I need to export some data out of the opensource wikipedia database. Please find details in attached file.
## Deliverables
1) Complete and fully-functional working program(s) in executable form as well as complete source code of all work done.
2) Deliverables must be in ready-to-run condition, as follows? (depending on the nature? of the deliverables):
a)? For web sites or? other server-side deliverables intended to only ever exist in one place in the Buyer's environment--Deliverables must be installed by the Seller in ready-to-run condition in the Buyer's environment.
b) For all others including desktop software or software the buyer intends to distribute: A software? installation package that will install the software in ready-to-run condition on the platform(s) specified in this bid request.
3) All deliverables will be considered "work made for hire" under U.S. Copyright law. Buyer will receive exclusive and complete copyrights to all work purchased. (No GPL, GNU, 3rd party components, etc. unless all copyright ramifications are explained AND AGREED TO by the buyer on the site per the coder's Seller Legal Agreement).
* * *This broadcast message was sent to all bidders on Tuesday May 30, 2006 12:26:25 PM:
Some answers to questions asked:
1. You are not expected to do "screen scraping". I've a wikipedia database loaded onto one of my linux dedicated servers - it's 8 mths old, and so will need to be updated with the latest update sql files.
2. I do not require the script, the only deliverable is the csv export.
3. The difficulty of this job depends on your familiarity with wikipedia database structure.
## Platform
win2k, linux