Looking for someone to teach me what I need to know to collect data from the web and documents on my own without being dependent on others. I have wasted so much time and money trying to hire a programmer and/or find a partner that my project, which has awesome financial potential, is now in jeopardy.
Specifically, I am looking for someone who can do the following:
1) Walk me through (via screen share) some of the numerous scraping programs currently available, their pros and cons, and limitations to help me make an informed decision if and to what extent such a program would be a more efficient and cost-effective way of collecting the data versus writing scripts from scratch.
2) Take the mystery out of scripting and help me write a script to either pick up where a scraping program ends or from scratch to be used as a template for additional data fields and sources.
Urgency is critical and my budget is limited. I am available to work on this 24/7. The further you can take me, the more hours I am willing to pay for.
---About My Project--
I am developing a a formula-based system for betting on horse races by adapting concepts used in trading financial markets (analogous to technical analysis) and eliminating the judgment applied by traditional handicappers (analogous to fundamental analysis). Formulas can be tested, judgment cannot. I see patterns that have so much consistency that I put my life on hold to test it but my failure to find a programmer has prevented me from collecting enough data to put it to the test.
I believe this project has the potential to go all the way--it's sexy with limitless potential assuming my patterns play out. However, even if my patterns prove too inconsistent to form the basis of a betting system, it still has strong commercial viability first, as an application to collect and store data not currently available and second, depending on licensing limits, as a fee-based website providing this data on a historical basis.