Optimise basic algo (start with variable runs and smart picking of values).
- include ramp up 4 machines (called Conner001 etc - do snapshot of
conner001 and replicate)
use 10 days of csv data to be supplied.
The first milestone (75% of bid) will be to review this strategy to determine if it is enhancing the performance of the original algo. There should be 3/4 queries with a satisfactory positive result (based off original objectives). i.e not 0.000000008. Once this has been achieved then milestone 2 will be the successful application of the queries to separate test data (1-3 days) maintaining reasonable performance.
*reasonable. From our recent experience we have both realized it is hard to be specific and that this process is intuitive. I use "reasonable" with fairness and commercial common sense applied.
- also query files generated to be [url removed, login to view] etc, skip existing files where ABV is abbreviation for algo (eg IH001 for my first one, project 3 will be IH002)