I need a web scraper written for the following URL:
[login to view URL]
All information needed is available on the main page. The number of rows will vary.
The output should be a pipe (|) delimited file with the following column mappings:
origin_city --> data located in the "Origin City" column
origin_state --> data located in the "ST" column located AFTER the "Origin City" column
ship_date --> data located in the "P/U Date" column, change to the YYYY-MM-DD format
destination_city --> data located in the "Dest. City" column
destination_state --> data located in the "ST" column located AFTER the "Dest. City" column
receive_date --> leave blank
trailer_type --> data located in the "Eq. Type" column
load_size --> data located in the "Full/LTL" column
weight --> data located in the "Weight" column, add three zeros to the end of the
data if there are only 2 numbers (ie. 48 = 48000); if column is blank, leave blank.
length --> data located in the "Length" column
width --> leave blank
height --> leave blank
trip_miles --> data located in the "Miles" column
pay_rate --> data located in the "Rate" column
contact_phone --> leave blank
contact_name --> data located in the "Dispatcher" and "Ext" columns
tarp_required --> leave blank
comment --> data located in the "Special Req." column
load_number --> leave blank
commodity --> data located in the "Commodity" column
The first line of the output should contain all of the column headers.
Any field that contains no data should be left blank.
Please do not use words like "null" or "blank" in blank columns.
Below is a sample output of the first 5 columns using sample data:
The deliverable will be a Perl .pl file that must run on
Ubuntu Linux and must use Modern::Perl. The Perl .pl file
should be called '[login to view URL]' and the output file should be
called '[login to view URL]'
It will be scheduled in cron to run unattended every 15 minutes.
Please specific what language/OS/modules you plan to use.
Also, please include the word "raccoon" in your bid so I know that
you read this description.
Thank you for your invitation! I can create a Perl web scraper that will use WWW::Mechanize, HTML::TreeBuilder::LibXML etc to get details from [login to view URL] website.
11 freelancere byder i gennemsnit $174 på dette job
Hey There, Thanks for your posting! We are ready to work with your requirement could you please share me your detailed requirement so that we can discuss in detail. Thanks Mohd Ahsan