Currently I have a custom built flat file fixed-field database. It simple saves usernames and passwords, and a few other pieces of info.
Currently a c# application does random access read and writes on the
file database, and this application has been running for many months
It needs to be enhanced. There are 2 real main things that need to
be done. 1) A in ram data structure to allow chaching, so that
reads are done WITHOUT disk access, and archiving, and 2) the data has to sync with either a postgres DB or a MSsql DB. Perferably a Postgres DB. The sync, need only to in one direction at this time.
(1) on start up of the application, all fields should be read into a data structure, probably a .Net "hash" table data structure with a
login name as the search key, and the data held in each structure will
be simply username, password, and a few other field.
When the program goes to READ from the existing application, it goes to hard-drive read each time, we need this to infact only go to the
ram hash data structure for a read (thus not involving a hard drive access). Now if there is some data change, then the data has to be
updated in the ram data structure (essentially acting as a cache)
as well as writting to the file DB on the hard drive.
Also it is required that a archive of the DB be created and saved on
disk with a time stamp, every (a) one hour, (b) after 10 transaction
that involve changing data - which ever is LONGER. So if 40 peices of data change in one hour, that will still only result in ONE creation
of a archive file at that next hourly interval. IF one piece of data
changes, again at the next hour interval. In otherwords, if data changes, don't write anymore frequently then once per hour. IF no data changes, then no need to write an archive.
The archive just needs to be a copy of the existing flat file DB,
but with a compression done on it, i.e. just provide a means to call
out to a system command to run a "gzip", "bzip", "zip" on the file.
The archive, after written (but before compressed), should be read
back into the application, to validate its integrity, this
integrity check should not effect the main "live" database,
or in RAM status of the database, but rather simple a isolated,
read in of the DB, to assure its integrity, and if it does not
reload and prove to be accurate data, then the archive must be recreated, and on two failures in a row, a flag can be set to
indicate a system error. If the validation is sucessful, then
the system call is made to compress the data, i.e.
gzip db_dump_YYYY_MM_DD-HH_MM_SS, this compress is of course
taken care of by a simple OS call to that program, so the application
just has to issue a call out to the OS.
On startup of the application, if the primary file DB is corrupt,
i.e. application crashed in middle of a write, then the application
should then go back to the last archive, uncompress it, by simply
issuing a system command i.e. "gunzip db_dump_YYYY_MM_DD-HH_MM_SS"
the last archive written, and unzip it, and load it in.
There should also be an option to reload from any given archive.
In particular, a routine is written to make a list of all the
archives, and load there names in. And a routine, that as a
paramter, given a archive name, load in that DB and also
refreshes the ram cache data structure, essentially allow to
revert to an older archive set of data. In addition, once the
number of archive files is greater then 20, the application issues
a system call out to run a script. The script will not be part
of what needs to be programmed, it will just be a simple perl script
that will move all files older then latest 20 to another directory,
again the programmer doesnt have to do this, but just know
to make a OS call out to a script name once number of archive files
is more then say 20.
(2) Sync. File DB to a postgres or MS SQL server.
If a request is made to the local DB for a username that IS NOT
in it, we need to make a DB call out to postgres or MsSQL and
check to see if that record exists there. If it does, it takes the
data and also updates the local file DB and the ram cache.
If it doesn't, then nothing happens.
Also every 1/2 hour (or similar), the application has to
poll the postgres or MS-SQL DB and do a select based on a time stamp,
that essential says : any records inserted or changed in last 1/2 hour
that data needs to be copied to the local file DB. So a simple select
of all recently changed data comes from the Postres or MSSQL DB,
and sits in a DataSet in the application, and then each
row of that data set has to be insert/update the local record,
update if the local record already exists, else create new.
A simple text log file shuold also be written too that
creates a audit record for each thing done by (1) and (2) above,
except "READS" from local DB don't need to be logged,
just "writes" to local DB, and and new data that gets taken from the
Postgres or MS-SQL DB.
This project requires straight forward c# abilities and experience
with data base calls from .Net/c# to postgres or MS-SQL,
however the code has to be able to run on c# on linux, which
should happen if its coded not using any MS specific .Net DB access
classes. I can provide all necessary Postgres examples, etc,
but if a programmer only has MS-SQL ability, we will have to
verify that the MS-SQL.
[url removed, login to view]
perhaps ODBC can be attempted, if it is straight forward to
get ODBC to postgres and MS-SQL.
This is probably 3 solid days of work for someone who has done similar, perhaps 4 days. Probably is 8-10 pages of code, maybe less.
Very simple testing is required, I will supply detailed testing,
so the quote on this job doesn't have to include a lot of
testing, as other things are happening on this project, so testing
will happen as part of another effort. The code just generally has
to work, and be reasonably coded, and commented.
Keep in mind all existing working code is provided that
works with the flat file, and plenty of postgres DB examples
can also be provided.
So with that and experience with .Net and simple sql DB
calls on either/or postgres, MS-SQL, everything is straight forward.
Even more exact specification will be made available as well.
Need this started in approx. 1 week. or sooner and completed over
1-2 weeks after that. however part (2) could be done a bit later
if needed. I can be a bit flexable and add some more dollars if
issue pop up, but generally it should be very straightforward.
If someone has used MONO/c# and odbc'd to postgres AND MS-SQL
this also may be valuable and add to $$$, but at a min. we just
require DB connectivity to Postgres, or MS-SQL. The programmer
must have their own access to these DB's on their own systems,
however if really necessary, a postgres access can be provided.
It would be usually expected that someone experienced in this stuff
would have a personal install of a postgres DB (i.e. its free).
Hello, I can start on this today. I have a dev machine loaded with linux, mono and postgres. I have a lot of experience with C# and postgresql (I have installed many postgresql databases too) Best Regards, Max
10 freelancers are bidding on average $218 for this job
GOOD QUALITY WORK WITH INTIME DELIVERY OF THE PRODUCT . 100% GUARANTEED OF HIGH QUALITY PROFESSIONAL WORK, I HAVE 5+ YRS. OF PROGRAMMING EXPERIENCE. I CAN DO IT WELL.
hi, i am interested in doing this, i am basically software eng. with 5yrs exp. vb asp java .net so u can trust me, i will deliver the project in time or even b4, with quality u expect. thanks gmh
I have hand on experience in Postgres and also in C#. Project will be more Professional and Reliable, Have a seperate testing member to test it professionally also.