Currently one website that I run uses a perl based CMS system. It does not save any information to databases but it stores everything to flat files.
As the traffic has increased on my site, I have often seen that the files that are accessed more often and written to more often get "truncated". So for example a data file that has a list of pictures in it and that keeps track of how many times each picture has been accessed, might get "cut in half" all of a sudden.
I believe the problem lies in the flocking functions that I'm using. Each time any file is opened with my site it uses the following function.
use Fcntl ':flock'
##############################
sub lock {
flock(MBOX,LOCK_EX)
# and, in case someone appended
# while we were waiting...
seek(MBOX, 0, 2)
}
sub unlock {
flock(MBOX,LOCK_UN)
}
##############################
(semi-colons were left out)
What I'm looking for is someone who can improve this function to ensure that files are no longer lost / truncated / corrupted / deleted on my server.
My guess is that this would only take an expert programmer a few minutes.
Please PM me with your code, and I'll test it, if it works and your price is right, you'll get the project.
Thanks so much for your time and help!