OTWarrior
11-16-2010, 02:54 AM
Just something that occurred to me as I was working, more of a discussion that a problem.
We use a frontend to backend connection. each copy of the frontend is held on each users local computer (in the D: drive), and has linked tables pointing to a folder on a file server (nothing special, just a normal server for holding documents).
We do have 60 users adding and amending records. At times (such as logging out time) the database grinds to a halt due to network traffic (as the users computers are restricted by the server being held up). If the backend files are made local (as I often do when developing) the speed is massively increased.
My idea is this:
Rather than link to the server for the backend, the key tables (the ones doing most of the adding and amending of records) are duplicated on each local computer. When the user loads the program in the morning, all tables are updated.
Whilst actually using the database, they are altering their local files. At critical points (such as when they finish alerting a number of related records) the system would run an update query for the specific records, copying over the the local version of the records and updating the server version.
When another user wants to access that same record that day, it would pull from the server and save to their local version, then they can continue as normal.
Like I said before, just an idea that came into my head. A few questions though:
1) If everyone is running update queries on the same table at the same time, is it likely to be slower, or even corrupt more easily?
2) Could such a system work, without having older data override the newer data?
3) If two people alter the same record at the same time, what would happen?
Any thoughts of this would be much appreciated.
We use a frontend to backend connection. each copy of the frontend is held on each users local computer (in the D: drive), and has linked tables pointing to a folder on a file server (nothing special, just a normal server for holding documents).
We do have 60 users adding and amending records. At times (such as logging out time) the database grinds to a halt due to network traffic (as the users computers are restricted by the server being held up). If the backend files are made local (as I often do when developing) the speed is massively increased.
My idea is this:
Rather than link to the server for the backend, the key tables (the ones doing most of the adding and amending of records) are duplicated on each local computer. When the user loads the program in the morning, all tables are updated.
Whilst actually using the database, they are altering their local files. At critical points (such as when they finish alerting a number of related records) the system would run an update query for the specific records, copying over the the local version of the records and updating the server version.
When another user wants to access that same record that day, it would pull from the server and save to their local version, then they can continue as normal.
Like I said before, just an idea that came into my head. A few questions though:
1) If everyone is running update queries on the same table at the same time, is it likely to be slower, or even corrupt more easily?
2) Could such a system work, without having older data override the newer data?
3) If two people alter the same record at the same time, what would happen?
Any thoughts of this would be much appreciated.