If you are using a shared folder from one computer to another, like on a network, the performance is mostly related to windows handling of the file, not with the database itself. I'm not familiar with the specifics, but if I remember correctly, windows manages file access for file shares differently than when the file is accessed locally. This has been a long known issue for all file based databases accessed on a share, and the best recommendation that I have heard is to create either a windows service or a web service, if possible, to manage external access to the database.
I hope that helps.
Remote access over a network share is always going to be slower, likely dramatically so. Performance can be optimized by having a good network server with the lowest latency possible to the various computers. You can improve performance by opening connections in read-only mode but that has limited practical applications obviously.
The *best* way to do this is to avoid using a network share to host the database and instead operate a layer up - for example using WCF self hosting to let you send the query from multiple clients to the central client who in turn accesses the database locally. In short, make a dedicated little application server. While this is complicated in the general case (e.g. creating your own database server) it's pretty straightforward with WCF for a specific set of operations.
I have a client that has about 4000 records in one table. They want to be able pull up about 2500 records in a grid. How can I speed this up? Main computer speed is good and the other two computers its slow.
There is no way around to this as they want to all of the current records to be in the grid. All I have to do is a simple Select command to get these records.
I am using C1TrueDbGrid which allows users to filter the grid on multiple columns. This part is fine and fast.
I am working VB.Net 2012.
PS. How about some samples in VB