Related
My company uses a shared MS Access database, with a back end stored on a server and a front end copied onto users desktops.
Recently, our IT department moved us to a new server without giving us any notice, and now our database keeps crashing.
Every 20-40 minutes, users get an error message that says:
Error 3043 Your network access was interrupted. To continue, close the database, and then open it again.
If they close and reopen, it does work. However, I'd like to stop this from happening, since it typically happens when they are in the middle of something and have to re-do everything.
I've already spoken with our IT consultants and they see no issue with our server/network, nor do they know anything about Access and therefore are no help.
Does anyone have any experience with this or have any ideas that may help me repair my database?
Thanks in advance.
Here are some thoughts:
It sounds very much like (short) network interruptions. MS Access doesn't like these at all, in particular it doesn't recover from a broken connection (even if very short) until you restart the frontend.
Network interruptions during write operations on Access backends are the prime cause of backend database corruption. Consider yourself lucky if you haven't experienced that yet. But you should backup and Compact&Repair the backend often (!) .
You can prevent backend corruptions by moving the backend to a server database, e.g. SQL Server Express (free). Errors will still occur ("ODBC call failed" instead of error 3043), but they will only affect the frontends.
You can probably work around all errors by changing the frontend from bound forms to unbound forms. This is a major undertaking.
I don't think there is anything you can do with the backend to prevent the errors.
If this database has value to your company, and IT says there is no problem, I suggest you escalate the problem to someone who can make IT look closer into the issue.
(How to do so would be a separate question, perhaps on SuperUser.)
I am trying to understand the trade-offs between going with MySQL or PostgreSQL on AWS.
Some considerations for me are that I am an amateur database user, so I need to be sure resources are available which allow me to overcome problems quickly. Along these lines, I bought the book 'PostgreSQL on the Cloud' and was all set to go with PostgreSQL since the book laid out a great use case.
One thing held me back though is that it is important for my work to be able to to easily use Excel as a front end for importing and exporting data into and out of the Database on AWS.
It looks like MySQL has an open extension which is fully integrated with Excel and is also well documented. My research into PostgreSQL uncovered a much more uneven integration with Excel and a lot of long painful group frustration a closer integration has not already occurred.
Right now, I am leaning to MySQL, but want to make sure I am not missing something.
Thanks!
Microsoft touts a PostgreSQL plugin as well: https://support.office.com/en-us/article/connect-to-a-postgresql-database-power-query-bf941e52-066f-4911-a41f-2493c39e69e4. Never used it, so can't comment on it.
You mention you are a beginner, so I'll add... be careful about security with either of these options. There are options to encrypt the channel between the client and server, which you indicate is running on AWS. If not secure, anyone would be able to effectively monitor the connections, extract credentials, and do whatever to your AWS-hosted DB. Generally, cloud-hosted DBs should be behind an authentication/authorization login process.
Okay, so I have a couple of hundred reports in my MS Access database (yes, it's a big project, and yes, we should switch to SQL Server). I was working on one of the reports yesterday and was suddenly disconnected from the network. I have been having a lot of network outages at work, and I think it has something to do with the sudden disappearance of all the reports. I have never had admin privileges to set up Backup and Restore on my machine, and have had to back up the database manually myself.
My most recent version is from a week ago, but I have done A LOT of work since then. My question is whether or not a sudden disconnection from the network (and, subsequently, the database I was working in), could have caused the deletion, and whether or not it is possible to restore the database without having Backup and Restore set up on my computer.
Please help.
Edit: My databases are in a front-end/back-end format. It was the front-end database (with the reports, queries, and forms) that crashed, but the only items that were deleted were the reports.
It your database was corrupted once, it's hard to tell if it's possible to restore your missing reports from it.
The only 100% safe way to restore your changes would be to have a recent backup from before the corruption happened.
But it's definitely possible that a network outage, while users accessed the database and you made changes in the reports at the same time, lead to corrupting the file.
I know that it doesn't help you right now, but here is some advice for the future:
You should think about splitting your Access database in a front-end (which contains forms, reports and code) and a back-end (which only contains tables).
The back-end will be located on a central machine, and each user will have his own copy of the front-end.
Of course splitting the database and ensuring that the front-ends are automatically updated on the user's machines will require some initial work, but you will gain two benefits from it:
When you work on the front-end, you are working on your local machine, so you are less affected by network outages.
The back-end is more unlikely to be corrupted. As you unfortunately experienced yourself, changing reports, forms and code in an Access database that's in production use at the same time can lead to issues.
Plus, even if you're not able to set up Backup and Restore on your computer, you should backup your work more often. I do this several times a day by just copying the Access database manually (you will do this automatically without thinking once you're used to it).
EDIT:
Okay, it wasn't really clear from the question that your database is split into front-end and back-end.
The network outage stuff made me think that you were editing the "one and only central database".
I don't remember that I ever experienced something like that by myself, so I can just guess what you could do now.
One thing that comes to my mind is that you could try to export your reports (given that you know their names) to text files with the undocumented SaveAsText and LoadFromText commands:
'save your report in a text file on your disk
Application.SaveAsText acReport, "YourReport", "c:\YourReport.txt"
'load your report from the text file in another Access database
Application.LoadFromText acReport, "YourReport", "c:\YourReport.txt"
But this is just an idea, so I don't know if this will work in your case.
Maybe the reports are still there (and just not shown for some reason), then it might work.
I'm sorry to say this again, but:
Honestly, I don't have a lot of experience in repairing corrupted databases (not Access, and neither SQL Server, for which I'm responsible at work!) because I try very hard to never come in a situation like yours by frequently taking backups. LOTS of them (I've been called "paranoid" by co-workers because of that).
EDIT 2:
I just read in your comment to phoog's answer that the corrupted database was on the network (i.e. you edited it while it was on the network).
Advice for the future:
Don't ever do this, for the reasons that phoog already mentioned in his answer. If the database is already split into front-end and back-end, make a copy of the front-end and edit it on your local machine to prevent such stuff from happening again.
Plus, if the front-end with the reports is used by several users, don't let them all work on the same file on the network (for the same reason).
It's incredibly easy to give each user his own copy, including auto-updating when there's a new version of the front-end.
You can read here how I'm doing this at work:
How to automatically update MS-Access 2007 application
Your question leaves open several important points:
How is your application set up?
Is the MDB containing the reports on your network?
Are the tables in the same file or another one?
If the tables are in another file, where is it located?
Access is notoriously susceptible to data loss and file corruption when network connectivity is spotty. See http://support.microsoft.com/kb/303528 (the section "Additional best practices for network environments") for more information.
The information in this article ("How to troubleshoot and to repair a damaged Access 2002 or later database" http://support.microsoft.com/kb/283849) may help
Any files that you have on the network should have been backed up by your network administrators, so ask them for the most recent backup before you lost your reports. With luck, this will be more recent than your own most recent backup.
I'm using Access 2003 on a duo-core machine with 4GB of RAM, running Windows XP (Service Pack 3) [5.1.2600]
Periodically, I get an error msg "There isn't enough memory to perform this operation. Close unneeded programs and try the operation again."
A check of Task Manager indicates that there is plenty of free memory. Closing other open programs makes no difference.
This happens sporadically, and under different circumstances: sometimes whilst saving Form design or VBA code changes, sometimes when several Forms are open and in use.
If attempting to save design changes, and this error occurs, the Access objects are corrupted and can't be recovered.
Any suggestions on what might be causing this would be very welcome.
MTIA
The VBA project in your front end is likely corrupted. You need to rebuild it from scratch and then use proper Access coding practices:
in VBE options, turn off COMPILE ON DEMAND (see Michael Kaplan's article on DECOMPILE for details of why).
in VBE options, turn on REQUIRE VARIABLE DECLARATION.
in the VBE, customize your toolbar so that the COMPILE button is easily accessible (it's on the Debug menu). I also recommend adding the CALL STACK button (from the VIEW menu), as it's handy for debugging errors in break mode. The point here is to make debugging and compiling as easy as possible.
having set up your environment, go through all the modules in your newly recovered project and add OPTION EXPLICIT to the top of every module that lacks it. Then compile. You'll quickly find out where you have invalid code and you'll need to fix it.
from now on, when programming, compile frequently, after every two or three lines of code. I probably compile my project 100 or more times a day when coding.
periodically decompile your project and compact and recompile it. This will clean out any crud that accumulates during regular development.
These practices insure that the code in a non-corrupt project stays in as clean a condition as possible. It will do nothing to recover an already corrupted project.
In regard to how to rebuild the project, I think I'd go the drastic route of exporting all the objects with Application.SaveAsText and importing them into a new blank database with Application.LoadFromText. This is superior to simply importing from your existing corrupted front end because the import can import corrupt structures that won't survive a SaveAsText/LoadFromText cycle.
I program daily in Access, working with non-trivial apps that use lots of code, including plenty of standalone class modules. I have not lost an object to code corruption in over 5 years, and that was back in the day when I was still using A97.
Having tripped across this old post of mine, and seeing it's had a fair bit of interest, I thought maybe an update would be in order?
So 2 years down the track, doing a lot of 2007 app work as well as older 2003 (and even '97) apps, I'm finding that 2007 is less prone to really nasty crashes than 2003 - where Access object definitions (forms and reports esp.) would be easily corrupted.
I still follow the suggestions 1-6 (above) by David-W-Fenton religiously though, plus the use of Application.SaveAsText (see Tony Toews' suggestion and link above).
These days, whether it's 97, 2003 or 2007 I'm working on, if Access gives any hint of "being weird | crashing | throwing inexplicable errors" etc, I do the following:
Immediately close the Access app
Backup the mdb/accdb file
Re-open the app whilst holding down [Shift] so nothing runs
Export all objects as text using Application.SaveAsText (as another backup)
Close and re-open the app using the /decompile switch
Recompile the VBA code
Do a Compact/Repair.
This doesn't solve everything, but it does significantly reduce the number of corruptions of Access objects from what I'm able to observe.
Oh my.
I worked in a shop for many years that used Access as their platform of choice. The application eventually got so large that it began hitting an internal memory limitation of Access 2003. They began experiencing the exact same problem that you are having. As you have noticed, there is no external indication of memory problems when this happens.
The company talked at length with Microsoft about the problem, and I believe Microsoft eventually supplied them with a patch. So you might want to talk to Microsoft about this, if it sounds like a similar situation to what you are experiencing, as they may be able to supply you with the same patch.
Ultimately the long-term solution is to break the application into smaller pieces. Moving to Access 2007 didn't help; in fact, it made things worse because Access 2007 has more moving parts.
Quick solution; guaranteed to work:
Open VBA (Alt-F11)
In the immediate window enter the following:
Application.SaveAsText acForm, "corrupt form name here", CurrentProject.Path & "\zzTempRevive"
then
Application.LoadFromText acForm, "corrupt form name here", CurrentProject.Path & "\zzTempRevive"
That's it :) Hope this helps others!
This is also the default error message when Access has no idea what the problem actually is. Now if your MDB is particularly large, say more than 800 forms and reports with modules then, yes the MDB could be too large although that gave you a message when you went to create MDEs. ACC2000: "Microsoft Access Was Unable to Create an MDE Database" Error Message
I have had this happen occasionally myself. And my current MDBs aren't quite that large. Note that compact and repair doesn't detect errors in objects other than tables, indexes or relationships. So importing into another MDB is the only way to correct these errors.
Are you working on this MDB over the network? That's about the only thing I can think of that might cause this problem.
As I know that it's either forms or reports that most likely get corrupted, I created a new mdb, and only imported tables (attached), queries, scripts (one only), modules and menus. Then I used LoadFromText to import Forms and Reports via a function, and then did the usual decompile/compile and compact/repair etc.
So far, touch wood, I haven't had another crash in some days, so I'll probably stick with this recovery method.
Many thanks to all for your suggestions.
I have encountered this problem many times and finally found a solution that worked. I don't know what causes the problem, but I do know how to solve it.
Usually the error occurs when you open a form. What you need to do is completely re-create that form. The easiest way to do so is to first export the form to a text file with the undocumented function Application.SaveAsText. Then you delete the form from your database and re-load it with Application.LoadFromText.
I have created an MS Access 2003 application, set up as a split front-end/back-end configuration, with a user group of about five people. The front end .mdb sits on a network file server, and it contains all the queries, forms, reports, and VBA code, plus links to all the tables in the back end .mdb and some links to ODBC data sources like an AS/400. The back end sits on the same network file server, and it just has the table data in it.
This was working well until I "went live" and my handful of users started coming up with enhancement requests, bug reports, etc. I have been rolling out new code by developing/testing in my own copy of the front-end .mdb in another network folder (which is linked to the same back-end .mdb), then posting my completed file in a "come-and-get-it" folder, alerting the users, and they go copy/paste the new front-end file to their own folders on the network. This way, each user can update their front end when they're at a 'stopping point' without having to boot everyone out at once.
I've found that when I'm developing now, sometimes Access becomes extremely slow. Like, when I am developing a form and attempt to click a drop-down on the properties box, the drop-down arrow will push in, but it will take a few seconds before the list of options appears. Or there's tons of lag in selecting & moving controls on a form. Or lots of keyboard lag.
Then, at other times, there's no lag at all.
I'm wondering if it's because I'm linked to the same back end as the other users. I did make a reasonable effort to set up the queries, forms, reports etc. with minimal record locking, if any at all, depending on the need. But I may have missed something, or perhaps there is some other performance issue I need to address.
But I'm wondering if there is an even better way for me to set up my own development back-end .mdb, so I can be testing my code on "safe" data instead of the same live data as the rest of the users. I'm afraid that it's only a matter of time before I corrupt some data, probably at the worst possible moment.
Obviously, I could just set up a separate back-end .mdb and manually reconfigure the table links in the front end every time, using the Linked Table Manager. But I'm hoping there is a more elegant solution than that.
And I'm wondering if there are any other performance issues I should be considering in this multi-user, split database configuration.
EDIT: I should have added that I'm stuck with MS Access (not MS-SQL or any other "real" back end); for more details see my comment to this post.
If all your users are sharing the front end, that's THE WRONG CONFIGURATION.
Each user should have an individual copy of the front end. Sharing a front end is guaranteed to lead to frequent corruption of the shared front end, as well as odd corruptions of forms and modules in the front end.
It's not clear to me how you could be developing in the same copy of the front end that the end users are using, since starting with A2000, that is prohibited (because of the "monolithic save model," where the entire VBA project is stored in a single BLOB field in a single record in one of the system tables).
I really don't think the problems are caused by using the production data (though it's likely not a good idea to develop against production data, as others have said). I think they are caused by poor coding practices and lack of maintainance of your front end code.
turn off COMPILE ON DEMAND in the VBE options.
make sure you require OPTION EXPLICIT.
compile your code frequently, after every few lines of code -- to make this easy, add the COMPILE button to your VBE toolbar (while I'm at it, I also add the CALL STACK button).
periodically make a backup of your front end and decompile and recompile the code. This is accomplished by launching Access with the /decompile switch, opening your front end, closing Access, opening your front end with Access (with the SHIFT key held down to bypass the startup code), then compacting the decompiled front end (with the SHIFT key held down), then compiling the whole project and compacting one last time. You should do this before any major code release.
A few other thoughts:
you don't say if it's a Windows server. Linux servers accessed over SAMBA have exhibited problems in the past (though some people swear by them and say they're vastly faster than Windows servers), and historically Novell servers have needed to have settings tweaked to enable Jet files to be reliably edited. There are also some settings (like OPLOCKS) that can be adjusted on a Windows server to make things work better.
store your Jet MDBs in shares with short paths. \Server\Data\MyProject\MyReallyLongFolderName\Access\Databases\ is going to be much slower reading data than \Server\Databases. This really makes a huge difference.
linked tables store metadata that can become outdated. There are two easy steps and one drastic one to be taken to fix it. First, compact the back end, and then compact the front end. That's the easy one. If that doesn't help, completely delete the links and recreate them from scratch.
you might also consider distributing an MDE to your end users instead of an MDB, as it cannot uncompile (which an MDB can).
see Tony Toews's Performance FAQ for other generalized performance information.
1) Relink Access tables from code
http://www.mvps.org/access/tables/tbl0009.htm
Once I'm ready to publish a new MDE to the users I relink the tables, make the MDE and copy the MDE to the server.
2) I specifically created the free Auto FE Updater utility so that I could make changes to the FE MDE as often as I wanted and be quite confident that the next time someone went to run the app that it would pull in the latest version. For more info on the errors or the Auto FE Updater utility see the free Auto FE Updater utility at http://www.granite.ab.ca/access/autofe.htm at my website to keep the FE on each PC up to date.
3) Now when working on site at a clients I make the updates to the table structure after hours when everyone is out of the system. See HOW TO: Detect User Idle Time or Inactivity in Access 2000 (Q210297) http://support.microsoft.com/?kbid=210297 ACC: How to Detect User Idle Time or Inactivity (Q128814) http://support.microsoft.com/?kbid=128814
However we found that the code which runs on the timer event must be disabled for the programmers. Otherwise weird things start happening when you're editing code.
Also print preview would sometimes not allow the users to run a menu item to export the report to Excel or others. So you had to right click on the Previewed report to get some type of internal focus back on the report so they could then export it. This was also helped by extending the timer to five minutes.
The downside to extending the timer to five minutes was if a person stays in the same form and at the same control for considerable parts of the day, ie someone doing the same inquiries, the routine didn't realize that they had actually done something. I'll be putting in some logic sometime to reset this timer whenever they do something in the program.
4) In reference to another person commenting about scripts and such to update the schema see Compare'Em http://home.gci.net/~mike-noel/CompareEM-LITE/CompareEM.htm. While it has its quirks it does create the VBA code to update tables, fields, indexes and relationships.
Use VBA to unlink and re-link your tables to the new target when switching from dev to prod. It's been to many years for me to remember the syntax--I just know the function was simple to write.
Or use MS-Access to talk to MS-Access through ODBC, or some other data connection that lives outside of the client mdb.
As with all file base databases, you will eventually run into problems with peak usage or when you go over a small magical number somewhere between 2 and 30.
Also, Access tends to corrupt frequently, so backup, compact and repair need to be done on an frequent basis. 3rd party tools used to exist to automate this task.
As far as performance goes, the data is being processed client side, so you might want to use something like netmeter to watch how much data is going over the wire. The same principle about indexing and avoiding table scans apply to file base dbs as well.
Many good suggestions from other people. Here's my 2 millicents worth. My backend data is on server accessed through a Drive mapping. In my case, the Y drive. Production users get the mapping through a login script using active directory. Then the following scenarios are easily done by batch file:
Develop against local computer by doing a subst command in a batch file
run reports against last nights data by pointing Y to the backup server (read only)
run reports against end of month data by pointing to the right directory
test against specialized scenarios by keeping a special directory
In my environment (average 5 simultaneous users, 1000's of rows, not 10,000's.) corruption has occurred, but it's rare and manageable. Only once in the last several years have we resorted to the previous days backup. We use SQL Server for our higher volume stuff, but it's not as convenient to develop against, probably because we don't have a SQL admin on site.
You might also find some of the answers to this question (how to extract schemas from access) to be useful as well. Once you've extracted a schema using one of the techniques that were suggested you gain a whole range of new options like the ability to use source control on the schemas, as well as being able to easily build "clean" testing environments.
Edit to respond to comment:
There's no easy way to source control an Access database in it's native format, but schema files are just text files like any other. Hence, you can check them in and out of the source control software of your choice for easy version control/rollbacks.
Or course, it relies on you having a series of scripts set up to re-build your database from the schema. Once you do, it's normally fairly trivial to create an option/alternative version that rebuilds it in a different location, allowing you to build test environments from any previous committed version of the schema. I hope that clarifies a bit!
If you want to update the back end MDB schema automatically when you release a new FE to the clients then see Compare'Em http://home.gci.net/~mike-noel/CompareEM-LITE/CompareEM.htm will happily generate the VBA code need to recreate an MDB. Or the code to create the differences between two MDBs so you can do a version upgrade of the already existing BE MDB. It's a bit quirky but works.
I use it all the time.
You need to understand that a shared mdb file for the data is not a robust solution. Microsoft would suggest that SQL Server or some other server based database would be a far better solution and would allow you to use the same access front end. The migration wizard would help you make the changeover if you wanted to go that way.
As another uses pointed out, corruption will occur. It is simply a question of how often, not if.
To understand the performance issues you need to understand that to the server the mdb file with the data in it is simply that, a file. Since no code runs on the server, the server does not understand transactions, record locking etc. It simply knows that there is a file that a bunch of people are trying to read and write simultaniously.
With a database system such as SQL Server, Oracle, DB2. MySQL etc. the database program runs on the server and looks to the server like a single program accessing the database file. It is the database program (running on the server) that handles record locking, transactions, concurrency, logging, data backup/recovery and all the other nice things one wants from a database.
Since a database program designed to run on the server is designed to do that and only that, it can do it far better and more efficently that a program like Access reading an writing a shared file (mdb).
There are two rules for developing against live data
The first rule is . . . never develop
against live data. Not ever.
The second rule is . . .never develop
against live data. Not ever.
You can programatically change the bindings for linked tables, so you can write a macro to change your links when you're deploying a new version.
The application is slow because it's MS Access, and it doesn't like many concurrent users (where many is any number > 1).