Is it safe to frequently Compact and Repair Access DB? - ms-access

I have an Access Database which I populate using a pass through query.
I get specific data from an SQL database and dumps it on Access.
This Access DB is updated weekly but I cannot simply append new data from the previous week because all other past weeks also updates.
What I did is truncate the Access DB and then load everything again.
When I do that, I need to Compact and Repair the DB so that the size doesn't bloat.
My question is, is it ok to do that? Currently I am using the logic posted in this answer.
I have not encountered any problems yet but I just want to make sure and get our access guru's thought about it. Also I'm planning on doing a scheduled run on our server to do the task.
Just need to make sure that it will not get corrupted easily (what is the chance of corrupting the file in the first place?).
If you'll ask, why do I need to do this? Users of data have no access on SQL server.So I have to pull data for them so they can just connect to Access DB instead.
Just in case you need the code:
Dim sqlDelete As String
Dim sqlAppend As String
sqlDelete = "DELETE * FROM dbo_Table;"
sqlAppend = "INSERT INTO dbo_Table (Col1,Col2) SELECT Col1,Col2 FROM passThrough;"
With DoCmd
.SetWarnings False
.RunSQL sqlDelete
.RunSQL sqlAppend
.SetWarnings True
End With
Application.SetOption "Auto Compact", True

If you need to truncate the data and load again I would recommend to move all tables, which should be truncated and all temporary tables to separate database.
After separating it will be possible to replace this database with unnecessary data by empty database file when the application starts. Just save a copy of empty database and copy this file over existing database with old data. This should be done before opening any form/recordset based on tables from temp database. It will work faster than C&R each time and more reliable, because sometimes C&R may damage the file.
Template database file also can be stored in main database in Memo field

I will refer this forum post: http://www.utteraccess.com/forum/index.php?showtopic=1561733

Related

Ms access database with MySQL as the backend

My current team of 20 people uses an access database front end on their desktop. The backend of the access database is on a network drive. I have been asked to create an access database front end with MYSQL as the back end.
I have installed the MySQL workbench and the ODBC connector on my computer. I have created the schema and tables and I have connected the front end of the database connected to the MYSQL table I created in workbench. My question is
How do I deploy this for the team to use. I believe I can move the front end of the access dB to the network drive and the team can copy it to their desktop. But what do I do about the backend?
Does the team need to have the ODBC connector installed on their computers?
Should I move the MySQL workbench to the network drive?
PS: I am a new developer and just learning about databases and I am only familiar with the access database please go easy on me.
Thanks.
SO is for questions about coding, so this is OT. However:
One method is described in my article:
Deploy and update a Microsoft Access application with one click.
The old backend, you can archive. It will not be used anymore.
Yes.
Probably not. It is only for you, should you need to modify the
MySQL database.
Well, first up, you STILL need and want to deploy the application part called the front end (FE) to each workstation.
So, after you migrate the data to MySQL, then of course you will use the access linked table manager, and now link the tables to MySQL. How this works is really much the same as you have now. The only difference is that you linked tables now point to the database server (MySQL). From the application point of view, it should work as before.
Like all applications, be it Outlook, Excel, accounting packages? You STILL deploy the application part to each workstation. So just because YOU are now developing and writing software with Access does not mean out of the blue that you now for some strange reason STOP deploying the FE part to each workstation. In fact, you should be deploying a compiled version of your application (a accDE).
A few more tips:
WHEN you link the FE, MAKE SURE you use a FILE dsn. The reason for this is Access converts the links to DSN-less for you. What this means is that once you link the tables, then you are free to deploy the FE to each workstation and it will retain the linked table information WITHOUT you having to setup a DSN connection on each workstation. You will also have to of course deploy the MySQL ODBC driver to each workstation as that is not part of Access nor it is part of your application.
So just because you are now developing software does not suggest nor get you off the hook of deploying that application to each workstation. So, your setup you have now with a FE on each workstation does NOT change one bit.
For the most part, after you migrate the data to MySQL, then likely setup your relationships (say with MySQL workbench) there are several other things you need to keep in mind.
All tables now need a primary key. You likely have this, but Access with a access back end did and could work on tables without a PK. However, for SQL server/MySQL etc., then all tables need that PK.
Next up:
If you have any true/false columns, you MUST set a default up for that column. If such true/false columns have a default value or allow nulls, this will confuse access - so ensure that true/false columns can't have nulls and have default (useally 0) setup on the server side.
Add a "rowversion" column. This is not to be confused with a datetime column. In SQL server this rowversion column is of timestamp data type (a poor name, since the column has zero to do with time - it is simply a column that "versions" the row. This will also eliminate many errors. I don't know what this type of column is called in MySQL, but all tables should have this column (there is zero need to see/use/look at this column on your forms - but it should be part of the table.
All of your forms, reports and code should work as before. For VBA recordset code, you need this:
dim rst DAO.Recordset
dim strSQL as string
strSQL = "SELECT * from tblHotels"
set rst = currentdb.OpenRecordSet(strSQL).
You likely in the past had lots of code as per above.
You now need:
Set rst = CurrentDb.OpenRecordset(strSQL, dbOpenDynaset, dbSeeChanges)
You can generaly do a search and replace (find each .OpenRecordSet, and add the dbOpen and dbSee to each line of code you find. (I put the dbOpenDynaset, dbSeeChanges in my paste buffer. And then do a system wide search. It takes a few minutes at most to find all .openRecordSets
At this point, 99% of your code, forms and everything should work.
The ONE got ya?
In access be it on a form or in VBA recordSet code? When you create a new row, or start typing on/in a form? Access with access back end would generate the PK at that point. There was/is no need to save the record to get the PK value. This need is rare, and in a large application I only had about 2 or 3 places where this occured.
So, if you have code like this:
dim rstRecords as RecordSet
dim lngPK as Long ' get PK value of new record
Set rstRecords = CurrentDb.OpenRecordset("tblHotels")
rstRecords.AddNew
' code here sets/ add data/ set values
rstRecords!HotelName = "Mount top Hotel"
rstRecords!City = "Jasper"
' get PK of this new record
lngPK = rstRecods!ID ' get PK
rstRecords.Update
So, in above code, I am grabbing the PK value. But with server systems, you can NOT get the PK value until AFTER the record been saved. So, you have to change above to
rstRecords.Update
rstRecords.Bookmark = rstRecords.LastModified
' get PK of this new record
lngPK = rstRecods!ID ' get PK
Note how we grab/get the PK AFTER we save.
The bookmark above simply re-sets the record pointer to the new record. This is a "quirk" of DAO, and WHEN adding new reocrds, a .Update command moves the record pointer, and thus you move it back with the above .LastModified. You ONLY need the .LastMOdifed trick for NEW reocords. For existing, you already have the PK - so it don't matter.
This type of code is quite rare, but sometimes in a form (and not VBA reocdset code), some form code we might use/need/get/grab the PK value. In a form, you can do this:
if me.dirty = true then me.dirty = false ' save reocod
lngPK = me.ID ' get PK
So, once again, in the above form code, I make sure the record is saved first, and THEN I grab the PK value as above. Of course this is ONLY a issue for NEW records. (and you don't have to use the bookmark trick - that's only for recrodsets - not bound forms.
So, just keep in mind in the few cases when you need the PK value of a NEW record, you have to do this AFTER you saved the recordset (update) or in the case of a form, after you forced a form record save. This requirement is only for new records and ALSO only when your code needs to get/grab the new PK value.
Other then the above two issues? All the rest of your existing code and forms should work as before.

How to get Around MS Access Column Number Limitation: Too Many Fields Defined

I inherited an MS Access database that is very poorly designed. I am working on redesign but in the meantime I need to provide enhancements to this current version. Because of varying datatype issues I have had to create many alter table queries to force fields into the proper datatypes. Long story short, I run into a "Too Many Fields Defined" error message from time to time while I am running the alter table/alter column queries. I've read up on it and found that the internal MS Access Column count increases whenever you run an alter column query. I am able to solve this issue by hitting the Compact and Repair button. However when my end users are running the program I don't want them to have to do this. Is there a way to either programmatically Compact and Repair (Checking the Compact on Close option doesn't seem to work) with VBA? Or can I somehow reset the internal table column count with VBA?
You can only compact a closed database - which means your code has to run in a separate database - so you have to split the front end off, run the compact routine for the backend db and then attach the tables programatically. Compacting has to be done in DAO so you need a reference to that
Sub CompactDatabaseRoutine()
Dim dbe As DBEngine
Dim source As String
Dim dest As String
Set dbe = DBEngine
source = "C:\Documents and Settings\user\My Documents\Sales2010.mdb"
dest = "C:\Documents and Settings\user\tempSales2010.mdb"
dbe.CompactDatabase source, dest
If Dir(dest) <> "" Then
Kill source
Name dest As source
End If
End Sub

How to end Windows lock on file in ODBC connection?

I am using Access to import data from a series of SQLite 3 databases that are all structured the same. I have a system/user DSN for "Import.db". My Access DB has linked tables to those in Import.db.
My goal is to import the data via the linked tables into Access tables, then delete Import.db, then copy the next SQLite3 DB to the same location and call it Import.db, and then keep repeating the process till all are imported.
I took this approach because I don't know how to create DSNs on the fly and link tables for SQLite3 dbs. SQL Server, yes, but not SQLite3. So I thought, just use the same DSN but change the actual file.
The trouble is, after opening my Access DB and opening the linked tables, Access creates a Windows file lock on Import.db. So I can't delete and replace it. Instead, I can import one, then close Access, reopen Access, and repeat. Not so hot.
Suggestions?
You can get the connection string of a linked table with:
Debug.Print CurrentDb.TableDefs("MyLinkedTable").Connect
You now have the information you need to create a query that imports without a linked table.
You might get something like this:
ODBC;DSN=SQLite3;Database=Z:\Docs\Import.db;StepAPI=0;SyncPragma=NORMAL;
NoTXN=0;Timeout=;ShortNames=0;LongNames=0;NoCreat=0;NoWCHAR=0;
FKSupport=0;JournalMode=;OEMCP=0;LoadExt=;BigInt=0;
But you probably won't need most of it, so:
sODBC = "[ODBC;DSN=SQLite3;Database=Z:\Docs\Import.db;]"
''Create table query, but append and update are also easy enough
sSQL = "SELECT * INTO SQLite_Import FROM " & sODBC & ".SQLiteTableNameHere"
CurrentDb.Execute sSQL, dbFailOnError

Access 2007 SCM over multiple databases

I'm looking for a way to implement SCM over multiple Access 07 databases. We just need source control for the forms/code.
Essentially we have 100+ databases that are structured the same, use the same code/modules, but each contains data for just one client. When we make a code change we currently have to manually go through each file to make the change.
Has anyone implemented source control for something similar(God help you if so) or have any ideas?
PS - I realize there is lots of DailyWTFery in here, this is a legacy product I've been assigned to do some emergency maintenance on before we rewrite to .NET/MSSQL, but I think there's enough work to warrant putting this in place if it's possible.
You can find out more how to do SCM in this question; it does involve exporting and importing the code and forms using some undocumented (or poorly documented) commands. I also recall issues with checksums or version numbers happening when you used that method.
However, you could solve a lot of this problem by separating the the data and the application sides of the DB into separate files and then adding table links to the data DB from the application DB. Then you would have just one Application DB, and oodles of Client data DBs.
Switching to a different client is as simple as re-linking to the other DB.
You can do that manually or code it with a structure something like this:
Dim myTable As TableDef
Dim db As Database
dim strDBPath as string
strDBPath = ";DATABASE=C:\workbench\MyNewDB.mdb"
Set db = CurrentDb
db.TableDefs.Refresh
For Each myTable In db.TableDefs
If Len(myTable.Connect) > 0 Then
myTable.Connect = strDBPath
myTable.RefreshLink
End If
Next
'' Note: This whips through every table in your database and re-links it to the new
'' DB. ODBC links will have a different value for strDBPath - I'll leave that as
'' and exercise for the reader

Syncronize data between alike tables in mdb file and MySQL server schema

I am looking for a way to manage syncronization between an Access mdb file for an application, and a MySQL schema containing a copy of the same tables. This arose from the fact that the application doesn't support MySQL as a backend, but I am looking for a way to utilize MySQL for other in-office applications using the data the first application generates.
Some givens:
1> We cannot abandon the first application, and it's only compatible with Microsoft SQL Server as a backend server to house data.
2> We are not against using Microsoft SQL server, but the licensing cost is a big concern - as well as rewritting some other Access applications written to use linked tables and seperate mdb files.
3> The database server should be "PHP friendly" for a future expansion project for an internal corporate intraweb.
4> No data needs to be, nor should be allowed to be, accessed from outside the corporate network.
I hope I am not being too obscure, but I don't want to break confidences either - so I am trying to walk a pretty tight rope. If anyone can help, I'd greatly appreciate it.
Synchronizing two databases is very, very complicated if both databases need to be updatable. If one is a slave of the other, it's not nearly as difficult. I have more than once programmed just this kind of synchronization using Access, once with an MDB on a web server that had to be synched with a local data MDB (to incorporate data edited on the website; no edits went back to the website, so one-way synch, but still the need to merge edits on the non-web side), as well as once programming a synch between MySQL on a website and Access in a master (MySQL) / slave (Access) relationship.
On the website, you program a dump of data for each table to a text file. It's helpful to have timestamp fields in the MySQL tables so that you know when records were created and updated. This allows you to select which records to dump since the last data dump (which makes the synchronization of data on the Access side much simpler).
The way I programmed it was to then import the text files into staging tables that were indexed appropriately and linked to the front-end Access app. Once the data dumps are imported into the staging tables, you then have three tasks:
find the new records and append them to the Access data store. This is easily done with an outer join.
deal with the deletions. I'll discuss this later, as it's, well, complicated.
deal with updated records. For this, I wrote DAO code that would write column-by-column SQL statements.
Something like this:
UPDATE LocalTable
SET LocalTable.Field1 = DownloadTable.Field2, LocalTable.Updated = DownloadTable.Updated
WHERE LocalTable.Field1 <> DownloadTable.Field2
Now, obviously, the WHERE clause has to be a bit more complicated than that (you have to deal with NULLs, and you have to use criteria formatted appropriately for the data types, i.e., with "" and ## for text and dates, respectively, and no delimiters for numeric data), but writing the code to do that is pretty easy.
Skeleton code looks something like this:
Dim db As DAO.Database
Dim rsFields As DAO.Recordset
Dim fld As DAO.Field
Dim strSQL As String
Set rsFields = db.OpenRecordset("SELECT TOP 1 Field1, Field2, Field3 FROM LocalTable;")
For Each fld in rsFields
[write your SQL statement and execute it]
Next fld
Set fld = Nothing
rsFields.Close
Set rsFields = Nothing
Set db = Nothing
Now, as I said, the complicated part is writing the WHERE clause for each SQL statement, but that's pretty easy to figure out. Also, note that in your rsFields recordset (which is used only to walk through the fields you want to update) you want to include only the fields that are updatable, so you'd leave out the CREATED field and the PK field (and any other fields that you don't want to update).
Now, for the DELETES.
You might think it's a good idea to simply delete any record in the local table that's not in the remote table. That works fine if it really is a slave database, but so often what is originally a slave ends up getting its own edits. So, in that case, you need to not delete records from the master MySQL database, and instead have a DELETE flag that marks records deleted. You could have different varieties of logic that could clean the deleted records out of the master database (e.g., if you're using date stamps in the records, you could delete all records flagged DELETED with LastUpdated timestamp that is <= the last time you dumped the data; alternatively, you could have the Access app send a text file up to the server with a list of the records that have been successfully deleted from the Access data store). If there are edits in the Access data store, then you'll need some logic for dealing with an edit there on a record that was deleted from the MySQL "master" database.
In summary:
If you have a true master/slave relationship, it's fairly trivial. If you really wanted to do it by brute force, you'd just dump the entirety of all the MySQL data tables to text files, delete all the records in your Access data store and import the text files.
I would tend not to do that, as the first time you need to depart from the pure master/slave relationship, you're hosed and have to rewrite everything from scratch.
The outline I gave above will work very cleanly for master/slave, but will also work well if you have a few fields that are private to the "slave" database, or data that exists in the slave and not in the master (which is the situation I was working with).