solution needed - 2 users running a program - ms-access

So I've developed this Access 2007 application with about 2 forms, a lot of VBA code and a bunch of tables.
Now the business wants to run this off a network drive (call it G:\ for example). My current solution (which I've already implemented is have a table similar to:
__________________
|Setting | Value |
==================
Updating 1
UpdateBy User1
So let me give you a context. When the application runs there is a button called "update" which updates the local table from a remote server so we can apply filtering. Now when two people (user1, user2) launch the application, and one person clicks update then the updating field is set to true, and the updateby is set to their name.
So User number 2 tries to update, it checks if the updating field is true, if it is then it gives them a message (to user two, not to user one).
It works beautifully right now, but here is the problem: Lets say user1 is updating, and closes his program (or taskkills it) or the power disrupts, then the application shuts off with the updating field set to to true. Now no matter who launches it, they can not update because its "already updating"
Can you guys think of a solution to this? Maybe a workaround?

Consider a different locking strategy. In the click event of your "update" button, you can first open a recordset based on your tblUpdateStatus (the table where you've been writing UpdateBy) with dbDenyWrite + dbDenyRead options.
Set rst = db.OpenRecordset("tblUpdateStatus", _
dbOpenTable, dbDenyWrite + dbDenyRead)
Then do your other operations for the "update" button. Afterward, close and release the recordset ... which releases the tblUpdateStatus lock.
Trap the error when a user is unable to open the recordset (because another user has the table locked), give them a message to try later and exit your click event subroutine.
With this approach, when user1 locks tblUpdateStatus but exits Access uncleanly, her lock on tblUpdateStatus is released. You may not even need to update tblUpdateStatus unless you want to record which user has it locked.
See Create and Use Flexible AutoNumber Fields (from the Access Cookbook) for more details about using a recordset with dbDenyWrite + dbDenyRead.

Please do not run an Access application with more than one user when you have not split the database. It will cause endless trouble. The data portion (back-end) should be put on the server and the code and forms (front-end) should be put on each users desktop.
More information: http://support.microsoft.com/kb/162522

Read my article on why you split here:
http://www.members.shaw.ca/AlbertKallal/Articles/split/index.htm
In the above, I don't just tell you to do this, but I tell you WHY you split.
It should help you a lot in terms of users tripping over each other.

Related

Ms access database with MySQL as the backend

My current team of 20 people uses an access database front end on their desktop. The backend of the access database is on a network drive. I have been asked to create an access database front end with MYSQL as the back end.
I have installed the MySQL workbench and the ODBC connector on my computer. I have created the schema and tables and I have connected the front end of the database connected to the MYSQL table I created in workbench. My question is
How do I deploy this for the team to use. I believe I can move the front end of the access dB to the network drive and the team can copy it to their desktop. But what do I do about the backend?
Does the team need to have the ODBC connector installed on their computers?
Should I move the MySQL workbench to the network drive?
PS: I am a new developer and just learning about databases and I am only familiar with the access database please go easy on me.
Thanks.
SO is for questions about coding, so this is OT. However:
One method is described in my article:
Deploy and update a Microsoft Access application with one click.
The old backend, you can archive. It will not be used anymore.
Yes.
Probably not. It is only for you, should you need to modify the
MySQL database.
Well, first up, you STILL need and want to deploy the application part called the front end (FE) to each workstation.
So, after you migrate the data to MySQL, then of course you will use the access linked table manager, and now link the tables to MySQL. How this works is really much the same as you have now. The only difference is that you linked tables now point to the database server (MySQL). From the application point of view, it should work as before.
Like all applications, be it Outlook, Excel, accounting packages? You STILL deploy the application part to each workstation. So just because YOU are now developing and writing software with Access does not mean out of the blue that you now for some strange reason STOP deploying the FE part to each workstation. In fact, you should be deploying a compiled version of your application (a accDE).
A few more tips:
WHEN you link the FE, MAKE SURE you use a FILE dsn. The reason for this is Access converts the links to DSN-less for you. What this means is that once you link the tables, then you are free to deploy the FE to each workstation and it will retain the linked table information WITHOUT you having to setup a DSN connection on each workstation. You will also have to of course deploy the MySQL ODBC driver to each workstation as that is not part of Access nor it is part of your application.
So just because you are now developing software does not suggest nor get you off the hook of deploying that application to each workstation. So, your setup you have now with a FE on each workstation does NOT change one bit.
For the most part, after you migrate the data to MySQL, then likely setup your relationships (say with MySQL workbench) there are several other things you need to keep in mind.
All tables now need a primary key. You likely have this, but Access with a access back end did and could work on tables without a PK. However, for SQL server/MySQL etc., then all tables need that PK.
Next up:
If you have any true/false columns, you MUST set a default up for that column. If such true/false columns have a default value or allow nulls, this will confuse access - so ensure that true/false columns can't have nulls and have default (useally 0) setup on the server side.
Add a "rowversion" column. This is not to be confused with a datetime column. In SQL server this rowversion column is of timestamp data type (a poor name, since the column has zero to do with time - it is simply a column that "versions" the row. This will also eliminate many errors. I don't know what this type of column is called in MySQL, but all tables should have this column (there is zero need to see/use/look at this column on your forms - but it should be part of the table.
All of your forms, reports and code should work as before. For VBA recordset code, you need this:
dim rst DAO.Recordset
dim strSQL as string
strSQL = "SELECT * from tblHotels"
set rst = currentdb.OpenRecordSet(strSQL).
You likely in the past had lots of code as per above.
You now need:
Set rst = CurrentDb.OpenRecordset(strSQL, dbOpenDynaset, dbSeeChanges)
You can generaly do a search and replace (find each .OpenRecordSet, and add the dbOpen and dbSee to each line of code you find. (I put the dbOpenDynaset, dbSeeChanges in my paste buffer. And then do a system wide search. It takes a few minutes at most to find all .openRecordSets
At this point, 99% of your code, forms and everything should work.
The ONE got ya?
In access be it on a form or in VBA recordSet code? When you create a new row, or start typing on/in a form? Access with access back end would generate the PK at that point. There was/is no need to save the record to get the PK value. This need is rare, and in a large application I only had about 2 or 3 places where this occured.
So, if you have code like this:
dim rstRecords as RecordSet
dim lngPK as Long ' get PK value of new record
Set rstRecords = CurrentDb.OpenRecordset("tblHotels")
rstRecords.AddNew
' code here sets/ add data/ set values
rstRecords!HotelName = "Mount top Hotel"
rstRecords!City = "Jasper"
' get PK of this new record
lngPK = rstRecods!ID ' get PK
rstRecords.Update
So, in above code, I am grabbing the PK value. But with server systems, you can NOT get the PK value until AFTER the record been saved. So, you have to change above to
rstRecords.Update
rstRecords.Bookmark = rstRecords.LastModified
' get PK of this new record
lngPK = rstRecods!ID ' get PK
Note how we grab/get the PK AFTER we save.
The bookmark above simply re-sets the record pointer to the new record. This is a "quirk" of DAO, and WHEN adding new reocrds, a .Update command moves the record pointer, and thus you move it back with the above .LastModified. You ONLY need the .LastMOdifed trick for NEW reocords. For existing, you already have the PK - so it don't matter.
This type of code is quite rare, but sometimes in a form (and not VBA reocdset code), some form code we might use/need/get/grab the PK value. In a form, you can do this:
if me.dirty = true then me.dirty = false ' save reocod
lngPK = me.ID ' get PK
So, once again, in the above form code, I make sure the record is saved first, and THEN I grab the PK value as above. Of course this is ONLY a issue for NEW records. (and you don't have to use the bookmark trick - that's only for recrodsets - not bound forms.
So, just keep in mind in the few cases when you need the PK value of a NEW record, you have to do this AFTER you saved the recordset (update) or in the case of a form, after you forced a form record save. This requirement is only for new records and ALSO only when your code needs to get/grab the new PK value.
Other then the above two issues? All the rest of your existing code and forms should work as before.

SQL Server rows not editable for Access after Insert

I have this problem: I'm using a SQL Server 2008R2 backend and MS Access 2000 frontend where some tables are connected via ODBC.
Following Structure (Tables all on SQL-Server):
Import (not connected to Access)
Products (connected via ODBC to Access)
Pricing (connected via ODBC to Access)
I want to fill the Pricing table automatically with some data from Products and Import. This is supposed to run as a SQL Agent job with a T-SQL script.
I want to insert the data from "Products" with following command:
INSERT INTO Pricing (Productnr, Manufacturernr)
(SELECT Productnr, Manufacturernr
FROM Products
WHERE Valid = 1
AND Productnr NOT IN (SELECT Productnr FROM Pricing ));
Right after that the inserted rows are locked for Access, I can't change anything. If I execute sql queries with SQL Server Management Suite or if i start queries as SQL Agent jobs everything works fine.
Why are the rows locked in ms access after the query ran (even if it finished successfully)? and how can I unlock them or make it unlock itself right after the query/job ran?
Thanks
When SQL Server inserts new rows, those new rows are in fact exclusively locked to prevent other transactions from reading or manipulating them - that's by design, and it's a good thing! And it's something you cannot change - you cannot insert without those locks.
You can unlock them by committing the transaction that they're being inserted under - once they're committed to SQL Server, you can access them again normally.
The error message i get says, that the dataset has been changed by another user and if i save it, i would undo the changes of the other user. (and asks me for copying into clipboard).
This is different from "locked", and completely normal.
If you have a ODBC linked table (or form based on the table) open, and change data in the backend, Access doesn't know about the change.
You need to do a full requery (Shift+F9) in Access to reload the data, afterwards all records can be edited again.
Got the solution for my Problem now.
I had to add a timestamp row into the pricing table, so access could recognize the change.
Access loads the data into the front end when the table is first accessed. If something in the backend changes the data, you need Access to refresh it first, before you can edit it from the front end (or see the changes).
Do this by (in Access) by closing and reopening the table, or switching to the table and pressing shift-F9 as Andre suggested, or programmatically using a requery statement. You must requery, not refresh, for it to release the locks and register the changes made in SQL.

EJB Timer for deleting database entries

I am currently working on a j2ee web application. The application features a way for users to reset their passwords if they forget them.
I have a database table with 3 columns: username, key, and timestamp.
When the user requests a password change, I add an entry in that table with their username and a random key (making sure that their are no duplicate keys in the table, also that a user can only appear once in the table). I also add the current time. I then send them an e-mail with a link to the application that contains their key, something like:
mysite.com/app/reset?key=abcxyz123
The servlet that handles this request looks at the key in the url to find the matching entry in the reset table to determine which user the key belongs to. If the key doesn't match an entry, I show an error page, if it does, I show the password reset screen. Once the user changes their password, I manually delete the entry from that reset table.
I am trying to implement the equivalent of a time to live for the password reset links, so that I don't have entries loitering in the table unnecessarily, and I thought of 2 options, the first of which I have implemented:
1) Create an EJB Timer that fires every minute that will delete entries in the reset table where the timestamp is older than 30 minutes. This is a manual process in that I am using hibernate as my jpa implementation, so I retrieve all the entries from the table, examine their timestamps, and delete the old ones.
2) Create a database job that deletes rows over a certain age?
My question is, does anyone see any drawbacks to the first approach, and second, is the 2nd option even possible with mysql? I figure that if I can use the 2nd approach, I can get rid of the timer, and let the database handle the time to live aspect of the password reset links, and that may be more efficient.
I haven't been doing j2ee development for that long, but based on the knowledge that I have, these seemed like 2 logical approaches. I welcome any input.
3) Create script that will connect to db, execute delete, disconnect. Then you can schedule this script via operating system e.g. crontab.
Regarding option 1 - Drawback of that solution is that it uses application server resources for stuff that can be done on database only and is not dependent/uses any application logic.
Benefit is that whole app is self contained and you don't need any additional installation/setup task on database as with 2 and 3.

Excel through Access through ODBC Cache DB - locking licenses issue

I have an excel workbook set up with 25 sheets in it. Each sheet has a data connection to a query in MS Access. Each access query has one or more linked tables from an InterSystems Caché DB. Here is the connection string from one of them.
Provider=Microsoft.ACE.OLEDB.12.0;User ID=Admin;Data Source=\DIR\SUBDIR\XXX\Database\CAST\CAST_CLIENT_SETTINGS.mdb;Mode=Share Deny Write;Extended Properties="";Jet OLEDB:System database="";Jet OLEDB:Registry Path="";Jet OLEDB:Engine Type=5;Jet OLEDB:Database Locking Mode=0;Jet OLEDB:Global Partial Bulk Ops=2;Jet OLEDB:Global Bulk Transactions=1;Jet OLEDB:New Database Password="";Jet OLEDB:Create System Database=False;Jet OLEDB:Encrypt Database=False;Jet OLEDB:Don't Copy Locale on Compact=False;Jet OLEDB:Compact Without Replica Repair=False;Jet OLEDB:SFP=False;Jet OLEDB:Support Complex Data=False;Jet OLEDB:Bypass UserInfo Validation=False
Background refresh is not enabled and the command type is table.
The Access DB has 25 linked ODBC tables to a Cache DB. There are 25 queries in the access db that are each utilizing one or more of the linked tables. Record locks No and Recordset type Dynaset in each.
There is a button with some vba code in the excel workbook that kicks off a refresh of all 25 sheets to bring back the newest information. This works just fine other than taking a few minutes but the issue is that it locks up 25 licenses in Cache and keeps them locked until the workbook is closed.
The company only has 50 licenses so I can't use this many. Is there any setting I'm missing that would stop this from happening? Could I change the mode, the locking mode, the recordset type, background refresh, use a pass through? Could I write something into the vba to remove these locks after it refreshes? People use this utility to track changes they make in the application corresponding to the Cache DB in relatively real time.
Before it gets totally scrapped, I was hoping someone out there would have an idea that allowed it to use only one license if possible. Thanks for your help.
You have to look at the code behind that button how it is doing. I guess it fires off all 25 queries at the same time. You'd probably have to change it to refresh just one and wait until that one is finished before refreshing the next.
I know that this is a very old post, but by default Excel keeps OLEDB/ODBC connections open (thus locking the access database in use.)
No amount of configuration via connection string will stop Excel from doing this. Contrary to expectations, setting the mode to share deny none, or read only won't get you past the trouble. However, you can modify the properties of the connection via code, to prevent this behavior using the .MaintainConnection property. There is no setting in the UI to modify this property.
Try this code:
Function unlock_conns()
Dim conn
For Each conn In ActiveWorkbook.Connections
Select Case conn.Type
Case xlConnectionTypeOLEDB
conn.OLEDBConnection.MaintainConnection = False
Case xlConnectionTypeODBC
conn.ODBCConnection.MaintainConnection = False
End Select
Next conn
End Function

Access Continuous Form with Linked Table - How to Avoid Hitting Database Server for Every Row in Form?

I'm migrating the data from an Access database to SQL Server via the SQL Server Migration Assistant (SSMA). The Access application will continue to be used with the local tables converted to linked tables.
One continuous form hangs for 15 - 30 seconds when it's loading. It displays approximately 2000 records. When I looked in SQL Server Profiler to see what it was doing, it was making a separate call to the backend database for each record in the form. So the delay when the form opens is caused by the 2000-odd separate calls to the database.
This is amazingly inefficient. Is there any way to get Access to make a single call to the backend database and retrieve all the records at once?
I don't know if this is relevant but the Record Source for the form is a view in the SQL Server backend database, which is linked to via an Access linked table (so, hopefully, Access just sees it as a table, not a view). I needed an Instead Of trigger on the view in SQL Server, and a unique index on the linked table in Access, to allow the records to be updated via the form.
If the act of opening that continuous form really does generate ~2000 separate SQL queries (one for every row in the view) then that is unusual behaviour for Access interacting with a SQL Server linked "table". Under normal circumstances what takes place is:
Access submits a single query to return all of the Primary Key values for all rows in the table/view. This query may be filtered and/or sorted by other columns based on the Filter and Order By properties of the form. This gives Access a list of the key values for every row that might be displayed in the form, in the order in which they will appear.
Access then creates a SQL prepared statement using sp_prepexec to retrieve entire rows from the table/view ten (10) rows at a time. The first call looks something like this...
declare #p1 int
set #p1=4
exec sp_prepexec #p1 output,N'#P1 int,#P2 int,#P3 int,#P4 int,#P5 int,#P6 int,#P7 int,#P8 int,#P9 int,#P10 int',N'SELECT "ID","AgentName" FROM "dbo"."myTbl" WHERE "ID" = #P1 OR "ID" = #P2 OR "ID" = #P3 OR "ID" = #P4 OR "ID" = #P5 OR "ID" = #P6 OR "ID" = #P7 OR "ID" = #P8 OR "ID" = #P9 OR "ID" = #P10',358,359,360,361,362,363,364,365,366,367
select #p1
...and each subsequent call uses sp_execute, something like this
exec sp_execute 4,368,369,370,371,372,373,374,375,376,377
Access repeats those calls until it has retrieved enough rows to fill the current page of continuous forms. It then displays those forms immediately.
Once the forms have been displayed, Access will "pre-fetch" a couple of more batches of rows (10 rows each) in anticipation of the user hitting PgDn or starting to scroll down.
If the user clicks the "Last Record" button in the record navigator, Access again uses sp_prepexec and sp_execute to request enough 10-row batches to fill the last page of the form, and possibly pre-fetch another couple of batches in case the user decides to hit PgUp or start scrolling up.
So in your case if Access really is causing SQL Server to run individual queries for every single row in the view then there may be something particular about your SQL View that is causing it. You could test that by creating an Access linked table to a single SQL Table or a simple one-table SQL View, then use SQL Server Profiler to check if opening that linked table causes the same behaviour.
Turned out the problem was two aggregate fields. One field's Control Source was =Count(ID) and the other field's Control Source was =Sum(Total_Qty).
Clearing the control sources of those two fields allowed the form to open quickly. SQL Server Profiler shows it calling sp_execute, as Gord Thompson described, to retrieve seven batches of 10 rows at a time. Much quicker than making 2000 calls to retrieve one row at a time.
I've come across the same problem again but this time with a different cause. I'm including it here for completeness, to help anyone in a similar situation:
This time the underlying query was hanging and SQL Server Profiler showed the same behaviour as before, with Access making separate calls to the SQL Server database to bring back one record at a time, for every record in the query.
The cause turned out to be the ORDER BY clause in the query. I guess Access had to pull back all records in the linked table from SQL Server before being able to order them. Makes sense when I think of it. Although I don't know why Access doesn't just pull all records through at once, instead of getting the records one at a time.
I would try setting the Recordset Type to Snapshot (on the Data tab of the Form's property sheet and/or the property sheet of the query you are using for the form source)