Copying to/from Dbase data using Access - ms-access

I am stuck with some legacy back ends using dBase IV, and would like to be able to copy records from one table to another using an Access front end. The simple answer would be to link to the source and destination tables and run an INSERT query or similar.
However, in my situation, the back end is not a single DBF file, but there are several hundred files that I need to dynamically link to for the copy operation. Currently I have to change and refresh the link definition using the TableDefs property (in VBA) every time I wish to perform the copy operation.
The catch is the front end is shared, meaning that each user has to have a separate copy of the FE so that the linked table definitions are not modified by another user.
I was wondering is there an easy way to do this without using linked tables? I could open DAO connections to the source and destination but cannot find any simple way of copying the records (apart from one at a time). Are there anyways around this?

It is possible to run a query using the linked DBF that inserts into a DBF in another location:
INSERT INTO [dBASE III;DATABASE=z:\docs\].[dbf2.dbf]
SELECT *
FROM dbf1;
Or
INSERT INTO dbf1
SELECT *
FROM [dBASE III;DATABASE=z:\docs\].[dbf2.dbf];

Related

SSIS package design, where 3rd party data is replacing existing data

I have created many SSIS packages in the past, though the need for this one is a bit different than the others which I have written.
Here's the quick description of the business need:
We have a small database on our end sourced from a 3rd party vendor, and this needs to be overwritten nightly.
The source of this data is a bunch of flat files (CSV) from the 3rd party vendor.
Current setup: we truncate the tables of this database, and we then insert the new data from the files, all via SSIS.
Problem: There are times when the files fail to come, and what happens is that we truncate the old data, though we don't have the fresh data set. This leaves us without a database where we would prefer to have yesterday's data over no data at all.
Desired Solution: I would like some sort of mechanism to see if the new data truly exists (these files) prior to truncating our current data.
What I have tried: I tried to capture the data from the files and add them to an ADO recordset and only proceeding if this part was successful. This doesn't seem to work for me, as I have all the data capture activities in one data flow and I don't see a way for me to reuse that data. It would seem wasteful of resources for me to do that and let the in-memory tables just sit there.
What have you done in a similar situation?
If files are not present update some flags like IsFile1Found to false and pass these flags to stored procedure which truncates on conditional basis.
If file is empty then Using powershell through Execute Process Task you can extract first two rows if there are two rows (header + data row) then it means data file is not empty. Then you can truncate the table and import the data.
other approach could be
you can load data into some staging table and from these staging table insert data to the destination table using SQL stored procedure and truncate these staging tables after data is moved to all the destination table. In this way before truncating destination table you can check if staging tables are empty or not.
I looked around and found that some others were struggling with the same issue, though none of them had a very elegant solution, nor do I.
What I ended up doing was to create a flat file connection to each file of interest and have a task count records and save to a variable. If a file isn't there, the package fails and you can stop execution at that point. There are some of these files whose actual count is interesting to me, though for the most part, I don't care. If you don't care what the counts are, you can keep recycling the same variable; this will reduce the creation of variables on your end (I needed 31). In order to preserve resources (read: reduce package execution time), I excluded all but one of the columns in each data source; it made a tremendous difference.

Refresh Data in Access 2016 from Read-Only MySQL ODBC`

I am attempting to "sync" data from a read-only ODBC MySQL server to Access 2016. I need to move the data into Access so that I can more easily manipulate and create better customized reports.
I have linked the data tables between Access and MySQL, however I cannot get the data in these tables to automatically refresh. I must go into Access and hit "Refresh All".
What I'm looking to do is update all of my open tables in Access once nightly so that each morning the data used to build these reports is new. Currently if I leave these tables all evening, when I get in the next morning I must hit "Refresh-All" for Access to go retrieve the most recent data.
Any ideas?
The data in linked tables is automatically refreshed by access when you attempt to read them. You can do that by displaying a datasheet view of the database, or by a form where the linked table is the data source. Beware, we have had problems which tables with lots of records being the source for drop down lists, having the database locked.
Access only does this properly (and at speed) if either the underlying table has a unique clustered index, or after having linked the tables you create an index in access.
If you want to create a copy that you can manipulate (such as write to) and the underlying tables are read only, then you will have to create matching unlinked tables and execute some form of copy sql and appropriate points in your application.

SSIS read text file to temp table before reaching ADO NET Destination?

I have a text file that's uploaded to a tsql table. It's straightforward, but due to some inconsistencies, I need to upload this file to a #tempTable or #tableVariable, clean it there, and then uploading it to the physical table.
So essentially, I have a Flat File Source that reads the txt file with the inconsistencies, and uploads it to the table in ADO NET Destination. If all the columns in destination table were varchar, then I would be able to save it in the physical table, and run some tsql script to clean. But I don't want to do that.
I can also run a tsql script before and after to create/drop the temp table, but if there's a way to do it with #temp or #table, then great.
If this is something you do more than once, a standard ETL practice is to have a persistent all-VARCHAR staging table that the file is loaded as the first step.
Then the data is inspected for suitability to add to the production table.
This does the following:
Ensures bad data stays out of the production tables where it can break things.
Allows the ETL process to be standardized and repeatable. You no longer have to remember how they broke the data last time, you just add it to the logic that promotes from the staging table to the production table and forget about it.
Allows the easy transition of the process to someone else, as well as the opportunity for proper source control.
I can't think of a single benefit to using a temp table/table variable, unless you are somehow unable to create a physical table.

How to run append query from data macro MS Access?

I have 2 table, one is local named 'Client' and other is a linked table to MySQL DB in my web-server named 'ClientSql'.
When I insert data into the table Client, I want it to be insert into clientSql too. I try it with data macro (after insert), but it shows me an error saying
It's not possible in linked tables.
I have tried to create an append query successfully, and it works, but it just works if I execute it manually. My question is:
Is it possible to call it from a data macro? if is possible, can you show me how? If not, can you point me to a solution?
I'm not sure why you should be able to do it manually, but not via a macro. According to this link
http://dev.mysql.com/doc/connector-odbc/en/connector-odbc-examples-tools-with-access-linked-tables.html
you should be able to do it either way.
Another thought is to eliminate the local access client table and have the access program update the mySql table directly. However, if another program is accessing the client table at the same time, this could become tricky due to multi-user and locking situations.

Migrating Table Macros to VBA with a now split Database

What i have is the Issues template from microsoft which is all i really need for its purpose. I have converted it so that is now no longer a web database by exporting objects as client objects.
I want to split this database so that not only multiple users can utilise it at the same time but also so it performs better.
When i split the database the macros linked to the tables go to the backend so when the front end needs to use them it errors.
Below is and example of the add comments macro that gets called when the add comment button is pressed on the front end.
And this is the macro embedded to the button
Can these be converted to VBA and interact with the backend the way it is meant to and if so where would i start. I have looked for an answer but all i find is people saying "it is fine now i have gone vba route" or similar but not actually showing it working.
Below is the converted data macro to VBA. It isn't 100% there yet as I have hard-coded the userID but this will be fixed later today, but I hope it gives a good understanding of how to convert data macros to VBA because this was a learning experience for me.
Private Sub cmdAddaComment_Click()
Dim db As dao.Database, theComments As Recordset
Set db = DBEngine.Workspaces(0).Databases(0)
Set theComments = db.OpenRecordset("Comments")
theComments.MoveLast
theComments.AddNew
theComments!IssueID = Me.ID
theComments!CommentDate = Now
theComments!Comment = Me.txtAddComment
theComments!UserID = 2
theComments.Update
Me.txtAddComment = ""
DoCmd.RepaintObject acForm, "IssueDetail"
End Sub
Before doing this, make a backup of your database. You can do that by closing down the database, then locate the .accdb file on your computer, and press Ctrl+C and Ctrl+V
You can do your own split like this:
Create a new, blank database.
Import everything EXCEPT THE TABLES from the old database
In the new database, create links to the old database tables. To do that, click on External Data - Import & Link, and then click on Access. Select the Link option, then locate the old database, and select all the tables you want to link. Note that if you see any tables that are named like "MSYS", do NOT import those. Those are system tables, and Access will handle those internally.
Now delete everything EXCEPT THE TABLES from the old database.
Your new database is now "linked" to the old database's tables, but all Forms, Reports, etc are in the new database.
If you want to split an Access DB, you have to separate TABLES from all the rest.
Queries, Macros, Forms, Reports and VBA code have to stay on the frontend, and only tables on the backend.
What you are missing are the LINKS from the frontend to the backend tables.
To do that you need to create a copy of your DB, renaming it "BE.mdb" (or accdb), delete all your objects except tables from BE.mdb.
Now on the original mdb, you need to delete every table and add a link for every deleted table to the BE.mdb corresponding table. This can be done from the Import menu, choosing "link table" instead of "Import table".