Updating an Access Table with a CSV File Automatically - csv

Problem Background:
I have a Powershell script that I can execute from my Microsoft Access Form that scans through file folders that contain information on different facilities, and produces a CSV that looks something like:
SiteCode FacilityNumber DocumentType HyperlinkPath
DKFZ 10 DD1400 C:\FACILITIES DATABASE\path
DKFZ 10 FLRPLN C:\FACILITIES DATABASE\path
SMQL 17 P1 C:\FACILITIES DATABASE\path
SMQL 17 P2 C:\FACILITIES DATABASE\path
So that way every time new files are added to those folders, I can just run this script and produce an updated list of everything I have:
C:\...\Output\scanResults.csv
All I need now is to take that CSV file and update (or even overwrite) a Table that I have in an Access database, which has relationships to other tables and is used by various Queries and Forms in the database. The CSV columns are already named and formatted in the same way as the Access Table.
I've looked at and tried to replicate the following threads:
VBA procedure to import csv file into access
Access Data Project Importing CSV File In VBA
VBA Import CSV file
The closest answer I found is:
Sub Import()
Dim conn as new ADODB.Connection
Dim rs as new ADODB.Recordset
Dim f as ADODB.field
conn.Open "DRIVER={Microsoft Text Driver (*.txt; *.csv)};DBQ=c:\temp;"
rs.Open "SELECT * FROM [test.txt]", conn, adOpenStatic, adLockReadOnly, adCmdText
While Not rs.EOF
For Each f In rs.Fields
Debug.Print f.name & "=" & f.Value
Next
Wend
End Sub
But this obviously won't write the data into the table, and I could not understand what the author was trying to say with respect to changing Select to Insert.
I've also found:
DoCmd.TransferText acImportDelim, "YourCustomSpecificationName", _
"tblImport", "C:\SomeFolder\DataFile.csv", False
Since both of these are from 2010, I wonder if there isn't a better way to accomplish this in Access 2013. And while I can do this all manually, I would like to incorporate it into the VBA code I use to tell Powershell to produce the CSV, that way I can make it and then upload it immediately.
Any help or suggestions are greatly appreciated. I'm still very green to Access, VBA, and SQL statements in general, so this has been very much a "learning as I go" process.

I prefer to use SQL clauses and queries to import such data. The details depend on your exact configuration, but it tends to look something like this:
SELECT *
INTO MyTable
FROM [Text;FMT=CSVDelimited;HDR=No;DATABASE=C:\...\Output].[scanResults#csv]
Or append the information to the table instead:
INSERT INTO MyTable
(SiteCode, FacilityNumber, DocumentType, HyperlinkPath)
SELECT *
FROM [Text;FMT=CSVDelimited;HDR=No;DATABASE=C:\...\Output].[scanResults#csv]
This allows you to do checks before importing (using a WHERE clause), import only specific values, and allows you to customize a lot without using external files.
DATABASE= is followed by your folder name (use {} if there are characters that need escaping in there), and then followed by your file name with . replaced with #.
You can execute it by either saving it as a query, or using it as a string in either VBA or a macro. Note that I rarely recommend macro's, but you can execute them using a scheduled task and close Access after importing.
To backup and restore a relation before and after updating, you can use the following functions:
Public Function DeleteRelationsGiveBackup(strTablename As String) As Collection
Dim ReturnCollection As Collection
Set ReturnCollection = New Collection
Dim i As Integer
Dim o As Integer
Do While i <= (CurrentDb.Relations.Count - 1)
Select Case strTablename
Case Is = CurrentDb.Relations(i).Table
ReturnCollection.Add DuplicateRelation(CurrentDb.Relations(i))
o = o + 1
CurrentDb.Relations.Delete CurrentDb.Relations(i).NAME
Case Is = CurrentDb.Relations(i).ForeignTable
ReturnCollection.Add DuplicateRelation(CurrentDb.Relations(i))
o = o + 1
CurrentDb.Relations.Delete CurrentDb.Relations(i).NAME
Case Else
i = i + 1
End Select
Loop
Set DeleteRelationsGiveBackup = ReturnCollection
End Function
Public Sub RestoreRelationBackup(collRelationBackup As Collection)
Dim relBackup As Variant
If collRelationBackup.Count = 0 Then Exit Sub
For Each relBackup In collRelationBackup
CurrentDb.Relations.Append relBackup
Next relBackup
End Sub
Public Function DuplicateRelation(SourceRelation As Relation) As Relation
Set DuplicateRelation = CurrentDb.CreateRelation(SourceRelation.NAME, SourceRelation.Table, SourceRelation.ForeignTable)
DuplicateRelation.Attributes = SourceRelation.Attributes
Dim i As Integer
Dim fldLoop As Field
Do While i < SourceRelation.Fields.Count
Set fldLoop = DuplicateRelation.CreateField(SourceRelation.Fields(i).NAME)
fldLoop.ForeignName = SourceRelation.Fields(i).ForeignName
DuplicateRelation.Fields.Append fldLoop
i = i + 1
Loop
End Function
And then, when importing:
Dim colRelBackup As Collection
Set colRelBackup = DeleteRelationsGiveBackup("MyTable")
'Delete MyTable
'Import new version
RestoreRelationBackup colRelBackup
(Note that the code is quite long, developed for a project several years ago, and not extensively tested. If a field name/type is not exactly like how it was before the import, the restore of the backup might fail and the relations will be permanently lost).

So some high level architect advice: replacing data versus replacing table
It is easier replacing data - - the new incoming data must be the exact same structure as the existing table (i.e. same field names and no new fields).
just fire a Delete Query to the existing table that clears out all records
then fire an Append Query to the linked CSV file that writes all those records into the existing table
very simple really.
You can replace the tables if you must - and you are already down this path. You can delete those table relationships entirely. That table relationship feature is useful - but not mandatory. You can create relationships at the query level as an alternative. Essentially the table relationships just auto create the query level relationships. If you delete the table relationships then one must simply create the table relationships at the query level manually - they don't automatically appear. Note however that if one is relying on cascade deletes or referential integrity, then removing table relationships will undo that - so you should check these points.
Deleting Table Relationships will not break any existing queries. Their table relationship join lines will remain intact.

Related

MS Access delete then append tabledef breaks querydef

Following this question thread, was able to successfully code the change suggested by "changing the sourcetable of a linked table in access 2007 with C#". However, it appears this customer has queries coded with relationships defined at the query level and the delete/append process breaks the relationships. Anyone have any idea how to preserve the relationships? And why is it that the tabledef.Sourcetable can't be updated?
Code snip:
Option Compare Database
Sub test()
Dim tdf As TableDef
Dim db As Database
Set db = CurrentDb
Open "out.txt" For Output As #1
For Each tdf In db.TableDefs
If tdf.Connect <> vbNullString Then
Print #1, tdf.Name; " -- "; tdf.SourceTableName; " -- "; tdf.Connect
Select Case tdf.SourceTableName
Case "CTITLU.txt"
'tdf.SourceTableName = "dbo.GRANTSADJS"
'tdf.Connect = "ODBC;DRIVER=SQL Server;SERVER=DCFTDBCL01-L01\EDS_DEV;DATABASE=GRANTSDB2;UID=grants_reader;PWD=xxxxx;TABLE=DBO.GRANTSTABL"
tdf.RefreshLink
End Select
End If
Next
End Sub
When I run this with just the tdf.connect syntax uncommented, it errors on the tdf.refreshlink call with "Run-time error '3011': The Microsoft Access database engine could not find the object 'objectname'..." I'm trying to update a text linked table to the equivalent SQL Server based linked table. The objectname does have spaces and hyphens in it, but it is correctly showing the name in the error message. For whatever reason, the previous developer shipped dumps of the tables to a file system instead of linking the tables directly. This is a small DB with very light transactional activity so there is very little chance this will cause any issues. When the tdf.SourceTableName is uncommented, throws the "Run-time error '3268': Cannot set this property once the object is part of a collection."
I followed other threads indicating this issue (noted above), and was successful using the tdf.delete / tdf.append calls to duplicate the tabledef with new source tablename and connection info. However, the dependent query's relationship definitions have disappeared and the query is unusable without redefining all of the links.
C Perkins, thanks, that was it. There was a slight difference in the table definition such that when using delete/append, it 'broke' the relationships (yes joins) in the related query. Using a DB view to fix that, it worked just fine. However it still 'moves' the query from its former place as being related to the original table object. Our customers will at least have their current data and not a weekly snapshot. Thanks again.

Import previously exported Access SQL queries from text files

I used the answer from this post to export my queries to a text file so I could do a find/replace exercise:
Using VBA to export all ms access sql queries to text files
I inherited a database that has object names, banker1, banker2 etc., hard coded and I had to create extras. I exported all of my queries and replaced banker1 with the new names. So far so good.
Is it possible to reverse this process from the single text file generated and load the queries back into Access?
My previous method involved exporting the queries to single text files using Application.SaveAsText, then looping through and doing my find/replace. The issue I encountered using this method is that the file is "formatted", possibly fixed width but not sure, such that some of the names were split across lines and therefore weren't detected by the find/replace. Loading them back in using Application.LoadFromText worked perfectly except I still had to search through queries to find the names that hadn't changed.
Edit: Sample of queries requested.
BNK30-AddChargebacks
INSERT INTO BNK30EntryTable ( Entry )
SELECT BNK30SelectChargebacks.Entry
FROM BNK30SelectChargebacks
WHERE (((BNK30SelectChargebacks.Amount)<>0));
BNK30-AddCredit
INSERT INTO BNK30EntryTable ( Entry ) SELECT
BNK30EntryQuery.Credit FROM BNK30EntryQuery WHERE
(((BNK30EntryQuery.Amt)<>0));
In the above I would be doing a find/replace of BNK30 with BNK31 etc.
Edit 2:
Operation =3
Name ="BNK01SavedReserves"
Option =0
Where ="(((BNK01Select.Reference) Is Null Or (BNK01Select.Reference)=[forms]![BNK01Nav]!"
"[txtReference]) AND ((BNK01Select.Date) Is Null Or (BNK01Select.Date)=[forms]![B"
"NK01Form]![StartedTime]))"
Begin InputTables
Name ="BNK01Select"
End
Begin OutputColumns
Name ="AssignedTo"
The above is from my original method which works except where the BNK01 is split; just above the Begin InputTables line. Hence trying to switch to exporting the SQL as one big file.
You can use a VBA procedure to modify both your query names and their SQL as needed. That approach should be much simpler than dumping the query definitions to a text file, doing search and replace in the text file, and then (somehow?) modifying your queries based on the text file changes.
For example, using the procedure below, you can do a "find/replace of BNK30 with BNK31" like this ...
ModifyQueries "BNK30", "BNK31"
However as written, the procedure does not change the queries. It only shows you the changes it would make if you enable the .Name = strNewName and .SQL = strNewSql lines. Please review the output in the Immediate window before enabling those lines.
Public Sub ModifyQueries(ByVal pFind As String, ByVal pReplace As String)
Dim db As DAO.Database
Dim qdf As DAO.QueryDef
Dim strNewSql As String
Dim varNewName As Variant
Set db = CurrentDb
For Each qdf In db.QueryDefs
With qdf
varNewName = Null
strNewSql = vbNullString
If .Name Like "*" & pFind & "*" Then
varNewName = Replace(.Name, pFind, pReplace)
Debug.Print "change " & .Name & " to " & varNewName
'.Name = strNewName
End If
If .SQL Like "*" & pFind & "*" Then
strNewSql = Replace(.SQL, pFind, pReplace)
Debug.Print Nz(varNewName, .Name) & " SQL: "
Debug.Print strNewSql
'.SQL = strNewSql
End If
End With
Next
End Sub
Beware that code has not been thoroughly tested. It is intended only as a starting point; you must test and refine it.
You should add error handling. The procedure will throw an error if/when it attempts to name a query with a name which matches an existing query or table.
Note, I wrote that procedure to rename queries. If you prefer to create new queries instead, revise the code to do this ...
db.CreateQueryDef varNewName, strNewSql
Finally make sure to backup your database before running the "enabled" version of that code. I doubt you need that warning, Nathan, but I cringe at the thought of anyone else inadvertently hosing their queries.

Comma separating text in field in MS Access 2016 Query

I have a csv file (updated 3 times a week. I have no control over its format, so cannot normalise it) that I have created an OBDC-link to a MS Access 2016 database.(I have chosen MS Access to refresh my skills with it, otherwise would complete in SQL). I have tried various permutations in setting up the OBDC link for the linked table however none give the optimum structure for the other fields.
The CSV file looks like this:
Fecha de Sorteo,Numero de sorteo,Numero de Juego,Nombre,Valores Principales,Comodines,DRAWNAME,Ganadores de Premio Mayor,Premio Mayor Garantizado
"8/25/2002 12:00:00 AM","1714","1","main","31,34,26,1,2,28","16","Loto","0",
I am trying to create a query to comma separate Field 3 into its 6 component parts. I have seen many examples that separate either 2 or 3 components (but never more than that) using InStr and the Mid functions such as seen here.
Do I have to create multiple expressions to separate this field into its components or is their an alternate solution?
I would suggest writing the data to a local table using the Split function. It will allow you to split that field into an array. Then use VBA to write the whole record into Access.
So, something like:
Dim db as Database
Dim rec as Recordset
Dim rec2 as Recordset
Set db = CurrentDB
Set rec = db.OpenRecordset("SELECT * FROM MyLinkedTable")
Set rec2 = db.OpenRecordset("SELECT * FROM MyLocalTable")
Do while rec.EOF = False
rec2.AddNew
rec2("Field1") = rec("Field1")
rec2("Field2" = rec("Field2")
strArray = Split(rec("Field3"), ",")
rec2("Part1") = strArray(0)
rec2("Part2") = strArray(1)
etc...
rec2.Update
rec.MoveNext
The above is "aircode" and completely untested, but it's probably pretty accurate and should get you started.
So your file seems to follow CSV specs, where values can be put between brackets ...
You should then be able to open it directly as an ADODB recordset(*). The column corresponding to the "Valores Principales" data will contain a plain text string, being your 6 values, separated by comas.
(*) don't know if it works with DAO recordsets ... For an example, check this link (it's excel but it's the same logic): return csv file as recordset

Copy Access database query into Excel spreadsheet

I have an Access database and an Excel workbook.
What I need to do is query the database and paste the query into a worksheet.
The issue is Runtime. I have stepped throught the program and everything works, but it works extremely slow, we're talking up to 30 second run times per query, although most of this run time is coming with the CopyFromRecordset call.
The database has over 800k rows in the table I'm querying.
Currently at my company there are people every morning who manually query the tables and copy and paste them into excel. I'm trying to remove this process.
Here is what I have:
Sub new1()
Dim objAdoCon As Object
Dim objRcdSet As Object
' gets query information '
Dim DataArr()
Sheets("Data2").Activate
DataArr = Range("A1:B40")
For i = 1 To UBound(DataArr)
job = DataArr(i, 1)
dest = DataArr(i, 2)
If InStr(dest, "HT") > 0 Then
OpCode = "3863"
ElseIf InStr(dest, "HIP") > 0 Then
OpCode = "35DM"
End If
strQry = "SELECT * from [BATCHNO] WHERE ([BATCHNO].[Job]='" & job & "') AND ([BATCHNO].[OperationCode] = " & "'" & OpCode & "')"
Set objAdoCon = CreateObject("ADODB.Connection")
Set objRcdSet = CreateObject("ADODB.Recordset")
objAdoCon.Open "Provider = Microsoft.Jet.oledb.4.0;Data Source = C:\Users\v-adamsje\Desktop\HTmaster.mdb"
'long run time
objRcdSet.Open strQry, objAdoCon
'very long run time
ThisWorkbook.Worksheets(dest).Range("A2").CopyFromRecordset objRcdSet
Set objAdoCon = Nothing
Set objRcdSet = Nothing
Next i
End Sub
Any help is appreciated. I am new to VBA and Access so this could be an easy fix. Thanks
Excel is very good at getting data for itself, without using VBA.
On the DATA ribbon
create a connection to a table or view of data somewhere (eg mdb or SServer)
then use the "existing connections" button to add data from your connected table to a worksheet table (ListObject).
You can even set the workbook (ie connection) to refresh the data every 12 hours.
Repeat for all the tables /view you need to grab data for. You can even specify SQL as part of the connection.
Let excel look after itself.
I just grabbed a 250,000 row table from a "nearby" disk in 2 secs.
It will look after itself and has no code to maintain!
I don't see how the CopyFromRecordset can be improved. You could copy the recods programmatically (in VB) record-by-record but that will probably be slower than the CopyFromRecordset.
You can move the CreateObject statements out of the loop, With the connection and RecordSet already created, this could be faster:
Set objAdoCon = CreateObject("ADODB.Connection")
Set objRcdSet = CreateObject("ADODB.Recordset")
For i = 1 To UBound(DataArr)
...
next i
Set objRcdSet = Nothing
Set objAdoCon = Nothing
You could also try ADO instead of DAO. ADO seems to perform faster on large record sets.
But also the server could be an issue, for example, are there indexes on Job and OperationCode? If not, then the slowness could be the server selecting the records rather than Excel placing them in the worksheet.
Whelp, never found out why the CopyFromRecordset runtime was obsurd, but solved my problem by pulling the whole table into excel then into an array, looping through that and putting them in respective sheets. From 30min runtime to <1min

Using VBA to create a dynamic table in Access 2010

I have an Access 2010 database with a VBA module that does some statistical analysis on the data. The results of the statistical analysis cannot be generated by SQL, but they can be presented in tabular format. Right now, I can run the VBA function in the Immediate window and it will loop over the results and write them to the terminal using Debug.Print().
I'd like to have the results of this function available to the rest of Access so that I can create queries and reports from the table of results. So what I'm looking for is how to turn my function into a "dynamic table" -- a table that doesn't actually store data, but stores the VBA function that runs and fills in the table data dynamically whenever that table is used.
I've spent quite a bit of time looking at creating tables dynamically via MAKE TABLE queries or using DDL in VBA, but all of these examples use SQL to create the new table from existing records. I can't use SQL to generate the results, so I'm not really sure how to coerce the results into an object that Access will recognize. Part of the problem is that I'm just not familiar enough with Access VBA terminology to know what I should be looking for.
My declaration is just "Public Function GenerateSchedule" . It has three code blocks: the first pulls the data I need from the database using a query and processes the RecordSet into an array. The second block performs the statistical analysis on the array, and the third prints the results of the analysis to the terminal. I'd like to replace the third block with a block that provides the results as a table that is usable by the rest of Access.
I use following code if I don't want to use DDL and SQL Query...
Set dbs = CurrentDb
Set tbl = dbs.CreateTableDef("tbl_Name")
Set fld = tbl.CreateField("Field1", dbText, 255)
tbl.Fields.Append fld
Set fld = tbl.CreateField("Field2", dbText, 255)
tbl.Fields.Append fld
Set fld = tbl.CreateField("Field3", dbInteger)
tbl.Fields.Append fld
Set fld = tbl.CreateField("Field4", dbCurrency)
tbl.Fields.Append fld
dbs.TableDefs.Append tbl
dbs.TableDefs.Refresh
and if you want to add a record you could do
Dim dbs As DAO.Database
Dim rs As DAO.Recordset
Set dbs = CurrentDb
Set rstVideos = dbs.OpenRecordset("tbl_name")
rs.AddNew
rs("field1").Value = "TEST "
rs("field2").Value = "TEXT"
rs("field3").Value = 1991
rs("field4").Value = 19.99
rstVideos.Update
I am not sure why you need to put the retrieved data into an array. It seems and extra step. If you can generate the statistics from the array, the same thing should be possible in a query. create another query, using the results query as one recordsource and make your calculations accordingly for the fields that you want created.
If we saw what you were trying to do, I think it could be made more simple.
This sounds like a disconnected recordset, or maybe "synthetic recordset," which is something ADO can do. I don't use ADO, so can't provide you with instruction, but maybe that will provide you with what you need.
Alternatively, depending on how you want to display it to the users, you might be able to do it native in Access. For instance, if presenting it on a form or report in a listbox is sufficient, then you could write a custom callback function and bind it to the listbox.