I am using the following code, based on from previous posts and answers by Remou and Anthony Jones.
Dim db: db = "C:\Dokumente und Einstellungen\hom\Anwendungsdaten\BayWotch4\baywotch.db5"
Dim exportDir: exportDir = "C:\Dokumente und Einstellungen\hom\Desktop"
Dim exportFile: exportFile=NewFileName(exportDir)
Dim cn: Set cn = CreateObject("ADODB.Connection")
cn.Open _
"Provider = Microsoft.Jet.OLEDB.4.0; " & _
"Data Source =" & db
cn.Execute "SELECT * INTO [text;HDR=No;Database=" & exportDir & _
";CharacterSet=65001]." & exportFile & " FROM tblAuction"
'Export file
'Support functions
Function NewFileName(ExportPath)
Dim fs
Dim NewFileTemp
Set fs = CreateObject("Scripting.FileSystemObject")
NewFileTemp = "CSV" & Year(Date) _
& Month(Date) & Day(Date) & ".csv"
NewFileName = NewFileTemp
End Function
The problem I am having, is when I export the file, the csv file contains headers, despite HDR being set to No. It will have the names of my columns in quotes before the actual data, which causes problems when attempting to import.
My second problem is that special characters do not seem to be escaped.
I am loading the data into mysql with:
LOAD DATA LOCAL INFILE 'file' INTO TABLE MYTABLE FIELDS TERMINATED BY ';' ENCLOSED BY '"' ESCAPED BY '\\'
I have also tried without the ESCAPED BY clause.
The problem is that one of the fields contains html data, which means quotes, slashes etc. This causes the data to be imported incorrectly, with date fields being inserted into the usernames fields and such. How can I escape to stop this from happening, or import correctly?
I am using a scheme.ini like the following:
[CSV2009427.csv]
ColNameHeader=No
CharacterSet=65001
Format=Delimited(;)
Col1=article_no Char Width 19
And column headers are still exported. Is it not possible to do this in a way without requiring schema.ini? I.e. being able to use the script in a portable way, where a schema.ini may not alway exist?
The problem I am having, is when I
export the file, the csv file contains
headers, despite HDR being set to No.
I think you need to need to include ColNameHeader=False in the Schema.ini File.
Example:
C:\schema.ini
[blah.csv]
ColNameHeader=False
CharacterSet=1252
Format=CSVDelimited
Col1=pence_amount Integer
SQL code:
SELECT * INTO [text;Database=C:\].blah#csv FROM Coins;
Note the schema.ini file is saved in the same directory as specified by Database in the connection string in the SQL.
Result: no headers.
Related
I've got about 100 CSV files that I'm trying to import them into Access and then rename the tables based on the file names.
Here is the code I've found but the "tablename" should be my file name. however, I can't get it to work as I'm new to scripting.
Function Import_multi_csv()
Dim fs, fldr, fls, fl
Set fs = CreateObject("Scripting.FileSystemObject")
Set fldr = fs.getfolder("D:Files\")
Set fls = fldr.files
For Each fl In fls
If Right(fl.Name, 4) = ".csv" Then
DoCmd.TransferText acImportDelim, , "TableName", "D:Files\" & fl.Name, False
End If
Next fl
End Function
Also, I have three columns in my files and I want the third column to be imported as a double.
Any help will be appreciated.
It should be this:
DoCmd.TransferText acImportDelim, , "[" & fs.GetBaseName(fl.Name) & "]", "D:Files\" & fl.Name, False
As for your second question, you could create, save, and use an import specification.
I've been assigned the task of importing about 180 csv files into an access 2007 database. These files have been put together over the years and will be put into 1 of 3 folders. I have not set up any data checks or restrictions to these tables (such as primary keys, validation rules, or relationships). That will be done once the data has been imported. The data contained in these files are from a survey which has changed over the years. This change has caused the fields to change. The order of them has changed or sometimes a field is there and sometimes it is not. I do have a list of all the fields possible though and what table each csv file should be imported to, and know that all these fields can be text.
Here is my problem: Not knowing what the order of the columns or if a column will exist, is it possible to run a function to import these text files into their relative tables by mapping each column in the text file to it's associated column in the access table?
Each text file has headers which is useful to see shat they actually are, but there is no text qualifier which can be very annoying when dealing with id codes consisting entirely of numbers. Below is what I've tried so far. It gets the file location from a function elsewhere, adds each filename in that location to a collection, then for each file in that collection it tries to import it into it's relative field.
'Get file names from the folder and store them in a collection
temp = Dir(location & "\*.*")
Do While temp <> ""
fileNames.Add temp
temp = Dir
Loop
'Go through each file in the collection and preccess it as needed
For Each temp2 In fileNames
If (temp2 Like "trip*") Then 'Import trip files
'Gets the data from a query 'DoCmd.RunSQL "SELECT * FROM [Text;FMT=Delimited;HDR=YES;IMEX=2;CharacterSet=437;DATABASE=" & location & "].[" & temp2 & "] As csv;"
DoCmd.TransferText acImportDelim, "Trips_Import", "tbl_Trips", location & "\" & temp2, -1
End If
If (temp2 Like "catch*") Then 'Import catch files
DoCmd.TransferText acImportDelim, "Catch_Import", "tbl_Catch", location & "\" & temp2, -1
End If
If (temp2 Like "size*") Then 'Import size files
DoCmd.TransferText acImportDelim, "Size_Import", "tbl_Size", location & "\" & temp2, -1
End If
Next temp2
You can create a SELECT * query for each CSV file and open the query as a recordset. Open another recordset for the destination table.
Then for each row in the CSV recordset, add a row to the destination recordset, loop through the CSV Fields collection, and add each CSV field value to the destination field with the same name.
This approach is independent of the order in which the fields appear in the CSV file. It also doesn't matter if the CSV file includes only a subset of the fields present in the destination table. As long as each CSV field also exists in the table, it should work (assuming compatible data types, the value satisfies validation rules/constraints, etc.).
Dim db As DAO.Database
Dim fld As DAO.Field
Dim rsDest As DAO.Recordset
Dim rsSrc As DAO.Recordset
Dim strSelect As String
Dim strTableName As String
Set db = CurrentDb
'Go through each file in the collection and preccess it as needed
For Each temp2 In fileNames
Select Case Left(temp2, 4)
Case "trip"
strTableName = "tbl_Trips"
Case "catc"
strTableName = "tbl_Catch"
Case "size"
strTableName = "tbl_Size"
Case Else
' what should happen here?
' this will trigger an error at OpenRecordset(strTableName) ...
strTableName = vbNullString
' figure out a better alternative
End Select
strSelect = "SELECT csv.* FROM " & _
"[Text;FMT=Delimited;HDR=YES;IMEX=2;CharacterSet=437;DATABASE=" & _
Location & "].[" & temp2 & "] As csv;"
Debug.Print strSelect
Set rsSrc = db.OpenRecordset(strSelect, dbOpenSnapshot)
Set rsDest = db.OpenRecordset(strTableName, dbOpenTable, dbAppendOnly)
With rsSrc
Do While Not .EOF
rsDest.AddNew
For Each fld In .Fields
rsDest.Fields(fld.Name).value = fld.value
Next
rsDest.Update
.MoveNext
Loop
.Close
End With
rsDest.Close
Next temp2
Note: This is a RBAR (row by agonizing row) approach, so the performance will be less than stellar. However, I presumed you will do this only once, so the performance hit will not be a deal-breaker. If you need a faster set-based approach instead, you can build and execute an "append query" for each CSV file. To do that, you would first need to get the CSV field names, and then build the appropriate INSERT INTO statement.
i need procedure in VBA to import data into access from csv excel file without some records,, as header and footer. Example,,, i have table in csv file, which contains some
sentence which not belong table date
A1 this is some sentence title.......
A2 title
A3.......
A7 DATA DATA DATA DATA DATA
A8 rows DATA DATA DATA DATA DATA
......
....
A256 DATA DATA DATA DATA
A257 this is some sentence
My Acess shoud contain only rows between A7 to A256. Does anyone knows procedure or whatever in VBA who solves my problems ?
thanks a lot
Edit
The easiest way to do it is to link the CSV-file into the Access database as a table. Then you can work on this table as if it was an ordinary access table, for instance by creating an appropriate query based on this table that returns exactly what you want.
You can link the table either manually or with VBA like this
DoCmd.TransferText TransferType:=acLinkDelim, TableName:="tblImport", _
FileName:="C:\MyData.csv", HasFieldNames:=true
Update
Dim db As DAO.Database
' Re-link the CSV Table
Set db = CurrentDb
On Error Resume Next: db.TableDefs.Delete "tblImport": On Error GoTo 0
db.TableDefs.Refresh
DoCmd.TransferText TransferType:=acLinkDelim, TableName:="tblImport", _
FileName:="C:\MyData.csv", HasFieldNames:=true
db.TableDefs.Refresh
' Perform the import
db.Execute "INSERT INTO someTable SELECT col1, col2, ... FROM tblImport " _
& "WHERE NOT F1 IN ('A1', 'A2', 'A3')"
db.Close: Set db = Nothing
Your file seems quite small (297 lines) so you can read and write them quite quickly. You refer to Excel CSV, which does not exists, and you show space delimited data in your example. Furthermore, Access is limited to 255 columns, and a CSV is not, so there is no guarantee this will work
Sub StripHeaderAndFooter()
Dim fs As Object ''FileSystemObject
Dim tsIn As Object, tsOut As Object ''TextStream
Dim sFileIn As String, sFileOut As String
Dim aryFile As Variant
sFileIn = "z:\docs\FileName.csv"
sFileOut = "z:\docs\FileOut.csv"
Set fs = CreateObject("Scripting.FileSystemObject")
Set tsIn = fs.OpenTextFile(sFileIn, 1) ''ForReading
sTmp = tsIn.ReadAll
Set tsOut = fs.CreateTextFile(sFileOut, True) ''Overwrite
aryFile = Split(sTmp, vbCrLf)
''Start at line 3 and end at last line -1
For i = 3 To UBound(aryFile) - 1
tsOut.WriteLine aryFile(i)
Next
tsOut.Close
DoCmd.TransferText acImportDelim, , "NewCSV", sFileOut, False
End Sub
Edit re various comments
It is possible to import a text file manually into MS Access and this will allow you to choose you own cell delimiters and text delimiters. You need to choose External data from the menu, select your file and step through the wizard.
About importing and linking data and database objects -- Applies to: Microsoft Office Access 2003
Introduction to importing and exporting data -- Applies to: Microsoft Access 2010
Once you get the import working using the wizards, you can save an import specification and use it for you next DoCmd.TransferText as outlined by #Olivier Jacot-Descombes. This will allow you to have non-standard delimiters such as semi colon and single-quoted text.
I want to create a classic asp (vbscript) function that replaces all 'returns' that occur between double quotes.
The input string is 'csv' like:
ID;Text;Number
1;some text;20
2;"some text with unwanted return
";30
3;some text again;40
I want to split the string on chr(13) (returns) to create single rows in an array. It works well, except for the unwanted chr(13) that is contained in the text of id 2.
I hope someone could help.
Fundamentally, this is going to be difficult to do as you won't be able to tell whether the carriage return is a valid one or not. Clearly the ones after 20 and 30 are valid.
An approach I would would be to scan through each line in the file and count the commas that occur. If it's less than 3, then append the next line and use the concatenated string. (This of course assumes your CSV structure is consistent and fixed).
What I would really be asking here is why is the CSV like this in the first place? The routine that populates this should really be the one stripping the the CRs out.
Think of a CSV file like a very crude database or spreadsheet. When cosidering the above file, it is clear that the 'Database'/'Spreadsheet' is corrupt.
If the program that generates this is correupting it, then what extent should the reading application goto to correct these defects? I'm not sure that Excel or SQL Server (for example) would go to great lengths to correct a corrupt data source.
Your text file is just like a CSV file but with semicolons not commas. Use ADO to grab the data and it will handle the line breaks in fields.
Specifically (In ASP VBScript):
On Error Resume Next
Const adOpenStatic = 3
Const adLockOptimistic = 3
Const adCmdText = &H0001
Set objConnection = Server.CreateObject("ADODB.Connection")
Set objRecordSet = Server.CreateObject("ADODB.Recordset")
strPathtoTextFile = server.mappath(".") 'Path to your text file
objConnection.Open "Provider=Microsoft.Jet.OLEDB.4.0;" & _
"Data Source=" & strPathtoTextFile & ";" & _
"Extended Properties=""text;HDR=YES;FMT=Delimited"""
objRecordset.Open "SELECT * FROM test.txt", _
objConnection, adOpenStatic, adLockOptimistic, adCmdText
Do Until objRecordset.EOF
Response.Write "ID: " & objRecordset.Fields.Item("ID") & "<br>"
Response.Write "Text: " & objRecordset.Fields.Item("Text") & "<br>"
Response.Write "Number: " & objRecordset.Fields.Item("Number") & "<br>"
objRecordset.MoveNext
Loop
Code sample is modified from Microsofts' Much ADO About Text Files.
This script assumes your data text file is in the same directory as it (the asp file). It also needs a schema.ini file in the same directory as your data text file with the data:
[test.txt]
Format=Delimited(;)
Change text.txt in both code samples above to the name of your text file.
If the unwanted CRLF always occurs inside a text field (inside double quotes), it would not be very difficult to use a regular expression to remove these. Vbscript has a regex engine to its disposal: http://authors.aspalliance.com/brettb/VBScriptRegularExpressions.asp
It all depends ofcourse on how familiar you are with Regular Expressions. I couldn't think of the proper syntax off the top of my head, but this is probably quite easy to figure out.
The solution is pretty easy:
str = "Some text..." & chr(13)
str = REPLACE(str,VbCrlf,"")
The secret is use VbCrlf. For me I use a simple function for solve the problem and add this in my framework.
FUNCTION performStringTreatmentRemoveNewLineChar(byval str)
IF isNull(str) THEN
str = ""
END IF
str = REPLACE(str,VbCrlf,"")
performStringTreatmentRemoveNewLineChar = TRIM(str)
END FUNCTION
Of course this will remove all new lines character from this string. Use carrefully.
I have an access database, with a query made. I need to automate it so that each night this query can run and export to a tab delimited csv file. It is not possible to export a query to a csv file from within access. My question is, are there any tools that can select certain tables, or perform an sql query on an mdb file, and export to a csv file?
Actually, you can export a query to a csv file from within Access.
You can do this with a Macro using the TransferText method.
Macro:
Name = ExportQuery
Action = TransferText
Transfer Type = Export Delimited
Table Name = [name of your Access query]
File Name = [path of output file]
Has Field Names = [Yes or No, as desired]
You can execute the macro from the command line like this:
"[your MS Office path]\msaccess.exe" [your databse].mdb /excl /X ExportQuery /runtime
Since you're having trouble with TransferText in a macro try this:
1) Create a Module named "ExportQuery". In this module, create a function called "ExportQuery":
Function ExportQuery()
DoCmd.TransferText acExportDelim, , "[your query]", "[output file].csv"
End Function
2) Create a Macro named RunExportQuery:
Action = RunCode
Function Name = ExportQuery ()
VBScript works quite well with the Jet engine. However, I do not see why you say " It is not possible to export a query to a csv file from within access."
Sub TransferCSV()
DoCmd.TransferText acExportDelim, , "PutNameOfQueryHere", "C:\PutPathAnd\FilenameHere.csv", True
End Sub
Is the usual way in VBA.
EDIT:
It is possible to run a VBScript file (.vbs) from the command line. Here is some sample VBScript to output a tab delimited file.
db = "C:\Docs\LTD.mdb"
TextExportFile = "C:\Docs\Exp.txt"
Set cn = CreateObject("ADODB.Connection")
Set rs = CreateObject("ADODB.Recordset")
cn.Open _
"Provider = Microsoft.Jet.OLEDB.4.0; " & _
"Data Source =" & db
strSQL = "SELECT * FROM tblMembers"
rs.Open strSQL, cn, 3, 3
Set fs = CreateObject("Scripting.FileSystemObject")
Set f = fs.CreateTextFile(TextExportFile, True)
a = rs.GetString
f.WriteLine a
f.Close
SQL Server Integration Services is able to do the transformation that you are talking about. Don't be fooled by the name, because you don't need SQL Server in order to automate and run the packages.
http://msdn.microsoft.com/en-us/library/ms141026.aspx