Access importing wrong columns - ms-access

Help! Access is importing wrong columns from a csv file. There are no commas in the csv and there should not be any spaces like (space)(space), but just (space). I'm ending up with '' in the SERIAL_NR header, but in the actual csv there are values.
This is giving me 1000 extra rows. Is there anything else to check for?

It's sneaky and underhanded, but the alt+enter character [chr(10)] was included in a few of the cells. I used the following vba code to get rid of it.
sub cleaner()
Dim enterchar as string
enterchar = chr(10)
activesheet.usedrange.Replace What:=enterchar, Replacement:="", LookAt:=xlPart, _
SearchOrder:=xlByRows, MatchCase:=False
end sub
Pretty nifty for prepping a csv for import into Access. I've created an add-in for cleaning csv's (eliminating duplicates, special char's, etc.) if anyone's interested I'll post.
Thanks all!

Related

How to remove double quotes from a CSV file that generated in Access by DoCmd.TransferText acExportDelim

I am using DoCmd.TransferText in MS Access to convert data from a query to a CSV file but there are double quotes around each field. The csv file will be used by an application and do not accept the format.
I know I can do it manually by Access Export wizard and set "Text Qualifier" to 'None' for removing double-quotes, but I need to do it in VBA. I would really appreciate if anyone can help me to do it with VBA
Here is the codes that I am using:
DoCmd.TransferText acExportDelim, , "tbl_ScrubCSV", "P:\0TPSScrub\" & Instrumentnu & ".csv", False
Expected result should be like:
E0A,M.V. KULTUS COVE,.,,,CA
but the CSV file exported like:
"E0A","M.V. KULTUS COVE",".","","","CA"

VBA Importing long file names

I have a .csv file on my Desktop. The file name is very long and includes special characters such as brackets, e.g.:
[ABCD 012015] ACCT 1117 - Section A10 Grades-20150316_1937-comma_separated.csv
where the first 36 characters of the filename are a constant. The remainder of the file name changes upon each instance of a download.
I have tried VBA to import the file into an Access with a DoCmd.TransferText and I get errors. I have discovered that there is a limit to the length of a filename that the DoCmd.TransfertText can handle.
So, I need to first rename the file manually, then I can use VBA. Renaming a file is relatively easy with VBA, but I would like to use the info provided in the original filename such as Section A10 (which would make it unique, since there are other Sections labelled A01, A02, etc.) and rename the .csv file as A10.csv, so this means searching for a string AND replacing it. Since the Section can be different, how do I write the code and rename the file then import it with VBA?
So, far I have bits and pieces, but cannot put them together:
Name OldPathName As NewPathName
DoCmd.TransferText acImport, "AME_Grades", strTable, strPathFile, blnHasFieldNames
I am using an import specification AME_Grades to make it cleaner in the Access table.
Any suggestions? TIA
I'd suggest to use FileCopy function before you start importing data from csv file into MS Access.
Dim OldPathName As String, NewPathName As String
OldPathName = "FullPathToVeryVeryLonLongLongFileName.csv"
NewPathName = "FullPathToShortFileName.csv"
FileCopy OldPathName, NewPathName
DoCmd.TransferText acImport, "AME_Grades", strTable, NewPathName, blnHasFieldName

Convert Windows Char Set to Cyrillic

I have a set of tables from a client that are supposed to be Cyrillic, but I guess the original coding was wrong or not set. All the text is gibberish.
If I past the text into an html page and set the encoding to Cyrillic-1251, I see the text as it should be.
Before: Ñèíèöûí À.Â.
After: Синицын А.В.
I've been looking for a VBA solution to convert the text in the tables without success. I thought this would be quick & easy, but so far no luck.
I'm running Win 7 with Access 2010
If you don't know Access, but have any VBA function to do this, I can adapt it to my needs.
Any help would be appreciated.
One possible way to solve the problem would be to export the table to CSV as code page 1252 ("ANSI") and then import it as code page 1251. I just tried that and for an existing table named [OldTable]
the following VBA code
Option Compare Database
Option Explicit
Sub DiskBounce()
Const tempFilePath = "C:\Users\Gord\Desktop\foo.csv"
DoCmd.TransferText _
TransferType:=acExportDelim, _
TableName:="OldTable", _
FileName:=tempFilePath, _
HasFieldNames:=True, _
CodePage:=1252
DoCmd.TransferText _
TransferType:=acImportDelim, _
TableName:="NewTable", _
FileName:=tempFilePath, _
HasFieldNames:=True, _
CodePage:=1251
Kill tempFilePath
End Sub
produced the following [NewTable]

Import CSV data from web service into Excel

I have written a simple web service that returns large volumes of csv data. I will to import this into Excel in a tabular format using Excel's "Data From Web" function.
Is there a way to get Excel to automatically parse the csv fields returned into individual columns as part of the import operation?
At present the only means I have for doing this is to first import the data into a single column and then write VBA code to select the data and split it using TextToColumns. This feels messy / error-prone.
The other alternative I have is to modify the web server to serve back the data as HTML. However, I'm reluctant to do this as adding tags around each csv field will greatly impact the volume of data returned.
Adamski,
Here is something that I use. I found the core somewhere on the internet, but don't know where.
What it does is it opens a tab separated file and reads the data in an excel sheet
If Answer1 = vbYes Then 'I asked prior if to import a tab separated file
Sheets("ZHRNL111").Select 'Select the sheet to dump the data
On Error Resume Next
With ActiveSheet
If .AutoFilterMode Then .ShowAllData 'undo any autofilters
End With
Sheets("ZHRNL111").Cells.Clear 'remove any previous data
On Error GoTo 0
Range("A1").CurrentRegion.Delete
Fname = MyPath & "\LatestReports\Report-111.tsv"
Open Fname For Input As #1
iRow = 1
Line Input #1, Record
On Error Resume Next
Do Until EOF(1)
P = Split(Record, vbTab)
For iCol = 1 To 14
Cells(iRow, iCol) = P(iCol - 1)
Next iCol
iRow = iRow + 1
Line Input #1, Record
Loop
On Error GoTo 0
Close 1
End If
Regards,
Robert Ilbrink
Depending on the version of excel you are running you should be able to open the .csv in excel and use the text to columns feature built into excel.
Also, if you could modify your csv to split columns based on commas "," instead of tabs excel would open it directly without the need to format it.
I know however this can sometimes be a problem depending on the data you are importing because if the data contains a comma it must be inside quotations.
In my experience the best way is to use quotations on every field if possible.
Hope this helps.
I am actually creating a product right now to do this in both XML and JSON for Excel. I know comma delimited does work in Excel, with some caveats. One way around it is to put some "" around the text in between the delimiters for the "Data From Web" feature. There are still issues with that however. I did find that despite it's increased size, XML was the best option for quick turn around. I was able to create the service and hand my project manager the Excel document which he could update at anytime.

Microsoft Access TransferText function: problem with codepage

I inherited a huge, bulky MS Access database and am assigned to solve a problem in it. The problem is as follow...
System A exports its data to a pipeline-delimited .txt file. The files has special characters working correctly, for example the value "Müller" shows when opening this file in notepad or Excel.
Next, the Access DB imports the .txt file and stores the result in an internal employees table. The last name field is of data type "memo". The method to import data from the .txt file to MS Access is as follow:
Call DoCmd.TransferText(acImportDelim, _
"tblEmployees", _
"tblEmployees", _
me.txtImportFile, _
True)
After running this import and viewing the employees table I noticed that names with special characters are screwed up. "Müller" becomes "M├⌐ller" for example. I investigated some online help and found out that can include a "codepage" parameter in the TransferText call, so I set it to 65001 (which appearantly is the codepage for unicode):
Call DoCmd.TransferText(acImportDelim, _
"tblEmployees", _
"tblEmployees", _
me.txtImportFile, _
True, _
, _
65001)
Now that I have ran the import script again, I see no difference whatsoever, the special characters are still misformed. I'm running out of steam so I hope one of you has some advise on how to resolve this...
Both versions of your TransferText operation are using a SpecificationName named tblEmployees. What Code Page is specified in that Specification?
Try importing the text file manually. Choose "Advanced" from the Import Text Wizard. Then select Unicode in the Code Page list box. You may need to test with different Code Page selections until you find which one imports your text correctly.
Which ever Code Page selection works, save your choices as a specification and use it in your TransferText command, without supplying a separate CodePage parameter.
Using CodePage=1200 (msoEncodingUnicodeLittleEndian) solved the issue in my case.
there is an unicode list to use in VBA:
http://msdn.microsoft.com/en-us/library/office/aa432511(v=office.12).aspx