Issues with Access parsing double quotes in a CSV file - ms-access

I have a large CSV file that I am trying to import into Microsoft Access but I am running into issues. Assume pipes represent different cells in the database
Assume my content is the below. The second entry will only parse the word my with default settings and will not import the word content into the database even though the import wizard implies that it will. The default settings being , delimiter and " text qualifier.
|my content is good|
|my|
Now if i change the text qualifier to NONE it parses the entire second entry and my content will be imported into the database however the first entry will wind up being in 3 different cells in the data base and will show up as
my|content|is|good.
|my content
I used pipes to imply different cells.
This seems like a limitation in Microsoft Access. Is anyone familiar with a workaround for this?
Original content:
,"my,content,is,good","",
,my"content","",
I am using the import wizard

Yes, this is a limitation of the CSV import capabilities in Access. For whatever reason, Access has always been more restrictive than Excel in its abilities to parse CSV files.
So, one workaround would be to open the CSV file in Excel, save the file as an actual Excel sheet, and then import the Excel sheet into Access. For example, the CSV file
this,is,a "test",CSV file,"Ugly, yes, but still parsable."
is "non-standard" (if one is willing to concede that there is such a thing as a CSV "standard"), and Access cannot import it directly. (It either complains of an "Unparsable Record" or it splits the last field on the commas, depending on the "Text Qualifier" setting.)
However, we can open it in Excel
save the file as "foo.xlsx", and then import the .xlsx file into Access

Related

SSIS Exporting to Flat File Destination (CSV) - Custom Property EscapeQualifier Not Working (Undocumented?)

Many questions have been asked on this topic, but I can't find anything specifically addressing what I see in Visual Studio 2017 (SSDT). A Custom Property named "EscapeQualifier" exists for a flat-file destination component in an SSIS project. Unfortunately, setting this to true doesn't seem to do anything.
Searching official documentation from MS doesn't even show the property existed.
On the surface, using this option seems to be a very elegant solution to the common issue of creating a "real" CSV file when the data being exported contains the double-quote character. If it worked as it seems it should, then it would double any double-quotes (or similarly escape whatever character you defined as your text-qualifier) for all quotable fields in the destination.
The solutions for "the CSV problem" that I've been able to find suggest modifying the specific data via transforms or at the data-retrieval level, but that's very impractical to do on each and every text-qualified data column.
To add insult to injury, I found a KB article from MS that suggests "exporting to CSV" is an official thing in SSDT.
KB4135137 - SSMS and SSDT do not escape double quotation marks when you export data as CSV
For example, you export a table into CSV format in a SQL Server Integration Services (SSIS) project.
This article suggests that the double-quotes not being escaped are a bug that has been fixed. Maybe it has, but only for the "Save results as..." option within SSMS. I still don't see any possible way to specify a true CSV export in an SSIS package, and this "EscapeQualifier" option gave me false hope.
Does this "EscapeQualifier" option ever do anything? If so, how do I get it to work? If not, is there another universal solution to the SSIS export to CSV issue?
Note: I created a pull request to add information about this property to Microsoft Docs.
As mentioned in the Flat File Destination properties, the EscapeQualifier property is used to:
When text qualifier is enabled, specifies whether the text qualifier in the data written to the destination file will be escaped or not
To test this property, I created a package the transfer data from a flat file to another one.
In the source flat file connection manager, the Text Qualifier is set to <none>, while in the destination flat file connection manager the text qualifier is set to ". The source flat file only contains the following value: my name is "hadi".
Is set the EscapeQualifier property as True in the flat file destination and execute the package. As shown in the screenshot below, the destination file contains the following value: "My name is ""hadi""" which means that this property worked as excepted.
Make sure that you have set a text qualifier in the flat file connection manager to ensure that this property will work as excepted.

Tableau isn't converting my csv data source to tables

When I import a csv to Tableau, it gets the same format of the original csv file (a single column with every label on it). How can I make Tableau separate the columns based on commas?
I can't see why this is happening, since in every tutorial I checked Tableau already converts the .csv to a tabular format.
Here's what I get
Note: I'm using Tableau's trial version.
Sometimes when you open a csv in Excel it can mess with the formatting like your image shows. If you think you opened it in Excel before connecting, try downloading your source again and connecting first with Tableau. If that doesn't work, I happen to have this dataset in a .tde if you would like to use that. vgsales.tde
Edit: Thinking regional settings might be a factor.
Click the dropdown on the right of the data source. Select Text File Properties
To get this window:
Can you match these settings?

Open Import Text Wizard in Access 2013 through VBA

I'm trying to open the Import Text Wizard as part of my code after it's ftp'd down a text file.
I don't have the metadata of the file as each file could be different, the only things they have in common is that they are pipe delimited with no text qualifier with a header row. No consistency in column type or number of columns.
Hence I don't think I can easily use docmd.transfertext as the specification would have to be different each time.
I don't mind pushing people down the manual route but if I look this up the instructions are to use Docmd.RunCommand acCmdImport. This appears to then be deprecated after Access 2007 as if I run it I get Run-time error 2002 saying the function or feature isn't installed in this version.
What I'm after is either:
A way to open the wizard
or
A way to import / link the text files without knowing the metadata ahead of the import.
The command is:
Docmd.RunCommand(acCmdImportAttachText)
That opens the wizard as if you called it from the band.

"Inconsistent number of matrix lines compared to the number of labels" runtime exception error when importing large CSV file into Gephi

The full error is "java.lang.RuntimeException: java.lang.Exception: Inconsistent number of matrix lines compared to the number of labels."
I am trying to pull an adjacency matrix stored in a CSV file into Gephi so that I can use its modularity optimization tool and make a really slick chart of my data. I compiled the data in Excel (yes, it took forever) and saved it as CSV, and then I opened the file in Notepad and used Ctrl + H to replace all commas with semicolons (and saved it as a CSV file again). My dataset is 5,654 x 5,654 cells, not counting the labels. It is an r-neighborhood graph with r = .6299 (80th percentile and above).
I searched Google and StackOverflow and I only found one solution for my error message: to remove all the spaces in the file. I used Ctrl + H again to remove all spaces, but I received the same error message when I tried to upload the "spaceless" CSV file. Just to double-check that saving it as CSV didn't cause an issue, I checked the CSV by opening it up in Excel. The file opened correctly, but I do not have much experience with CSV files so I do not know if anything was off. It seemed as though all the records were separated by semicolons instead of commas and I did not see any spaces.
Is it the size of my file? I am currently struggling through learning some Python and R, and I would be open to creating this adjacency matrix CSV file in either of those environments and then feeding it to Gephi. I just need a dependable solution that works without bogging my computer down in Excel all afternoon and allows me to be the "slick graph superhero" of my office.
Not a direct answer to your problem but there is also the Excel/CSV import spigot to whatever it might be useful. Otherwise you could perhaps try to import the network with NodeXL and then save it in GraphML format which can then be opened by Gephi
Good tip from http://social-dynamics.org/gephi-faq/
A. One thing to try is removing any extra spaces from your csv file.
Sometimes these trip up the import. Open the csv file using a simple
text editor like NotePad or TextEdit, and then use find/replace to
remove any spaces. Save the adjacency matrix and then try importing it
again.
Removing spaces helped me to fix the issue.

Macro For MS Access for Batch Uploading

I want to upload 1000 CSV file to a single table in MSAccess. Could someone help me with macro for that.
First make sure your CSV is fit to upload - no blank lines, no blank colums, column headers worded to suit Access (No spaces, no Reserved words, no barred characters). Then in the access ribbon click External data, select the file type you are importing, then browse to the file and create a new table. You will have to help access with the data types during the import process.