CSV Load data not importing file - mysql

Importing CSV into MySQL by PHPmyAdmin using "CSV LOAD DATA".
The process does not throw any errors during the upload.
(I have made sure my columns are correct)
But as it returns the results, there are none.
Over 51000 rows of data in the spreadsheet, and "Browse" returns zero results.
Any suggestions? Maybe I am uploading too large of a spreadsheet?

All the fields need to be enclosed in quotes. Your best bet is to use an Excel file with a macro that does that for you. I wrote a tutorial on how to do just what you're describing which includes a link to an Excel file with a macro. Here's a link.

Related

Tableau isn't converting my csv data source to tables

When I import a csv to Tableau, it gets the same format of the original csv file (a single column with every label on it). How can I make Tableau separate the columns based on commas?
I can't see why this is happening, since in every tutorial I checked Tableau already converts the .csv to a tabular format.
Here's what I get
Note: I'm using Tableau's trial version.
Sometimes when you open a csv in Excel it can mess with the formatting like your image shows. If you think you opened it in Excel before connecting, try downloading your source again and connecting first with Tableau. If that doesn't work, I happen to have this dataset in a .tde if you would like to use that. vgsales.tde
Edit: Thinking regional settings might be a factor.
Click the dropdown on the right of the data source. Select Text File Properties
To get this window:
Can you match these settings?

"Inconsistent number of matrix lines compared to the number of labels" runtime exception error when importing large CSV file into Gephi

The full error is "java.lang.RuntimeException: java.lang.Exception: Inconsistent number of matrix lines compared to the number of labels."
I am trying to pull an adjacency matrix stored in a CSV file into Gephi so that I can use its modularity optimization tool and make a really slick chart of my data. I compiled the data in Excel (yes, it took forever) and saved it as CSV, and then I opened the file in Notepad and used Ctrl + H to replace all commas with semicolons (and saved it as a CSV file again). My dataset is 5,654 x 5,654 cells, not counting the labels. It is an r-neighborhood graph with r = .6299 (80th percentile and above).
I searched Google and StackOverflow and I only found one solution for my error message: to remove all the spaces in the file. I used Ctrl + H again to remove all spaces, but I received the same error message when I tried to upload the "spaceless" CSV file. Just to double-check that saving it as CSV didn't cause an issue, I checked the CSV by opening it up in Excel. The file opened correctly, but I do not have much experience with CSV files so I do not know if anything was off. It seemed as though all the records were separated by semicolons instead of commas and I did not see any spaces.
Is it the size of my file? I am currently struggling through learning some Python and R, and I would be open to creating this adjacency matrix CSV file in either of those environments and then feeding it to Gephi. I just need a dependable solution that works without bogging my computer down in Excel all afternoon and allows me to be the "slick graph superhero" of my office.
Not a direct answer to your problem but there is also the Excel/CSV import spigot to whatever it might be useful. Otherwise you could perhaps try to import the network with NodeXL and then save it in GraphML format which can then be opened by Gephi
Good tip from http://social-dynamics.org/gephi-faq/
A. One thing to try is removing any extra spaces from your csv file.
Sometimes these trip up the import. Open the csv file using a simple
text editor like NotePad or TextEdit, and then use find/replace to
remove any spaces. Save the adjacency matrix and then try importing it
again.
Removing spaces helped me to fix the issue.

adding text to an xls file with bash script

I'm trying to understand if it's possible to write to an xls file with a bash script. Situation is outlined below.
I have a cronjob that runs every monday and generates an xls and emails to my client. This xls is filled with data from a MySQL DB. when the report is empty and the client attempts to open it, it shows as corrupt. Originally I addressed this issue by excluding empty files from the email with an if statement. However, the constraint is that all 4 reports much reach the client - empty or not.
So my question is, can I simply add a row of text at the top with a bash script so the file never "empty"? I'm not an expert in bash scripting by any means, so feedback here would be great. thanks!
Tony
I'm not aware of any pure bash implementation for writing XLS files. There are solutions in other languages such as Perl, Python, or PHP. If you think outside the box there is another option available to you. You mentioned that you currently use an if statement to not attach empty files. Create a blank spreadsheet in a program like MS Excel, optionally enter some text in A1 like "No records", save it, and transfer that to a known location on the server that runs the cronjob. Rather that skipping the attachment, whenever you detect an empty file in your if statement just attach the blank "No records" template XLS file. You may need to copy the template to a temporary location before attaching if you need to rename the file.

Macro For MS Access for Batch Uploading

I want to upload 1000 CSV file to a single table in MSAccess. Could someone help me with macro for that.
First make sure your CSV is fit to upload - no blank lines, no blank colums, column headers worded to suit Access (No spaces, no Reserved words, no barred characters). Then in the access ribbon click External data, select the file type you are importing, then browse to the file and create a new table. You will have to help access with the data types during the import process.

Creating a CSV file with the Report Generation Toolkit in Labview

I want to create .csv files with the Report Generation Toolkit in Labview.
They must actually be .csv files which can be opened with Notepad or something similar.
Creating a .csv is not that hard, it's just a matter of adding the extension to the file name that's going to be created.
If I create a .csv file this way it opens nicely in excel just the way it should, but if I open it in Notepad it shows all kind of characters and it doesn't even come close to the data I wrote to the file.
I create the files with the Labview code below:
Link to image (can't post image yet because I've got to few points)
I know .csv files can be created with the Write to Spreadsheet VI but I would like to use the Report Generation Toolkit because it's pretty easy to add columns and rows to the file and that is something I really need.
you can use the Robust CSV package on the lavag.org forum to read and write 2D arrays to CSV files.
http://lavag.org/files/file/239-robust-csv/
Calling a file "csv" does not make it a CSV file. I never used the toolkit to generate an Excel file, but I'm assuming it creates an XLS or XLSX file, regardless of what extension you give it, which is why you're seeing gibberish (probably XLS, since it's been around for a while and I believe XLSX is XML, not binary).
I'm not sure what your problem is with the write spreadsheet VI. It has an append input, so I assume you can use that to at least add rows directly to a file, although I can't say I ever tried it. I would prefer handling all the data in memory explicitly, where you can easily use the array functions to add rows or columns to the array and then overwrite the entire file.