How can I export .csv files correctly? - ms-access

I've got two tables (8/~150 columns). The first one gets filled with personal data, while the second one is a "checklist" which is filled with one character.
I created a query, which concatenates both tables, to export it as a .csv file. When I try to do so (with the export wizard) I get this exception:
The Microsoft Access database engine could not find the object 'filename.csv'. Make sure the object exists and that you spell its name and the path correctly.
I double checked, tried to do it with VBA, but nothing worked.
I don't know what I should try to do.
I'm hoping that someone can help me
Paul

Related

SSIS - Exporting data with commas to a csv file

I am trying to export a list of fields to a csv file from a database.
It keeps putting all the data onto one column and doesn't separate it. When checking the preview it seems to be okay but on export its not working. Currently trying to following settings. Any help would be appreciated.
SSIS settings
Excel file output issue
Actually it seems to work, Excel is just too dumb to recognize it.
Mark the whole table, then go to Data -> Text in Rows
And configure the wizard (Separeted, Semikolon as Separator):
Now you have seperated the rows and cells:

Access Export to CSV, trouble keeping leading zeros, 55-60 Million Records

Let me preface this by saying yes, I do need all of the records. It's part of a FOIL request. Not necessarily in one worksheet or file.
I'm having trouble figuring this out. I am currently able to export about 500k records at a time without timing out or exceeding access file size(I think this is due to working with state systems legacy data) or worrying about excel row limit. I can preserve column headers but lose leading zeros in one field.
This is done using the export wizard to text file. I change the destination file name ending from .txt to .csv and that gives me the option to keep headers. On the preview of the .csv file in the wizard and when opened in notepad it shows the field with leading zeros correctly with double quotes around it, for example "00123456" but then when opened in excel it shows as 123456. If I update the row from General format to Text the contents remain the same.
I have tried VBA method of DoCmd.TransferSpreadsheet but when I try to run it I am prompted with a Macros box. And honestly am less familiar with VBA than in am SQL. Overall I consider myself a novice.

Invalid field count in CSV input on line 1 phpmyadmin

I have read many threads but can't find the right specific answer. I get this error message when I try to import additional data into an existing table. The field names are all aligned correctly, but not every row has data in every field. For example, although I have a field named middle_name, not every row has a middle name in it. During the import process, is this blank field not counted as a field and thus throwing off the field count?
I have managed to get most of the data to import by making sure I had a blank column to allow for the auto-increment of ID, as well as leaving the header row in the file but choosing 1 row to skip on the import.
Now the problem is the last row won't import - get error message Invalid format of CSV input on line 19. When I copy the file to Text Wrangler, the last row ends with ,,,,,. This accounts for the last 5 columns which are blank. I need to know what the trick is to get the last row to import.
Here are the settings I have been using:
I’ve had similar problems (with a tab-separated file) after upgrading from an ancient version of phpMyAdmin. The following points might be helpful:
phpMyAdmin must have he correct number of columns. In older versions of phpMyAdmin you could get away with not supplying empty values for columns at the end of the row, but this is no longer the case.
If you export an Excel file to text and columns at the start or end of rows are completely empty, Excel will not export blanks for those rows. You need to put something in, or leave blank then edit the resulting file in a text editor with regular expressions, e.g. to add a blank first row, search for ^ and replace with , (CSV file) or \t (tab file); to add two columns to the end search for $ and replace with ,, or \t\t etc.
Add a blank line to the bottom of the file to avoid the error message referring to the last line of data. This seems to be a bug that has been fixed in newer versions.
Whilst in the text editor, also check the file encoding as Excel sometimes saves as things like UTF-16 with BOM which phpMyAdmin doesn’t like.
I saw the same error while trying to import a csv file with 20,000 rows into a custom table in Drupal 7 using phpmyadmin. My csv file didn't have headers and had one column with many blanks in it.
What worked for me: I copied the data from Excel into Notepad (a Windows plain text editor) and then back into a new Excel spreadsheet and re-saved it as a csv. I didn't have to add headers and left the blank rows blank. All went fine after I did that.
You'll never have this problem if you keep your 1st row the header row, even if your table already has a header. You can delete the extra header row later.
Why this helps is that, then mysql knows how many cells can possibly contain data, and you wont have to fill in dummy data or edit the csv or any of those things.
I’ve had a similar problem, Then I have tried in Mysql Workbench.
table data import through CSV file easily and I have done my work perfectly in MySQL Workbench.
As long as the CSV file you're importing has the proper number of empty columns, it should be no problem for phpMyAdmin (and that looks like it's probably okay based on the group of commas you pasted from your last line).
How was the CSV file generated? Can you open it in a spreadsheet program to verify the line count of row 19?
Is the file exceptionally large? (At 19 rows, I can't imagine it is, but if so you could be hitting PHP resource limits causing early termination of the phpMyAdmin import).
Make sure you are trying to import the table into the database and not the database > table.
I had this same issue, tried all listed and finally realized I needed to go up a level to the database

SSIS Datatype issue with different options

Does anyone know how do the options 'Copy data from one or more tables or view' and 'Write a query to specify the data to transfer' in Import & Export Wizard function differently?
During testing, I tried using both options to transfer a table from a source to destination but I am getting different datatypes mapping.
For 'Copy data from one or more tables or view' option, all the datatypes mapping are correct. However, if I use 'Write a query to specify the data to transfer' option, some dataypes would appear as numbers instead of its appropriate datatype. Eg: Under type column, it shows 1 instead of Char.
I think I can rule out Mapping files as the cause because I would be getting the same error for both options if that's the case. What I would like to ask is if anyone knows whether the query is being parsed differently with those options? If so, how is it different?
Any advice is appreciated. Thanks for your help.

Appending rows in a database using toad and excel

Friends, I am using toad for MySQl, and have a huge database ready and validated.
Now i have an excel file which contains data-entries for a particular table. And i am also successfully able to import data into the db using import wizard, mapping the first row header with the column names etc.
But now i have appended a few data entries into it which i wish to insert into the database. However the old values also get selected and hence cause a primary_key_violation exception as the entry already exists! Otherwise a truncate table option is there which i dont wish to use as there may be many files from which i have inserted the data.
I tried my level best but didnt get any solution, atleast in toad for mysql. Please tell me what to do! the solution maybe simple but i need it SOS
An option may be to not append records to that excel file, but create a new excel file with only the new records