Importing ODS to phpmyadmin error 1117: too many columns - mysql

so I am currently testing one web application, and for that I need to import an excel file to phpmyadmin.
I need to import the file as an *.ods. To do that, I know I need to rename the file so that it matches the table name, and set values in first row to match columns. However, whenever I try to import the file, I get an error 1117: too many columns, listing all the unecessary empty columns in my ods file (F,G,H,I,J....).
Is there any way to remove those columns, or have them be ignored?

A lot of things can wrong when you're importing a spreadsheet. If your boss highlighted row 70,000 the color "invisible" (yes kids, that's a color now), the row will stretch into infinity and give a too many columns error. Save as csv and you delete all that mess, but then you have to make sure your delimiters are nice and neat or your fields will wander into their neighbor's columns.

Related

Program to help split and manage 2,000 column excel

I am building a web application that will run off of data that is produced for the public by a governmental agency. The issue is that the csv file that houses the data I need is a 2,000 column beast of a file. The file is what it is, I need to find the best way to take it and modify it. I know I need to break this data up into much smaller tables within MySQL, but I'm struggling with the best way to do this. I need to make this as easy as possible to replicate for next year when the data file is produced again (and every year after). I've searched for programs to help, and everything I've seen deals with a huge amount of rows, not columns. Has anyone else dealt with this problem before? Any ideas? I've spent the last week color coding columns in excel and moving data to new tabs, but this is time consuming, will be super difficult to replicate and I worry it leaves me open for copy and paste errors. I'm at a complete loss here!
Thank you in advance!
I suggest that you use functions in excel to give every column an automatic name "column1", "column2", "column3", etc.
After that import the entire csv file into MySQL.
Decide on which columns you want to group together into separate tables. This is the longest step and no program can help you manage this part.
Query your massive SQL table to get just the columns you want for each group. Export these queries to CSV and then import them as new tables in your database.
At the end, if you want, query all the columns you didn't put into separate groups. Make this a new table in the database and delete the original table to save on storage space.
Does this government csv file get updated and republished in the same format every time? If so you'll need to write a script to do all of the above automatically.

MySQL - importing a CSV file

I am trying to import a large excel file containing text, values and links to websites etc into MySQL.
I have saved my .xlsx file as a .csv file and am trying to import it using the "table data import" tool. When I reach the screen where it asks for the encoding I am currently seeing all the correct column headings, but only five out of 15 of my rows. I think this must be due to an unrecogonized character being present in the 6th row. However, I do not know how to find what this character might be.
Also, if I select different encodings, then my 6th row appears but not the rest.
So, if anyone can help me to work out which characters are causing this error, I would be very grateful.
Thanks,
Sarah

How to properly format csv data from my sql workbench

Hello i am trying to export data from my remote database using mysql work bench.
I have been able to export successfully but the records re not properly formatted into their right columns.
Please is there any way to properly place the text in their columns,
Find below a screen shot
In the above file there are two fields insured name and registeration number.
They are jumbled together.
Is there a way i can properly format the out put
Thanks
Since it worked... I'll post it as an answer :)
Both the export process and import process need to have their column deliminators match. CSV is normally with commas, however tabs or \t is also common. When exporting, look at the various properties during the export process, and I'm betting you can find an option to change the character.

Import csv file into an ms access table- Data ignored [duplicate]

Access is truncating the data in a couple Memo fields when I am appending an Excel file. The field in the Access table is already set as a Memo type. I believe the problem is that I do not have any entries in the first few rows of some of the memo fields. Access is assuming the data is a text field, even though I have already set it as a Memo type.
I have tried appending as a CSV. Did not work.
I have put dummy data in the first row that exceeds the 255 character limit and the data is not truncated if I do that.
I do not want to have to put dummy data in every time I have to import an Excel file. This is a process that will be completed at least biweekly, maybe more frequent. I would like to set up an easy way to import the data for future employees that work with the same database. Any ideas?
Update: Even with dummy data in the first couple of rows, Access is truncating the data for 3 out of the 10 Memo feilds when I import the Excel file (Character length of dummy data is 785). Now I am really at a loss for ideas.
It has been a while, but I was having the same issues as you.
After much digging, I found that the wonderful world of microsoft explains:
To avoid errors during importing, ensure that each source column
contains the same type of data in every row. Access scans the first
eight source rows to determine the data type of the fields in the
table. We highly recommend that you ensure that the first eight source
rows do not mix values of different data types in any of the columns.
Otherwise, Access might not assign the correct data type to the
column.
Apparently, this means when appending an excel file to an existing table, even when columns are formatted and saved as memo fields, that if all 8 of the first rows in the excel file are less than 256 chars, Access assumes you actually meant to specify text, thus truncating the remaining rows after 255 chars. I have performed several tests placing "dummy" rows within the top 8 rows, and each triggered the import of more than 255 chars.
Now, if you import to a new table, the wizard allows you to pick all of the formatting options.
Importing to a new table is convenient if you are okay with overwriting all of the data already in the table. However, if you truly need to append, I would suggest importing to a temporary table, then appending from there. An easy way to do this is to save a import then execute it from VBA, like Elliot_et_al wanted to do. You could then also run the append query in VBA as well. If you set up your tables correctly you may be able to get away with
INSERT INTO [MyTable]
SELECT [MyTable_temp].*
FROM [MyTable_temp];
For what it's worth....I ran into a similar problem with Access 2013 - it was truncating fields to 255 characters on import from XLS, even when the Import Wizard selected LONG TEXT as the field, and even when I had fields with > 255 characters in the first few rows.
A colleague suggested that I link the spreadsheet instead importing to a new table, the issue went away. I also created a new table based on the linked one, and all is good.
EDITED TO ADD: In Access 2013, if you've already imported the XLS file into Access and cannot go back to it to try to link first, try this instead:
Go to Design View of the table, go to Field Properties at the bottom of that screen and set the Long Text "Text Format" to "Rich Text". Just today, I found that this saved me from having to recreate a table that I'd imported from excel months ago and found that even though I had the "Notes" column set to Long Text, it was still truncating text that I was manually entering in to 255 characters regardless. Switching to Rich Text made this text visible.
I use excel to communicate with external partners and capture reports from them into an access database. I've found the best way to do this is to insert a "dummy" first row into the worksheet that contains greater than 255 characters in any given column where the user-populated data is likely to exceed 255 characters.
In this way when I import the data it always imports all the text, and then I can simply delete the "dummy" row from the database table.
I frequently use an "import template" workbook that I link to from my access database. I set the template page to be formatted as a table before linking (so that the import contains all data without having the specify the range each time), and make the first "dummy" row hidden in there.
In this way I can simply copy and paste the data into the import template and then run a database query to import (and if necessary, transform) the data into the database, with a second query to delete the "dummy" record afterwards.
Hope this helps..?
Excel and Access are quirky. Apparently, appending Excel or CSVs to the end of an existing Access table which has the same properties of Long Text is an issue. Appending data will default all Long Text to Short Text. The work around was to output the data to Excel, append the data into one table, then import it as a new table in Access. Access has a problem with treating appending data as Short Text instead of Long Text regardless what you do.
Do make sure that when using the import wizard to change the properties of the column to Long Text.
I hope this helps.
I faced the same issue in MS Access 2013. When I import an excel sheet with one of the column text greater than 255 characters, it was truncating. I did lot of research and finally I am able to find a workaround. Actually, Some how MS Access database determining the size of the text based on the first record column text length and fixing that length for the subsequent records. If it's length < 255, access automatically limiting further records length to 255 size or what ever is the first record column length. I ensured the first record to have max length of all the records text column (sorted) and then imported and it worked well for me.
I had same exact problem with Access 2010. I found two different workarounds after finding out Access look at first 25 records to determine type of data on each columns when importing.
Sorted importing records by length of column in descending order. This means records with larger than 255 characters in some column will be among first 25 records. Then, Access was able to import those records without truncating.
Created link table specifying column data type as memo and then appended to table.
I've had luck in the past with Rich Text solution offered above as well as using "dummy rows" as the first record imported. Thank you for those! However, today I think I've come across a more efficient/consistent solution for imports you'll repeat many times. I tried this in Access 2007.
Use the import wizard as if you're importing the data to a new table. Go through all the screens setting your specifications. Most important, check or specify the data type for each field in the tedious Field Options / Data Type area (for my recent text file, this was the 3rd screen of the Import Text Wizard)--be sure to specify your Memo fields here. (Don't worry, you'll only have to do this once!)
When you arrive at the final "That's all the info the wizard needs..." screen, look for the "Advanced..." button on the lower left. This brings up a screen summarizing everything you just did. Look for "Save as..." on the right. Save these specs with a helpful name. (You can confirm you saved your specs by clicking "Specs..." directly below.) Click "Okay" to leave the advanced screen.
You can now cancel out of the wizard if you don't actually need to create a new table. Next--and this is what you can do every time from now on to avoid truncations--go to the normal import wizard with "Append a copy of the records to the table..." In the wizard, you should see that same "Advanced..." button. Open it, click "Specs...", and double-click your saved specification. Say "OK" to exit "Advanced," and complete the wizard. This should tell Access to keep your memo fields as memo fields!
When importing CSVs to existing tables, I find I need to go through a couple of the normal wizard screens (e.g. specify the Text Qualifier) before going to the "Advanced" screen. Not sure why this makes it happy, just FYI.
I hope this helps someone else who has struggled with Field Truncation import errors like me!
In many case you just change text format of memo field from normal text to RTF, now if you open table data you can see all imported text

ACCESS: Truncation error when appending CSV data to tables?

I am currently experiencing difficulties when trying to append data to existing tables.
I have about 100 CSV files that I would like to create a single table from; all the tables have different column structures but this isn't really an issue as the associated field names are in the first row of each file.
First, I create a new table from one of the files indicating that my field names are in the first row. I change the particular fields that have more than 256 characters to memo fields and import the data.
I then add to the table the fields that are missing.
Now, when I try to append more data, I again select that my field names are in the first row, but now I receive a truncation error for data that is destined for the memo fields.
Why is this error occurring? Is there a workaround for this?
edit
Here is an update regarding what I've attempted to solve the problem:
Importing and appending tables will not work unless they have the exact same structure. Moreover, you cannot create a Master table with all fields and properties set, then append all tables to the master. You still receive truncation errors.
I took CodeSlave's advice and attempted to upload the table, set the fields that I needed to be Memo fields, and then append the table. This worked, but again, the memo fields are not necessarily in the same order in every data file, and I have 1200 data files to import into 24 tables. Importing the data table by table is just NOT an option for this many tables.
I expect what you are experiencing is a mismatch between the source file (CSV) and the destination table (MS Access).
MS Access will make some guesses about what the field types are in you CSV file when you are doing the import. However, it's not perfect. Maybe it's seeing a string as a memo or float as a real. It's impossible for me to know without seeing the data.
What I would normally do, is:
Import the second CSV into it's own (temporary) table
Clean up the second table
Then use an SQL query to append those records from the second table to the first table.
Delete the second table
(repeat for each CSV file you are loading).
If I knew ahead of time that every CSV file was already identical in structure, I'd be inclined to instead concatenate them all together into one, and only have to do the import/clean-up once.
Had a very similar problem - trying to import a CSV file with large text fields (>255 chars) into an existing table. Declared the fields as memo but were still being truncated.
Solution: start an import to link a table and then click on the Advanced button. Create a link specification which defines the relevant fields as memo fields and then save the link specification. Then cancel the import. Do another import this time the one you want which appends to an existing table. Click on the Advanced button again and select the link specification you just created. Click on finish and the data should be imported correctly without truncation.
I was having this problem, but noticed it always happened to only the first row. So by inserting a blank row in the csv it would import perfectly, then you need to remove the blank row in the Access table.
Cheers,
Grae Hunter
Note: I'm using Office 2010