I am using the built-in Import in the Wordpress dashboard to get CSV data from a file created by Excel into a table. It creates a new table and enters data as expected, but on one field which it automatically defined as varchar(8) should really be varchar(10) so some characters are dropped. I can't find settings that may fix the problem and even if I create a new table correctly, I can't see how I can force the import into that new table. Any advice would be appreciated Thanks.
Related
Every week I import a table of our clients into MySQL and every week a random client is missing from that list. That same client might have been imported without problems the week before and then next week again imported perfectly fine, however MySQL will always omit one client from the import which makes my job very frustrating in trying to find who it was and manually fix it.
Has anyone run into similar issues in the past or have any idea why they might be happening?
I am using the MySQL table import wizard. Normally I don't have issues, but this specific client table export I get from another platform is having this issue every single week.
I would do the import from the Excel into a TEMPORARY (ie: not the final table), but one of the same structure. Then, once that is done and the import of all names and columns are parsed, you can left-join from LIVE table to the TEMP table where the entry exists on LIVE but not in TEMP. This would help find the missing import. THEN, since the data is already in a valid table structure, you can pull in all new values from the temp table to the production, including adding any new that did not yet exist.
If you are exporting to excel using the Export wizard, it creates a sheet with the name of the table, but also creates a named range with the same name. The named range then contains only the region with data. Then when you add new data, this named region is not extended. And when you are later importing the data, you probably chose the name without the $ character on end. The name without the $ in the name represents the named region, which probably doesn't contain your newly inserted data. To load all the date chose the name with $ character or extent the named region to cover all the data in the sheet.
I want to add fields to into a created table called 'Deporte', i follow the process and when i arrive to finally workbench shows to me
table Deporte has been used, 0 records imported
My table over i want to load data have more fields than my csv file, but in the process of import i can select the fields couples i want to load.
However, if i create a new table with my data, the import process is completed correctly. I have this captures of the procedure that i want:
Thanks for all.
The problem was that my URL fields was not long enough.
I have to import CSV File for different clients in my system, some with [,] some [|] Etc… separated. Always very big files.
While importing I need to filter duplicate records, duplicate records should not insert in dB.
Problem is columns can be different for different clients. I have database for every client for CSV DATA which I import every day, Week or month depending on client need. I keep data for every import so we can generate reports & what data we receive in CSV file our system do processing after import.
Data structure example:
Client 1 database:
First_Name | Last_Name | Email | Phone | Ect…
95% data always same every new CSV file. Some records new comes & some records they delete from csv. so our system only process those records those newly imported .
Currently what happening we import data every time in new table. We keep table name with time stamp so we can keep track for import. it Is expensive process, it duplicate records and tables.
Im thinking solution and I need your suggestion on it.
Keeping one table every time import when I import CSV file data in table, I’ll alter existing table add new column, column name will be “current date” (byte or Boolean) add true false on import??
My other question is first time when I import CSV file …I need to write script:
While importing CSV data, if table already exists then my date logic will work else if table does not exist it should create table given or provided “client name” as table name. Challenge is columns I don’t know it should create columns from CSV file.
Table already exist some new records came in it should insert else update.
Is it do able in mysql??
although I have to do something for mssql also but right now I need solution for my sql.
Please help me... im not good in MySQL :(
You can certainly do an insert Or update statement when importing each record.
see here :
https://dev.mysql.com/doc/refman/5.0/en/insert-on-duplicate.html
I propose you create a script to dynamically create you table if it doesn't
What is the language that you would use to insert your csv?
Is there a way to import a csv into a SQL table, without having a previously-constructed table? I know how to import a csv into an existing table, but is there a way to create one from the csv?
You can do this using phpMyAdmin ,
(in this method csv file first row elements use as column names for the sql table)
1) select database
2) go to import tab and select csv file
3) ↓↓↓↓↓↓↓
4) after above steps new table will be created and if you want to change table names instead of having table1,table2
select table and go to operation tab :)
(phpMyAdmin 4.1.14)
I am no expert in MySQL but I don't believe there is such an import process. And there might not be in other database servers like Oracle, SQL Server, or PostgreSQL. In fact, it may not be a desirable automation as a table should be user defined and created to fit the database's relational model and for appropriate data types, indices, and keys.
Almost all SQL dialects require setting up the database table beforehand. If not, how would the system know beforehand you intended an integer or long number, a double or decimal number, a tinytext or longtext, which fields are to be indexed, or serve as primary key, and so on?
You might argue MS Access allows a CSV import with an optional table name. However, the ribbon wizard walks the user through setting up the field types, primary key, and table name. And going the non-wizard automation route, the DoCmd.TransferText method requires table name when using the acImportDelim argument.
So, your best process in MySQL may be LOAD DATA INFILE to run a bulk import of an external CSV into an existing table.
I already have a table in phpmyadmin that contains users records. Each user has a unique admission number. I now want to add a new column to this table and was wondering how I can import data for this new column using just the admission number and new data.
Is this possible? I have a CSV but can't work out the best way to import the data without overwriting any existing records.
Thanks.
As far as I can see, this is not possible. phpMyAdmin's import features are for whole rows only.
You could write a small PHP script that opens the CSV file using fgetcsv(), walks through every line and creates a UPDATE statement for each record:
UPDATE tablename SET new_column = "new_value" WHERE admission_number = "number"
you can then either output and copy+paste the commands, or execute them directly in the script.
If you want to do it using just CSV, here are the steps you could perform.
In a text editor, make a comma separated list of all the column names for the final table (including your new column). This will be useful for importing the new data.
Add the new column to your table using phpmyadmin
Export current table in csv format and sort by admission number in Excel
In your new data CSV, sort by admission number
Copy the column over from your new data to your exported CSV and save for re-import.
Backup your users table (export to CSV)
Truncate the contents of your table (Operations, Truncate)
Import your updated CSV
Optional / Recommended: When you import CSV into phpmyadmin, use the column names option to specify the columns you are using, separated by commas (no spaces).
Assumptions:
1. You are using Spreadsheet such as Excel / OpenOffice to open your csv files.
Any problems?
Truncate the table again and import the sql backup file.