I have 100+ XLSX files that I need to get into a MySQL Database. Each file is a bit different, so I've created a massive table that contains a field for each possible column header across all files. This way they can auto-map on import.
Using Navicat I can import the files one at a time, but I'm wondering if there is a way to import all of the files at one time?
I'm thinking 'no' but if your going to do this a lot I think there are ways to automate this.
Export your xls-files into csv-files instead. Write some script to transform them between csv-excel style to csv-mysql style.
Create tables for importing with the csv engine. Put your files inplace of the ones created by mysql and flush tables. Now your data is ready to be read inside mysql and copied over to more powerful tables engines.
Another way is to do a VBA-script that exports the data in a format recognized by load data infile and then load them using mysql.
Related
I have to load several CSVs into some tables with Mysql LOAD DATA IN FILE, and I want to save discard records that could not be loaded (Because failed FKs, duplicates, etc.) in a discard file, such as Oracle SQL loader does
Any suggestion?
Thanks!
You can import data easily using MySQL WorkBench using import wizard through CSV. It has the option to import all the data and create new table. From there you can alter the tables later on or make necessary indexes as needed, or change dataTypes on the go.
Another option is to use LOAD DATA commands as usual. Import data to a newly created tables without the foreign keys. You can target csv columns to specific table columns as well. see https://dev.mysql.com/doc/refman/8.0/en/load-data.html
I got a Problem, cause I'm totally new to sql and have to kinda learn it in an internship. So I had to import huge txt files into a database in phpmyadmin (took me for ever but managed it with load data infile). Now my task is to find a way to control if the data of the tables is the same as the data of the given txt files.
Is there any possibility to do so ?
Have you tried exporting the data through phpMyAdmin using a different file format instead of .sql? phpMyAdmin gives you several choices including CSV, OpenOffice spreadsheets. That would make your compare easier. You could use Excel, sort the data and you'd have a quicker compare.
the best way to do so is to load, and then extract.
Compare your extract with the original file.
Another way could be to count the number of line in both table and file. And extract few lines, and verify that they both exists. This is less precise.
But this has nothing to do with SQL. It is just a test logic.
I am new at databases. I have table and and I need to export it and save its structure. I'm using MySql Workbench. It is my first task and I have no idea and know just few things about databases.
Your question is a bit unprecise. What exactly do you want to export? The table structure + data for later restore (if so use a dump) or just the table data in a common format like CSV for further processing (if so use the table data export wizard).
A dump is what is usually used to store SQL data + structure in text files (conventionally tagged with an .sql extension). These contain Data Definition Language (DDL) constructs which define the meta data (e.g. CREATE TABLE) as well as Data Manipulation Language (DML) commands to manage the content (e.g. INSERT or DELETE). This structure serves well for copying content between servers and such, as it is what a database server speaks natively. In MySQL Workbench you can import and export such dumps via the Management tab -> Data Import/Restore:
For importing and exporting data via a common file format like CSV or JSON, use the table data import/export feature, reachable via the context menu for a given table:
this does however not preserve the structure of the table in a manner which would allow to recreate it automatically (like SQL statements do).
I have 2 separate sql databases, they both have the same field but are not attached and are completely separate files. One of the database files has a few hundred rows of data and I want to copy a few of those rows into the other database file. Some people have said to use sql statements to copy the data but the databases are not linked in any way so I am not sure as to how these statements would work. Is there no software where I can just select the correct rows and copy them over, or create a new database with the ones selected?
I hope this makes sense, thanks.
Regardless of the database platform you are using, there should be commands/tools that will allow you to perform bulk data imports and exports from/to a file (e.g. a CSV file). Try exporting the rows you wish to copy from the database on the first server into an intermediate file, copying that file to the second server, and then importing it into that database.
I have an Excel sheet with some cols and i want to import those into the mysql table. The Problem is that there are more colums in the mysql table than in the sheet (which is absolutely fine). What would be the easiest way of getting the data in the right fields?
My solution would be export to csv and put it in mysql via php, but there has to be a way that is more simple.
mysqlimport cmdline-tool has support for importing csv-files, and IIRC supports mapping of different columns in csv into different columns in your table.
http://linux.die.net/man/1/mysqlimport
I realize that it's just a cmd-line wrapper to the LOAD DATA INFILE sql statement, which can be used instead.
If you need to reaorganize the data, you could just import the csv flat into an equivalent table, and from there, do insert ... select from
You can use mysqlimport to import CSV data.
Although the mysqlimport solution is absolutely feasible, it can be cumbersome (NULL handling e.g.) if you have to import a lot of files or if you have to import them regularly. I sometimes use Toad® for MySQL which is able to directly import XLS(X) files into MySQL. This surely is overkill if you only import some Excel data now and then - but it's an alternative and it can be automated as Toad supports automation workflows.