I did some research on how to import XML data into MySQL possibly with the Workbench.
However, I was unable to find any easy tutorial how to do that. I have 6 XML files and all contain data, no schema.
From what I understood, the process consists of 2 parts:
1.Making the table (this is the part which is unclear to me) - is there a way to make the table from only XML data file?
2.Importing the data to the MySQL table. I think I understand this one, it could be done by executing this query:
LOAD XML LOCAL INFILE '/pathtofile/file.xml'
INTO TABLE my_tablename(personal_number, firstname, ...);
I've done it before where I read the XML files into a MySQL database where the data type was set to BLOB.
Related
I am new at databases. I have table and and I need to export it and save its structure. I'm using MySql Workbench. It is my first task and I have no idea and know just few things about databases.
Your question is a bit unprecise. What exactly do you want to export? The table structure + data for later restore (if so use a dump) or just the table data in a common format like CSV for further processing (if so use the table data export wizard).
A dump is what is usually used to store SQL data + structure in text files (conventionally tagged with an .sql extension). These contain Data Definition Language (DDL) constructs which define the meta data (e.g. CREATE TABLE) as well as Data Manipulation Language (DML) commands to manage the content (e.g. INSERT or DELETE). This structure serves well for copying content between servers and such, as it is what a database server speaks natively. In MySQL Workbench you can import and export such dumps via the Management tab -> Data Import/Restore:
For importing and exporting data via a common file format like CSV or JSON, use the table data import/export feature, reachable via the context menu for a given table:
this does however not preserve the structure of the table in a manner which would allow to recreate it automatically (like SQL statements do).
I need to fill several of tables with CSV files. I tried to use a loop that do insert with each row but a file with 65,000 records take me more then 20 min.
I want to use the MySQL command LOAD DATA LOCAL INFILE, but I received this message :
LOAD DATA LOCAL INFILE forbidden in C:\xampp\htdocs\myProject\apps\backend\modules\member\actions\actions.class.php on line 112
After a little research, I understand there is need to change one of the security parameters of the PDO (PDO::MYSQL_ATTR_LOCAL_INFILE) to true.
In symfony2, you need to change it at config.yml of your app, but I can't find it on symfony 1.4.
Let me try to understand the question (or questions?!).
If you need to optimize the INSERT queries you should probably batch them at a single INSERT query or a few ones, but definitely not for each row. Besides, the INSERT query in MySQL will be always slow especially for a large amount of data inserted, also depends on indexing, engine and schema structure of the DB.
About the second question, take a look here, maybe it will help.
Currently, I have a CSV file with data in it. I want to turn it into a SQL table, so I can run SQL queries on it. I want the table to be within a web-based database that others in my organization can also access. What's the easiest way to go from CSV file to this end result? Would appreciate insight on setting the up database and table, giving others access, and getting data inside. Preferably PostgreSQL, but MySQL is fine too.
To create the table it depends on the number of columns you have. If you have only a few then do it manually:
CREATE TABLE <table name> (<variable name> <variable type (e.g. int or varchar(100)>, <etc.>)
If you have many columns you can open the csv file in excel and get 'SQL Converter for Excel' which will build a create statement for you using your column headings (and autodetect variable types too).
Loading data from a csv is also pretty straightforward:
LOAD DATA INFILE <filepath (e.g. 'C:/Users/<username>/Desktop/test.csv'>
INTO TABLE <table name>
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS; (Only use this line if you have column names included in the csv).
As for a web-based solution: https://cloud.google.com/products/cloud-sql/
That's a relatively open-ended question. A couple of noteworthy pointers off the top of my head:
MySQL allows you to store your data in different formats, one of them being CSV. That's a very straightforward solution if you're happy with it and don't mind a few limitations (see http://dev.mysql.com/doc/refman/5.0/en/csv-storage-engine.html).
Otherwise you can import your data into a table with a full-featured engine (see other answer(s) for details).
If you're happy with PostgreSQL and look for fully web based solution, have a look at Heroku.
There are a great many ways to make your data available through web services without accessing the back-end data store directly. Have a look at REST and SOAP for instance.
HTH
Working with Mysql, are stored procedures suitable for populating 6 different tables belonging to a database?
Data is listed in a CSV file.
By the way should I have 6 different CSV files or just a single one?
My idea is that I'd like to avoid the LOAD DATA LOCAL INFILE command.
Thanks very much
Mauro
I think this is not a good idea becouse it would be plataform dependent, and you can't check important things from SP. I'd consider bash scripting.
Im working on a large project, havent had to do what I need help with before. I have a csv file, it contains a large amount of data, namely all of the cities, towns, suburbs in Australia. I need to convert the csv file to sql for mysql, and then import it into the database.
What would be the best way to achieve this?
Use LOAD DATA INFILE or the equivalent command-line tool mysqlimport.
These are easy to use for loading CSV data, and this method runs 20x faster than importing rows one at a time with SQL.