Load xml data into sql database using phpmyadmin - mysql

I know this is a really basic question but I am struggling on my first import of data from an xml file. I have created the table "Regions" which has just two columns - ID and Name. The xml file contains the same column names.
In order to bulk import the data, I am using the following SQL command:
LOAD XML LOCAL INFILE 'C:\Users\Dell\Desktop\regions.xml'
INTO TABLE Regions (ID, Name)
but I am getting the error #1148 - The used command is not allowed with this MySQL version
Now having researched the internet, to allow this command requires a change in one of the command files but my service provider doesn't allow me access to it. Is there an alternative way to write the SQL code and do exactly the same thing as the code above which is basically just import the data from an xml file?
Many thanks

Since LOAD DATA INFILE isn't enabled for you, it appears you have only one more option and that's to create a set of INSERT statements for each row. If you converted your XML file to CSV using Excel, that's an easy step. Assuming you have a rows of data like this
A | B
-----|-------------------------
1 | Region 1
2 | Region 2
I would create a formula like this in column C
=CONCATENATE("INSERT INTO Regions(ID,Name) VALUES(",A1,",'",B1,"');")
This will result in INSERT INTO Regions(ID,Name) VALUES(1,'Region 1'); for your first row. File this down to the last row of your spreadsheet. Select all the insert statements and copy them into a Query text box inside PHPMyAdmin and you should be able to insert your values.
I've used this method many times when I needed to import data into a database.

Related

Google BigQuery bulk load data into table

I think that what I want to do is not feasible at the moment but want to clarify.
I have a bucket say bucketA with files served to the public and a bucket say bucketB where access logs of bucketA are stored in a specific CSV format
What I want to do is to run SQL queries to these access logs. The problem that I have is that the logs are stored in different CSVs (one per hour I think). I tried to import them through BigQuery UI interface but it seems that there is a one to one CSV to table mapping. When you define the input location the placeholder and documentation as you to put a gs://<bucket_name>/<path_to_input_file>.
Based on the above my question is: Is it possible to upload a all files in a bucket to a single BigQuery table, with something like an "*" asterisk operator?
Once the table is constructed what happens when more files with data get stored in the bucket? Do I need to re-run, is there a scheduler?
Based on the above my question is: Is it possible to upload a all
files in a bucket to a single BigQuery table, with something like an
"*" asterisk operator?
You can query them directly in GCS (federated source) or load then all into a native table using * in both cases:
Once the table is constructed what happens when more files with data get stored in the bucket? Do I need to re-run, is there a scheduler?
If you leave it as en external table, then each time you query BigQuery will scan all the files, so you'll get new files/data. If you load it as a native table, then you'll need to schedule a job yourself to append each new file to your table.
Using BigQuery web UI, after I have created the new table + some initial data with the standard upload csv method.
For quick testing, how to use BigQuery web UI to insert more new data into the existing table?
I realized I CANNOT copy and paste multiple insert statements in the Query editor textbox.
INSERT INTO dataset.myschema VALUES ('new value1', 'more value1');
INSERT INTO dataset.myschema VALUES ('new value2', 'more value2');
Wow, then it will be tedious to insert new line of data 1 by 1.
Luckily BigQuery supports INSERT statements that use VALUES syntax can insert multiple rows.
INSERT INTO dataset.myschema VALUES ('new value1', 'more value1'),
('new value2', 'more value2');

Bulk CSV File Import in MySQL Removing duplicates while Importing dynamic colmns from CSV

I have to import CSV File for different clients in my system, some with [,] some [|] Etc… separated. Always very big files.
While importing I need to filter duplicate records, duplicate records should not insert in dB.
Problem is columns can be different for different clients. I have database for every client for CSV DATA which I import every day, Week or month depending on client need. I keep data for every import so we can generate reports & what data we receive in CSV file our system do processing after import.
Data structure example:
Client 1 database:
First_Name | Last_Name | Email | Phone | Ect…
95% data always same every new CSV file. Some records new comes & some records they delete from csv. so our system only process those records those newly imported .
Currently what happening we import data every time in new table. We keep table name with time stamp so we can keep track for import. it Is expensive process, it duplicate records and tables.
Im thinking solution and I need your suggestion on it.
Keeping one table every time import when I import CSV file data in table, I’ll alter existing table add new column, column name will be “current date” (byte or Boolean) add true false on import??
My other question is first time when I import CSV file …I need to write script:
While importing CSV data, if table already exists then my date logic will work else if table does not exist it should create table given or provided “client name” as table name. Challenge is columns I don’t know it should create columns from CSV file.
Table already exist some new records came in it should insert else update.
Is it do able in mysql??
although I have to do something for mssql also but right now I need solution for my sql.
Please help me... im not good in MySQL :(
You can certainly do an insert Or update statement when importing each record.
see here :
https://dev.mysql.com/doc/refman/5.0/en/insert-on-duplicate.html
I propose you create a script to dynamically create you table if it doesn't
What is the language that you would use to insert your csv?

Mysql select from table left join with csv export

I have tables that are on different mysql instances. I want to export some data as csv from a mysql instance, and perform a left join on a table with the exported csv data. How can I achieve this?
Quite surprisingly that is possible with MySQL, there are several steps that you need to go through.
First create a template table using CSV engine and desired table layout. This is the table into which you will import your CSV file. Use CREATE TABLE yourcsvtable (field1 INT NOT NULL, field2 INT NOT NULL) ENGINE=CSV for example. Please note that NULL values are not supported by CSV engine.
Perform you SELECT to extract the CSV file. E.g. SELECT * FROM anothertable INTO OUTFILE 'temp.csv' FIELDS TERMINATED BY ',';
Copy temp.csv into your target MySQL data directory as yourcsvtable.CSV. Location and exact name of this file depends on your MySQL setup. You cannot perform the SELECT in step 2 directly into this file as it is already open - you need to handle this in your script.
Use FLUSH TABLE yourcsvtable; to reload/import the CSV table.
Now you can execute your query against the CSV file as expected.
Depending on your data you need to ensure that the data is correctly enclosed by quotation marks or escaped - this needs to be taken into account in step 2.
CSV file can be created by MySQL on some another server or by some other application as long as it is well-formed.
If you export it as CSV, it's no longer SQL, it's just plain row data. Suggest you export as SQL, and import into the second database.

Can I import tab-separated files into MySQL without creating database tables first?

As the title says: I've got a bunch of tab-separated text files containing data.
I know that if I use 'CREATE TABLE' statements to set up all the tables manually, I can then import them into the waiting tables, using 'load data' or 'mysqlimport'.
But is there any way in MySQL to create tables automatically based on the tab files? Seems like there ought to be. (I know that MySQL might have to guess the data type of each column, but you could specify that in the first row of the tab files.)
No, there isn't. You need to CREATE a TABLE first in any case.
Automatically creating tables and guessing field types is not part of the DBMS's job. That is a task best left to an external tool or application (That then creates the necessary CREATE statements).
If your willing to type the data types in the first row, why not type a proper CREATE TABLE statement.
Then you can export the excel data as a txt file and use
LOAD DATA INFILE 'path/file.txt' INTO TABLE your_table;

How to import an excel file in to a MySQL database

Can any one explain how to import a Microsoft Excel file in to a MySQL database?
For example, my Excel table looks like this:
Country | Amount | Qty
----------------------------------
America | 93 | 0.60
Greece | 9377 | 0.80
Australia | 9375 | 0.80
There's a simple online tool that can do this called sqlizer.io.
You upload an XLSX file to it, enter a sheet name and cell range, and it will generate a CREATE TABLE statement and a bunch of INSERT statements to import all your data into a MySQL database.
(Disclaimer: I help run SQLizer)
Below is another method to import spreadsheet data into a MySQL database that doesn't rely on any extra software. Let's assume you want to import your Excel table into the sales table of a MySQL database named mydatabase.
Select the relevant cells:
Paste into Mr. Data Converter and select the output as MySQL:
Change the table name and column definitions to fit your requirements in the generated output:
CREATE TABLE sales (
id INT NOT NULL AUTO_INCREMENT PRIMARY KEY,
Country VARCHAR(255),
Amount INT,
Qty FLOAT
);
INSERT INTO sales
(Country,Amount,Qty)
VALUES
('America',93,0.60),
('Greece',9377,0.80),
('Australia',9375,0.80);
If you're using MySQL Workbench or already logged into mysql from the command line, then you can execute the generated SQL statements from step 3 directly. Otherwise, paste the code into a text file (e.g., import.sql) and execute this command from a Unix shell:
mysql mydatabase < import.sql
Other ways to import from a SQL file can be found in this Stack Overflow answer.
Export it into some text format. The easiest will probably be a tab-delimited version, but CSV can work as well.
Use the load data capability. See http://dev.mysql.com/doc/refman/5.1/en/load-data.html
Look half way down the page, as it will gives a good example for tab separated data:
FIELDS TERMINATED BY '\t' ENCLOSED BY '' ESCAPED BY '\'
Check your data. Sometimes quoting or escaping has problems, and you need to adjust your source, import command-- or it may just be easier to post-process via SQL.
There are actually several ways to import an excel file in to a MySQL database with varying degrees of complexity and success.
Excel2MySQL. Hands down, the easiest and fastest way to import Excel data into MySQL. It supports all verions of Excel and doesn't require Office install.
LOAD DATA INFILE: This popular option is perhaps the most technical and requires some understanding of MySQL command execution. You must manually create your table before loading and use appropriately sized VARCHAR field types. Therefore, your field data types are not optimized. LOAD DATA INFILE has trouble importing large files that exceed 'max_allowed_packet' size. Special attention is required to avoid problems importing special characters and foreign unicode characters. Here is a recent example I used to import a csv file named test.csv.
phpMyAdmin: Select your database first, then select the Import tab. phpMyAdmin will automatically create your table and size your VARCHAR fields, but it won't optimize the field types. phpMyAdmin has trouble importing large files that exceed 'max_allowed_packet' size.
MySQL for Excel: This is a free Excel Add-in from Oracle. This option is a bit tedious because it uses a wizard and the import is slow and buggy with large files, but this may be a good option for small files with VARCHAR data. Fields are not optimized.
Not sure if you have all this setup, but for me I am using PHP and MYSQL. So I use a PHP class PHPExcel. This takes a file in nearly any format, xls, xlsx, cvs,... and then lets you read and / or insert.
So what I wind up doing is loading the excel in to a phpexcel object and then loop through all the rows. Based on what I want, I write a simple SQL insert command to insert the data in the excel file into my table.
On the front end it is a little work, but its just a matter of tweaking some of the existing code examples. But when you have it dialed in making changes to the import is simple and fast.
the best and easiest way is to use "MySQL for Excel" app that is a free app from oracle. this app added a plugin to excel to export and import data to mysql. you can download that from here
When using text files to import data, I had problems with quotes and how Excel was formatting numbers. For example, my Excel configuration used the comma as decimal separator instead of the dot.
Now I use Microsoft Access 2010 to open my MySql table as linked table. There I can simply copy and paste cells from Excel to Access.
To do this, first install the MySql ODBC driver and create an ODBC connection.
Then in access, in the "External Data" tab, open "ODBC Database" dialog and link to any table using the ODBC connection.
Using MySql Workbench, you can also copy and paste your Excel data into the result grid of MySql Workbench. I gave detailed instructions in this answer.
Fastest and simpliest way is to save XLS as ODS (open document spreasheet) and import it from PhpMyAdmin
For a step by step example for importing Excel 2007 into MySQL with correct encoding (UTF-8) search for this comment:
"Posted by Mike Laird on October 13 2010 12:50am"
in the next URL:
http://dev.mysql.com/doc/refman/5.1/en/load-data.html
You could use DocChow, a very intuitive GIU for importing Excel into MySQL, and it's free on most common platforms (including Linux).
More especially if you are concerned about date, datetime datatypes, DocChow easily handles datatypes. If you are working with multiple Excel spreadsheets that you want to import into one MySQL table DocChow does the dirty work.
Step 1 Create Your CSV file
Step 2 log in to your mysql server
mysql -uroot -pyourpassword
Step 3
load your csv file
load data local infile '//home/my-sys/my-excel.csv' into table my_tables fields terminated by ',' enclosed by '"' (Country, Amount,Qty);
Another useful tool, and as a MySQL front-end replacement, is Toad for MySQL. Sadly, no longer supported by Quest, but a brilliant IDE for MySQL, with IMPORT and EXPORT wizards, catering for most file types.
If you are using Toad for MySQL steps to import a file is as follows:
create a table in MySQL with the same columns that of the file to be imported.
now the table is created, goto > Tools > Import > Import Wizard
now in the import wizard dialogue box, click Next.
click Add File, browse and select the file to be imported.
choose the correct dilimination.("," seperated for .csv file)
click Next, check if the mapping is done properly.
click Next, select the "A single existing table" radio button also select the table that to be mapped from the dropdown menu of Tables.
Click next and finish the process.
If you don't like plugins, VBA and external tools, I have an excel file that using formulas only allows you to create INSERT/UPDATES. You only have to put the data on the cells:
As an extra, there's another tab in the file to CREATE TABLES:
The file can be found on the following link:
EXCEL FILE
I've had good results with the Tools / Import CSV File feature in HeidiSQL, with CSV files directly exported from Excel 2019 with "Save As..."
It uses LOAD DATA INFILE internally but with a GUI interface and also analyzes the CSV file before passing it to LOAD DATA INFILE so it can, for example, create the table using the first row as column names and guessing the column data type (<New table> option as shown in the picture)