How can I turn a CSV file into a web-based SQL database table, without having any database already setup? - mysql

Currently, I have a CSV file with data in it. I want to turn it into a SQL table, so I can run SQL queries on it. I want the table to be within a web-based database that others in my organization can also access. What's the easiest way to go from CSV file to this end result? Would appreciate insight on setting the up database and table, giving others access, and getting data inside. Preferably PostgreSQL, but MySQL is fine too.

To create the table it depends on the number of columns you have. If you have only a few then do it manually:
CREATE TABLE <table name> (<variable name> <variable type (e.g. int or varchar(100)>, <etc.>)
If you have many columns you can open the csv file in excel and get 'SQL Converter for Excel' which will build a create statement for you using your column headings (and autodetect variable types too).
Loading data from a csv is also pretty straightforward:
LOAD DATA INFILE <filepath (e.g. 'C:/Users/<username>/Desktop/test.csv'>
INTO TABLE <table name>
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS; (Only use this line if you have column names included in the csv).
As for a web-based solution: https://cloud.google.com/products/cloud-sql/

That's a relatively open-ended question. A couple of noteworthy pointers off the top of my head:
MySQL allows you to store your data in different formats, one of them being CSV. That's a very straightforward solution if you're happy with it and don't mind a few limitations (see http://dev.mysql.com/doc/refman/5.0/en/csv-storage-engine.html).
Otherwise you can import your data into a table with a full-featured engine (see other answer(s) for details).
If you're happy with PostgreSQL and look for fully web based solution, have a look at Heroku.
There are a great many ways to make your data available through web services without accessing the back-end data store directly. Have a look at REST and SOAP for instance.
HTH

Related

Need to load excel data into sql, then I need to use sql data for forecasting

I am interning at a company and they want to load their excel data (with formulas) into SQL. There are 2 types of excel sheets. The first ones contain their initial budgets, while the second ones contain the total expenses by each department. After loading the data into SQL, they want to use the first sheets (containing their budgets) to forecast, and compare them to the second sheets (containing the real expenses).
I was thinking about saving the data as csv (not sure if the formulas will let me do it). Then pulling everything into MYSQL using Workbench. After that, I am not sure on how to forecast with the data from the first sheet for comparison.
I am not sure if I should use MySQL and Workbench due to licensing issues, I though about using PostgreSQL and pgAdmin
ny ideas will be greatly appreciated, thanks
Perhaps this can help:-
LOAD DATA INFILE 'c:/tmp/file_name.csv'
INTO TABLE table_name
FIELDS TERMINATED BY ',' /* if applicable */
ENCLOSED BY '"' /* if applicable */
LINES TERMINATED BY '\n'
IGNORE 1 ROWS; /* IF you want to exclude header row */
I'm doing that task really often too. I found a lot of great conversion applications who help you to convert your Excel sheets to SQL or MySQL Databases and nearly any Database-Management-Application has such a tool too. So you could search for that in your management-studio.
Otherwise I recommend you to read this tutorial by MySQL itself, where they show how to convert a Excelsheet to a MySQL Database:
https://dev.mysql.com/doc/mysql-for-excel/en/mysql-for-excel-export.html
Other converters:
https://www.rebasedata.com/convert-excel-to-mysql-online;
https://sqlizer.io/#/;
Then you have to simply import them into your database by using the standard Database import tool.

Importing Data MySQL

I have a huge dataset what is the faster way to upload data in MySQL PHP database and is there anyway to verify all datas are imported or not.
Any suggestion or hints will be greatly appreciate. Thanks.
If the data set is simply huge (can be transferred within hours), it is not worth the effort of finding an efficient way - any script should be able to do the job. I am assuming you are reading from some non-db format (eg. plain text) ? In that way, simply read, and insert.
If you require careful processing before you insert the rows, you might want to consider creating real objects in memory and their sub-objects first and then mapping them to rows and tables - Object-Relational data source patterns will be valuable here. This will, however, be much slower, and I would not recommend it unless it's absolutely necessary, especially if you are doing it just once.
For very fast access, some people wrote a direct binary blob of objects on the disk and then read it directly into an array, but that is available in languages like C/C++; I am not sure if/how it can be used in a scripted language. Again, this is good for READING the data back into memory, not transferring to DB.
The easiest way to verify that the data has been transferred is to compare the count(*) of the db with the number of items in your file. The more advanced way is to compute hash (eg. sha1) of primary key sets.
I used LOAD DATA, this is a standard MySql Loader Tools. It's work fine and faster. there are many options.
You can use :
data file named export_du_histo_complet.txt with multiple line like this :
"xxxxxxx.corp.xxxxxx.com";"GXTGENCDE";"GXGCDE001";"M_MAG105";"TERMINE";"2013-06-27";"14:08:00";"14:08:00";"00:00:01";"795691"
sql file with (because I use Unix Shell which call SQL File) :
LOAD DATA INFILE '/home2/soron/EXPORT_HISTO/export_du_histo_complet.txt'
INTO TABLE du_histo
FIELDS
TERMINATED BY ';'
ENCLOSED BY '"'
ESCAPED BY '\\'
LINES
STARTING BY ' '
TERMINATED BY '\n'
(server, sess, uproc, ug, etat, date_exploitation, debut_uproc, fin_uproc, duree, num_uproc)
I specified the table fields which i would import (my table has more columns)
Note that exist MySql bug, so you can't use variable to specify your INFILE.

How to move data from one SQLite to MySQL with different designs?

The problem is:
I've got a SQLite database which is constantly being updated though a proprietary application.
I'm building an application which uses MySQL and the database design is very different from the one of SQLite.
I then have to copy data from SQLite to MySQL but it should be done very carefully as not everything should be moved, tables and fields have different names and sometimes data from one table goes to two tables (or the opposite).
In short, SQLite should behave as a client to MySQL inserting what is new and updating the old in an automated way. It doesn't need to be updating in real time; every X hours would be enough.
A google search gave me this:
http://migratedb.sourceforge.net/
And asking a friend I got information about the Multisource plugin (Squirrel SQL) in this page:
http://squirrel-sql.sourceforge.net/index.php?page=plugins
I would like to know if there is a better way to solve the problem or if I will have to make a custom script myself.
Thank you!
I recommend a custom script for this:
If it's not a one-to-one conversion between the tables and fields, tools might not help there. In your question, you've said:
...and sometimes data from one table goes to two tables (or the opposite).
If you only want the differences, then you'll need to build the logic for that unless every record in the SQLite db has timestamps.
Are you going to be updating the MySQL db at all? If not, are you okay to completely delete the MySQL db and refresh it every X hours with all the data from SQLite?
Also, if you are comfortable with a scripting language (like php, python, perl, ruby, etc.), they have API's for both SQLite and MySQL; it would be easy enough to build your own script which you can control customise more easily based on program logic. Especially if you want to run "conversions" between the data from one to the other and not just simple mapping.
I hope i understand you correctly, that you will flush the data which are stored in a SQLite DB periodicly to a MySQL DB. Right?
So this is how i would do it.
Create a Cron, which starts the script every x minutes.
Export the Data from SQLite into an CSV-File.
Do an LOAD DATA INFILE an import the CSV Data to MySQL
Code example for LOAD DATA INFILE
LOAD DATA INFILE 'PATH_TO_EXPORTED_CSV' REPLACE INTO TABLE your_table FIELDS TERMINATED BY ';' ENCLOSED BY '\"' LINES TERMINATED BY '\\n' IGNORE 1 LINES ( #value_column1, #unimportend_value, #value_column2, #unimportend_value, #unimportend_value, #value_column3) SET diff_mysql_column1 = #value_column1, diff_mysql_column2 = #value_column2, diff_mysql_column3 = #value_column3);
This Code you can query to as much db tables you want. Also you can change the variables #value_column1.
Im in a hurry. so thats it for now. ask if something is unclear.
Greets Michael

Importing a CSV file into mysql. (Specifically about create table command)

I hava text file full of values like this:
The first line is a list of column names like this:
col_name_1, col_name_2, col_name_3 ......(600 columns)
and all the following columns have values like this:
1101,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,1101,1,3.86,65,0.46418,65,0.57151...
What is the best way to import this into mysql?
Specifically how to come up with the proper CREATE TABLE command so that the data will load itself properly? What is the best generic data type which would take in all the above values like 1101 or 3.86 or 0.57151. I am not worried about the table being inefficient in terms of storage as I need this for a one time usage.
I have tried some of the suggestions in other related questions like using Phpmyadmin (it crashes I am guessing due to the large amount of data)
Please help!
Data in CSV files is not normalized; those 600 columns may be spread across a couple of related tables. This is the recommended way of treating those data. You can then use fgetcsv() to read CSV files line-by-line in PHP.
To make MySQL process the CSV, you can create a 600 column table (I think) and issue a LOAD DATA LOCAL INFILE statement (or perhaps use mysqlimport, not sure about that).
The most generic data type would have to be VARCHAR or TEXT for bigger values, but of course you would lose semantics when used on numbers, dates, etc.
I noticed that you included the phpmyadmin tag.
PHPMyAdmin can handle this out of box. It will decide "magically" which types to make each column, and will CREATE the table for you, as well as INSERT all the data. There is no need to worry about LOAD DATA FROM INFILE, though that method can be more safe if you want to know exactly what's going on without relying on PHPMyAdmin's magic tooling.
Try convertcsvtomysql, just upload your csv file and then you can download and/or copy the mysql statement to create the table and insert rows.

MySQL export to MongoDB

I am looking to export an existing MySQL database table to seed a MongoDB database.
I would have thought this was a well trodden path, but it appears not to be, as I am coming up blank with a simple MySQLDUMP -> MongoDB JSON converter.
It won't take much effort to code up such a conversion utility.
There are a method that doesn't require you to use any other software than mysql and mongodb utilities. The disadvantage is that you have to go table by table, but in your case you only need to migrate one table, so it won't be painful.
I followed this tutorial. Relevant parts are:
Get a CSV with your data. You can generate one with the following query in mysql.
SELECT [fields] INTO outfile 'user.csv' FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' FROM [table]
Finally, import the file using mongoimport.
That's all
If you're using Ruby, you can also try: Mongify
It will read your mysql database, build a translation file and allow you to map the information.
It supports:
Updating internal IDs (to BSON ObjectID)
Updating referencing IDs
Type Casting values
Embedding Tables into other documents
Before filters (to change data manually)
and much much more...
Read more about it at: http://mongify.com/getting_started.html
MongoVue is a new project that contains a MySQL import:
MongoVue. I have not used that feature.
If you are Mac user you can use MongoHub which has a built in feature to import (& export) data from MySql databases.
If you are using java you can try this
http://code.google.com/p/sql-to-nosql-importer/
For a powerful conversion utility, check out Tungsten Replicator
I'm still looking int this one called SQLToNoSQLImporter, which is written in Java.
I've ut a little something up on GitHub - it's not even 80% there but it's growing for work and it might be something other of you could help me out on!
https://github.com/jaredwa/mysqltomongo