Tool for creating (My)SQL table from CSV header (no script) - mysql

I have to make some queries on several Excel sheets and I think it would be easier if I can put them on a DB and make the queries with SQL.
Is there a tool that creates SQL tables from a CSV file with headers?

MySQL has functions built in to load data from CSV files. This is probably your best option and one that gives you the most control.

Have a look at Data Import tool (Excel, CSV or TEXT formats) in dbForge Studio for MySQL. Import allows to create new table and customize fields in a wizard.

Related

Link Exel Model to intereactive Web Dashboard via bulK import of CVS into SQL

I have a few excel wordbooks which will be generating dozens of csv files which need to be imported in their corresponding receiving SQL tables every 5 minutes. I wonder which would be the elegant way to do this? Can SQL script be used to import multiple csv files into their respective tables in one batch run (we woudl upload all csv files in one ftp run)? What you would recommend?
How error–proof is this setup? We need this for a production environment – need to connect local excel models with interactive charts on the web - fAST.
Just to clarify we will be using wpdatatables plugin to draw charts using the imported csv data in WordPress - in combination with wpbakey you can create an responsive dashboard in a matter of minute using any result from your excel models of ANY COMPLEXITY - and without the need to pay thousands for any blackbox dashboard solutions.
will not constant BULK import of the csv tables corrupt the database? I assume you can run multiple sql commands in a stream?

Multiple Access tables import to mysql db

I receive an access file with multiple tables that I need to import to a MySQL db. I need to either convert all the tables to txt files for loading or need a perl script or something to convert and then I can write out load statements. Doing this manually takes forever and I need to do this for about 20 Access files with 16 tables each. The company where I am at does not want to use anything other than Navicat,Perl Scripts, VB Scripts or straight MySQL code to process this. Any ideas? Something in Access 2007 that can do this for multiple tables instead of the export tool doing one by one?
Any idea or suggestion will help.
You can download this utility to help you to convert an access file into a sql file. Now, you can import a sql file into mySQL.

PostgreSQL Import Tool (RDF/JSON/CSV)

I'm looking for a (GUI) tool that helps me converting various data formats (mostly RDF, JSON and CSV) into a PostgreSQL database.
Is there any nice tool that helps me defining the table schema for the files I want to import (and interlink) and to import the data?
One. You can try to link the tables in Microsoft Access with pgODBC
2nd. You can import files. Csv into Access and run commands Insert
3rd. Click the file through Access to Export and choose the PostgreSQL ODBC source you want.

Copy data from Hypersonic to Sql server 2008

I created a website using liferay with some sample pages. Later we wanted to import some 23000 users into liferay. The data is in csv format.
I dont know java. tried to insert data using C#.net and liferay webservices. No luck. So i changed the settings to point to Sql Server.
Now all the data I created for the sample site is residing in the Hypersonic DB.
Is there any way to copy the data to sql server?
The simplest way would be to gather all your table data from HSQLDB into .CSV format.
If you can use your original .csv file, that's great. If needed, you'll need to find a way to get your data from HSQLDB into a text file. You may have a tool already that you know of. Perhaps you could use RazorSQL's Export Tool. (screenshot).
Then you can import .csv to SQL Server using the SQL Server Management Studio Import Wizard. The screenshots are a bit out of date, but you'll have no trouble figuring it out.

How to load Excel or CSV file into Firebird?

I'm using Firebird database and I need to load Excel file into a database table. I need a tool that does this well. I tried some I found on Google, but all of them have some bugs.
Since Excel data is not created by me, it would be good if it could scan the file and discover what kind of data is inside and suggest a table to be created in the database.
Also, it would be nice if I could compare the file against the data that is already in the database table, and I can pick which data to load and which not.
Tools that load CSV files are also fine, I can "Save as" CSV from Excel before loading.
Well, if you can use CSV, the I guess XMLWizard is the right tool for you. It can load a CSV file and compare with database data. And you can select the changes you wish to make to the table.
Don't let the name fool you, it does work with XML, but it also works very well with CSV files. And it can also estimate the column datatypes and offer CREATE TABLE statement for your file.
Have you tried FSQL?
It's a freeware very similar to Firebird's standard ISQL, but with some extra features, like import data from CSV files.
I've used it with DBF files and it worked fine.
There is also EMS Data import tool for Firebird and Interbase
http://www.sqlmanager.net/en/products/ibfb/dataimport
Not free, though, but it accepts a big variety of formats, including CSV and Excel.
EDIT
Another similar payware tool is Firebird Data Wizard http://www.sqlmaestro.com/products/firebird/datawizard/
There are some online tools which can help you to generate DDL/DML scripts from csv header/sample dump file, check out: http://www.convertcsv.com/csv-to-sql.htm
You can then use sql-workbench's Data Pumper or WbImport Tool from command line.
Orbada has GUI which support for importing csv file also.
DBeaver Free edition also support importing csv out of the box.
BULK INSERT
Other way is on Excell you build formula in new cells with data you want to export. The formula consists to format in strings and lenght to your field according lenght your field in firebird. So you can copy all this cells from excell and past on txt editor, so is possible to use the strategy of BULK INSERT in Firebird.
See more details in http://www.firebirdfaq.org/faq209/
The problem is if you have blob or null data to import, so see if you have this kind of values and if this way is to you.
If you have formated data in txt file, BULK INSERT will be quick way.
Hint: You can too to disable trigger and index associated with your table to accelerate BULK INSERT, and after enable them.
Roberto Novakosky
I load the excel file to lazarus spreadsheet and then export to firebird db. Everythong is fine and the only problem is fpspreadsheet will consider string field with numbers only as a number field. I can check the titles in the first row to see whether the excel file is valid or not.
As far as I can see all replies so far focus on tools that essentially read the Excel (or CSV) file and uses SQL inserts to insert the records into the Firebird database. While this works, I have always found this approach painstakingly slow.
That's why I created a tool that reads an Excel file and writes one file that has a (text) format suitable for Firebird external table (including support for UTF8 char columns) and one DDL file to create the external table in Firebird.
I then use regular SQL to select from the external table, cast as needed, and insert into whatever normal Firebird table I want. The performance with this approach is orders of magnitude faster than SQL inserts from a client app in my experience.
I would be willing to publish the tool. It's written in C#. Let me know if there's any interest.