How can I import an Excel file containing data from three different tables with foreign keys into a mysql database?
A very simple way to do it is working with Excel itself. In order to help you we need more details about the Tables in the excel and the tables in the database but you will get it:
Make a new column and using formulas with CONCATENATE, do a sample "INSERT INTO" command. Use series for do the INSERT's automatically, copy & paste the column into a .sql file and execute it.
If you need more detail, if you edit your question and I will glade to help you improving this answer.
Related
I am trying to copy the data from one table to another table. Normally using the SELECT command, we can read the whole table, and using the INSERT command we can insert the data into another table. But I don't want to use raw SQL command, I want to use SQLAlchemy ORM to copy and insert. Is there any way to do it?
Are you just trying to add an entry to a database, or are you trying to duplicate an entry?
Adding would be done by simply doing:
ed_user = User(name='ed', fullname='Ed Jones', nickname='edsnickname')
session.add(ed_user)
session.commit()
The example was taken from the official documentation. The commit will actually write the data added to the session, to the database.
EDIT:
you'll have to write something that parses the file into objects and add those objects to the database. Depends on what kind of file, if it's a database export, then you can just import with your preferred database tool. You can have a look at this blog post as well. Bottom-line is that if you want to import from csv / excel / txt, you'll have to write something for it.
I need help creating table in DB through Excel.
User has table in excel sheet, which has not exact count of columns. User can add or reduce columns and of course add or reduce rows too.
I need a script for example for ODBC Microsoft Query which choose all table range in excel sheet and create table with this data in DB (MySQL).
It has to work on a one click not manually.
Thank you
This is solution for me. It's not exactly what I wanted, but it's enough for me now. After some modifications, it works on MySQL.
tomaslind.net - export data excel to sql server
I was given an excel (csv) sheet containing a database metadata.
I'm asking if there's a simple way to import the csv and create the tables from there?
Data is not part of this question. the csv looks like this:
logical_table_name, physical_table_name, logical_column_name, physcial_column_name, data_type, data_length
There's about 2000 rows of metadata. I'm hoping I don't have to manually create the tables. Thanks.
I don't know of any direct import or creation. However, if I had to do this and I couldn't find one, I would import the excel file into a staging table (just a direct data import). I'd make add a unique auto ID column to staging table to keep the rows in order.
Then I would use some queries to build table and column creation commands from the raw data. Unless this was something I was setting up to do a lot, I would keep it dead simple, not try and get fancy. Build individual add column commands for each column. Build a create Table command for the first row for each table. Sort them all by the order id, tables before columns. Then you should be able to just copy the script column, check the commands, and go.
I have tried to find an answer for this elsewhere but cannot, I hope someone can help me.
I am trying to import the MySQL sample database into Oracle SQL Developer.
I have an existing database/connection I want to dump it into. I have created an empty table named classicmodels in my existing connection. Yes that name is only 1 table within the sample db, correct. Ignore the error in naming convention.
When I R-click on it and try 'import data' I cannot import a .sql file, I can only do it with XL, CSV, etc.
When I try and run a script it found on dba.stackexchange
#\path\mysqlsampledatabase.sql , I get a series of 'please provide substitution value' messages, which does not make sense to me given that I am importing a database which is built for SQL (ie what reason is there to substitute).
Pictures below:
The 'UnseenCollection' is a single table I imported as a csv file. I need to import the mysqlsampledatabase file such that it shows up the same way, I can access all tables within the sample db.
Anyone can help I would appreciate it. I need the end result to be the entire mysqlsampledatabase to populate within the 'classicmodels' node.
Thank you.
connect to MySQL
connect to Oracle
for a single MySQL table, right-click, 'Copy to Oracle'
for a few tables, select, drag and drop onto Oracle connection (requires newer version of SQL Developer)
for an entire MySQL database, use the migration project feature
I am trying to update one of my SQL tables with new columns in my source CSV file. The CSV records in this file are already in this SQL table, but this SQL table is lacking some of the new columns from this CSV file.
I already added the new columns to my SQL table structure via ALTER TABLE. But now I just need to import the data from this CSV file into the new columns. How can I do this? I am trying to use SSIS and SQL Server to accomplish this, but am pretty new to Excel.
This is probably too late to solve salvationishere's problem; though I'm posting this for future readers!
You could just generate the SQL INSERT/UPDATE/etc command by parsing the csv file (a simple python script will do).
You could alternatively use this online parser:
http://www.convertcsv.com/csv-to-sql.htm
(Hoping that it'd still be available when you click!)
to generate your SQL command. The interface is extremely straight forward and it does the entire job in an awesome way.
You have several options:
If you are loading the data into a non-production system where you can edit the target tables, you could load the data into a new table, rename the old table to obsolete, and rename the new table to the old table name.
You can load the data into a staging table and then write a SQL statement to update the target table from the staging table.
You can open the CSV file in Excel and write a formula to generate an update script, drag the formula down across all rows so that you get a separate update statement for each row, and then run the separate update statements in management studio.
You can truncate the target table and update your existing ssis package that imports the file to use the new columns if you have the full history in your CSV file.
There are more options, but any of the above would probably be more than adequate solutions.