I am trying to copy the data from one table to another table. Normally using the SELECT command, we can read the whole table, and using the INSERT command we can insert the data into another table. But I don't want to use raw SQL command, I want to use SQLAlchemy ORM to copy and insert. Is there any way to do it?
Are you just trying to add an entry to a database, or are you trying to duplicate an entry?
Adding would be done by simply doing:
ed_user = User(name='ed', fullname='Ed Jones', nickname='edsnickname')
session.add(ed_user)
session.commit()
The example was taken from the official documentation. The commit will actually write the data added to the session, to the database.
EDIT:
you'll have to write something that parses the file into objects and add those objects to the database. Depends on what kind of file, if it's a database export, then you can just import with your preferred database tool. You can have a look at this blog post as well. Bottom-line is that if you want to import from csv / excel / txt, you'll have to write something for it.
Related
I'm trying to insert some initial data into a django table.
I've tried sqlcustom and sqlall, but it doesn't seems to work...
And followed the instructions and created a .sql file with some inserts statements in it, like this:
INSERT INTO charts_wave (id, ...) VALUES (...);
This file it is in the following path:
project/charts/sql/wave.sql
Maybe I didn't understand the purpose of sqlcustom and sqlall.
After running the commands, I run one final the syncdb command.
Next I test it by calling the Wave.objects.all() in the python shell and it returns a empty list.
What I'm doing wrong?? How can I insert this data with the sql file.
The method with fixtures it seems hardcode for the really extensive data that I've to insert.
Thanks for now.
To me it seems you are looking for Django fixtures. They provide a way to import initial data into your Django apps. So instead of importing your data yourself, including a fixture in your app, when you will run syncdb, the data will be automatically imported.
Friends, I am using toad for MySQl, and have a huge database ready and validated.
Now i have an excel file which contains data-entries for a particular table. And i am also successfully able to import data into the db using import wizard, mapping the first row header with the column names etc.
But now i have appended a few data entries into it which i wish to insert into the database. However the old values also get selected and hence cause a primary_key_violation exception as the entry already exists! Otherwise a truncate table option is there which i dont wish to use as there may be many files from which i have inserted the data.
I tried my level best but didnt get any solution, atleast in toad for mysql. Please tell me what to do! the solution maybe simple but i need it SOS
An option may be to not append records to that excel file, but create a new excel file with only the new records
I am trying to update one of my SQL tables with new columns in my source CSV file. The CSV records in this file are already in this SQL table, but this SQL table is lacking some of the new columns from this CSV file.
I already added the new columns to my SQL table structure via ALTER TABLE. But now I just need to import the data from this CSV file into the new columns. How can I do this? I am trying to use SSIS and SQL Server to accomplish this, but am pretty new to Excel.
This is probably too late to solve salvationishere's problem; though I'm posting this for future readers!
You could just generate the SQL INSERT/UPDATE/etc command by parsing the csv file (a simple python script will do).
You could alternatively use this online parser:
http://www.convertcsv.com/csv-to-sql.htm
(Hoping that it'd still be available when you click!)
to generate your SQL command. The interface is extremely straight forward and it does the entire job in an awesome way.
You have several options:
If you are loading the data into a non-production system where you can edit the target tables, you could load the data into a new table, rename the old table to obsolete, and rename the new table to the old table name.
You can load the data into a staging table and then write a SQL statement to update the target table from the staging table.
You can open the CSV file in Excel and write a formula to generate an update script, drag the formula down across all rows so that you get a separate update statement for each row, and then run the separate update statements in management studio.
You can truncate the target table and update your existing ssis package that imports the file to use the new columns if you have the full history in your CSV file.
There are more options, but any of the above would probably be more than adequate solutions.
I wonder if there is a (native) possibility to create a MySQL table from an .xls or .xlsx spreadsheet. Note that I do not want to import a file into an existing table with LOAD DATA INFILE or INSERT INTO, but to create the table from scratch. i.e using the header as columns (with some default field type e.g. INT) and then insert the data in one step.
So far I used a python script to build a create statement and imported the file afterwards, but somehow I feel clumsy with that approach.
There is no native MySQL tool that does this, but the MySQL PROCEDURE ANALYSE might help you suggest the correct column types.
With a VB Script you could do that. At my client we have a script which takes the worksheet name, the heading names and the field formats and generates a SQL script containing a CREATE TABLE and a the INSERT INTO statements. We use Oracle but mySQL is the same principle.
Of course you could do it even more sophisticated by accessing mySQL from Excel by ODBC and post the CREATE TABLE and INSERT INTO statements that way.
I cannot provide you with the script as it is the belonging of my client but I can answer your questions on how to write such a script if you want to write one.
As the title says: I've got a bunch of tab-separated text files containing data.
I know that if I use 'CREATE TABLE' statements to set up all the tables manually, I can then import them into the waiting tables, using 'load data' or 'mysqlimport'.
But is there any way in MySQL to create tables automatically based on the tab files? Seems like there ought to be. (I know that MySQL might have to guess the data type of each column, but you could specify that in the first row of the tab files.)
No, there isn't. You need to CREATE a TABLE first in any case.
Automatically creating tables and guessing field types is not part of the DBMS's job. That is a task best left to an external tool or application (That then creates the necessary CREATE statements).
If your willing to type the data types in the first row, why not type a proper CREATE TABLE statement.
Then you can export the excel data as a txt file and use
LOAD DATA INFILE 'path/file.txt' INTO TABLE your_table;