Saving MySQL records as a series of Insert Statements - mysql

Having carefully and painfully manually populated/manipulated a series of records in a table, I want to retain them for reuse. As the table is rewritten daily I'd like to save JUST these particular records as a series of "inserts". The only way I know how to do this is to dump the whole table as sql using a GUI eg sqlyog.
But is there any quicker/better way to do this?

would mysqldump help ? (it's not a GUI)
edit : note that you can save only part of a table using this tool. since it's command line, you can automate the task easily.

Create a copy of your table with a meaningful name and copy using a INSERT the records your interested in.
Doing it this way gives the most flexibility should you need to copy them back/compare them.

Providing you still have your clean manually populated table available. Copy it into another table:
CREATE TABLE mytable_backup SELECT * FROM mytable;
Then you can re-introduce these to your daily rebuild table using a similar method:
INSERT INTO mytable SELECT * FROM mytable_backup;
Does this help?

Related

how create a sql insert query from php select query

My problem:
I am trying to delete some important rows from multiple tables, around 20 tables, I am afraid that deleting the rows might cause some problem(I am not the creator of this website), so before deleting the rows I am selecting the rows and writing it into a file. But I write it as an array.
Is there a way to write it as an sql insert statement, to a file, so that it would be easy for me to update the database if there is some problem.
For me it would be easier to store the information in a way that would allow me to understand the data. Then IF I need it, I could mutate the data into an INSERT statement.
I strongly encourage you as a professional software engineer, to try not to solve a problem that you might encounter, until you DO encounter it.
If you use phpMyAdmin you can run a query that selects those rows, then click the Export link under Query results operations:
In the next page, select Custom - display all possible options and SQL Format:
Then, further down the page, select data under Format specific options:
And then press Go. You will be prompted to Save or Open a file, which will include the appropriate INSERT statements to recreate the data from those rows.

Insert static data in database in SSDT

I want to pre populate my data base table with some defined data in SSDT. Such that insertion data take place only once. I am using SQL 2005 ?? How can I do this ??In SQL 2008 there is MERGE but I am not finding solution in 2005!!!
I was all excited until I saw you used SQL 2005 ;), poor you I feel your pain as up until recently I was working with a SQL 2005 db.
You will need to do something like:
"if not exists (select * from table where col = blah
Insert xxx into table
"
If you have lots of rows then you could look at doing something like checking one of the rows and doing a bulk insert or openrowset from a cav file -or- use redgate data compare to manage it for you.
Ed
If it's pre-defined, you might be able to get away with saving that out and inserting using BCP or something similar. You can also write the script and call it as part of some sort of "IF NEW" check. However, your best bet for an ongoing script would be to insert into a temp table, then do some sort of EXCEPT or LEFT JOIN to figure out what doesn't exist and update or insert as appropriate.
There's not a really clean way to do it, but it is doable. My concern would be that you may want to consider a separate "New" script and tell people to create the database, then run that script afterwards if they re-created the DB. This would keep the size of your main release scripts a bit more manageable.

Automated Data Import Stored Procedure From an Excel File

I have this Excel file:
Based on this data, I want to create a stored procedure that will identify the correct meter, if it exists, and perform either an insert or update to the monthly data.
Here is the MonthlyData table:
I really have no idea where to get started on this. Sorry about the tables, I am new here and I cannot post pictures yet. Please copy the tables and paste it in Excel.
Thank you
It's probably easiest to create an SSIS package for this if you're going to do this repeatedly.
First, create two tables:
myDataRaw
myDataCleaned
With myDataRaw, you truncate the table and then upload the Excel file into that table using a data upload object.
Create the stored procedure to work with the raw data. I would truncate the myDataCleaned table and then do a INSERT ... SELECT to it, making the WHERE clause specific to finding the account meters that you're looking for. If there are a lot, you can create another table to hold the specific account meters you want to import and use it in your WHERE clause.
I hope that helps get you started.
Have you considered using MERGE Query? I have no idea what 'meter' in this context mean, but if its something that can be checked in database itself, then MERGE query will be the best solution to your problem.
http://www.jooq.org/doc/2.6/manual/sql-building/sql-statements/merge-statement/

partial restore from sql dump?

I have a table that has 7000 rows,
I added a new column to this table
The table has a mysql DateTime so.
When i updated the table to fill in this new table it updated the datetime,
I took an sql dump just before i did the update so now i need to use the sql dump to revert the datetime back (and only that column).
How do i do that?
There are a couple ways I can think of to do this off the top of my head.
First is to create another mysql database and load the dump into that database (make sure it's not going to load into the first database from a use commmand in the dump), and then use the data from that database to construct the update queries for the first.
The second, easier, more hackish way, is to open the dump in a text editor, pull out just that table, and find and replace to make update statements for just that column based on primary key instead of inserts. You'd need to be able to find and replace on patterns.
A third way would be to load the dump in an abstract sql tool letting it do the parsing for you, and write new queries from the data in the abstract syntax trees.
A fourth, again hackish, possibility, if this isn't a live system, is to rollback and re-perform the more recent transformations (only if they are simple).
Restore the dump to a second table. Select the ID and datetime from that table. Use those results to update the rows in the original table corresponding to the IDs you got.

MySQL: Dump a database from a SQL query

I'm writing a test framework in which I need to capture a MySQL database state (table structure, contents etc.).
I need this to implement a check that the state was not changed after certain operations. (Autoincrement values may be allowed to change, but I think I'll be able to handle this.)
The dump should preferably be in a human-readable format (preferably an SQL code, like mysqldump does).
I wish to limit my test framework to use a MySQL connection only. To capture the state it should not call mysqldump or access filesystem (like copy *.frm files or do SELECT INTO a file, pipes are fine though).
As this would be test-only code, I'm not concerned by the performance. I do need reliable behavior though.
What is the best way to implement the functionality I need?
I guess I should base my code on some of the existing open-source backup tools... Which is the best one to look at?
Update: I'm not specifying the language I write this in (no, that's not PHP), as I don't think I would be able to reuse code as is — my case is rather special (for practical purposes, lets assume MySQL C API). Code would be run on Linux.
Given your requirements, I think you are left with (pseudo-code + SQL)
tables = mysql_fetch "SHOW TABLES"
foreach table in tables
create = mysql_fetch "SHOW CREATE TABLE table"
print create
rows = mysql_fetch "SELECT * FROM table"
foreach row in rows
// or could use VALUES (v1, v2, ...), (v1, v2, ...), .... syntax (maybe preferable for smaller tables)
insert = "INSERT (fiedl1, field2, field2, etc) VALUES (value1, value2, value3, etc)"
print insert
Basically, fetch the list of all tables, then walk each table and generate INSERT statements for each row by hand (most apis have a simple way to fetch the list of column names, otherwise you can fall back to calling DESC TABLE).
SHOW CREATE TABLE is done for you, but I'm fairly certain there's nothing analogous to do SHOW INSERT ROWS.
And of course, instead of printing the dump you could do whatever you want with it.
If you don't want to use command line tools, in other words you want to do it completely within say php or whatever language you are using then why don't you iterate over the tables using SQL itself. for example to check the table structure one simple technique would be to capture a snapsot of the table structure with SHOW CREATE TABLE table_name, store the result and then later make the call again and compare the results.
Have you looked at the source code for mysqldump? I am sure most of what you want would be contained within that.
DC
Unless you build the export yourself, I don't think there is a simple solution to export and verify the data. If you do it table per table, LOAD DATA INFILE and SELECT ... INTO OUTFILE may be helpful.
I find it easier to rebuild the database for every test. At least, I can know the exact state of the data. Of course, it takes more time to run those tests, but it's a good incentive to abstract away the operations and write less tests that depend on the database.
An other alternative I use on some projects where the design does not allow such a good division, using InnoDB or some other transactional database engine works well. As long as you keep track of your transactions, or disable them during the test, you can simply start a transaction in setUp() and rollback in tearDown().