Query 2 databases on 2 different SQL Servers - sql-server-2014

In reviewing many of the answers, don't see a solution something I feel should be simple.
I'm attempting to update a couple of fields in my Production database on one server from a restored database on another server because of a loss of data by our ERP vendor's update.
Anyway, have both servers connected in SSMS and just want to run the query below:
USE coll18_production;
GO
USE coll18_test2;
GO
UPDATE coll18_Production.dbo.PERSON
SET W71_ID_CRD_NO = T2.PERSON_USER1, W71_ID_CRD_DATE = T2.PERSON_USER9
FROM coll18_test2.dbo.PERSON as T2
WHERE coll18_Production.dbo.PERSON.ID = T2.ID;
I would think this would be a simple update, but can't make a query for 2 different server databases via the same tables.
Thanks if anyone can make this simple,
Donald
Okay, thanks for the input. In the essence of time I'm going to do something similar to what cpaccho recommended. Create a temp table containing the 2 fields that I want to update from in my Production database. Then I'm going to connect to my Test2 database that I restored from backup. Export these two fields as a csv file with the primary key and simply restore this table data into the temp. table in my production database. Then simply run my update from this temp table into the 2 fields in my production PERSON table where the ID's equal each other.
Have a great weekend,
Donald

The problem is that since the databases are on 2 different servers in order to join between them you will need a way for the servers to talk to each other.
The way to do that is through linked servers. Then you can set up your query to join the 2 tables together using 4 part naming (server.DB.Schema.Table) and accomplish your goal. The query will look sort of like this:
UPDATE Server.DB.Schema.Table1
SET column = b.column
FROM Server1.DB.Schema.Table1 a
INNER JOIN Server2.DB.Schema.Table2 b
ON a.column = b.column
Where a.column = something
You will only need to set up the linked server on one side and the Server name in the query will be the name you give the linked server. The only caveat is that this can be slow because in order to join the tables SQL Server may have to copy the entire table from one server to the other. I would also set up the linked server on the server you are updating (so that you run the update on the same server as the DB you are updating)
How to set up Linked Server Microsoft KB

A simple, rather hacky way would be to hard copy the table from database to database...
First create a table that containts the changes you want:
USE coll18_test2;
GO
SELECT PERSON_USER1, PERSON_USER9, ID
INTO dbo.MyMrigationOrWhateverNameYouLike
FROM coll18_test2.dbo.PERSON
Then go to SSMS, right click on coll18_test2 database --> Task --> Generate scripts and go with the assistant to generate a script for the newly created table. Don't forget to setup, in the advanced options, "Type of data to script" to "Schema and "Data".
Now that you have your script, just run it in your production database, and make your query based on that table.
UPDATE dbo.PERSON
SET W71_ID_CRD_NO = T2.PERSON_USER1, W71_ID_CRD_DATE = T2.PERSON_USER9
FROM dbo.MyMrigationOrWhateverNameYouLike as T2
WHERE coll18_Production.dbo.PERSON.ID = T2.ID;
Finally drop the MyMrigationOrWhateverNameYouLike table and you're done...

Related

MySQL - Auto/Schedual SQL Updating Tables

I am not database engineering. But I have a question about the possibility of an issue about the MySQL database.
Is it possible to write SQL to get the data from several tables and then use these data (what we get) to updated a new table?
Also, this work should be scheduled daily.
The reason why I ask this question is because I am in this situation:
Our IT department has maintained a big database, but the database/tables are not meet our department's business need (we only have read permission). Our department has a small database (have all the permission), which we can use custom SQL to create some special table and updated them by daily.
So go back to the question, it is possible to set up the SQL and schedule it to make sure these SQL keep updating our tables?
Thank you so much!!!
Is it possible to write SQL to get the data from several tables and
then use these data (what we get) to updated a new table?
Yes it is possible. You can use a UPDATE .. JOIN construct to get the data from several table using SELECT statement and then JOIN with that inline view and perform the update operation to your other table.
Example:
UPDATE Your_Table a
JOIN (
//Select query to get data from multiple other tables
) xxx ON a.some_column = xxx.some_matching_column
SET a.column_c = xxx.column_c;
Also, this work should be scheduled daily
Sure, use MySQL Event Schedular

SQL Server 2008 : Update table records from one database to another database present on two different servers

I have two databases with the same schema, name, stored procedures, same tables, same records on two different servers.
For example database test is present on following mentioned servers with everything same including data. server 1 = 123.155.12.1 and server 2 = 123.155.12.2
Now I need to update records in table on server 2 accordingly if there is any update done in same table on server 1. For this any query syntax exists..?
(excluding replication option) if yes please help me with example.
Thanks in advance!
Regards.
Aksh
Hi #Akki you could make use of the OPENQUERY option to achieve this, I am not sure if you want to perform this action for every update done on the Table in SERVER1 or if this is just a one time update.

how can I inner join 2 mysql tables on 2 different mysql server

I am trying to inner join 2 tables located on 2 different MySQL Servers.
I am trying to do something like this:
SELECT id, name FROM server2.db1.account AS new_data
INERT INTO server1.db2.account(id, name)
ON DUPLICATE KEY UPDATE name = new_data.name
In SQL Server there is link server future which allows you to do this but I am not sure how this is done using MySQL Server.
Note: I need to be able to inner join all the tables from one server to another.
I am looking for a solution where I don't have to do every table separately as I have multiple servers and many tables on each database.
Thanks for your time and help.
Go read about the federated engine. It is the MySQL version of linked servers. You will be able to query a remote table like a local table using the federated engine. Read this link.

Setting up a master database to control the structure of other databases

I got a case where I have several databases running on the same server. There's one database for each client (company1, company2 etc). The structure of each of these databases should be identical with the same tables etc, but the data contained in each db will be different.
What I want to do is keep a master db that will contain no data, but manage the structure of all the other databases, meaning if I add, remove or alter any tables in the master db the changes will also be mirrored out to the other databases.
Example: If a table named Table1 is created in the master DB, the other databases (company1, company2 etc) will also get a table1.
Currently it is done by a script that monitors the database logs for changes made to the master database and running the same queries on each of the other databases. Looked into database replication, but from what I understand this will also bring along the data from the master database, which is not an option in this case.
Can I use some kind of logic against database schemas to do it?
So basicly what I'm asking here is:
How do I make this sync happen in the best possible way? Should I use a script monitoring the logs for changes or some other method?
How do I avoid existing data getting corrupted if a table is altered? (data getting removed if a table is dropped is okay)
Is syncing from a master database considered a good way to do what I wish (having an easy maintainable structure across several datbases)?
How will making updates like this affect the performance of the databases?
Hope my question was clear and that this is not a duplicate of some other thread. If more information and/or a better explantion of my problem is needed, let me know:)
You can get the list of tables for a given schema using:
select TABLE_NAME from information_schema.tables where TABLE_SCHEMA='<master table name>';
Use this list for a script or stored procedure ala:
create database if not exists <name>;
use <name>;
for each ( table_name in list )
create table if not exists <name>.table_name like <master_table>.table_name;
Now that Im thinking about it you might be able to put a trigger on the 'information_schema.tables' db that would call the 'create/maintain' script. Look for inserts and react accordingly.

How do I rescue a small portion of data from a SQL Server database backup?

I have a live database that had some data deleted from it and I need that data back. I have a very recent copy of that database that has already been restored on another machine. Unrelated changes have been made to the live database since the backup, so I do not want to wipe out the live database with a full restore.
The data I need is small - just a dozen rows - but those dozen rows each have a couple rows from other tables with foreign keys to it, and those couple rows have god knows how many rows with foreign keys pointing to them, so it would be complicated to restore by hand.
Ideally I'd be able to tell the backup copy of the database to select the dozen rows I need, and the transitive closure of everything that they depend on, and everything that depends on them, and export just that data, which I can then import into the live database without touching anything else.
What's the best approach to take here? Thanks.
Everyone has mentioned sp_generate_inserts. When using this, how do you prevent Identity columns from messing everything up? Do you just turn IDENTITY INSERT on?
I've run into similar situations before, but found that doing it by hand worked the best for me.
I restored the backup to a second server and did my query to get the information that I needed, I then build a script to insert the data sp_generate_inserts and then repeated for each of my tables that had relational rows.
In total I only had about 10 master records with relational data in 2 other tables. It only took me about an hour to get everything back the way it was.
UPDATE To answer your question about sp_generate_inserts, as long as you specify #owner='dbo', it will set identity insert to ON and then set it to off at the end of the script for you.
you'll have to restore by hand. The sp_generate_inserts is good for new data. but to update data I do it this way:
SELECT 'Update YourTable '
+'SET Column1='+COALESCE(''''+CONVERT(varchar,Column1Name)+'''','NULL')
+', Column2='+COALESCE(''''+CONVERT(varchar,Column2Name)+'''','NULL')
+' WHERE Key='+COALESCE(''''+CONVERT(varchar,KeyColumn)+'''','NULL') FROM backupserver.databasename.owner.YourTable
you could create inserts this way too, but sp_generate_inserts is better. Watch those identity values, and good luck (I've had this problem before and know where you're at right now).
useful queries:
--find out if there are missing rows, and which ones
SELECT
b.key,c.key
from backupserver.databasename.owner.YourTable b
LEFT OUTER JOIN YourTable c ON b.key=c.key
WHERE c.Key is NULL
--find differences
SELECT
b.key,c.key
from YourTable c
LEFT OUTER JOIN backupserver.databasename.owner.YourTable b ON c.key=b.key
WHERE b.Key is not null
AND ( ISNULL(c.column1,-9999) != ISNULL(b.column1,-9999)
OR ISNULL(c.column2,'~') != ISNULL(b.column2,'~')
OR ISNULL(c.column2,GETDATE()) != ISNULL(b.column2,GETDATE())
)
SQL Server Management Studio for SQL Server 2008 allows you to export table data as insert statements. See http://www.kodyaz.com/articles/sql-server-script-data-with-generate-script-wizard.aspx. This approach lacks some of the flexibility of sp_generate_inserts (you cannot specify a WHERE clause to filter the rows in your table, for example) but may be more reliable since it is part of the product.