SSIS Copying table from 1 server to another - sql-server-2008

I have been tasked with moving about 500+ tables from one server to another. The servers have the same tables and the tables have the same schema. The issue is that some tables have a read-only column, so I can't just import and dump. In addition, the servers are not linked and the company has a no-linking policy.
This is where I stand:
I have a table name stored in a table in my source server/database. I have an Execute SQL Task to that passes each table name to a Foreach Loop Container. My thought is to write a script, but this is where I am scratching my head. I do not know what the next step would be. Can someone point me in the right direction?
Edit:
This is my graphical representation.

SQL Server Data Tools (SSDT) has Schema and Data comparison and synchronization tools. There are also third party tools available.

Related

Create a simplified versions of a database onto another server

I am required to create a test environment for some of our .Net applications, and some of these applications use only a small portion of some rather large databases. My idea is to create a 'small' database, which would only hold the tables, stored procedures, views, etc... that are being used by the application.
This will hopefully speed up refresh time on these 'small' databases, however I can't see a simple way of doing this, is there an option to do this easily within SQL server, or via a T-SQL script.
Currently the best method I have is to generate a script from the database, and only select the tables I require with the 'data only' option selected, then run these on the 'small' database to get the data up to date. However this as you can imagine is a lengthy process and I would prefer to use something a bit more automated.
Any suggestions you can provide are very much appreciated.
Thanks,
Michael Tempest
Database Mirroring Can be a solution for this problem, Only publish the items/Articles you want on your test Database, You can pause and restart mirroring when needed.
SSMS Script as Another wasy way would be go to your SSMS right click the objects you want to copy to test database and Script as Create, do it for all the items you want to move save the scripts in right order i.e (Creating Tables 1st and then relating objects) in one file and run it on the target database.
Since its only you who knows what items to move over to test db I think it will be difficult to find a script which suits your needs.
Some Useful tips for using SCRIPT AS Option
To generate the sql script for the objects:
SQL Server Management Studio > Databases > Database1 > Tasks >
Generate Scripts...
The SQL Server Scripts Wizard will start and
you can choose the objects and settings to export into scripts
By default the scripting of Indexes and Triggers are not
included so make sure to trun these on (and any others that you
are interested in).
To export the data from the tables:
SQL Server Management Studio > Databases > Database1 > Tasks >
Export Data...
Choose the source and destination databases
Select the tables to export •Make sure to check the Identity Insert
checkbox for each table so that new identities are not created.
Then create the new database, run the scripts to create all of the objects, and then import the data.
For Dev database we just keep a structural copy of Production one with some data. Periodically we compare databases with tool that compares and syncs database structure (there are plenty of such tools now - we use redgate's one).
For prod_copy database we just do backup-restore of prod db and then truncate biggest tables and shrink database if needed.
If you want completely automate the procedure you can script both SQL Compare or SQL Data Compare. I am not sure if other SQL tools vendors have such an option.

Import tables structure and no data from one database to another

I have database with multiple tables in Microsoft SQL Server with schema in tables as "xyz".
i am able to copy this database tables along with data from one sql server to another using export and import wizard of SQL server.
I want to do find a way to-
1. Copy only tables with no data.
2. is it possible to covert current database design to a script and then run the same on another server which will create all these tables with empty data ?
Thanks in advances.
Best Regards
Yes, you could do that with Management Studio. Right click your database and then select Tasks -> Generate Scripts.
There are some settings there you should tweak, like if it should generate scripts for indexes and statistics. They are all in plain sight.
An alternative is SQL Server Data Tools. It's relatively new (ex-Data Dude). It's not as straightforward, but better on a long term, for database versioning and for creating migration scripts.

how to keep table data same in oracle and sql

I am trying to build a database in sql server that replicates exact data present in tables in oracle production database. The database in sql server will be used for reporting and for analysis. I want every new or updated data in oracle tables to be present in sql server tables in around 1 hour time span. Does sql server integration services helps on this? is there any tool that does this i.e. it makes sure that data present in oracle table and sql server table is always same( neglecting the 1 hour lag?)......
There are two things you could look into: replication and SSIS. SQL Server replication allows you to replicate data from Oracle to MSSQL so that would be one way to handle the data copy. On the other hand, if you plan on doing data transformations, mappings etc. then you might want to use SSIS because it's a full ETL tool.
One important question is how you can identify new data in Oracle, because that may determine at least the first part of your solution. And you then have to decide what transformations are necessary once you've copied the data into SQL Server; perhaps you will need to run some stored procedures to clean the data and put it into reporting tables. Since your reporting system is a different platform from the source, you will need to handle data type transformations at some point, whatever solution you choose.
Your question is quite general, and it isn't really possible to say what you should do without a lot more detail about your environment, your requirements, your resources and so on. I suggest that you try to break down your task into smaller ones, and then you should be able to ask more specific questions.

Refreshing a reporting database

We are currently having an OLTP sql server 2005 database for our project. We are planning to build a separate reporting database(de-normalized) so that we can take the load off from our OLTP DB. I'm not quite sure which is the best approach to sync these databases. We are not looking for a real-time system though. Is SSIS a good option? I'm completely new to SSIS, so not sure about the feasibility. Kindly provide your inputs.
Everyone has there own opinion of SSIS. But I have used it for years for datamarts and my current environment which is a full BI installation. I personally love its capabilities to move data and it still is holding the world record for moving 1.13 terabytes in under 30 minutes.
As for setup we use log shipping from our transactional DB to populate a 2nd box. Then use SSIS to de-normalize and warehouse the data. The community for SSIS is also very large and there are tons of free training and helpful resources online.
We build our data warehouse using SSIS from which we run reports. Its a big learning curve and the errors it throws aren't particularly useful, and it helps to be good at SQL, rather than treating it as a 'row by row transfer' - what I mean is you should be creating set based queries in sql command tasks rather than using lots of SSIS component and dataflow tasks.
Understand that every warehouse is difference and you need to decide how to do it best. This link may give you some good idea's.
How we implement ours (we have a postgres backend and use PGNP provider, and making use of linked servers could make your life easier ):
First of all you need to have a time-stamp column in each table so you can when it was last changed.
Then write a query that selects the data that has changed since you last ran the package (using an audit table would help) and get that data into a staging table. We run this as a dataflow task as (using postgres) we don't have any other choice, although you may be able to make use of a normal reference to another database (dbname.schemaname.tablename or somthing like that) or use a linked server query. Either way the idea is the same. You end up with data that has change since your query.
We then update (based on id) the data that already exists then insert the new data (by left joining the table to find out what doesn't already exist in the current warehouse).
So now we have one denormalised table that show in this case jobs per day. From this we calculate other tables based on aggregated values from this one.
Hope that helps, here are some good links that I found useful:
Choosing .Net or SSIS
SSIS Talk
Package Configurations
Improving the Performance of the Data Flow
Trnsformations
Custom Logging / Good Blog

Migration strategies for SQL 2000 to SQL 2008

I've perused the threads here on migration from SQL 2000 to SQL 2008 but haven't really run into my question, so here we go with another one.
I'm building a strategy to move specific SQL 2000 databases to a new SQL 2008 R2 instance. My question comes with regards to the best method for transferring the schema and data. One way I know of is to do the quick 'n' dirty detach - copy - attach method, which should work so long as I've done my homework wrt compatibility and code and such.
What if, though, I wrote the schema and logins via script and then copied the data via SSIS? I'm thinking of trying that so I can more easily integrate some of my test cases into the package (error handling and whatnot). What would I be setting myself up for if I did this?
Since you are moving the data between servers or instances, I would recommend moving the data via data flows. If you don't expect to run the code more than once, then you can let the wizard generate your code for this move. However, when I did this once 2+ years ago, the wizard code generated combined execute sql tasks that combined many "create table" commands into one task and created a few data flow tasks that had multiple source and destinations in them to insert data in the destination. This was good to get up and running, but it was inadequate when I wanted to refresh the tables one more time after I modified the schema of the new target tables. If you expect to run the refresh more than once, then you may want to take the time to create the target schema first and then manually create the data flows.
Once you have moved the data, then you can enable full-text search on the new server. I don't believe you will need to have this enabled on your first load.
One reason I recommend against the detach-attach method for migration is that you bring all the dirty laundry from the 2000 database to the 2008 R2 database. If you had too lax security on the 2000 server or many ancient users that shouldn't exist, it could be easier to clean this up by starting from scratch. If you use the detach-attach method, then you have to worry about users.