Can I compare the data from two SQL Server 2008 RT databases using built-in tools in SQL Management Studio or do I need to look for some third-party tool to do this?
I've used "the Google" but only seem to get information about third-party tools.
Thanks.
Use Except operator or checksum like i explain in this post.
Just note that the CheckSum method can provide duplicates.
If your database is big, you may want to look into SQL Data Compare.
If you need very sophisticated comparisons you can use Open Source DiffKit:
www.diffkit.org
You can use SQL Server Data Tools. For the most part they work pretty well but they don't have many features and tend to show some limitations on complex tasks.
Third party tools like xSQL Data Compare give you a huge amount of control over the comparison and synchronization process, so I suggest you take a look at those as well.
Dislcaimer: I'm affiliated to xSQL
Related
Does anyone know of a quick and easy test to see if a query is properly formatted for both MySQL & MSSQL. Perhaps other database types as well, such as SQL Server? I only have access to MySQL at this time.
Info: I'm working on an Open Source project called JJWDesign Google Maps for SugarCRM. Some of the queries use the SugarCRM classes; others I have to write custom. For example, some are special distance calculations against the geocode information stored in the tables.
http://www.sugarforge.org/projects/jjwgooglemaps/
More importantly, while there is an accepted syntax, each flavour of database has it's own specific functions, features and things you can do.
The best you can do is to make do with the most basic of features. Oracle has different functions for datetime compared to mysql compared to db2. While I would love to assist in a 'free as in beer' project, you really will need to check each function to see if it is the same across all major vendors. General functions most often are, so abs() will be fairly consistent, but others simply won't.
You're talking about a SQL parser so by definition it either isn't going to be quick and easy or it will do only the simplest checking.
Each RDBMS has its own flavour of SQL too so you'd really be limited to testing whether it was ANSI SQL.
I'm trying to find a good tool (open source or commercial) for doing comparisons of database instances, for example:
Compare 2 database schemas; generate platform specific change script (either direction) to bring one into synch with the other
Compare data (table contents), generate platform specific change script (either direction) to bring one into synch with the other
Migrate an entire database schema + data from one platform to another, ie: port an Oracle database to SQL Server (hopefully including support for sequences and identity columns)
For 1 & 2 above, by platform specific I mean the script native to the specific database platform, not, for example, an ODBC equivalent script.
Personally, I am mainly concerned with SQL Server and Oracle support, but MySQL support would be very nice to have as well.
Quest Toad or Red Gate SQL Compare would be the most likely options.
I'd give a try to "SQL Server Migration Assistant for Oracle (SSMA for Oracle)"
http://www.microsoft.com/sqlserver/2005/en/us/Migration-oracle.aspx
It will be really useful to you for points 1 and 3.
I used it with some data migration work, and although it didn't solved everything by itself, it really saved me a lot of time.
I'd like to clarify Red Gate's position. Schema Compare for Oracle is now available. For those who have used SQL Compare (for SQL Server), this tool will be very familiar. MySQL Compare also exists.
http://www.red-gate.com/Products/schema_compare_for_oracle/index.htm
ER/Studio , see http://www.embarcadero.com, will do most of what you're looking for, except for the data comparison. It is pricey however.
I am a .NET developer with average SQL skills. I am working on a web app that is 'database heavy'. I have been using the profiler to debug queries and procs. Is there a way to use this tool to look at performance of a query / procedure.
How do I make the most of the SQL Profiler?
I am using SQL Server 2008.
There are no hard and fast rules per say as it depends on the type of database system your a working with.
As a general starting point with SQL Server performance tuning you will find the following reference to be very useful indeed. It contains a variety of considerations and instructions.
http://www.brentozar.com/sql-server-performance-tuning/
Also take a look at the following article, "Identifying Performance issues using SQL Server Profiler"
http://vyaskn.tripod.com/analyzing_profiler_output.htm
If you need additional assistance just drop me a line.
I am not familiar with sql serer 2008, but in sql server 2005 your best bet is the "Display Execution Plan" feature. I will let you know what parts of your query are taking up the most time. Typically adding indexes will help immensely and this tool will help you identify where they are most needed.
Its a wonderful and powerful tool. But be careful and don't do the following"
1) Don't Capture Everything. You can have so many events/objects that you won't be able to find anything. You can set filters by app/user/database.
2) Don't run profiler on the production machine.
3) Don't save the trace info on a production database. Save it to a file that is stored separately from production files.
Do you know any applications to synchronize two databases - during development sometimes it's required to add one or two table rows or new table or column.
Usually I write every sql statement in some file and during uploading path I evecute those lines on my production database (earlier backing it up).
I work with mySQL and postreSQL databases.
What is your practise and what applications helps you in that.
You asked for a tool or application answer, but what you really need is a a process answer. The underlying theme here is that you should be versioning your database DDL (and DML, when needed) and providing change scripts to be able to update any version of your database to a higher version.
This set of links provided by Jeff Atwood and written by K. Scott Allen explain in detail what this ought to look like - and they do it better than I can possibly write up here: http://www.codinghorror.com/blog/2008/02/get-your-database-under-version-control.html
For PostgreSQL you could use Another PostgreSQL Diff Tool . It can diff two SQL Dumps very fast (a few seconds on a db with about 300 tables, 50 views and 500 stored procedures). So you can find your changes easily and get a sql diff which you can execute.
From the APGDiff Page:
Another PostgreSQL Diff Tool is simple PostgreSQL diff tool that is useful for schema upgrades. The tool compares two schema dump files and creates output file that is (after some hand-made modifications) suitable for upgrade of old schema.
Have scripts (under source control of course) that you only ever add to the bottom off. That combined with regular restores from your production database to dev you should be golden. If you are strict about it, this works very well.
Otherwise I know lots of people use redgate stuff for SQLServer.
Another vote for RedGate SQL Compare
http://www.red-gate.com/products/SQL_Compare/index.htm
Wouldn't want to live without it!
Edit: Sorry, it seems this is only for SQL Server. Still - if any SQL Server users have the same question I'd definitely recommend this tool.
If you write your SQL statements for your development database (which are, I imagine, series of DDL instructions such as CREATE, ALTER and DROP), why don't you keep track of them by recording them in a table, with a "version" index? You will then be able to:
track your version changes
make a small routine allowing the "automatic" update of your production database by sending the recorded instructions to the database.
I really like the EMS tools.
There tools are available for all popular DB's and you have the same user experience for every type of DB.
One of the tools is the DB Comparer.
TOAD
saved many an ass several times in the past. Why do people run sql with no exit strategy?
the redgate one is good also.
Siebel (CRM, Sales, etc. management product) has a built-in tool to align the production database with the development one (dev2prod).
Otherwise, you've got to stick with manually executed scripts.
Navicat has a structure synchronisation wizard that handles this.
I solve this by using Hibernate. It can detect and autocreate missing tables, columns, etc.
You could add some automation to your current way of doing things by using dbDeploy or a similar script. This will allow you to keep track of your schema changes and to upgrade/rollback your schema as you see fit.
Here's a straight linux bash script I wrote for syncing Magento databases... but you can easily modify it for other uses :)
http://markshust.com/2011/09/08/syncing-magento-instance-production-development
DBV - "Database version control, made easy!" (PHP)
I often have data in Excel or text that I need to get into SqlServer. I can use ODBC to query the Excel file and I can parse the text file. What I want though is some tool that will just grab the data and put it into tables with little / no effort. Does anyone know of such a tool?
Have you tried the SQL Server Import/Export Wizard ?
In SQL Server Management Studio, right-click your Database Name, and select Tasks menu, Import Data. For Data Source, select Microsoft Excel, browse to the .XLS...
If you are using Sql Server look at Integration Services (SSIS).
You can also take a look at parse-o-matic
Use DTS or SSIS depending on which version of SQL Server you have. There is an import wizard which can get you started, but data imports are rarely simple and usually involve some sort of data cleanup so that your incoming data is acceptable to the table where you intend to store it. Excel data, in my experience, is usually particularly bad inthis respect becasue it often isn't stored properly in Excel to begin with.
I haven't seen commercial tools that do this. I create this kind of tools at work all the time, and the data validation is not trivial. This just makes sure that you don't have bad data making it into your database.
I found that for simple data conversion needs something like FileHelpers is pretty good. It still needs programming though. This framework is fairly easy to use, and somebody with a little bit of experience could bang something out for you.
On further thought, you can use the SQL Server bcp utility to upload the contents of a text file. This is a command-line utility and has a lot of switches. I would suggest you experiment on a test table before you use this in a production table.
It's been a while since I used it, so I can't remember if you can directly use an Excel spreadsheet. Text files are always the easiest to deal with in any case.
Seems like it'd be pretty easy to write a script that reads the text file, and converts it to "INSERT * into TABLE" Sql statements. I suspect this has already been done, but a simple implementation would be less than 100 lines of code in your favorite scripting language.
Hey, Google says SQLServer comes with such a tool, BULK INSERT: