Update MySQL database from SQL Server in different domain - mysql

I am SQL Server developer and the current assignment is little different than what I have done in past. I found Stack Overflow very promising for my problem. I am working on the SQL Server 2005 database for the internal application for my client and the client also got the public facing web application with MySQL database. I do not have any details about this web application, but I got the assignment to update the MySQL database (on public domain) from the SQL Server database (internal domain) on daily basis as auto process. How can I achieve this through the SQL Server?

You might want to try Pentaho Data integrator.
http://wiki.pentaho.com/display/EAI/Latest+Pentaho+Data+Integration+%28aka+Kettle%29+Documentation
The product would allow you to speak to both data technologies. (MSSQL+MySQL) You will find the product similar to DTS. You may be able to construct your solution will little to no code.

SSIS will do this just fine. The hard part is determining how you want to transform the data from one structure to the other (I assume they are not exactly alike in terms of table design.)
But basically you create a dataflow task, connect to the SQL Server for the source data and use a query to define what data you are going to copy, then you do any transformations needed to make the data fit into the MySQL structure and connect to a MySQL destination.
Repeat this process for mulitple data sets you want to send to differnt places.
Once the SSIS pacakge is done, set up configurations so that you can run the package on the production server (you will want to test development to development of course!) then schedule the package to run at an appropriate time.
Depending on how different the two databases are and how much data you need to move, this can be a relatively simple process or very complicated.

Related

connecting already existing database in a local environment

I have an access database that connects to a vb6 application and this whole thing is connected between two computers via a shared network one running win 8 and other a win 7, and there is no internet involved in any sorta way nor should it be that is a requirement in fact
sorry I advance I have tried researching on the net but there is really short time and a lot of confusing material online
I am creating a WPF app connected to MySQL DB
now I have copied the access file and imported the contents of the DB in MySQL
things are a real mess in the imported DB so I am fixing it
what I am confused is how I am going to make it work there
do I go and install MySQL and do the whole process manually there, repeating all the steps and changes
is made
make a document that contains the code/script for all the changes I have made and run the data through
it, and is there even a way to implement that as a whole in a singular go
connect both databases together, i don't even know if this is possible
yes, in place of a simple "file share" of the Access file, you now are going to run some kind of SQL server system. In this case MySQL. But it could be PostgreSQL or any kind of "server" database.
That instance of "sql server" thus has to be setup, installed and you ensure that the "box" running that instance of MySQL also allows external connections (often by default the given computer firewall settings prevent this).
At that point, 2 or 10 different computers on that same network can now simply connect to the SQL server. The code of course is going to be VERY simular. You almost for sure used the oleDB provider for use with Access. However, you can use the ODBC provider, or even use the provider from MySQL. Those providers thus means you change the connect object, datareader object etc. However the "base" .net types such as row, or datatable, or dataset can remain as before (so you only change the provider). If you have a lot of code based on oleDB, then you could well consider to contine to use that oleDB provider code in .net, and thus you change the connection strings to now point to MySQL.
If you don't have a lot of code, then for sure do adopt the mySQL provider for .net. But as noted the least amount of changes would be to continue to use a oleDB provider for mysql, and that would suggest the least amount of code to be changed.
As for the msaccess data migration? Well, it not clear what tools and how you doing that now. But, once you transfer the data to the MySQL server (assuming you installed + setup my sql to run on one computer). The it is a simple matter to point your .net connection(s) in your code to Now MySQL as opposed to Acess. As a result, most if not all of your code logic for working with the tables can remain as before - but as noted you have to swap out the provider parts in .net
Now, if your REALLY lucky and the .net code used the ODBC provider? Then all you have to do is change your connection strings. And of course "some" SQL syntax in your code may have to be tweaked, as like Oracle, MS SQL server, postgreSQL, and MySQL?
Well, they all have some features and syntax that is different - this is especially in regards to date/time calculations, datediff() etc. But the general sql you have/had in your .net code should continue to run mostly un-changed against MySQL data tables.
As for how to migrate the data? I think that a really good tool is of course to use MS-Access. What you do is get MySQL up and running. Then use ms-access to open that database. You then add linked tables from MS-access to the MySQL tables.
At that point, you can now run append queries from Access to move/send the data to MySQL. It really depends on how many tables, and how many related tables are in that database. The more complex and the greater number of related tables in Access then the more the challenge to move such data up to MySQL.
Transferring Excel or a small or even big table is a breeze. (again, use MS Access and link to the tables on the sql server). However, where things can become messy is that if you have say 25 tables, and they are all related, many have cascade delete and say enforced parent to child relationships. So the more tables, and especially a larger number of related data tables, then the more work such a data migration task will become.
I think MS Access is a really good tool, since if you setup a connection to MySQL, then you can execute a transferDatabase commend in Access to send up one table to MySQL, and even all the columns and data types for those columns will be automatic created for you. So not only can Access transfer the data, but MORE valuable is it has the abilty to create the target tables on MySQL for you - and that will save you large amounts of time to build + setup the tables on MySQL.

SQL Server 2012 Data Integration

I'm writing an intranet application (in a LAMP environment) that uses data from sections of an MSSQL 2012 database (used by another much larger application).
As I see it my options are to:
Directly query the database from the application.
Create a web service
Use Microsoft SQL Server Integration Services to have the data
automatically integrated into my applications database
I'm sure the best solution here would be using SSIS, however I've not done this before am on a deadline - so if that's the case could someone let me know
a) With my limited experience in that area would I be able to set that up, and
b) What are the pros and cons of the above options?
Any other suggestions outside of the options I've thought of would also be appreciated
Options:
Directly query the database from the application.
Upside:
Never any stale data
Downside:
Your application now contains specific code and is tied that
application
If you are in the common situation where the business
buys another application containing the same master data, you now
need special code to connect to two applications
Vendor might not like it
Might be performance impacts on source application
Use Windows Task Scheduler / SQL Agent to run a script or SSIS to replicate data at x minute intervals or so.
Upside:
Your application is only tied to your local copy of the database, which you can customise as required. If your source app gets moved to the cloud or something then you don't need to make application changes, just integration changes
If another source application appears with the same type of master data, you can now replicate that into your local DB rather than making application changes to connect to 2 databases.
Downside:
Possibility of stale data
Even worse: possibility of stale data without users realising it, with subsequent loss of confidence in the application
Another component to maintain
If you write a batch script, .Net app or SSIS, they are all pieces of logic that needs to be scheduled to run
Another option is to replicate the database using differential replication if your source database is Oracle or SQL, you can use replication to replicate it into another database.
You need to consider where you will be in a few years. The data copy method probably gives you more flexibility to adapt to changes in the source system as you only need to change your integration, not your whole app if something drastic changes with your source system.
You also need to consider: will you ever be asked to propogate changes back the other way, i.e. update data in your local copy and have it pushed back to the source systems.

how to keep table data same in oracle and sql

I am trying to build a database in sql server that replicates exact data present in tables in oracle production database. The database in sql server will be used for reporting and for analysis. I want every new or updated data in oracle tables to be present in sql server tables in around 1 hour time span. Does sql server integration services helps on this? is there any tool that does this i.e. it makes sure that data present in oracle table and sql server table is always same( neglecting the 1 hour lag?)......
There are two things you could look into: replication and SSIS. SQL Server replication allows you to replicate data from Oracle to MSSQL so that would be one way to handle the data copy. On the other hand, if you plan on doing data transformations, mappings etc. then you might want to use SSIS because it's a full ETL tool.
One important question is how you can identify new data in Oracle, because that may determine at least the first part of your solution. And you then have to decide what transformations are necessary once you've copied the data into SQL Server; perhaps you will need to run some stored procedures to clean the data and put it into reporting tables. Since your reporting system is a different platform from the source, you will need to handle data type transformations at some point, whatever solution you choose.
Your question is quite general, and it isn't really possible to say what you should do without a lot more detail about your environment, your requirements, your resources and so on. I suggest that you try to break down your task into smaller ones, and then you should be able to ask more specific questions.

Migration strategies for SQL 2000 to SQL 2008

I've perused the threads here on migration from SQL 2000 to SQL 2008 but haven't really run into my question, so here we go with another one.
I'm building a strategy to move specific SQL 2000 databases to a new SQL 2008 R2 instance. My question comes with regards to the best method for transferring the schema and data. One way I know of is to do the quick 'n' dirty detach - copy - attach method, which should work so long as I've done my homework wrt compatibility and code and such.
What if, though, I wrote the schema and logins via script and then copied the data via SSIS? I'm thinking of trying that so I can more easily integrate some of my test cases into the package (error handling and whatnot). What would I be setting myself up for if I did this?
Since you are moving the data between servers or instances, I would recommend moving the data via data flows. If you don't expect to run the code more than once, then you can let the wizard generate your code for this move. However, when I did this once 2+ years ago, the wizard code generated combined execute sql tasks that combined many "create table" commands into one task and created a few data flow tasks that had multiple source and destinations in them to insert data in the destination. This was good to get up and running, but it was inadequate when I wanted to refresh the tables one more time after I modified the schema of the new target tables. If you expect to run the refresh more than once, then you may want to take the time to create the target schema first and then manually create the data flows.
Once you have moved the data, then you can enable full-text search on the new server. I don't believe you will need to have this enabled on your first load.
One reason I recommend against the detach-attach method for migration is that you bring all the dirty laundry from the 2000 database to the 2008 R2 database. If you had too lax security on the 2000 server or many ancient users that shouldn't exist, it could be easier to clean this up by starting from scratch. If you use the detach-attach method, then you have to worry about users.

Getting MySQL code from an existing database

I have a database (mdb file) that I am currently busy with. I would like to know if it is possible to generate MySQL code that would be used to create this database?
There are a couple of tools you can look at to try to do the conversion.
DataPump
Microsoft DTS (Nos Called SQL Server Integration Services)
Other option might be generate MySQL code from Access' DB MetaData you can access from JDBC, ODBC, ADO.NET or any other database access technology with metadata support. For this option you need to generate a piece of code (script). So it will only make sense if your access DataBase has a lot of table with a lot of columns or if you are planning to do this task several times.
Of course, using one of the mentioned tools will be faster if it works.
You can certainly write DDL to create and populate a MySQL database from the work that you've already done on Microsoft Access. Just put it in a text file that you execute using MySQL batch and you're all set.
If you intend to keep going with developing both, you'll want to think about how you'll keep the two in synch.