Need to update Mysql periodically from Oracle - mysql

I have a user table in Mysql. And this table is copied from another DB(Oracle).
Of course they have totally same table. And it needs to be updated once a day.(Mysql ← Oracle)
Is it possible to access Oracle and retrieve data within MYsql? I mean things like procedure.
(It seems possible between Mysql)
Or do I have to find other way?

Try to use Data Import tool in dbForge Studio for MySQL.
Start Data Import
Check ODBC format, specify its options and select table to import from
Specify mapping and other options
Select Append, Update or Append/Update import mode.
Also, you can run this tool in command line mode. Save data import template file and run it in command line mode once a day.

Related

How to import Oracle sql file into MySQL

I have a .sql file from Oracle which contains create table/index statements and a lot of insert statements(around 1M insert).
I can manually modify the create table/index part(not too much), but for the insert part there are some Oracle functions like to_date.
I know MySql has a similar function STR_TO_DATE but the usage of the parameter is different.
I can connect to MySQL, but the .sql file is the only thing I got from Oracle.
Is there any way I can import this Oracle .sql file into MySQL?
Thanks.
Although the above job can be done by manually editing the script appropriately however there are products available which can be of use. Refer to the link for more information on one such product.
P.S. I am not affiliated in any way to the product
Since you mention about insert script basically i think you will be inserting data for this you can use any ETL tool, like open source tool like Pentaho data integrator, pretty simple to do, just search table to table transformation from different database connection on youtube to learn you should be able to connect to both mysql and oracle database else this wont help, but all the table structures you should create manually in the source database for data - you can just load it using ETL, no need to edit for every single line of insert if its more than 100 may be its very painful thing to do.

Import MySQL file into Oracle SQL Developer

I have tried to find an answer for this elsewhere but cannot, I hope someone can help me.
I am trying to import the MySQL sample database into Oracle SQL Developer.
I have an existing database/connection I want to dump it into. I have created an empty table named classicmodels in my existing connection. Yes that name is only 1 table within the sample db, correct. Ignore the error in naming convention.
When I R-click on it and try 'import data' I cannot import a .sql file, I can only do it with XL, CSV, etc.
When I try and run a script it found on dba.stackexchange
#\path\mysqlsampledatabase.sql , I get a series of 'please provide substitution value' messages, which does not make sense to me given that I am importing a database which is built for SQL (ie what reason is there to substitute).
Pictures below:
The 'UnseenCollection' is a single table I imported as a csv file. I need to import the mysqlsampledatabase file such that it shows up the same way, I can access all tables within the sample db.
Anyone can help I would appreciate it. I need the end result to be the entire mysqlsampledatabase to populate within the 'classicmodels' node.
Thank you.
connect to MySQL
connect to Oracle
for a single MySQL table, right-click, 'Copy to Oracle'
for a few tables, select, drag and drop onto Oracle connection (requires newer version of SQL Developer)
for an entire MySQL database, use the migration project feature

SQlyog to import only new record

I have a Pervasive SQL Database that uses ODBC to connect with.
I can run a scheduler for every 1 hr with import external data tool using ODBC to MySQL but it imports all of the records every time i want instead just importing the rows changed. Any one have any idea on this?
Thanks!
You can use a where clause on the column. The where clause option can be found in the import external data wizard.

How to load excel data into sql server 2008 table?

DBMS - SQL server 2008 r2 with management studio (GUI)
Excel file - Columns exactly like the columns in destination table
How do I load the data from the excel table into the sql table? Are there any potential problems in this way of doing things ? For example, a column does not allow null, but the
excel has a null.
The simpliest way to import XLS data directly into SQL is to use the Import Wizard.
You can do this either by having the import wizard create the table for you when the wizard runs, alternatively, you could create the table before hand and then use the wizard and use that table as your target table.
I prefer the latter method as I like to control my table creation with my desired datatypes.
To find the Import Wizard, simply right-click on the Database name, Tasks -> Import Data
You can also use an Excel Data Source in SQL Server Integration Services.
Some caveats though:
If you are running in a 64-bit OS, make sure your package properties are set to disable the 64-bit runtime, or the import from Excel 2003-2007 or will fail. You will get a connection error.
Take the time to use the advanced editor to shrink your columns to the size you need, or every field will come out as nvarchar(255) by default, which is a waste of space. If you are mapping to an existing table, you may need to convert the data types as well.
The advantage of this approach is that you can re-use it more easily, though of course you can save the package the wizard creates as well.
Joey Morgan
Programmer/Analyst Principal I
WellPoint Medicaid Business Unit

MS Access to MySQL Conversion help (Gigantic table)

So I have this gigantic table, containing approx 7 million records, in MS Access (*.mdb), I want to transfer it into a much more workable MySQL format, and store it on my webserver. The file itself weighs 2GB.
The problem is, since the table is so large, it won't let me export it normally (Access says the limit is 65,536 records.)
I've tried some 3rd party software but to no avail.
Can anyone recommend a clean way of doing so, without damaging the data inside?
Thanks in advance for any help.
Install an ODBC driver for MySQL, if you don't have one already. The latest version is available here: Download Connector/ODBC
Create a DSN (Data Source Name) for your MySQL server from the Windows ODBC Data Source Administrator.
Then from Access 2003, select your table in the Database Window, and choose File->Export from Access' main menu. In the "Export Table 'yourtablename' To ..." dialog, select "ODBC Databases()" from the "Save as type" drop-down list (at the bottom of the dialog). The next dialog allows you to specify the name MySQL will use for the exported table, and it defaults to the Access table name. After you click OK, you will get another dialog, "Select Data Source", where you can select your DSN for MySQL. After you click OK on that dialog, you will probably get one more asking you for user name and password. Supply them, and click OK.
Hopefully your table will then transfer without errors. However, I've never done that operation with MySQL. It has worked for me with ODBC transfers to SQL Server and PostGreSQL. So I don't see why it wouldn't work with MySQL, too.
Also I've never attempted to export 7 million records in one go. If it chokes, we'll have to figure out a work-around.
If you're using Access 2007 instead of 2003, look for a similar option starting with the Export section of the ribbon.
I suggested this approach because my impression is this export will be a one-time deal, so I think the Access UI export method would be easiest. However, you can do essentially the same operation with VBA code using the DoCmd.TransferDatabase Method with your ODBC DSN.
Yet another alternative would be to create a compatible table structure in MySQL, create a link in Access to the MySQL destination table (using your DSN again), then run an "append query" from Access:
INSERT INTO link_to_mysql_table (field1, field2, field3, etc)
SELECT field1, field2, field3, etc
FROM access_table;
The append query approach could be useful in case the export chokes on 7 million records. You could add a WHERE clause to limit the SELECT query's output record set to a manageable chunk size, and then repeat with a different WHERE to specify another chunk.
Is that 7 million value after a compact + repair? I mean, if each record is about 120 chars in length, you can fit 32 million records in 2 gigs.
Also, I not aware of a limit of exporting 65,000 records, but only in regards to Excel.
So, you can/should be able to export the data to a csv, and then use a bulk text import in mySql to pull that data in. So, try exporting the table as csv. That should work.
I mean, you could link a table via odbc if you have a good local connection to the sql server, but if not, then I would export to csv (it is VERY fast). I would then zip the file (they zip fantastic). Upload file to server, and un-zip, and then use bulk text import. So, such a zipped file is VERY small and will save huge amounts of transfer time.
You can also consider using tab delimited as mySql also can import those, but a simple text file should work just fine.
I would use pyodbc as described in
http://en.wikibooks.org/wiki/Python_Programming/Database_Programming
download python 2.7 from
http://python.org/
download
http://code.google.com/p/pyodbc/
modify the following coede to set myfile.mdb and MyTable according to your table and file
save the code in a file translate.py
import csv
mycsv = csv.writer(open('result.csv', 'wb'), delimiter=',',
quotechar='"', quoting=csv.QUOTE_MINIMAL)
import pyodbc
DBfile = 'myfile.mdb'
conn = pyodbc.connect('DRIVER={Microsoft Access Driver (*.mdb)};DBQ='+DBfile)
cursor = conn.cursor()
SQL = 'SELECT * FROM MyTable;'
for row in cursor.execute(SQL): # cursors are iterable
mycsv.writerow(row)
cursor.close()
conn.close()
run python translate.py
Install MySQL on your own system and upsize to it rather than trying to use your local server. Then run an append query from your MySQL to the server instance.