I have just ended up creating Django website and now using phpMyAdmin I am importing large data sets in my mode. However, it appears that there is something wrong with the column values which I am trying to import via phpMyAdmin. I see the following error:
#1366 - Incorrect string value: '\x93Desig...' for column 'sku_description' at row 1
If it was one or two columns I could've manually fixed it. However, as I mentioned there is tons of data in there. What would be the most practical solution for this problem?
Nevermind. I solved it by changing the encoding in excel.
In Excel 2010, save file as CSV and before clicking save, click on Tools->Web Options->Encoding->Unicode(Utf8) and save it. Now import it into MySQL via phpMyAdmin.
I am using the migration wizard provided by MySQL workbench 6.3 to convert a SQL Server database into a MySQL. I tested the connection between both DBs and they are valid for the migration wizard. Once The migration wizard has completed I am left with 22 migration warnings and they are all the same warning:
Truncated key column length for column 0 to 16
I am having a hard time finding any similarities between the tables that are receiving warnings to narrow down the issue. There are tables with the same types of data that are not receiving these errors.
Here is an example of one of the tables affected by this warning.
Does anyone know what is/what could be causing these migration warnings?
if you need more information/images please let me know.
Migration wizard show this warning when find index that have different length on source and target databases. In fact you should also get index name in that message - ... for column <name> from ..., but it's empty. I guess something goes wrong, but to investigate that I need to reproduce issue on my machine. Please fill bug report on bugs.mysql.com and attach there sample database (you can make it private if you wish). Then paste link here.
Warning doesn't matter. Just remember to rename the schema while migrating I have attached an image for better understanding
https://i.stack.imgur.com/v6PGK.png
I am trying to export a table from MS Access (2007) to DB2 (9.7 LUW) using ODBC. I can do this, as long as I export the table into my own schema.
However, I would like to export the table to another schema. How can I get Access to put the table into another schema? In Db2, the table foo within schema bar is normally referred to as bar.foo. However, if I enter this as the target to export to, Access gives me an error:
The object name 'bar.foo' you entered doesn't follow Microsoft Access object-naming rules.
This is because it won't accept a period in the table name. Does anyone know how I can overcome this limitation? I can just copy the table over after the export, but some other users don't have permission to create tables in their own schema; thus, this is preventing them from exporting from Access.
Thanks for your help.
Remou suggested using a query like this:
SELECT *
INTO [ODBC;<db2 connection string>].schema.table
FROM ms_access_table1
I believe something similar could be made to work for DB2. I suggest trying this if anyone needs a solution. I have decided to stick with my current kludgy approach, however. It is a shame that Access's export feature can't do this for such a dumb reason.
So I have this gigantic table, containing approx 7 million records, in MS Access (*.mdb), I want to transfer it into a much more workable MySQL format, and store it on my webserver. The file itself weighs 2GB.
The problem is, since the table is so large, it won't let me export it normally (Access says the limit is 65,536 records.)
I've tried some 3rd party software but to no avail.
Can anyone recommend a clean way of doing so, without damaging the data inside?
Thanks in advance for any help.
Install an ODBC driver for MySQL, if you don't have one already. The latest version is available here: Download Connector/ODBC
Create a DSN (Data Source Name) for your MySQL server from the Windows ODBC Data Source Administrator.
Then from Access 2003, select your table in the Database Window, and choose File->Export from Access' main menu. In the "Export Table 'yourtablename' To ..." dialog, select "ODBC Databases()" from the "Save as type" drop-down list (at the bottom of the dialog). The next dialog allows you to specify the name MySQL will use for the exported table, and it defaults to the Access table name. After you click OK, you will get another dialog, "Select Data Source", where you can select your DSN for MySQL. After you click OK on that dialog, you will probably get one more asking you for user name and password. Supply them, and click OK.
Hopefully your table will then transfer without errors. However, I've never done that operation with MySQL. It has worked for me with ODBC transfers to SQL Server and PostGreSQL. So I don't see why it wouldn't work with MySQL, too.
Also I've never attempted to export 7 million records in one go. If it chokes, we'll have to figure out a work-around.
If you're using Access 2007 instead of 2003, look for a similar option starting with the Export section of the ribbon.
I suggested this approach because my impression is this export will be a one-time deal, so I think the Access UI export method would be easiest. However, you can do essentially the same operation with VBA code using the DoCmd.TransferDatabase Method with your ODBC DSN.
Yet another alternative would be to create a compatible table structure in MySQL, create a link in Access to the MySQL destination table (using your DSN again), then run an "append query" from Access:
INSERT INTO link_to_mysql_table (field1, field2, field3, etc)
SELECT field1, field2, field3, etc
FROM access_table;
The append query approach could be useful in case the export chokes on 7 million records. You could add a WHERE clause to limit the SELECT query's output record set to a manageable chunk size, and then repeat with a different WHERE to specify another chunk.
Is that 7 million value after a compact + repair? I mean, if each record is about 120 chars in length, you can fit 32 million records in 2 gigs.
Also, I not aware of a limit of exporting 65,000 records, but only in regards to Excel.
So, you can/should be able to export the data to a csv, and then use a bulk text import in mySql to pull that data in. So, try exporting the table as csv. That should work.
I mean, you could link a table via odbc if you have a good local connection to the sql server, but if not, then I would export to csv (it is VERY fast). I would then zip the file (they zip fantastic). Upload file to server, and un-zip, and then use bulk text import. So, such a zipped file is VERY small and will save huge amounts of transfer time.
You can also consider using tab delimited as mySql also can import those, but a simple text file should work just fine.
I would use pyodbc as described in
http://en.wikibooks.org/wiki/Python_Programming/Database_Programming
download python 2.7 from
http://python.org/
download
http://code.google.com/p/pyodbc/
modify the following coede to set myfile.mdb and MyTable according to your table and file
save the code in a file translate.py
import csv
mycsv = csv.writer(open('result.csv', 'wb'), delimiter=',',
quotechar='"', quoting=csv.QUOTE_MINIMAL)
import pyodbc
DBfile = 'myfile.mdb'
conn = pyodbc.connect('DRIVER={Microsoft Access Driver (*.mdb)};DBQ='+DBfile)
cursor = conn.cursor()
SQL = 'SELECT * FROM MyTable;'
for row in cursor.execute(SQL): # cursors are iterable
mycsv.writerow(row)
cursor.close()
conn.close()
run python translate.py
Install MySQL on your own system and upsize to it rather than trying to use your local server. Then run an append query from your MySQL to the server instance.
I am trying to create an SSIS package for integrating between MSSQL and MYSQL. I have no prior experience of working with Bids or SSIS and following the instructions from here.
I added the OLE DB Source, Lookup, Conditional Split, OLE DB Destination and OLE DB Command components to the Data Flow and configured the connection managers and column mappings upto the Conditional Split component.
From here, I am facing two problems -
1) After configuring the OLE DB Destination, it shows error symbol on the component that says could not convert between unicode and non unicode string datatypes. To solve this, I tried to insert a Data Conversion Component between the Conditional Split and the Destination and configured it for the problematic column. But that doesnt seem to help
2) While configuring the OLE DB Command, the right hand side column in Column mappings tab shows zero columns. I have added the Sql command with question marks so i guess it should be showing columns named "Param_0", "Param_1" etc if i am not wrong. I even tried to add them manually from the input and output properties tab but then it shows the warning, external columns for OLE DB command are out of sync with data source
What am I missing here ?
Thanks
The way you describe your first problem, it sounds like it should work. Here are a couple of things to check.
The data conversion component creates a new column for the converted data. Make sure you are referring to it in your following transformations and destination.
Right-click on the Data Conversion component and select Advanced Editor. Select the Input and Output Properties tab in the Advanced Editor. Expand the Data Conversion Output branch of the tree-view and select your new column. Ensure that the Data Type Properties show the data type that you want to convert too. If these values are not right then something is not right with the setup in the component.
For your second problem, the issue can frequently be caused by an error with the SqlCommand value. First, make sure the Connection Manager is correct on the Connection Manager tab. Switch to the Column Mappings tab. Near the bottom of the form, there may be a warning message that indicates that the SQL statement cannot be prepared. In other words, SSIS can't figure out what the statement is supposed to do. Address any problems with the SQL statement and switch back to the Column Mappings tab. The columns will appear once the SQL statement can be parsed.
If you want to avoid the conversion issues then change your destination table column types from char/varchar to nchar/nvarchar. I'm pretty sure you will need to use an ADO connector for mysql source and destinations, you should be able to read data from the mysql source and write to the mssql database w/o using anything other than source and destination components.