I am trying to export a datatable from Microsoft Access 2016 via ODBC Export to a MariaDB. I have tried:
do a right click on the datatable and choose "Export" --> "ODBC-Database"
then choose the preconfigured ODBC User-DSN
Then I get the ODBC-Call Error:
"ODBC-Driver[...] Data truncated for column 'TotRev' at row 1 [#1265]"
I have tried different codings, as I got other error codes before which were related to that.
I would really appreciate a hint for this solution. The used Database is MariaDB with utf8-mb4 encoding.
Having no familiarity at all with MariaDB - my only suggestion is to export out of Access to a neutral format; either text file or excel.
Then on the MariaDB side - import the neutral file.
I have solved the problem: One specific characteristic with Access is that there exists a datatype currency. This is the problem and so the question is how to get rid of it. Just changing the datatype did not work, as Access run out of memory. The reason is, that Access tries to keep both tables (old datatype + new datatype) in memory.
To solve this problem I found a nice explanation on Microsoft pages. What I did is to follow the hint on this page:
Microsoft forum entry by John W. Vinson/MVP
Here his advice:
"[...]An alternative way to accomplish this task requires a couple of steps but works with any size table:
Rename the table to tablename_old
Copy and paste it to tablename, using the option design mode only
Change the datatype in the new empty table
Run an Append query to migrate the data
It may be necessary to drop and reestablish relationships.[...]"
As I am not familiar with Access here the link to office support how to append an query
Add records to a table by using an append query
I have tried to find an answer for this elsewhere but cannot, I hope someone can help me.
I am trying to import the MySQL sample database into Oracle SQL Developer.
I have an existing database/connection I want to dump it into. I have created an empty table named classicmodels in my existing connection. Yes that name is only 1 table within the sample db, correct. Ignore the error in naming convention.
When I R-click on it and try 'import data' I cannot import a .sql file, I can only do it with XL, CSV, etc.
When I try and run a script it found on dba.stackexchange
#\path\mysqlsampledatabase.sql , I get a series of 'please provide substitution value' messages, which does not make sense to me given that I am importing a database which is built for SQL (ie what reason is there to substitute).
Pictures below:
The 'UnseenCollection' is a single table I imported as a csv file. I need to import the mysqlsampledatabase file such that it shows up the same way, I can access all tables within the sample db.
Anyone can help I would appreciate it. I need the end result to be the entire mysqlsampledatabase to populate within the 'classicmodels' node.
Thank you.
connect to MySQL
connect to Oracle
for a single MySQL table, right-click, 'Copy to Oracle'
for a few tables, select, drag and drop onto Oracle connection (requires newer version of SQL Developer)
for an entire MySQL database, use the migration project feature
I have a user table in Mysql. And this table is copied from another DB(Oracle).
Of course they have totally same table. And it needs to be updated once a day.(Mysql ← Oracle)
Is it possible to access Oracle and retrieve data within MYsql? I mean things like procedure.
(It seems possible between Mysql)
Or do I have to find other way?
Try to use Data Import tool in dbForge Studio for MySQL.
Start Data Import
Check ODBC format, specify its options and select table to import from
Specify mapping and other options
Select Append, Update or Append/Update import mode.
Also, you can run this tool in command line mode. Save data import template file and run it in command line mode once a day.
Greetings.
We are running Microsoft SQL Server 2008 on one machine with a single license. We need to create an identical development instance of a database held on this server, including tables, triggers, default values, data, views, keys, constraints and indexes.
As a temporary solution, I downloaded and installed SQL Server 2008 Express R2 along with the SQL Server 2008 Toolkit on a separate machine. I then used DTSWizard.exe and pointed it at the remote host as the data source and the local machine as the target.
Transfer of data at first appeared to be fine as the tables, indexes, etc. were created but after a little more digging, I realized it was NOT transferring/setting the default values of any fields! Many of the fields have "NOT NULL" constraints and we're interfacing with a COM API (Response RCK) which does not allow us to manually edit the queries so we're stuck with how they have interface with the database/insert entries (including the use of default values circumventing the NOT NULL constraints.)
As a second option we used the "Generate Script" option and exported all tables, constraints, indexes, default values, data, etc as a .SQL file but now I'm not sure how to load this SQL file into SQL Server because it is 4.9GB - All of which is required, no circumventing the size of this monster.
So my questions are:
- Is there a way I can make a complete copy of SQL database to another server including default values?
- Or is there a way to import a .SQL file without copying and pasting it as a New Query?
P.S: Apologize if my "Microsoft" lingo is not perfect, I'm a Linux guy familiar with PostgreSQL and mySQL.
Why not just take a complete backup of the database and restore it to the new server? That will include everything including default values?
Here is some SQL that should make it happen (edit paths and logical file names to fit your needs):
-- On the source server run:
BACKUP DATABASE [TestDb]
TO DISK = N'C:\TEMP\TestDb.bak'
WITH
NOFORMAT,
NOINIT,
NAME = N'SourceDb-Full Database Backup',
SKIP,
NOREWIND,
NOUNLOAD,
STATS = 10
GO
-- On the other server run
RESTORE DATABASE [DestDb]
FROM DISK = N'C:\Temp\TestDb.bak'
WITH
FILE = 1,
MOVE N'TestDb' TO N'C:\TEMP\DestDb_data.mdf',
MOVE N'TestDb_log' TO N'C:\TEMP\DestDb_log.ldf',
NOUNLOAD, STATS = 10
GO
and you need to move the backup file between the servers if it is not accessible over the network...
Finally came across a solution that works.
In SQL Server 2008, there appears to be a bug when either exporting a database in which DEFAULT values are not carried with the table structures.
Here is my solution for circumventing this:
Right-click on the database you wish to backup.
If "Welcome to the Generate SQL Server Scripts wizard" dialog appears, click next. Otherwise continue to next step.
Select the database you wish to transfer.
There key things to ensure you select properly are as follows:
Set Script Defaults to True
Script USE DATABASE to False
Script Data to True
Script Indexes to True
Script Primary Keys to True
Script Triggers to True
Script Unique Keys to True
Once you've finished setting other optional parameters, click Next >.
Check Stored Procedures, Tables and View (do not check Users unless you want to/need to.) and click Next >.
Click Select All to select all Stored Procedures and click Next >.
Click Select All to select all Tables and click Next >.
Click Select All to select all Views and click Next >.
Under Script mode, select Script to file.
Click the Browse... Button and select the folder and filename you wish to save the SQL script under. In this example we'll use my_script.sql.
Click Finish.
Now that we have the entire database backed up including tables, views, stored procedures, indexes, data, etc. it's time to import this data into a new database.
On the machine you wish to restore this information to, perform the following steps:
Open your command prompt by clicking Start -> Run... or Pressing Windows (Super) + R on your keyboard.
Type "cmd" without the quotes in the Run dialog and click OK.
Browse to the directory your SQL file is located at. In my case, cd "C:\Documents and Settings\Administrator\Desktop"
Type "sqlcmd -s [server][instance] -i my_script.sql" ... [server] is whatever the name of your Windows machine and [instance] is whatever the name of your SQL instance is. For SQLExpress it is "SQLEXPRESS" without the quotes.
Press Enter and you're on your way!
Hope this helps someone else who has encountered the maraud of issues!
Is possible to run a both query from a single server
-- On the source server run:
BACKUP DATABASE [TestDb]
TO DISK = N'C:\TEMP\TestDb.bak'
WITH
NOFORMAT,
NOINIT,
NAME = N'SourceDb-Full Database Backup',
SKIP,
NOREWIND,
NOUNLOAD,
STATS = 10 GO
-- On the other server run
RESTORE DATABASE [DestDb]
FROM DISK = N'C:\Temp\TestDb.bak'
WITH
FILE = 1,
MOVE N'TestDb' TO N'C:\TEMP\DestDb_data.mdf',
MOVE N'TestDb_log' TO N'C:\TEMP\DestDb_log.ldf',
NOUNLOAD, STATS = 10
GO
Can any one explain how to import a Microsoft Excel file in to a MySQL database?
For example, my Excel table looks like this:
Country | Amount | Qty
----------------------------------
America | 93 | 0.60
Greece | 9377 | 0.80
Australia | 9375 | 0.80
There's a simple online tool that can do this called sqlizer.io.
You upload an XLSX file to it, enter a sheet name and cell range, and it will generate a CREATE TABLE statement and a bunch of INSERT statements to import all your data into a MySQL database.
(Disclaimer: I help run SQLizer)
Below is another method to import spreadsheet data into a MySQL database that doesn't rely on any extra software. Let's assume you want to import your Excel table into the sales table of a MySQL database named mydatabase.
Select the relevant cells:
Paste into Mr. Data Converter and select the output as MySQL:
Change the table name and column definitions to fit your requirements in the generated output:
CREATE TABLE sales (
id INT NOT NULL AUTO_INCREMENT PRIMARY KEY,
Country VARCHAR(255),
Amount INT,
Qty FLOAT
);
INSERT INTO sales
(Country,Amount,Qty)
VALUES
('America',93,0.60),
('Greece',9377,0.80),
('Australia',9375,0.80);
If you're using MySQL Workbench or already logged into mysql from the command line, then you can execute the generated SQL statements from step 3 directly. Otherwise, paste the code into a text file (e.g., import.sql) and execute this command from a Unix shell:
mysql mydatabase < import.sql
Other ways to import from a SQL file can be found in this Stack Overflow answer.
Export it into some text format. The easiest will probably be a tab-delimited version, but CSV can work as well.
Use the load data capability. See http://dev.mysql.com/doc/refman/5.1/en/load-data.html
Look half way down the page, as it will gives a good example for tab separated data:
FIELDS TERMINATED BY '\t' ENCLOSED BY '' ESCAPED BY '\'
Check your data. Sometimes quoting or escaping has problems, and you need to adjust your source, import command-- or it may just be easier to post-process via SQL.
There are actually several ways to import an excel file in to a MySQL database with varying degrees of complexity and success.
Excel2MySQL. Hands down, the easiest and fastest way to import Excel data into MySQL. It supports all verions of Excel and doesn't require Office install.
LOAD DATA INFILE: This popular option is perhaps the most technical and requires some understanding of MySQL command execution. You must manually create your table before loading and use appropriately sized VARCHAR field types. Therefore, your field data types are not optimized. LOAD DATA INFILE has trouble importing large files that exceed 'max_allowed_packet' size. Special attention is required to avoid problems importing special characters and foreign unicode characters. Here is a recent example I used to import a csv file named test.csv.
phpMyAdmin: Select your database first, then select the Import tab. phpMyAdmin will automatically create your table and size your VARCHAR fields, but it won't optimize the field types. phpMyAdmin has trouble importing large files that exceed 'max_allowed_packet' size.
MySQL for Excel: This is a free Excel Add-in from Oracle. This option is a bit tedious because it uses a wizard and the import is slow and buggy with large files, but this may be a good option for small files with VARCHAR data. Fields are not optimized.
Not sure if you have all this setup, but for me I am using PHP and MYSQL. So I use a PHP class PHPExcel. This takes a file in nearly any format, xls, xlsx, cvs,... and then lets you read and / or insert.
So what I wind up doing is loading the excel in to a phpexcel object and then loop through all the rows. Based on what I want, I write a simple SQL insert command to insert the data in the excel file into my table.
On the front end it is a little work, but its just a matter of tweaking some of the existing code examples. But when you have it dialed in making changes to the import is simple and fast.
the best and easiest way is to use "MySQL for Excel" app that is a free app from oracle. this app added a plugin to excel to export and import data to mysql. you can download that from here
When using text files to import data, I had problems with quotes and how Excel was formatting numbers. For example, my Excel configuration used the comma as decimal separator instead of the dot.
Now I use Microsoft Access 2010 to open my MySql table as linked table. There I can simply copy and paste cells from Excel to Access.
To do this, first install the MySql ODBC driver and create an ODBC connection.
Then in access, in the "External Data" tab, open "ODBC Database" dialog and link to any table using the ODBC connection.
Using MySql Workbench, you can also copy and paste your Excel data into the result grid of MySql Workbench. I gave detailed instructions in this answer.
Fastest and simpliest way is to save XLS as ODS (open document spreasheet) and import it from PhpMyAdmin
For a step by step example for importing Excel 2007 into MySQL with correct encoding (UTF-8) search for this comment:
"Posted by Mike Laird on October 13 2010 12:50am"
in the next URL:
http://dev.mysql.com/doc/refman/5.1/en/load-data.html
You could use DocChow, a very intuitive GIU for importing Excel into MySQL, and it's free on most common platforms (including Linux).
More especially if you are concerned about date, datetime datatypes, DocChow easily handles datatypes. If you are working with multiple Excel spreadsheets that you want to import into one MySQL table DocChow does the dirty work.
Step 1 Create Your CSV file
Step 2 log in to your mysql server
mysql -uroot -pyourpassword
Step 3
load your csv file
load data local infile '//home/my-sys/my-excel.csv' into table my_tables fields terminated by ',' enclosed by '"' (Country, Amount,Qty);
Another useful tool, and as a MySQL front-end replacement, is Toad for MySQL. Sadly, no longer supported by Quest, but a brilliant IDE for MySQL, with IMPORT and EXPORT wizards, catering for most file types.
If you are using Toad for MySQL steps to import a file is as follows:
create a table in MySQL with the same columns that of the file to be imported.
now the table is created, goto > Tools > Import > Import Wizard
now in the import wizard dialogue box, click Next.
click Add File, browse and select the file to be imported.
choose the correct dilimination.("," seperated for .csv file)
click Next, check if the mapping is done properly.
click Next, select the "A single existing table" radio button also select the table that to be mapped from the dropdown menu of Tables.
Click next and finish the process.
If you don't like plugins, VBA and external tools, I have an excel file that using formulas only allows you to create INSERT/UPDATES. You only have to put the data on the cells:
As an extra, there's another tab in the file to CREATE TABLES:
The file can be found on the following link:
EXCEL FILE
I've had good results with the Tools / Import CSV File feature in HeidiSQL, with CSV files directly exported from Excel 2019 with "Save As..."
It uses LOAD DATA INFILE internally but with a GUI interface and also analyzes the CSV file before passing it to LOAD DATA INFILE so it can, for example, create the table using the first row as column names and guessing the column data type (<New table> option as shown in the picture)