How to load excel data into sql server 2008 table? - sql-server-2008

DBMS - SQL server 2008 r2 with management studio (GUI)
Excel file - Columns exactly like the columns in destination table
How do I load the data from the excel table into the sql table? Are there any potential problems in this way of doing things ? For example, a column does not allow null, but the
excel has a null.

The simpliest way to import XLS data directly into SQL is to use the Import Wizard.
You can do this either by having the import wizard create the table for you when the wizard runs, alternatively, you could create the table before hand and then use the wizard and use that table as your target table.
I prefer the latter method as I like to control my table creation with my desired datatypes.
To find the Import Wizard, simply right-click on the Database name, Tasks -> Import Data

You can also use an Excel Data Source in SQL Server Integration Services.
Some caveats though:
If you are running in a 64-bit OS, make sure your package properties are set to disable the 64-bit runtime, or the import from Excel 2003-2007 or will fail. You will get a connection error.
Take the time to use the advanced editor to shrink your columns to the size you need, or every field will come out as nvarchar(255) by default, which is a waste of space. If you are mapping to an existing table, you may need to convert the data types as well.
The advantage of this approach is that you can re-use it more easily, though of course you can save the package the wizard creates as well.
Joey Morgan
Programmer/Analyst Principal I
WellPoint Medicaid Business Unit

Related

SQL Server Import and Export Wizard is crashing when trying to export to Oracle

I am trying to migrate SQL Server 2008 table records to an Oracle 11g database using the SQL Server Import and Export Wizard.
Steps:
Select the SQL Database -> Right Mouse click
tasks - > export data
Choose a data source (keep the default options)
Click "Next" button
Choose a Destination: (Oracle Provider for OLE DB)
Select the table and click Finish Button
There are millions of records in the table, but after copying a couple of million records, SQL Server Import and Export Wizard is crashing.
You need to specify the Rows per batch or maximum insert commit size if you dealing with millions of transactions.
Not sure if the Wizard has option to specify the Rows per batch or maximum insert commit size.
If not, you need to create a SSIS package to do so.
I agree with Henry, If you want to deal with million's of record try to create a package instead of using the old import export wizard and set functionality like rows per batch, maximum insert commit size. but you need to be aware of your system capacity to set this options.

Need to update Mysql periodically from Oracle

I have a user table in Mysql. And this table is copied from another DB(Oracle).
Of course they have totally same table. And it needs to be updated once a day.(Mysql ← Oracle)
Is it possible to access Oracle and retrieve data within MYsql? I mean things like procedure.
(It seems possible between Mysql)
Or do I have to find other way?
Try to use Data Import tool in dbForge Studio for MySQL.
Start Data Import
Check ODBC format, specify its options and select table to import from
Specify mapping and other options
Select Append, Update or Append/Update import mode.
Also, you can run this tool in command line mode. Save data import template file and run it in command line mode once a day.

Reading excel from DB (varbinary) in SSIS

First I'd like to say that I'm brand new to SSIS so bear with me if this is a very basic question. I've searched and cannot find an answer.
I need to read data from SQL Server that is stored in a varbinary column that contains an excel document. I then need to store this data into another table with the appropriate columns (pre-defined format).
My question is essentially... How do I read this varbinary data into something I can work with and then insert into another table?
You could use Export Column Transformation available within the Data Flow Task to read the varbinary data and then save it as a file on local disk where the SSIS package is running.
MSDN documentation about Export Column transformation.
Sample: The Export Column Transformation on BI Monkey
Using another data flow task, you can read the saved file and import the data into the table of your choice.

SQL Server 2005/2008 - Import a fixed width text file via the command line?

In MySQL I am able to create a table with fixed column widths and then can use the load data infile command to import a fixed width file.
For example:
Fixed width text file = JOHN 1234
Imports into table:
Username - CHAR(8)
Password - ChAR(4)
The beauty of this approach is that the file is 'chopped' up based on the column sizes defined in the MySQL table.
Now there is a new project requiring SQL Server 2005.
Does SQL Server have a function similar to load data infile? Or is this a better approach then the one I'm taking.
You do have similar functionality with SQL Server. I would encourage you to learn about format files. This page, from Microsoft, does a fairly good job of explaining it.
http://msdn.microsoft.com/en-us/library/ms178129.aspx
I would also encourage you to read this blog:
6 ways to import data into SQL Server

How to import an excel file in to a MySQL database

Can any one explain how to import a Microsoft Excel file in to a MySQL database?
For example, my Excel table looks like this:
Country | Amount | Qty
----------------------------------
America | 93 | 0.60
Greece | 9377 | 0.80
Australia | 9375 | 0.80
There's a simple online tool that can do this called sqlizer.io.
You upload an XLSX file to it, enter a sheet name and cell range, and it will generate a CREATE TABLE statement and a bunch of INSERT statements to import all your data into a MySQL database.
(Disclaimer: I help run SQLizer)
Below is another method to import spreadsheet data into a MySQL database that doesn't rely on any extra software. Let's assume you want to import your Excel table into the sales table of a MySQL database named mydatabase.
Select the relevant cells:
Paste into Mr. Data Converter and select the output as MySQL:
Change the table name and column definitions to fit your requirements in the generated output:
CREATE TABLE sales (
id INT NOT NULL AUTO_INCREMENT PRIMARY KEY,
Country VARCHAR(255),
Amount INT,
Qty FLOAT
);
INSERT INTO sales
(Country,Amount,Qty)
VALUES
('America',93,0.60),
('Greece',9377,0.80),
('Australia',9375,0.80);
If you're using MySQL Workbench or already logged into mysql from the command line, then you can execute the generated SQL statements from step 3 directly. Otherwise, paste the code into a text file (e.g., import.sql) and execute this command from a Unix shell:
mysql mydatabase < import.sql
Other ways to import from a SQL file can be found in this Stack Overflow answer.
Export it into some text format. The easiest will probably be a tab-delimited version, but CSV can work as well.
Use the load data capability. See http://dev.mysql.com/doc/refman/5.1/en/load-data.html
Look half way down the page, as it will gives a good example for tab separated data:
FIELDS TERMINATED BY '\t' ENCLOSED BY '' ESCAPED BY '\'
Check your data. Sometimes quoting or escaping has problems, and you need to adjust your source, import command-- or it may just be easier to post-process via SQL.
There are actually several ways to import an excel file in to a MySQL database with varying degrees of complexity and success.
Excel2MySQL. Hands down, the easiest and fastest way to import Excel data into MySQL. It supports all verions of Excel and doesn't require Office install.
LOAD DATA INFILE: This popular option is perhaps the most technical and requires some understanding of MySQL command execution. You must manually create your table before loading and use appropriately sized VARCHAR field types. Therefore, your field data types are not optimized. LOAD DATA INFILE has trouble importing large files that exceed 'max_allowed_packet' size. Special attention is required to avoid problems importing special characters and foreign unicode characters. Here is a recent example I used to import a csv file named test.csv.
phpMyAdmin: Select your database first, then select the Import tab. phpMyAdmin will automatically create your table and size your VARCHAR fields, but it won't optimize the field types. phpMyAdmin has trouble importing large files that exceed 'max_allowed_packet' size.
MySQL for Excel: This is a free Excel Add-in from Oracle. This option is a bit tedious because it uses a wizard and the import is slow and buggy with large files, but this may be a good option for small files with VARCHAR data. Fields are not optimized.
Not sure if you have all this setup, but for me I am using PHP and MYSQL. So I use a PHP class PHPExcel. This takes a file in nearly any format, xls, xlsx, cvs,... and then lets you read and / or insert.
So what I wind up doing is loading the excel in to a phpexcel object and then loop through all the rows. Based on what I want, I write a simple SQL insert command to insert the data in the excel file into my table.
On the front end it is a little work, but its just a matter of tweaking some of the existing code examples. But when you have it dialed in making changes to the import is simple and fast.
the best and easiest way is to use "MySQL for Excel" app that is a free app from oracle. this app added a plugin to excel to export and import data to mysql. you can download that from here
When using text files to import data, I had problems with quotes and how Excel was formatting numbers. For example, my Excel configuration used the comma as decimal separator instead of the dot.
Now I use Microsoft Access 2010 to open my MySql table as linked table. There I can simply copy and paste cells from Excel to Access.
To do this, first install the MySql ODBC driver and create an ODBC connection.
Then in access, in the "External Data" tab, open "ODBC Database" dialog and link to any table using the ODBC connection.
Using MySql Workbench, you can also copy and paste your Excel data into the result grid of MySql Workbench. I gave detailed instructions in this answer.
Fastest and simpliest way is to save XLS as ODS (open document spreasheet) and import it from PhpMyAdmin
For a step by step example for importing Excel 2007 into MySQL with correct encoding (UTF-8) search for this comment:
"Posted by Mike Laird on October 13 2010 12:50am"
in the next URL:
http://dev.mysql.com/doc/refman/5.1/en/load-data.html
You could use DocChow, a very intuitive GIU for importing Excel into MySQL, and it's free on most common platforms (including Linux).
More especially if you are concerned about date, datetime datatypes, DocChow easily handles datatypes. If you are working with multiple Excel spreadsheets that you want to import into one MySQL table DocChow does the dirty work.
Step 1 Create Your CSV file
Step 2 log in to your mysql server
mysql -uroot -pyourpassword
Step 3
load your csv file
load data local infile '//home/my-sys/my-excel.csv' into table my_tables fields terminated by ',' enclosed by '"' (Country, Amount,Qty);
Another useful tool, and as a MySQL front-end replacement, is Toad for MySQL. Sadly, no longer supported by Quest, but a brilliant IDE for MySQL, with IMPORT and EXPORT wizards, catering for most file types.
If you are using Toad for MySQL steps to import a file is as follows:
create a table in MySQL with the same columns that of the file to be imported.
now the table is created, goto > Tools > Import > Import Wizard
now in the import wizard dialogue box, click Next.
click Add File, browse and select the file to be imported.
choose the correct dilimination.("," seperated for .csv file)
click Next, check if the mapping is done properly.
click Next, select the "A single existing table" radio button also select the table that to be mapped from the dropdown menu of Tables.
Click next and finish the process.
If you don't like plugins, VBA and external tools, I have an excel file that using formulas only allows you to create INSERT/UPDATES. You only have to put the data on the cells:
As an extra, there's another tab in the file to CREATE TABLES:
The file can be found on the following link:
EXCEL FILE
I've had good results with the Tools / Import CSV File feature in HeidiSQL, with CSV files directly exported from Excel 2019 with "Save As..."
It uses LOAD DATA INFILE internally but with a GUI interface and also analyzes the CSV file before passing it to LOAD DATA INFILE so it can, for example, create the table using the first row as column names and guessing the column data type (<New table> option as shown in the picture)