Import .CSV values to SQL Server 2008 using MFC Application - sql-server-2008

I have created dialog based application in MFC from here i am connecting the database SQL Server 2008.
up to now coonection was ok.
here i have the .CSV file for inserting large amount of data.How can i insert the values using MFC application with .CSV file in to SQL Server 2008.

You have to read the CSV (Comma Separated Values) file Character by Character, whenever you encounter a ',' you can take it as on field and on a line break it will be end of record.
Please see This link for the detailed code.
Then you have simultaneously write into the database table, whenever you encounter the End of Record.
This link or this provides a description on how to insert data into the database table using MFC.
Hopefully these will get you started.

Related

Add VBA code in order to amend Excel output in SSIS before sending to Excel destination?

I have a current package which basically copies an Excel template sheet, writes data into this from a table in the SQL database and then sends to Excel destination (ishare drive and server drives). We have repeated rows with specific data which need to be removed from the Excel sheet before it is sent out. We have a VBA code which can be run in Excel as a macro in order to achieve this result. I am wondering how I can automate this in SSIS in the data flow?
I have a current package which basically copies a an excel template
sheet, writes data into this from a table in the sql db
You can use a Data Flow Task where you import data from the SQL Server database:
Create a connection manager :
When you click on the New Connection Manager option, an Add SSIS Connection Manager window form will open to select the connections managers from the given list.
Then select the Provider, Server Name, and Database Name (you will point to a SQL Server database you can either select the table or use a query)
If you want to remove duplicate entries you can use Drag Transformation and Connect your OLE DB Source to it. Double Click on Sort Transformation and Choose the columns to Sort. Also Check the Check Box : Remove rows with duplicate sort values and then click OK.
If you have other specific rules, you can implement a T-SQL query when importing data from the begining (when using the OLE DB Source)
Then you can link it to an Excel destination.

Import MySQL file into Oracle SQL Developer

I have tried to find an answer for this elsewhere but cannot, I hope someone can help me.
I am trying to import the MySQL sample database into Oracle SQL Developer.
I have an existing database/connection I want to dump it into. I have created an empty table named classicmodels in my existing connection. Yes that name is only 1 table within the sample db, correct. Ignore the error in naming convention.
When I R-click on it and try 'import data' I cannot import a .sql file, I can only do it with XL, CSV, etc.
When I try and run a script it found on dba.stackexchange
#\path\mysqlsampledatabase.sql , I get a series of 'please provide substitution value' messages, which does not make sense to me given that I am importing a database which is built for SQL (ie what reason is there to substitute).
Pictures below:
The 'UnseenCollection' is a single table I imported as a csv file. I need to import the mysqlsampledatabase file such that it shows up the same way, I can access all tables within the sample db.
Anyone can help I would appreciate it. I need the end result to be the entire mysqlsampledatabase to populate within the 'classicmodels' node.
Thank you.
connect to MySQL
connect to Oracle
for a single MySQL table, right-click, 'Copy to Oracle'
for a few tables, select, drag and drop onto Oracle connection (requires newer version of SQL Developer)
for an entire MySQL database, use the migration project feature

Why all the records are not being copied from CSV to SQL table in a SSIS package

I am trying to copy data from a flat file to a SQL table using SSIS.
I have a Data Flow Task where I have created a Flat File Source pointing to the csv file and an OLE DB Destination pointing to the table I want the data in.
The problem I am facing is when I run the package, I get only 2621 rows copied to the SQL destination table, where I have about 1,70,000 records in the csv. Not sure why this is happening.
Thanks in advance.
This could be a number of things. This is what comes to mind:
The connection string to your flat file is overwritten by a variable expression or a package configuration. Check SSIS -> Package configurations or the Expressions properties on your connection manager.
The DataRowsToSkip property on your flat file connection manager is set to a value.
The meta data definition of your flat file is incorrectly configured in your connection manager. See properties such as Format, Row delimiter, Column delimiter, etc. Use the preview function to see the output.
The error output on your flat file source is set to Ignore failure, meaning that lines which SSIS cannot process (due to, e.g., incompatible data types) are ignored without warning.

MS Access to MySQL Conversion help (Gigantic table)

So I have this gigantic table, containing approx 7 million records, in MS Access (*.mdb), I want to transfer it into a much more workable MySQL format, and store it on my webserver. The file itself weighs 2GB.
The problem is, since the table is so large, it won't let me export it normally (Access says the limit is 65,536 records.)
I've tried some 3rd party software but to no avail.
Can anyone recommend a clean way of doing so, without damaging the data inside?
Thanks in advance for any help.
Install an ODBC driver for MySQL, if you don't have one already. The latest version is available here: Download Connector/ODBC
Create a DSN (Data Source Name) for your MySQL server from the Windows ODBC Data Source Administrator.
Then from Access 2003, select your table in the Database Window, and choose File->Export from Access' main menu. In the "Export Table 'yourtablename' To ..." dialog, select "ODBC Databases()" from the "Save as type" drop-down list (at the bottom of the dialog). The next dialog allows you to specify the name MySQL will use for the exported table, and it defaults to the Access table name. After you click OK, you will get another dialog, "Select Data Source", where you can select your DSN for MySQL. After you click OK on that dialog, you will probably get one more asking you for user name and password. Supply them, and click OK.
Hopefully your table will then transfer without errors. However, I've never done that operation with MySQL. It has worked for me with ODBC transfers to SQL Server and PostGreSQL. So I don't see why it wouldn't work with MySQL, too.
Also I've never attempted to export 7 million records in one go. If it chokes, we'll have to figure out a work-around.
If you're using Access 2007 instead of 2003, look for a similar option starting with the Export section of the ribbon.
I suggested this approach because my impression is this export will be a one-time deal, so I think the Access UI export method would be easiest. However, you can do essentially the same operation with VBA code using the DoCmd.TransferDatabase Method with your ODBC DSN.
Yet another alternative would be to create a compatible table structure in MySQL, create a link in Access to the MySQL destination table (using your DSN again), then run an "append query" from Access:
INSERT INTO link_to_mysql_table (field1, field2, field3, etc)
SELECT field1, field2, field3, etc
FROM access_table;
The append query approach could be useful in case the export chokes on 7 million records. You could add a WHERE clause to limit the SELECT query's output record set to a manageable chunk size, and then repeat with a different WHERE to specify another chunk.
Is that 7 million value after a compact + repair? I mean, if each record is about 120 chars in length, you can fit 32 million records in 2 gigs.
Also, I not aware of a limit of exporting 65,000 records, but only in regards to Excel.
So, you can/should be able to export the data to a csv, and then use a bulk text import in mySql to pull that data in. So, try exporting the table as csv. That should work.
I mean, you could link a table via odbc if you have a good local connection to the sql server, but if not, then I would export to csv (it is VERY fast). I would then zip the file (they zip fantastic). Upload file to server, and un-zip, and then use bulk text import. So, such a zipped file is VERY small and will save huge amounts of transfer time.
You can also consider using tab delimited as mySql also can import those, but a simple text file should work just fine.
I would use pyodbc as described in
http://en.wikibooks.org/wiki/Python_Programming/Database_Programming
download python 2.7 from
http://python.org/
download
http://code.google.com/p/pyodbc/
modify the following coede to set myfile.mdb and MyTable according to your table and file
save the code in a file translate.py
import csv
mycsv = csv.writer(open('result.csv', 'wb'), delimiter=',',
quotechar='"', quoting=csv.QUOTE_MINIMAL)
import pyodbc
DBfile = 'myfile.mdb'
conn = pyodbc.connect('DRIVER={Microsoft Access Driver (*.mdb)};DBQ='+DBfile)
cursor = conn.cursor()
SQL = 'SELECT * FROM MyTable;'
for row in cursor.execute(SQL): # cursors are iterable
mycsv.writerow(row)
cursor.close()
conn.close()
run python translate.py
Install MySQL on your own system and upsize to it rather than trying to use your local server. Then run an append query from your MySQL to the server instance.

How to import an excel file in to a MySQL database

Can any one explain how to import a Microsoft Excel file in to a MySQL database?
For example, my Excel table looks like this:
Country | Amount | Qty
----------------------------------
America | 93 | 0.60
Greece | 9377 | 0.80
Australia | 9375 | 0.80
There's a simple online tool that can do this called sqlizer.io.
You upload an XLSX file to it, enter a sheet name and cell range, and it will generate a CREATE TABLE statement and a bunch of INSERT statements to import all your data into a MySQL database.
(Disclaimer: I help run SQLizer)
Below is another method to import spreadsheet data into a MySQL database that doesn't rely on any extra software. Let's assume you want to import your Excel table into the sales table of a MySQL database named mydatabase.
Select the relevant cells:
Paste into Mr. Data Converter and select the output as MySQL:
Change the table name and column definitions to fit your requirements in the generated output:
CREATE TABLE sales (
id INT NOT NULL AUTO_INCREMENT PRIMARY KEY,
Country VARCHAR(255),
Amount INT,
Qty FLOAT
);
INSERT INTO sales
(Country,Amount,Qty)
VALUES
('America',93,0.60),
('Greece',9377,0.80),
('Australia',9375,0.80);
If you're using MySQL Workbench or already logged into mysql from the command line, then you can execute the generated SQL statements from step 3 directly. Otherwise, paste the code into a text file (e.g., import.sql) and execute this command from a Unix shell:
mysql mydatabase < import.sql
Other ways to import from a SQL file can be found in this Stack Overflow answer.
Export it into some text format. The easiest will probably be a tab-delimited version, but CSV can work as well.
Use the load data capability. See http://dev.mysql.com/doc/refman/5.1/en/load-data.html
Look half way down the page, as it will gives a good example for tab separated data:
FIELDS TERMINATED BY '\t' ENCLOSED BY '' ESCAPED BY '\'
Check your data. Sometimes quoting or escaping has problems, and you need to adjust your source, import command-- or it may just be easier to post-process via SQL.
There are actually several ways to import an excel file in to a MySQL database with varying degrees of complexity and success.
Excel2MySQL. Hands down, the easiest and fastest way to import Excel data into MySQL. It supports all verions of Excel and doesn't require Office install.
LOAD DATA INFILE: This popular option is perhaps the most technical and requires some understanding of MySQL command execution. You must manually create your table before loading and use appropriately sized VARCHAR field types. Therefore, your field data types are not optimized. LOAD DATA INFILE has trouble importing large files that exceed 'max_allowed_packet' size. Special attention is required to avoid problems importing special characters and foreign unicode characters. Here is a recent example I used to import a csv file named test.csv.
phpMyAdmin: Select your database first, then select the Import tab. phpMyAdmin will automatically create your table and size your VARCHAR fields, but it won't optimize the field types. phpMyAdmin has trouble importing large files that exceed 'max_allowed_packet' size.
MySQL for Excel: This is a free Excel Add-in from Oracle. This option is a bit tedious because it uses a wizard and the import is slow and buggy with large files, but this may be a good option for small files with VARCHAR data. Fields are not optimized.
Not sure if you have all this setup, but for me I am using PHP and MYSQL. So I use a PHP class PHPExcel. This takes a file in nearly any format, xls, xlsx, cvs,... and then lets you read and / or insert.
So what I wind up doing is loading the excel in to a phpexcel object and then loop through all the rows. Based on what I want, I write a simple SQL insert command to insert the data in the excel file into my table.
On the front end it is a little work, but its just a matter of tweaking some of the existing code examples. But when you have it dialed in making changes to the import is simple and fast.
the best and easiest way is to use "MySQL for Excel" app that is a free app from oracle. this app added a plugin to excel to export and import data to mysql. you can download that from here
When using text files to import data, I had problems with quotes and how Excel was formatting numbers. For example, my Excel configuration used the comma as decimal separator instead of the dot.
Now I use Microsoft Access 2010 to open my MySql table as linked table. There I can simply copy and paste cells from Excel to Access.
To do this, first install the MySql ODBC driver and create an ODBC connection.
Then in access, in the "External Data" tab, open "ODBC Database" dialog and link to any table using the ODBC connection.
Using MySql Workbench, you can also copy and paste your Excel data into the result grid of MySql Workbench. I gave detailed instructions in this answer.
Fastest and simpliest way is to save XLS as ODS (open document spreasheet) and import it from PhpMyAdmin
For a step by step example for importing Excel 2007 into MySQL with correct encoding (UTF-8) search for this comment:
"Posted by Mike Laird on October 13 2010 12:50am"
in the next URL:
http://dev.mysql.com/doc/refman/5.1/en/load-data.html
You could use DocChow, a very intuitive GIU for importing Excel into MySQL, and it's free on most common platforms (including Linux).
More especially if you are concerned about date, datetime datatypes, DocChow easily handles datatypes. If you are working with multiple Excel spreadsheets that you want to import into one MySQL table DocChow does the dirty work.
Step 1 Create Your CSV file
Step 2 log in to your mysql server
mysql -uroot -pyourpassword
Step 3
load your csv file
load data local infile '//home/my-sys/my-excel.csv' into table my_tables fields terminated by ',' enclosed by '"' (Country, Amount,Qty);
Another useful tool, and as a MySQL front-end replacement, is Toad for MySQL. Sadly, no longer supported by Quest, but a brilliant IDE for MySQL, with IMPORT and EXPORT wizards, catering for most file types.
If you are using Toad for MySQL steps to import a file is as follows:
create a table in MySQL with the same columns that of the file to be imported.
now the table is created, goto > Tools > Import > Import Wizard
now in the import wizard dialogue box, click Next.
click Add File, browse and select the file to be imported.
choose the correct dilimination.("," seperated for .csv file)
click Next, check if the mapping is done properly.
click Next, select the "A single existing table" radio button also select the table that to be mapped from the dropdown menu of Tables.
Click next and finish the process.
If you don't like plugins, VBA and external tools, I have an excel file that using formulas only allows you to create INSERT/UPDATES. You only have to put the data on the cells:
As an extra, there's another tab in the file to CREATE TABLES:
The file can be found on the following link:
EXCEL FILE
I've had good results with the Tools / Import CSV File feature in HeidiSQL, with CSV files directly exported from Excel 2019 with "Save As..."
It uses LOAD DATA INFILE internally but with a GUI interface and also analyzes the CSV file before passing it to LOAD DATA INFILE so it can, for example, create the table using the first row as column names and guessing the column data type (<New table> option as shown in the picture)