SQL Import Wizard errors on importing a psv file - sql-server-2014

I am trying to import a psv (pipe delimited csv) into Microsoft SQL Server 2008R2 Express database table.
There are only two fields in the psv, each field has more than 1000 characters.
In the import wizard, I have the following settings:
Double checked in the mapping:
Note I set the option of Ignore on fail/truncate:
and as usual, I get an error:
Error 0xc02020a1: Data Flow Task 1: Data conversion failed. The data
conversion for column "Comm" returned status value 4 and status text
"Text was truncated or one or more characters had no match in the
target code page.". (SQL Server Import and Export Wizard)
UPDATE:
So, following #Marc's suggestion, though very/extremely reluctant, I spent 3 hours or so to finally get SQL2014 installed on my computer and am hoping to import the psv. As expected, error shows up again:
I really cannot understand why company like Microsoft did not do thorough QAT on their products?!

After being tortured by Microsoft for the whole morning, I finally got this task done, for the future readers, you can follow the steps below to import a csv/psv data source into your sql:
Import the CSV/PSV to an Access Database. Note, must be saved to the mdb type (yes, the type from 20th century), you might want to read my story here: how to import psv data into Microsoft Access
In your SQL (mine is 2014), start the Import Wizard and select the data source type (ACCESS) and the file. Why you have to use mdb type of access database? Here you will see there is no option in SQL 2014 for accdb type of access database.
DO NOT forget to select the right Destination (yes, even though you started the wizard by right click on the destination database and chose Import), you want to select the last option: SQL Native Client 11.0. That will show up the SQL2014 and the database.
Now that the import can be completed as expected.
Thanks to the great design logic in this SQL (2014? No, essentially no change compared to 2008), what a humble expectation and requirement!!! it costs me 4-5 hours to complete.

Alternatively, you can use bulk insert to import any flat file.
if (object_id('dbo.usecase1') is not null)
drop table dbo.usecase1
go
create table dbo.usecase1
(
Descr nvarchar(2000) null,
Comm nvarchar(2000) null
)
go
bulk insert dbo.usecase1
from 'C:\tmp\usecase0.psv'
with (
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
go
BULK INSERT (Transact-SQL)

Related

I can't import a CSV into Microsoft SQL Server Management Studio 2014 due to Pre Execute error 0xc020802e

I'm trying import a CSV file into SQL Server Management Studio 2014 but keep hitting errors every time I try. Specifically, I get a Pre-execute error:
Messages Error 0xc020802e: Data Flow Task 1: The data type for "Source
download_fresh_filename_com_06_Apr_17_4EB41F5D720E569B7AD1D854B1EC3142_csv.Outputs[Flat
File Source Output].Columns[Target URL]" is DT_NTEXT, which is not
supported with ANSI files. Use DT_TEXT instead and convert the data to
DT_NTEXT using the data conversion component. (SQL Server Import and
Export Wizard)
Error 0xc0202094: Data Flow Task 1: Unable to
retrieve column information from the flat file connection manager.
(SQL Server Import and Export Wizard)
Error 0xc004701a: Data Flow
Task 1: Source -
download_fresh_filename_com_06_Apr_17_4EB41F5D720E569B7AD1D854B1EC3142_csv
failed the pre-execute phase and returned error code 0xC0202094. (SQL
Server Import and Export Wizard)
Information 0x4004300b: Data Flow
Task 1: "Destination -
download_fresh_filename_com_06_Apr_17_4EB41F5D720E569B7AD1D854B1EC3142"
wrote 0 rows. (SQL Server Import and Export Wizard)
The CSV is UTF8 encoded, ~114,900 rows 20 columns. Here's what I've tried so far with no success:
Under Choose a Data Source>advanced I've set the data type to [DT_TEXT] which didn't work,
tried [DT_NTEXT] but still didn't work.
Under Review Data Type Mapping I've set On Error (global) to Ignore, still didn't work
Any help would be appreciated.
Thanks.
You can try with a limited number of rows to narrow down the row having the rogue values.
Also, funny though it may sound, in such cases, I have had better success by importing it first into Excel, and then onward to SQL Server. In other cases, to Excel, onward to Access and further on to SQL Server. Strange world.
I noticed a box I was missing before in the Import Wizard: under choose a data source>flat file source there is a "text qualifier" definition which is set to by default, I changed it to ".
Also in the same section under Advanced, I changed the column width from the default 50 to 1000 for all rows.
Worked perfectly.
Error clearsly indicates regarding ANSI file, Solution is to use Unicode. There is tick as shown below. When you check this box, it will work fine.

SQL Bulk Insert CSV

I have a csv comma separated file containing hundreds of thousands of records in the following format:
3212790556,1,0.000000,,0
3212790557,2,0.000000,,0
Now using the SQL Server Import Flat file method works just dandy. I can edit the sql so that the table name and column names are something meaningful. Plus I also edit the data type from the default varchar(50) to int or decimal. This all works fine and sql import is able to import successfully.
However I am unable to do this same task using the Bulk Insert Query which is as follows:
BULK
INSERT temp1
FROM 'c:\filename.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
This query returns the following 3 errors which I have no idea how to resolve:
Msg 4866, Level 16, State 1, Line 1
The bulk load failed. The column is too long in the data file for row 1, column 5. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
The purpose of my application is that there are multiple csv files in a folder that all need to go up in a single table so that I can query for sum of values. At the moment I was thinking of writing a program in C# that will execute the BULK insert in a loop (according for the number of files) and then return back with my results. I am guessing I dont need to write a code and that I can just write a script that does all of this - any one can guide me to the right path :)
Many thanks.
Edit: just added
ERRORFILE = 'C:\error.log'
to the query and I am getting 5221 rows inserted. Some times its 5222 some times its 5222 but it just fails beyond this point. Dont know whats the issue??? The CSV is perfectly fine.
SOB. WTF!!!
I cant believe that replacing \n with "0x0A" in the ROWTERMINATOR worked!!! I mean seriously. I just tried it and it worked. WTF moment!! Totally.
However what is a bit interesting is that the SQL Import wizard too only about 10 something seconds to import. The import query took well over a minute. Any guesses??

import database dump to mysql using visual foxpro

I used leaves stru2mysql.prg and vfp2mysql_upload.prg to create a .sql dump file from DBF's. I connect to mysql database from vfp using ODBC.I KNOW how upload the sql dump file but i need to automate the whole process i.e after creating the dump file,my visual foxpro program can upload the dump file without a third party(automatically). I thought of using the source command but that needs to be run in mysql prompt.The assumption here is that my end users dont know how to import(which most of them dont).Please advice on how i can automate importation of sql file to mysql database.thank you
I think what you are looking for are the various SQL* functions in Foxpro. See the VFP help or MSDN on SQLCONNECT (or SQLSTRINGCONNECT), SQLEXEC, and SQLDISCONNECT functions to get you started. Microsoft provided good examples on each in the documentation.
You may also want to use FILETOSTR to get the output from Leafe's programs into a string for the SQLEXEC function.
Here's the steps I use to take data from a Visual FoxPro Database and upload to a MySql Database. These are all put into a custom method on a form, which is fired by a command button. For example the method would be 'uploadnewdata' and I pass parameters for whichever data tables I need
1) Connect to the Server - I use MySql ODBC
2) Validate the user (this uses a SQLEXEC to pull the correct matching record for a users tables
IF M.WorkingDatabase<>-1
nRetVal=SQLEXEC(m.WorkingDatabase,"SELECT * FROM users", "csrUsersOnServer")
SELECT csrUsersOnServer
SELECT userid,FROM csrUsersOnServer;
WHERE ALLTRIM(UPPER(userid))=ALLTRIM(UPPER(lcRanchUser));
AND ALLTRIM(UPPER(lcPassWord))=ALLTRIM(UPPER(lchPassWord));
INTO CURSOR ValidUsers
IF _TALLY>=1
ELSE
=MESSAGEBOX("Your Premise ID Does Not Match Any Records On The Server","System Message")
RETURN 0
ENDIF
ELSE
=MESSAGEBOX("Unable To Connect To Your Database", "System Message")
RETURN 0
ENDIF
3) Once that is successful I create my base cursor (this is the one I'm sending from)
4) I then loop through that cursor creating variable for the values in the fields
5) then using the SQLEXEC, and INSERT INTO, I update each record
6) once the program is finished processing the cursor, it generates a messagebox with the 'finished' message and control returns to the form.
All the user has to do, is select the starting table and enter their login information

When Trying to Import Old Visual FoxPro Database into SQL receiving "Cannot find column -1"

I have a ton Visual FoxPro db files that I am trying to import into an empty SQL 2008 Express database. When I run through the SQL Import and Export Wizard everything seems to communicate fine. When I get to the mappings section I can click on preview and see the data in the selected FP table. When I click on Edit Mappings or Next I get:
===================================
Column information for the source and destination data could not be retrieved.
"eqr_sellers" -> [dbo].[eqr_sellers]:
- Cannot find column -1.
(SQL Server Import and Export Wizard)
===================================
Cannot find column -1. (System.Data)
------------------------------
Program Location:
at System.Data.DataColumnCollection.get_Item(Int32 index)
at Microsoft.DataTransformationServices.Controls.ProviderInfos.MetadataLoader.LoadColumnsFromTable(IDbConnection myConnection, String[] strRestrictions)
at Microsoft.SqlServer.Dts.DtsWizard.OLEDBHelpers.LoadColumnsFromTable(MetadataLoader metadataLoader, IDbConnection myConnection, String[] strRestrictions, DataSourceInfo dsi)
at Microsoft.SqlServer.Dts.DtsWizard.TransformInfo.PopulateDbSourceColumnInfoFromDB(IDbConnection mySourceConnection)
at Microsoft.SqlServer.Dts.DtsWizard.TransformInfo.PopulateDbSourceColumnInfo(IDbConnection mySourceConnection, ColumnInfoCollection& sourceColInfos)
Any insight would be appreciated.
What are the data types? Auto-Increment Integer fields are not supported by the ODBC connectors I have used in the past.

Create a new table and import data from csv file into SQL Server 2005

I have several files about 15k each of CSV data I need to import into SQL Server 2005.
What would be the simplest way to import the csv data into sql server? Ideally, the tool or method would create the table as well, since there are about 180 fields in it, this would simplify things.
BULK INSERT is your friend. Something like:
BULK INSERT MyTable
FROM 'c:\data.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
EDIT
Although BULK INSERT will not create the table for you. You could look at using SQL Server Integration Services, which will infer the schema from the data file. Take a look at http://www.kodyaz.com/articles/import-csv-flat-file-into-sql-server-using-ssis-integration-services.aspx as an example.
Use the Import and Export tool in SQL Server. Should be under programs -> SQL Server 2005 -> Import and Export (32) & Import and Export (64)
You can use MSSQL wizard:
1) Select your database from MSSql Menagement Studio, right click on it and select "tasks"
2) under tasks you'll find "import data", on the new window click next
3) select in data source "flat file source" to import csv and then follow the wizard.
in my experience sometimes csv arn't imported correctly so, if you can, convert it in excel file and in data source select "microsoft excel"