my text file like this.
name :Settings, id :1, preference :Mail
name:test ,id:2,preference :testMail
I need to read that textfile and placed that as records in sqlserver database.
name id preference
settings 1 Mail
test 2 testMail
How to parse that textfile like as above .please tell me
Use SQL Management Studio 'Import Data' task, that generates an SSIS package.
Use SQLbulkcopy method of C#.net to insert textfile data into sql server table.
Check this LINK
Related
I am trying to import a psv (pipe delimited csv) into Microsoft SQL Server 2008R2 Express database table.
There are only two fields in the psv, each field has more than 1000 characters.
In the import wizard, I have the following settings:
Double checked in the mapping:
Note I set the option of Ignore on fail/truncate:
and as usual, I get an error:
Error 0xc02020a1: Data Flow Task 1: Data conversion failed. The data
conversion for column "Comm" returned status value 4 and status text
"Text was truncated or one or more characters had no match in the
target code page.". (SQL Server Import and Export Wizard)
UPDATE:
So, following #Marc's suggestion, though very/extremely reluctant, I spent 3 hours or so to finally get SQL2014 installed on my computer and am hoping to import the psv. As expected, error shows up again:
I really cannot understand why company like Microsoft did not do thorough QAT on their products?!
After being tortured by Microsoft for the whole morning, I finally got this task done, for the future readers, you can follow the steps below to import a csv/psv data source into your sql:
Import the CSV/PSV to an Access Database. Note, must be saved to the mdb type (yes, the type from 20th century), you might want to read my story here: how to import psv data into Microsoft Access
In your SQL (mine is 2014), start the Import Wizard and select the data source type (ACCESS) and the file. Why you have to use mdb type of access database? Here you will see there is no option in SQL 2014 for accdb type of access database.
DO NOT forget to select the right Destination (yes, even though you started the wizard by right click on the destination database and chose Import), you want to select the last option: SQL Native Client 11.0. That will show up the SQL2014 and the database.
Now that the import can be completed as expected.
Thanks to the great design logic in this SQL (2014? No, essentially no change compared to 2008), what a humble expectation and requirement!!! it costs me 4-5 hours to complete.
Alternatively, you can use bulk insert to import any flat file.
if (object_id('dbo.usecase1') is not null)
drop table dbo.usecase1
go
create table dbo.usecase1
(
Descr nvarchar(2000) null,
Comm nvarchar(2000) null
)
go
bulk insert dbo.usecase1
from 'C:\tmp\usecase0.psv'
with (
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
go
BULK INSERT (Transact-SQL)
Requirement: Run the below XMLA script on a particular node of SQL server 2012 and store the output data in a file or an SQL table
<Discover xmlns="urn:schemas-microsoft-com:xml-analysis">
<RequestType>DISCOVER_XML_METADATA</RequestType>
<Restrictions>
<RestrictionList>
<ObjectExpansion>ExpandObject</ObjectExpansion>
</RestrictionList>
</Restrictions>
<Properties>
<PropertyList>
</PropertyList>
</Properties>
I'm not finding any component in SSIS that could run the XMLA script and store the results. let me know is it possible in any other ways.
Like #sebTHU said you could use a SSAS DMV:
Select [CUBE_NAME], LAST_DATA_UPDATE from $system.MDSCHEMA_CUBES
However, if you want to list all the databases and their last processed date, install ASSP and then run this:
CALL ASSP.DiscoverXmlMetadataFull("\Databases\Database", "" ,"<ObjectExpansion>ExpandObject</ObjectExpansion>")
You could use a linked server or a .NET source component in SSIS.
I used leaves stru2mysql.prg and vfp2mysql_upload.prg to create a .sql dump file from DBF's. I connect to mysql database from vfp using ODBC.I KNOW how upload the sql dump file but i need to automate the whole process i.e after creating the dump file,my visual foxpro program can upload the dump file without a third party(automatically). I thought of using the source command but that needs to be run in mysql prompt.The assumption here is that my end users dont know how to import(which most of them dont).Please advice on how i can automate importation of sql file to mysql database.thank you
I think what you are looking for are the various SQL* functions in Foxpro. See the VFP help or MSDN on SQLCONNECT (or SQLSTRINGCONNECT), SQLEXEC, and SQLDISCONNECT functions to get you started. Microsoft provided good examples on each in the documentation.
You may also want to use FILETOSTR to get the output from Leafe's programs into a string for the SQLEXEC function.
Here's the steps I use to take data from a Visual FoxPro Database and upload to a MySql Database. These are all put into a custom method on a form, which is fired by a command button. For example the method would be 'uploadnewdata' and I pass parameters for whichever data tables I need
1) Connect to the Server - I use MySql ODBC
2) Validate the user (this uses a SQLEXEC to pull the correct matching record for a users tables
IF M.WorkingDatabase<>-1
nRetVal=SQLEXEC(m.WorkingDatabase,"SELECT * FROM users", "csrUsersOnServer")
SELECT csrUsersOnServer
SELECT userid,FROM csrUsersOnServer;
WHERE ALLTRIM(UPPER(userid))=ALLTRIM(UPPER(lcRanchUser));
AND ALLTRIM(UPPER(lcPassWord))=ALLTRIM(UPPER(lchPassWord));
INTO CURSOR ValidUsers
IF _TALLY>=1
ELSE
=MESSAGEBOX("Your Premise ID Does Not Match Any Records On The Server","System Message")
RETURN 0
ENDIF
ELSE
=MESSAGEBOX("Unable To Connect To Your Database", "System Message")
RETURN 0
ENDIF
3) Once that is successful I create my base cursor (this is the one I'm sending from)
4) I then loop through that cursor creating variable for the values in the fields
5) then using the SQLEXEC, and INSERT INTO, I update each record
6) once the program is finished processing the cursor, it generates a messagebox with the 'finished' message and control returns to the form.
All the user has to do, is select the starting table and enter their login information
I have several files about 15k each of CSV data I need to import into SQL Server 2005.
What would be the simplest way to import the csv data into sql server? Ideally, the tool or method would create the table as well, since there are about 180 fields in it, this would simplify things.
BULK INSERT is your friend. Something like:
BULK INSERT MyTable
FROM 'c:\data.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
EDIT
Although BULK INSERT will not create the table for you. You could look at using SQL Server Integration Services, which will infer the schema from the data file. Take a look at http://www.kodyaz.com/articles/import-csv-flat-file-into-sql-server-using-ssis-integration-services.aspx as an example.
Use the Import and Export tool in SQL Server. Should be under programs -> SQL Server 2005 -> Import and Export (32) & Import and Export (64)
You can use MSSQL wizard:
1) Select your database from MSSql Menagement Studio, right click on it and select "tasks"
2) under tasks you'll find "import data", on the new window click next
3) select in data source "flat file source" to import csv and then follow the wizard.
in my experience sometimes csv arn't imported correctly so, if you can, convert it in excel file and in data source select "microsoft excel"
I need to use text files as data source in SSRS. I tried accessing this with ‘OLEDB provider for Microsoft directory services’ connection. But I could not. The query is given below.
Also let me know how to query the data
I know this thread is old, but as it came up in my search results this may help other people.
There are two 'sort of' workarounds for this. See the following:
http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=130650
So basically you should use OLEDB as the data source, then in the connection string type:
Provider=Microsoft.Jet.OLEDB.4.0;Data Source=xxxx;Extended Properties="text;HDR=No;FMT=Delimited"
Then make sure your file is saved in .txt format, with comma delimiters. Where I've put xxxx you need to put the FOLDER directory - so C:\Temp - don't go down to the individual file level, just the folder it's in.
In the query you write for the dataset, you specify the file name as though it were a table - essentially your folder is your database, and the files in it are tables.
Thanks
I have had great success creating linked servers in SQL to link to disparate text files for creating SSRS reports. Below is sample SQL to link to your txt files:
EXEC master.dbo.sp_addlinkedserver #server = N'', #srvproduct=N'', #provider=N'Microsoft.Jet.OLEDB.4.0', #datasrc=N'', #provstr=N'text'
EXEC master.dbo.sp_addlinkedsrvlogin #rmtsrvname=N'YourLinkedServerName',#useself=N'False',#locallogin=NULL,#rmtuser=NULL,#rmtpassword=NULL
I simply used BULK INSERT command to load the flat file into a temporary table in SSRS, like this:
CREATE TABLE #FlatFile
(
Field1 int,
Field2 varchar(10),
Field3 varchar(15),
Field4 varchar(20),
Field5 varchar(50)
)
BEGIN TRY
BULK INSERT #FlatFile
FROM 'C:\My_Path\My_File.txt'
WITH
(
FIELDTERMINATOR ='\t', -- TAB delimited
ROWTERMINATOR ='\n', -- or '0x0a' (whatever works)
FIRSTROW = 2, -- has 1 header row
ERRORFILE = 'C:\My_Path\My_Error_File.txt',
TABLOCK
);
END TRY
BEGIN CATCH
-- do nothing (prevent the query from aborting on errors...)
END CATCH
SELECT * FROM #FlatFile
I don't think you can
Data Sources Supported by Reporting Services. In the table, your only chance would be "Generic ODBC data source", however a text file is not ODBC compliant AFAIK. No types, no structure etc.
Why not just display the text files? It seems a bit strange to query text files to bloat them into formatted HTML...
I'm not of the mind that you can, but a workaround for this, if your text files are CSVs or the like, is to create an SSIS package which brings that data into a table in SQL Server, which you can then query like there's no tomorrow. SSIS does Flat File Sources with ease.
You can even automate this by right clicking the database in SSMS, doing Tasks->Import Data. Walk through the wizard, and you can then save off the package at the end.