I am trying to insert a record into a table. I dont need to access a dat file, because all values are either constants or generated in the control file itself.
An example would be -
OPTIONS(LOAD=1)
LOAD DATA
APPEND
INTO TABLE table_name
(P_ID CONSTANT 202, NAME CONSTANT "ABC", NUM CONSTANT 1, CREATED_BY CONSTANT "DEF",
CREATION_DATE EXPRESSION "current_timestamp(6)")
However, when I execute the sqlldr command for the above ctl file, I am getting the following error -
SQL*Loader-500: Unable to open file (<file_name>.dat)
SQL*Loader-553: file not found
SQL*Loader-509: System error: No such file or directory
I tink you need to correct yor controlfile:
LOAD DATA
INFILE *
Related
My file has data as below
abc|def|I completed by degreeSymbol®|210708
My load/import statement is below which is run in linux shell script. The Lang environment variable value is= en_US.UTF-8
load client from filename of del MODIFIED BY CHARDEL timestampformat="YYYYMMDD" coldel| usedefaults fastparse messages logfilename insert into tablename nonrecoverable;
In table the data is getting loaded as
abc def (null) 210708
And also when I run a select query the get the below error in the db2
com.ibm.db2.jcc.am.SqlException: Caught java.io.CharConversionException
It's my first time asking a question on here, please so bear with me
I am trying to create a data pipeline to upload a CSV file in an S3-Bucket to a MySQL database table(Production1) using the template provided by aws, but fails when executing RdsMySqlTableCreateActivity.
The sql statement that I'm using(all column names match the CSV file) in the myRDSTableInsertSql parameter:
INSERT INTO `Production1` (`API`, `Normalized Month`, `DATE`, `Monthly Liquid`, `Cum Oil`, `BOPD`, `Monthly Gas Mcf/Month`, `Cum Gas`, `MCFPD`) VALUES(?,?,?,?,?,?,?,?,?);
The RdsMySqlTableCreateActivity error:
errorId
ActivityFailed:SQLException
errorMessage
No value specified for parameter 1
errorStackTrace
amazonaws.datapipeline.taskrunner.TaskExecutionException:
private.com.amazonaws.services.datapipeline.redshift.QueryStatementException: Exception No value specified for
parameter 1 while executing INSERT INTO `Production1` (`API`, `Normalized Month`, `DATE`, `Monthly Liquid`, `Cum Oil`, `BOPD`, `Monthly Gas Mcf/Month`, `Cum Gas`, `MCFPD`) VALUES(?,?,?,?,?,?,?,?,?);...
I ran the insert command on MySQL workbench, replacing the (?,?,?,?,?,?,?,?,?) with (1,2,3,4,5,6,7,8,9), and it worked. The CSV file that I'm using only has 2 rows the column names and values 1-9 for each column respectively. Really not sure what it means by No value specified for parameter 1, any help/guidance would really be appreciated!!!
For anyone that runs into the same issue using the "Load S3 data into RDS MySQL table" template
My values for each parameter were the following
myRDSTableInsertSql:
INSERT INTO tableName(`col_name1`, `col_name2`, `col_name3`, `col_name4`, `col_name5`, `col_name6`, `col_name7`, `col_name8`, `col_name9`) VALUES(?,?,?,?,?,?,?,?,?);
myRDSTableName: tableName
myRDSCreateTableSql:
CREATE TABLE tableName(`col_name1` type, `col_name2` type, `col_name3` type, `col_name4` type, `col_name5` type, `col_name6` type, `col_name7` type, `col_name8` type, `col_name9` type);
The main issue was with the actual CSV file format, you have to make sure there is no header, and that the types are exactly the same. Also make sure that you're separators are "," and each value is not quoted within your CSV file.
This template is a good starting point but form more detailed/complex CSV files making your own datapipeline is a must!
So I want to import a datetime from a txt:
2015-01-22 09:19:59
into a table using a data flow. I have my Flat Source File and my destination DB set up fine. I changed the data type for the txt input for that column in the advanced settings and the input and output properties to:
database timestamp [DT_DBTIMESTAMP]
This is the same data type as the DB used for the table so this should work.
However, when I execute the package I get a error saying the data conversion failed... How do I make this possible?
[Import txt data [1743]] Error: Data conversion failed. The data conversion for column "statdate" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
[Import txt data [1743]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "output column "statdate" (2098)" failed because error code 0xC0209084 occurred, and the error row disposition on "output column "statdate" (2098)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
[Import txt data [1743]] Error: An error occurred while processing file "C:\Program Files\Microsoft SQL Server\MON_Datamart\Sourcefiles\tbl_L30T1.txt" on data row 14939.
On the row he is giving the error the datetime is filled up with spaces. that is why on the table the "allow nulls" is checked but my SSIS package gives the error for some reason... can I somewhere tell the package to allow nulls aswell?
I suggest you import the data in to a character field and then parse it after entry.
The following function should help you:
SELECT IsDate('2015-01-22 09:19:59')
, IsDate(Current_Timestamp)
, IsDate(' ')
, IsDate('')
The IsDate() function returns a 1 when it thinks the value is a date and a 0 when it is not.
This would allow you to do something like:
SELECT value_as_string
, CASE WHEN IsDate(value_as_string) = 1 THEN
Cast(date_as_string As datetime)
ELSE
NULL
END As value_as_datetime
FROM ...
I solved it Myself. Thank you for your suggestion gvee but the way I did it is way easier.
In the Flat File Source when making a new connection in the advanced tab I fixed all the data types according to the table in the database EXCEPT the column with the timestamp (in my case it was called "statdate")! I changed this data type to a STRING because otherwise my Flat File Source would give me a conversion error even before any scripts would have been able to be executed and the only way arround this was setting the error output to ignore failure wich I don't want. (You still have to change the data type after you set it to a string in the advanced settings by right clicking the flat file source -> show advanced editor -> going to the output colums and changing the data type there from Date to string.)
After the timestamp was set to a string I added a Derived Column with this expression to delete all the spaces and give it then "NULL" value:
TRIM(<YourColumnName>) == "" ? (DT_STR,4,1252)NULL(DT_STR,4,1252) : <YourColumnName>
Next I added a Data Conversion to set the string back to a timestamp. The Data conversion is finally connected to the OLE DB Destination.
I hope this helps anyone with the same problem in the future.
End result: Picture of data flow
I have a csv comma separated file containing hundreds of thousands of records in the following format:
3212790556,1,0.000000,,0
3212790557,2,0.000000,,0
Now using the SQL Server Import Flat file method works just dandy. I can edit the sql so that the table name and column names are something meaningful. Plus I also edit the data type from the default varchar(50) to int or decimal. This all works fine and sql import is able to import successfully.
However I am unable to do this same task using the Bulk Insert Query which is as follows:
BULK
INSERT temp1
FROM 'c:\filename.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
This query returns the following 3 errors which I have no idea how to resolve:
Msg 4866, Level 16, State 1, Line 1
The bulk load failed. The column is too long in the data file for row 1, column 5. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
The purpose of my application is that there are multiple csv files in a folder that all need to go up in a single table so that I can query for sum of values. At the moment I was thinking of writing a program in C# that will execute the BULK insert in a loop (according for the number of files) and then return back with my results. I am guessing I dont need to write a code and that I can just write a script that does all of this - any one can guide me to the right path :)
Many thanks.
Edit: just added
ERRORFILE = 'C:\error.log'
to the query and I am getting 5221 rows inserted. Some times its 5222 some times its 5222 but it just fails beyond this point. Dont know whats the issue??? The CSV is perfectly fine.
SOB. WTF!!!
I cant believe that replacing \n with "0x0A" in the ROWTERMINATOR worked!!! I mean seriously. I just tried it and it worked. WTF moment!! Totally.
However what is a bit interesting is that the SQL Import wizard too only about 10 something seconds to import. The import query took well over a minute. Any guesses??
I want to store images and .docx/.doc, .pptx/.ppt, .pdf files using the front end of my software. I don't understand how to implement this and how to insert the BLOB and CLOB files into the table. Please help.
I am using Kubuntu 11.04, MySQL5, Qt 4.7.3.
Two ways:
1 - Use a LOAD_FILE function -
INSERT INTO table1 VALUES(1, LOAD_FILE('data.png'));
2 - Insert file as hex string, e.g. -
INSERT INTO table1 VALUES
(1, x'89504E470D0A1A0A0000000D494844520000001000000010080200000090916836000000017352474200AECE1CE90000000467414D410000B18F0BFC6105000000097048597300000EC300000EC301C76FA8640000001E49444154384F6350DAE843126220493550F1A80662426C349406472801006AC91F1040F796BD0000000049454E44AE426082');
INSERT INTO MY_TABLE(id, blob_col) VALUES(1, LOAD_FILE('/full/path/to/file/myfile.png')
LOAD_FILE has many conditions attached to it. From the MySQL documentation:
LOAD_FILE(file_name)
Reads the file and returns the file contents as a string. To use this
function, the file must be located on the server host, you must
specify the full path name to the file, and you must have the FILE
privilege. The file must be readable by all and its size less than
max_allowed_packet bytes. If the secure_file_priv system variable is
set to a nonempty directory name, the file to be loaded must be
located in that directory.
If the file does not exist or cannot be read because one of the
preceding conditions is not satisfied, the function returns NULL.
Also, there there are bugs with LOAD_FILE in Linux. See http://bugs.mysql.com/bug.php?id=38403 for the bug, and MySQL LOAD_FILE returning NULL for workarounds. On Ubuntu 12.04, MySQL 5.5.32, this works for me:
Copy file to /tmp
Change ownership to mysql user chown mysql:mysql /tmp/yourfile
Log into mysql as mysql root user so you are sure you have FILE privilege
Run your insert statement
Or you could merely use the MySQL Workbench, select the rows, last rows, insert a row without the blob, then just right click and select "Load Value From File".
INSERT INTO table1 VALUES(1, LOAD_FILE(data.png));
won't work but
INSERT INTO table1 VALUES(1, LOAD_FILE('data.png'));
should (assuming data.png exists in the local directory)
for those People who are getting "Column 'image' cannot be null" error while saving Blob through query :-
Open your MySql Command Line Client and login with root user and type
mysql> SHOW VARIABLES LIKE "secure_file_priv";
this will show you the secure path used by MySql to access the files. something like
+------------------+-----------------------+
| Variable_name | Value |
+------------------+-----------------------+
| secure_file_priv | /var/lib/mysql-files/ |
+------------------+-----------------------+
you can either paste files inside this folder or change the "secure_file_priv" variable value to "empty string" so that it can read file from anywhere.
If you are using mysql workbench, just right click on the field (cell) and select 'load value from file' option and then browse to the file and click open and then click on apply. It will automatically generate query like this
UPDATE `dbname`.`tablename` SET `columnname` = ? WHERE (`row` = '1');