SQL loader, insert filename in import - sql-loader

I'm trying to import a data file that looks like
FileName: BND20160114.dat
The rows look something like this:
SPSTART;BN;20160114;083422;000026
SPINFO;15165446456;A1;20160114
SPINFO;54645465456;A1;20160114
SPSLUT;BN;20160114;083422;4
I've created a simple controlfile that imports the rows in a table.
load data characterset WE8MSWIN1252
APPEND INTO TABLE CSSE_IMP.STAGED_PERSON_LKD
fields terminated by ';'
TRAILING NULLCOLS
(
FILE_ROWNUM RECNUM,
FILE_DATA POSITION(1:400),
)
My issue is that I'm am going to import more files that are similar to this file, and I need to know the origin of every file I import.
Either I import the filename for every row.
Or I get the the letters after the SPSTART; in the first row, and add that to every row.
I'm stuck. Could someone shed some light on this.
Thank!

Related

CSV import (neo4j browser) returning empty nodes only i.e. without properties

I am unable to successfully import a csv file in the neo4j browser, as the nodes are created but they do not show the properties. Does anyone see the problem? I will describe how I proceeded:
This is how the csv file looks
I have tested the csv file with LOAD CSV WITH HEADERS FROM "file:///testCSV3.csv" AS line
WITH line LIMIT 4
RETURN line
and the result is ok (I guess?):
I then tried various things, as e.g. this query:
LOAD CSV WITH HEADERS FROM "file:///testCSV3.csv" AS line
CREATE (:Activity {activityName: line.MyActivity, time: toInteger(line.Timestamp)})
The outcome is nodes without properties:
Any ideas what I am missing? Why are the properties activityName and time not showing up? - Thanks in advance!
(You should have shown your raw CSV file, to make the problem clearer.)
I assume your raw file starts out like this:
ID ;Timestamp;MyActivity
1;1;Run
2;2;Talk
3;3;Eat
LOAD CSV is sensitive to extra spaces, so your ID header should not be followed by a space. Also, the default field terminator is a comma not a semicolon, so you need to specify the FIELDTERMINATOR option to override the default.
Your results would be more reasonable if you removed the extra space and changed your query to this:
LOAD CSV WITH HEADERS FROM "file:///testCSV3.csv" AS line FIELDTERMINATOR ';'
WITH line LIMIT 4
RETURN line

import csv to sqlite3, but change the type of column before import

I am working with one csv, which has many columns. I decided I'll import the data via sqlite3 shell and I found this very useful:
.mode CSV
.import my_table.csv my_sqlite_table
This saves me a lot of work, on the other hand it gives me no control over column and value characteristic as all the data is TEXT
Is there any elegant way within shell to first address what type the column should be or to fix particular blank values with null?
Values in CSV files always are strings.
To change that, import into a temporary table, and then modify the values appropriately:
INSERT INTO the_actual_table(a, b, c)
SELECT a, CAST(b AS INTEGER), nullif(c, '') FROM temp_table;

Can I selectively import data from a text file into MySQL?

I have a 13gb .txt file which I am importing into MySQL, however I don't want to import all of the data. For example there are many columns that are either completely empty or contain irrelevant information - I only want to import ~100/360 I've been provided. If I only create headers for the columns I want, can I select the specific corresponding data from the .txt file to be uploaded?
Normally I would use a text editor to remove the superfluous data, but I do not possess a text editor that can handle a file of this size.
You can ignore specific columns in the input file by assigning them to a user-defined variable instead of a database column.
For example if you had a CSV file with 4 columns and just wanted to import columns 1 and 4 into your table you could do something like this:
load data infile '/tmp/so42140337.csv'
into table so42140337
fields terminated by ','
lines terminated by '\n'
(c1,#dummy,#dummy,c2);
Given the size of your input file it may be more efficient to import it in chunks rather than importing the entire file in one command. You can use the pt-fifo-split tool for this, following the pattern in this blog post.

Importing PIPE delimited format txt into MySQL via PHPMyAdmin

I am importing some thousands lines of Data from a .txt file containing two columns and the format is as it follows:
A8041550408#=86^:|blablablablablablablablablablablablablablablablablablablabla1
blablablablablablablablablablablablablablablablablablablabla2
blablablablablablablablablablablablablablablablablablablabla3
A8041550408#=86^:|blablablablablablablablablablablablablablablablablablablabla1
blablablablablablablablablablablablablablablablablablablabla2
A8041550408#=86^:|blablablablablablablablablablablablablablablablablablablabla1
blablablablablablablablablablablablablablablablablablablabla2
blablablablablablablablablablablablablablablablablablablabla3
blablablablablablablablablablablablablablablablablablablabla4
etc....
What I have done so far is create a table with the two fields, but when i try to import the .txt file as a CSV and putting / Columns separated By : | /, I get an error:
"Invalid column count in CSV input on line 2."
Which is quite obvious since the second line of the .txt file is empty.
Moreover, I have tried importing the file as a CSV using LOAD DATA, and it didn't work as well it has just filled up the table with random words and phrases from the .txt file .
So my question is : How can I import the data from this file ?
You have to fix your file; in its current state you cannot expect the import module to be able to understand it. First step would be to remove the empty lines: How to remove blank lines from a Unix file

Importing an excel .csv file and adding it to a column in phpMyAdmin

I've read through some other posts and nothing quite answers my question specifically.
I have an existing database in phpMyAdmin - a set of pin codes we use to collect contest entries.
The DB has about 10,000 pin codes in it.
I need to add 250 "New" codes to it. I have an excel file that is stripped down to a single column .csv, no header - just codes.
What I need to do is import this into the table named "pin2" and add these to the row called "pin"
The other rows are where entrants would add names and phone numbers, so are all "null"
I've uploaded a screen grab of the structure.
DB Structure http://www.redpointdesign.ca/sql.png
any help would be appreciated!
You need to use a LOAD DATA query similar to this:
LOAD DATA INFILE 'pincodes.csv'
INTO TABLE pin2 (pin)
If the pin codes in the csv file are enclosed in quotes you may also need to include an ENCLOSED BY clause.
LOAD DATA INFILE 'pincodes.csv'
INTO TABLE pin2
FIELDS ENCLOSED BY '"'
(pin)
If you wants to do using csv
Then you need to need to follow these steps
Manually define autoincremented value in first comlumn.
In other column you have to externally define it as a NULL,
otherwise you will get Invalid column count in CSV input on line 1.
because column with no value is not consider by phpmyadmin
Them click on import in phpmyadmin and you are done ..