If Query result is big its creating multiple line - mysql

I am exporting a oracle table data that has 165 rows into a delimited separated text file.
And then importing into mysql table by Load Data Infile command.
Now the problem is, few row is too long so its created separate line in the text file
while import its creating problem.
my text file data is pipeline(|) separated and enclosed with double quote(").
And its on windows server
Any help will be much appreciated.

its nothing wrong with oracle export,
it is the problem with notepad
i open it in different editor and every row in same line

Related

SSIS - Exporting data with commas to a csv file

I am trying to export a list of fields to a csv file from a database.
It keeps putting all the data onto one column and doesn't separate it. When checking the preview it seems to be okay but on export its not working. Currently trying to following settings. Any help would be appreciated.
SSIS settings
Excel file output issue
Actually it seems to work, Excel is just too dumb to recognize it.
Mark the whole table, then go to Data -> Text in Rows
And configure the wizard (Separeted, Semikolon as Separator):
Now you have seperated the rows and cells:

MySQL LOAD DATA INFILE Query running successfully but no output

I am running an SQL statement to read in a CSV file (in this case only 1 column with a heading) and importing it into my database, where further manipulation to the data will occur through subsequent SQL statements.
However, I seem to be unable to load the CSV file into my DB both directly in MySQL Workbench and my PHP website locally as well as on another Mac in my network through the PHP website.
The interesting thing is the query appears to run successfully as I get no errors on any of the platforms or computers but no rows are affected.
I have done a lot of digging in trying to solve the problem. Here is my current SQL code and I will then talk through what I have tried.
LOAD DATA INFILE '/Users/Josh/Desktop/testcsv.csv'
INTO TABLE joinTest
FIELDS
TERMINATED BY ','
ENCLOSED BY '"'
LINES
TERMINATED BY '\r\n'
IGNORE 1 LINES
(interestName);
So this is me trying it in MySQL Workbench. In PHP I have an uploader and variable which stores the location of the tmp file. This works as have echo'd it out and all looks fine.
I've tried running it as
LOAD DATA INFILE
But it still doesn't affect any rows (runs successfully). I've also changed the TERMINATED BY in LINES to just \n but still will not affect any rows.
I can't understand why it is not affecting any rows as my CSV file is readable by all and should be in the correct format (created in Excel and saved as cvs format).
Does anyone know what the potential problem could be?
If any more info is required I will respond with it ASAP. Thanks.
Right so I discovered Mac uses different Line Endings to Unix & Windows. I opened the CSV in Sublime Text 3 and discovered there was an option to change the Line Endings in the View Options.
I set this to Unix, saved the file and the terminator of \n worked. Unfortunately Sublime text doesn't show the line endings as visible characters so this was purely by chance.
I hope this helps anyone else who runs into this issue, make sure the line endings of the CSV match the line endings you are specifying in your LOAD DATA query.

MySQL Import text file with one word per line

I have a text file with one word per line. I haven't been able to get it to import into mysql table with a field called name.
What options do I need to use for importing?
My text file is like:
random
name
one
two
three
You should use LOAD DATA INFILE
LOAD DATA INFILE 'data.txt' INTO TABLE my_table(name)
EDIT:
Seems like you cannot specify both columns list ((name) in this case) and FIELDS/LINES clauses at the same time - really weird.
The most straight forward way would be to use mysqlimport
with the parameter --lines-terminated-by="\n"

loading very large text file into my sql table

I have a very large text file of 6.2 GB. I have loaded this to a table in mysql using
load data local infile 'sample.txt'
into table table1 fields terminated by ','
(It took about 20 mins)But the dump was not successful and the data is not readable and not meaningful. I had a similar file but of smaller size in csv format and could successfully load that into sql table. I am unable to open the large text file, How should I convert the text to csv without opening it? or Can anyone share an idea to successfully load the text file to mysql table

job results to a txt file in sql server

I have set up a job within SSMS to run everyday where it executes queries against two tables and stores the results in .csv text files in a folder. It works fine but I would like the files to be comma delimited text files and it is storing it with spaces between the columns and this creates unnecessary increase in size..
I went to tool/options etc and set the sql query text file format to comma delimited and it works fine when I run a manual query and select to save the results but not within the actual job. Any suggestions
In fact if I could get the results from bot queries to store into one text file only that would be even better.
I think you may want to use
SSIS and send the output to a file in CSV format
BCP
SQLCMD