I'm actually having problem generating a CSV file from a select statement that output a lot of rows (close to 10 M). I need to export the result of this statement in a CSV file and since i only have a Citrix VM, i'm being disconnect every 2h, which is not giving me enought time to execute my query.
My though was to start the query and use UTL_FILE to generate a CSV file on our server. But now, i am facing security issues -ยป I only have read access and cannot create procedure (as ASK TOM articlle https://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:9537857800346182134)
I want to know if you guys knows how could i generate a CSV file on my server by only executing a query and letting it goes?
Thanks a lot!
Related
I have a .sql file from Oracle which contains create table/index statements and a lot of insert statements(around 1M insert).
I can manually modify the create table/index part(not too much), but for the insert part there are some Oracle functions like to_date.
I know MySql has a similar function STR_TO_DATE but the usage of the parameter is different.
I can connect to MySQL, but the .sql file is the only thing I got from Oracle.
Is there any way I can import this Oracle .sql file into MySQL?
Thanks.
Although the above job can be done by manually editing the script appropriately however there are products available which can be of use. Refer to the link for more information on one such product.
P.S. I am not affiliated in any way to the product
Since you mention about insert script basically i think you will be inserting data for this you can use any ETL tool, like open source tool like Pentaho data integrator, pretty simple to do, just search table to table transformation from different database connection on youtube to learn you should be able to connect to both mysql and oracle database else this wont help, but all the table structures you should create manually in the source database for data - you can just load it using ETL, no need to edit for every single line of insert if its more than 100 may be its very painful thing to do.
I know how to import a text file into MySQL database by using the command
LOAD DATA LOCAL INFILE '/home/admin/Desktop/data.txt' INTO TABLE data
The above command will write the records of the file "data.txt" into the MySQL database table. My question is that I want to erase the records form the .txt file once it is stored in the database.
For Example: If there are 10 records and at current point of time 4 of them have been written into the database table, I require that in the data.txt file these 4 records get erased simultaneously. (In a way the text file acts as a "Queue".) How can I accomplish this? Can a java code be written? Or a scripting language is to be used?
Automating this is not too difficult, but it is also not trivial. You'll need something (a program, a script, ...) that can
Read the records from the original file,
Check if they were inserted, and, if they were not, copy them in another file
Rename or delete the original file, and rename the new file to replace the original one.
There might be better ways of achieving what you want to do, but, that's not something I can comment on without knowing your goal.
I'm trying to extract data from a cube using MDX, when I run the query in SSMS I get 500K rows (same result I get when I use excel to connect to the cube), however, when I put the query into a SSIS package and execute it I get only 100k rows. The package executes just fine (completes correctly), it doesn't show any errors, warnings, anything so not sure why I'm not getting the same number on rows :(.
Thanks for the help! :)
Well I still don't know what was causing this issue but I was able to pull all of the data by using a linked server query instead of a direct MDX query to the cube.
If you dump it to RAW file SSIS dumps the whole extract. Then you can import from the RAW file. It's not an ideal but solves the issue.
I'm trying to backup some of my data stored in a big table with
SELECT ... INTO OUTFILE
statement.
The output file is on a network hard drive, so if the network connection breaks just during the dump (it takes one minute more or less) I find a partial file on my network hard drive and I'd like to mark such file as "wrong".
Is there a SQL command that I can give inside my MySQL Stored Procedure that let me rename such file?
Thank you very much
Best
cghersi
You can't rename file in mysql, but you could use two files for dump and rotate it only when operation was successful.
Example:
You make dump to 'a.csv', if operation was successful will use 'b.csv' for next dump, otherwise use 'a.csv' again. And so on...
Im running SQL Server 2008 on winows server 2008 and I have a stored proc that outputs some information about a product entity with the inout as a product Id.
It outputs a reecord to represent the product information followed by a second table full of orders.
Im wondering if there is any way to call the stored proc and write the orders data to a CSV file from the command shell?
The other alternative is to try this using a custom written application and a data reader but I dont realy want to go down this route.
You should be able to use the SQLCMD command-line utility to do this. It's a complex tool to use; the BOL entry can be found here, and if necessary a bit of googling should turn up the odd tutorial that goes over the basics.