I have a table called table B that as 28 million records that is in Netezza and I want to export it to a text file so that I can export the text file to the mysql server. When I run the command below, the SQL client hangs. I am using SquirrelSQL.
CREATE EXTERNAL TABLE '/Users/blah/A.txt'
USING(DELIM '\t' REMOTESOURCE 'JDBC')
AS
SELECT * FROM tableB;
I am not sure if this is supposed to be the case.
Well I'm note sure if you are running Squirrel on a Window machine, but if you are you need to use backslash in the path, and you might need to escape them also. Below is an example I use in Squirrel running on a Window 7 laptop
CREATE EXTERNAL TABLE ‘C:\\Users\\ValuedCustomer\\customer dim dump.csv’
USING ( DELIMITER ‘,’ Y2BASE 2000 ENCODING ‘internal’ REMOTESOURCE ‘JDBC’ ESCAPECHAR ‘\’ ) AS
SELECT CUSTOMER_FIRST_NAME, CUSTOMER_LASTNAME, CUSTOMER_ADDRESS, CUSTOMER_CITY, CUSTOMER_STATE
FROM DIM_CUSTOMER
You can find a little more info here on my blog
http://nztips.com/2012/07/returning-and-saving-large-result-sets-locally/
Related
I've to import some data from a CSV file into a table of db on my Aruba server.
I use the following query:
LOAD DATA LOCAL INFILE 'test.csv' INTO TABLE dailycoppergg
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r\n'
(
ddmmyy,
lmedollton,
changedolleuro,
euroton,
lmesterton,
delnotiz,
girm,
sgm
)
I tested this query on other Aruba server and it worked correctly but here, I've the following error:
#1148 - Il comando utilizzato non e` supportato in questa versione di MariaDB
How can I modify my query to import csv file data into dailycoppergg table? Can you help me, please? Thanks!
The query is fine, but MySQL client (mysql) disables local infile by default, you need to run it as mysql --local-infile ..., and then the same query should work.
The error message is a legacy and it's confusing.
Since you're using phpMyAdmin, I highly recommend you just use the Import tab instead of manually entering the import query in the SQL tab. phpMyAdmin can easily import CSV files and I don't see any advantage to entering the query manually.
In MySQL Workbench Add the line below. in the Advanced Tab, check the test connection, and close.
OPT_LOCAL_INFILE=1
I've been tasked with converting FoxPro databases to MySQL and I need to know how to export the structure/indexes of a FoxPro database to Excel. Is this possible to export that type of information from FoxPro?
I know there are tools out there that do this kind of conversion for you but that has been rejected due to our budget. We were hoping to create a specialized conversion script that will automatically convert all our containers and dbfs.
Thank you in advance.
If you look at the download area at Leafe.com there are various free tools to migrate data from VFP into MySQL.
There is a data upload program, and a couple of tools to create MySQL CREATE TABLE scripts from the currently selected alias in Visual FoxPro.
Alternatively if you want to pursue the Excel route manually then ...
If you have a table MYTABLE.DBF with the following structure:
Structure for table: C:\TEMP\MYTABLE.DBF
Number of data records: 0
Date of last update: 01/05/2014
Code Page: 1252
Field Field Name Type Width Dec Index Collate Nulls Next Step
1 FIRSTNAME Character 20 No
2 LASTNAME Character 20 No
3 AGE Numeric 3 No
4 ID Integer (AutoInc) 4 Asc Machine No 1 1
** Total ** 48
Then you can dump the structure to another DBF via the VFP Command Window like this:
cd \temp
use mytable
copy structure extended to mytablestruct.dbf
You can then open the table that contains structure info and dump it to XLS format:
use mytablestruct
copy to struct.xls type xl5
In Excel that will look like:
With regard to indexes you would have to code a small routine like this:
Create Cursor indexinfo (idxname C(254), idxtype c(254), idxkey c(254), ;
idxfilter c(254), idxorder c(254), idxcoll c(254))
Use mytable In 0
Select mytable
lnTags = ATagInfo(laTemp)
For i = 1 to lnTags
Insert into indexinfo values (laTemp[i, 1], laTemp[i, 2], laTemp[i, 3], laTemp[i, 4], laTemp[i, 5], laTemp[i, 6])
EndFor
Select indexinfo
Copy To indexinfo.xls type xl5
Opening the resultant indexinfo.xls:
You can do it from FoxPro, and there is no need to export the info to Excel, Foxpro is capabale of recreating your databases/tables/indexes in MySQL, and upload the records.
I have developed a tool that can upload any FoxPro table to MySQL, just using FoxPro commands.
Check gendbc.prg in the tools folder and adapt it to your needs.
You will have to do some field type conversions for MySQL. Also if you are going to upload your data, there are some caveats with dates/datetimes:
Replace empty VFP date fields with '0000-00-00' in MySQL, and '0000-00-00 00:00:00' for empty datetimes.
Some useful commands are AFIELDS, ATAGINFO
All good points... Additionally with VFP, you can do with the menu at "Tools" --> "Wizards" --> "Upsizing". You will need to make a connection to the database and it will walk you through most of the stuff.
You can upsize an entire database, or just individual tables during the wizard process.
For my database class the teacher assigned us to use Oracle SQL to design an application. Because I have more experience with mySQL he said I could use it instead.
I want to make my assignment look as simliar to his example as possible. What his example consists of of is one file run.sql that looks like this:
#start //this runs start.sql which creates the tables
DESC table_name; //do this for all tables
#insert //this runs insert.sql that creates dummy data
SELECT * FROM table_name; //do this for all tables
#query //this runs query.sql that runs our sample queries
#drop //this kills all the data
Is there a way to do something simliar?
Namely a way to write a query that calls external queries and outputs all data to an output.txt file?
Use 'source' to input the *.sql files
use 'create procedure' to generate the 'drop' function
use "into outfile '/file/path';" on your select to write out.
double redirect to append: "into outfile '>>/file/path';"
The source command for the mysql command-line client could do the job here:
source start.sql;
DESC table_name;
You can get more commands with help.
In PowerShell, how do I execute my mysql script so that the results are piped into a csv file? The results of this script is just a small set of columns that I would like copied into a csv file.
I can have it go directly to the shell by doing:
mysql> source myscript.sql
And I have tried various little things like:
mysql> source myscript.sql > mysql.out
mysql> source myscript.sql > mysql.csv
in infinite variation, and I just get errors. My db connections is alright because I can do basic table queries from the command line etc... I haven't been able to find a solution on the web so far either...
Any help would be really appreciated!
You seem to not be running powershell, but the mysql command line tool (perhaps you started it in a powershell console though.)
Note also that the mysql command line tool cannot export directly to csv.
However, to redirect the output to a file just run
mysql mydb < myscript.sql >mysql.out
or e.g.
echo select * from mytable | mysql mydb >mysql.out
(and whatever arguments to mysql you need, like username, hostname)
Are you looking for SELECT INTO OUTFILE ? dev.mysql.com/doc/refman/5.1/en/select.html – Pekka 19 hours ago
Yep. Select into outfile worked! But to make sure you get column names you also need to do something like:
select *
from
(
select
a,
b,
c
)
Union ALL
(Select *
from actual)
I need to use text files as data source in SSRS. I tried accessing this with ‘OLEDB provider for Microsoft directory services’ connection. But I could not. The query is given below.
Also let me know how to query the data
I know this thread is old, but as it came up in my search results this may help other people.
There are two 'sort of' workarounds for this. See the following:
http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=130650
So basically you should use OLEDB as the data source, then in the connection string type:
Provider=Microsoft.Jet.OLEDB.4.0;Data Source=xxxx;Extended Properties="text;HDR=No;FMT=Delimited"
Then make sure your file is saved in .txt format, with comma delimiters. Where I've put xxxx you need to put the FOLDER directory - so C:\Temp - don't go down to the individual file level, just the folder it's in.
In the query you write for the dataset, you specify the file name as though it were a table - essentially your folder is your database, and the files in it are tables.
Thanks
I have had great success creating linked servers in SQL to link to disparate text files for creating SSRS reports. Below is sample SQL to link to your txt files:
EXEC master.dbo.sp_addlinkedserver #server = N'', #srvproduct=N'', #provider=N'Microsoft.Jet.OLEDB.4.0', #datasrc=N'', #provstr=N'text'
EXEC master.dbo.sp_addlinkedsrvlogin #rmtsrvname=N'YourLinkedServerName',#useself=N'False',#locallogin=NULL,#rmtuser=NULL,#rmtpassword=NULL
I simply used BULK INSERT command to load the flat file into a temporary table in SSRS, like this:
CREATE TABLE #FlatFile
(
Field1 int,
Field2 varchar(10),
Field3 varchar(15),
Field4 varchar(20),
Field5 varchar(50)
)
BEGIN TRY
BULK INSERT #FlatFile
FROM 'C:\My_Path\My_File.txt'
WITH
(
FIELDTERMINATOR ='\t', -- TAB delimited
ROWTERMINATOR ='\n', -- or '0x0a' (whatever works)
FIRSTROW = 2, -- has 1 header row
ERRORFILE = 'C:\My_Path\My_Error_File.txt',
TABLOCK
);
END TRY
BEGIN CATCH
-- do nothing (prevent the query from aborting on errors...)
END CATCH
SELECT * FROM #FlatFile
I don't think you can
Data Sources Supported by Reporting Services. In the table, your only chance would be "Generic ODBC data source", however a text file is not ODBC compliant AFAIK. No types, no structure etc.
Why not just display the text files? It seems a bit strange to query text files to bloat them into formatted HTML...
I'm not of the mind that you can, but a workaround for this, if your text files are CSVs or the like, is to create an SSIS package which brings that data into a table in SQL Server, which you can then query like there's no tomorrow. SSIS does Flat File Sources with ease.
You can even automate this by right clicking the database in SSMS, doing Tasks->Import Data. Walk through the wizard, and you can then save off the package at the end.