Extracting Structure/Indexes from a FoxPro Database to Excel - mysql

I've been tasked with converting FoxPro databases to MySQL and I need to know how to export the structure/indexes of a FoxPro database to Excel. Is this possible to export that type of information from FoxPro?
I know there are tools out there that do this kind of conversion for you but that has been rejected due to our budget. We were hoping to create a specialized conversion script that will automatically convert all our containers and dbfs.
Thank you in advance.

If you look at the download area at Leafe.com there are various free tools to migrate data from VFP into MySQL.
There is a data upload program, and a couple of tools to create MySQL CREATE TABLE scripts from the currently selected alias in Visual FoxPro.
Alternatively if you want to pursue the Excel route manually then ...
If you have a table MYTABLE.DBF with the following structure:
Structure for table: C:\TEMP\MYTABLE.DBF
Number of data records: 0
Date of last update: 01/05/2014
Code Page: 1252
Field Field Name Type Width Dec Index Collate Nulls Next Step
1 FIRSTNAME Character 20 No
2 LASTNAME Character 20 No
3 AGE Numeric 3 No
4 ID Integer (AutoInc) 4 Asc Machine No 1 1
** Total ** 48
Then you can dump the structure to another DBF via the VFP Command Window like this:
cd \temp
use mytable
copy structure extended to mytablestruct.dbf
You can then open the table that contains structure info and dump it to XLS format:
use mytablestruct
copy to struct.xls type xl5
In Excel that will look like:
With regard to indexes you would have to code a small routine like this:
Create Cursor indexinfo (idxname C(254), idxtype c(254), idxkey c(254), ;
idxfilter c(254), idxorder c(254), idxcoll c(254))
Use mytable In 0
Select mytable
lnTags = ATagInfo(laTemp)
For i = 1 to lnTags
Insert into indexinfo values (laTemp[i, 1], laTemp[i, 2], laTemp[i, 3], laTemp[i, 4], laTemp[i, 5], laTemp[i, 6])
EndFor
Select indexinfo
Copy To indexinfo.xls type xl5
Opening the resultant indexinfo.xls:

You can do it from FoxPro, and there is no need to export the info to Excel, Foxpro is capabale of recreating your databases/tables/indexes in MySQL, and upload the records.
I have developed a tool that can upload any FoxPro table to MySQL, just using FoxPro commands.
Check gendbc.prg in the tools folder and adapt it to your needs.
You will have to do some field type conversions for MySQL. Also if you are going to upload your data, there are some caveats with dates/datetimes:
Replace empty VFP date fields with '0000-00-00' in MySQL, and '0000-00-00 00:00:00' for empty datetimes.
Some useful commands are AFIELDS, ATAGINFO

All good points... Additionally with VFP, you can do with the menu at "Tools" --> "Wizards" --> "Upsizing". You will need to make a connection to the database and it will walk you through most of the stuff.
You can upsize an entire database, or just individual tables during the wizard process.

Related

SQL Import Wizard errors on importing a psv file

I am trying to import a psv (pipe delimited csv) into Microsoft SQL Server 2008R2 Express database table.
There are only two fields in the psv, each field has more than 1000 characters.
In the import wizard, I have the following settings:
Double checked in the mapping:
Note I set the option of Ignore on fail/truncate:
and as usual, I get an error:
Error 0xc02020a1: Data Flow Task 1: Data conversion failed. The data
conversion for column "Comm" returned status value 4 and status text
"Text was truncated or one or more characters had no match in the
target code page.". (SQL Server Import and Export Wizard)
UPDATE:
So, following #Marc's suggestion, though very/extremely reluctant, I spent 3 hours or so to finally get SQL2014 installed on my computer and am hoping to import the psv. As expected, error shows up again:
I really cannot understand why company like Microsoft did not do thorough QAT on their products?!
After being tortured by Microsoft for the whole morning, I finally got this task done, for the future readers, you can follow the steps below to import a csv/psv data source into your sql:
Import the CSV/PSV to an Access Database. Note, must be saved to the mdb type (yes, the type from 20th century), you might want to read my story here: how to import psv data into Microsoft Access
In your SQL (mine is 2014), start the Import Wizard and select the data source type (ACCESS) and the file. Why you have to use mdb type of access database? Here you will see there is no option in SQL 2014 for accdb type of access database.
DO NOT forget to select the right Destination (yes, even though you started the wizard by right click on the destination database and chose Import), you want to select the last option: SQL Native Client 11.0. That will show up the SQL2014 and the database.
Now that the import can be completed as expected.
Thanks to the great design logic in this SQL (2014? No, essentially no change compared to 2008), what a humble expectation and requirement!!! it costs me 4-5 hours to complete.
Alternatively, you can use bulk insert to import any flat file.
if (object_id('dbo.usecase1') is not null)
drop table dbo.usecase1
go
create table dbo.usecase1
(
Descr nvarchar(2000) null,
Comm nvarchar(2000) null
)
go
bulk insert dbo.usecase1
from 'C:\tmp\usecase0.psv'
with (
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
go
BULK INSERT (Transact-SQL)

Pull NULLs as NaNs into Matlab from MySQL

In Matlab, I'm pulling in data from a mySQL database using a statement similar to:
SELECT PrimaryKeyVar, Var1, MyDate, Var2, Var3 FROM MyDatabase.MyTable ORDER BY PrimaryKeyVar DESC LIMIT 4
Among the 4 values returned are some NULLs. Unfortunately, these are imported into Matlab as 'null' rather than NaN (in other words, Matlab treats the mySQL NULLs as strings). Is there a way to import the NULLs as NaNs?
I was thinking of including a statement like ...IF(MyDate IS NULL, "????", MyDate) AS MyDate... (where the "????" would hold some kind of identifier for the NULLs) but I'm not sure if that can work.
You can control the behaviour of Database Toolbox on null data via preferences. Open the MATLAB Preferences dialog via File->Preferences, and navigate in the left panel to the section for Database Toolbox. Specify the behaviour you'd like in the section Null Data Handling.
Alternatively, you can control the same preferences programatically with the command setdbprefs. You may need to set the values of the preferences NullNumberRead, NullNumberWrite, NullStringRead and NullStringWrite. Type doc setdbprefs for more information.

Most effective way to push data from a SQL Server database into a Greenplum database?

Greenplum Database version:
PostgreSQL 8.2.15 (Greenplum Database 4.2.3.0 build 1)
SQL Server Database version:
Microsoft SQL Server 2008 R2 (SP1)
Our current approach:
1) Export each table to a flat file from SQL Server
2) Load the data into Greenplum with pgAdmin III using PSQL Console's psql.exe utility
Benifits...
Speed: OK, but is there anything faster? We load millions of rows of data in minutes
Automation: OK, we call this utility from an SSIS package using a Shell script in VB
Pitfalls...
Reliability: ETL is dependent on the file server to hold the flat files
Security: Lots of potentially sensitive data on the file server
Error handling: It's a problem. psql.exe never raises an error that we can catch even if it does error out and loads no data or a partial file
What else we have tried...
.Net Providers\Odbc Data Provider: We have configured a System DSN using DataDirect 6.0 Greenplum Wire Protocol. Good performance for a DELETE. Dog awful slow for an INSERT.
For reference, this is the aforementioned VB script in SSIS...
Public Sub Main()
Dim v_shell
Dim v_psql As String
v_psql = "C:\Program Files\pgAdmin III\1.10\psql.exe -d "MyGPDatabase" -h "MyGPHost" -p "5432" -U "MyServiceAccount" -f \\MyFileLocation\SSIS_load\sql_files\load_MyTable.sql"
v_shell = Shell(v_psql, AppWinStyle.NormalFocus, True)
End Sub
This is the contents of the "load_MyTable.sql" file...
\copy MyTable from '\\MyFileLocation\SSIS_load\txt_files\MyTable.txt' with delimiter as ';' csv header quote as '"'
If you're getting your data load done in minutes, then the current method is probably good enough. However, if you find yourself having to load larger volumes of data (terabyte scale for instance), the usual preferred method for bulk-loading into Greenplum is via gpfdist and corresponding EXTERNAL TABLE definitions. gpload is a decent wrapper that provides abstraction over much of this process and is driven by YAML control files. The general idea is that gpfdist instance(s) are spun up at the location(s) where your data is staged, preferrably as CSV text files, and then the EXTERNAL TABLE definition within Greenplum is made aware of the URIs for the gpfdist instances. From the admin guide, a sample definition of such an external table could look like this:
CREATE READABLE EXTERNAL TABLE students (
name varchar(20), address varchar(30), age int)
LOCATION ('gpfdist://<host>:<portNum>/file/path/')
FORMAT 'CUSTOM' (formatter=fixedwidth_in,
name=20, address=30, age=4,
preserve_blanks='on',null='NULL');
The above example expects to read text files whose fields from left to right are a 20-character (at most) string, a 30-character string, and an integer. To actually load this data into a staging table inside GP:
CREATE TABLE staging_table AS SELECT * FROM students;
For large volumes of data, this should be the most efficient method since all segment hosts are engaged in the parallel load. Do keep in mind that the simplistic approach above will probably result in a randomly distributed table, which may not be desirable. You'd have to customize your table definitions to specify a distribution key.

Share 1 table between 2 different types of databases

The problem that I have is that I want to synchronize one table between two different databases.
Database 1 is on a XP server with MySQL
Database 2 is on a Novell server with Clarion.
Is it possible to share one table users between the two databases?
So when data is put in database 1, the database automatically synchornize with database 2. When this is done the table: user is in both databases the same?
Thanks in advance!
Diederik,
Your question isn't very clear in that we don't know if you have access to the source code or can only operate on a database level.
You didn't mention clearly if you're using Clarion to drive those databases. I'm assuming you are, since you tagged your post with it.
Also, you didn't mention which file format you're using at the Novell server. I'm assuming you are using the TopSpeed file format - here a bit of information about the TopSpeed file format: most programmers think it is the "native" file format for Clarion for Windows. It is not. Clarion for Windows doesn't have such thing as a native file format, but employs a totally driver driven approach. Clarion Professional Developer (a DOS IDE) had a native file format, which was the Clarion .DAT format. Clarion for Windows can use whatever file format that offers a driver or ODBC driver, including the old .DAT.
If you have access to the source code, then it is a pretty straight situation. In Clarion you can easily have different buffers pointing to different tables.
PROGRAM
MAP
END
szConnMySQL CSTRING( 256 )
users_mysql FILE, DRIVER( 'ODBC' ), OWNER( szConnMySQL ), NAME( 'users' )
RECORD RERCORD
id LONG
name STRING( 20 )
END
END
users_tps FILE, DRIVER( 'TopSpeed' ), NAME( 'users' )
RECORD RECORD
name STRING( 20 )
id LONG
END
END
CODE
szConnMySQL = 'Driver={{MySQL ODBC 3.51 Driver};' & |
'Server=myServerAddress;Database=myDataBase;User=myUsername;' & |
'Password=myPassword;Option=3;'
OPEN( users_mysql, 42h )
OPEN( users_tps, 42h )
users_mysql.id = 1
users_mysql.name = 'GUSTAVO PINSARD'
ADD( users_mysql )
IF NOT ERRORCODE()
users_tps.RECORD :=: users_mysql.RECORD
ADD( users_tps )
ELSE
! Do your thing
END
CLOSE( users_mysql )
CLOSE( users_tps )
If you don't have access to source code, then you'll have to write a MySQL stored procedure to update the remote file. The problem is that the remote file, being a TopSpeed file, would bot be directly accessible from the MySQL server, since it, MySQL, doesn't know anything about it.
One solution to overcome this is by using the TopSpeed ODBC driver at the MySQL server, and having the MySQL SP access the ODBC driver. I consider the TopSpeed ODBC driver a must have, because it allows for a strategy to escape such situations, and promote a better integration.
Details on the MySQL SP are out of the scope of this post, also because I don't know MySQL SPs to that level.
Regards

Text file as data source in SSRS

I need to use text files as data source in SSRS. I tried accessing this with ‘OLEDB provider for Microsoft directory services’ connection. But I could not. The query is given below.
Also let me know how to query the data
I know this thread is old, but as it came up in my search results this may help other people.
There are two 'sort of' workarounds for this. See the following:
http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=130650
So basically you should use OLEDB as the data source, then in the connection string type:
Provider=Microsoft.Jet.OLEDB.4.0;Data Source=xxxx;Extended Properties="text;HDR=No;FMT=Delimited"
Then make sure your file is saved in .txt format, with comma delimiters. Where I've put xxxx you need to put the FOLDER directory - so C:\Temp - don't go down to the individual file level, just the folder it's in.
In the query you write for the dataset, you specify the file name as though it were a table - essentially your folder is your database, and the files in it are tables.
Thanks
I have had great success creating linked servers in SQL to link to disparate text files for creating SSRS reports. Below is sample SQL to link to your txt files:
EXEC master.dbo.sp_addlinkedserver #server = N'', #srvproduct=N'', #provider=N'Microsoft.Jet.OLEDB.4.0', #datasrc=N'', #provstr=N'text'
EXEC master.dbo.sp_addlinkedsrvlogin #rmtsrvname=N'YourLinkedServerName',#useself=N'False',#locallogin=NULL,#rmtuser=NULL,#rmtpassword=NULL
I simply used BULK INSERT command to load the flat file into a temporary table in SSRS, like this:
CREATE TABLE #FlatFile
(
Field1 int,
Field2 varchar(10),
Field3 varchar(15),
Field4 varchar(20),
Field5 varchar(50)
)
BEGIN TRY
BULK INSERT #FlatFile
FROM 'C:\My_Path\My_File.txt'
WITH
(
FIELDTERMINATOR ='\t', -- TAB delimited
ROWTERMINATOR ='\n', -- or '0x0a' (whatever works)
FIRSTROW = 2, -- has 1 header row
ERRORFILE = 'C:\My_Path\My_Error_File.txt',
TABLOCK
);
END TRY
BEGIN CATCH
-- do nothing (prevent the query from aborting on errors...)
END CATCH
SELECT * FROM #FlatFile
I don't think you can
Data Sources Supported by Reporting Services. In the table, your only chance would be "Generic ODBC data source", however a text file is not ODBC compliant AFAIK. No types, no structure etc.
Why not just display the text files? It seems a bit strange to query text files to bloat them into formatted HTML...
I'm not of the mind that you can, but a workaround for this, if your text files are CSVs or the like, is to create an SSIS package which brings that data into a table in SQL Server, which you can then query like there's no tomorrow. SSIS does Flat File Sources with ease.
You can even automate this by right clicking the database in SSMS, doing Tasks->Import Data. Walk through the wizard, and you can then save off the package at the end.