Reading Paradox Database files - mysql

I'm working with a client who has an existing system, built on what is apparently a Paradox database. I've got the database, in the form of a zip file containing .DB, .MB and .PX files, one for each table.
I need to take (some) of this data and import it in to a Web application that's using MySQL. Does anybody have a way for me to extract this data, that doesn't involve installing Paradox?
If not, does Paradox export in some readable format? Either as SQL or something that can be parsed reasonably easily? The person in charge of this system for my client is a volunteer (they're a non-profit), so I'd like to go to him with a solution - because last time I asked for the data, I got this, which is clearly no good.

The wikipedia article about Paradox lists two other things, that might be interessant, both under GPL license:
pxlib: Library to read and write Paradox databases
pxtools: convert a Paradox-database into a SQL-database
And if you have Delphi and want to write a converter yourself (which would need the BDE to work) you can take a look at this article or at the source code of ConvertCodeLib on this web site. Both make use of TClientDataset, which can write a CDS (binary format) or an XML file.

Both the Paradox for DOS and Paradox for Windows platforms will export data tables in Delimited Text, Fixed-Length Text, and Lotus 1-2-3 formats. The older Paradox for DOS also writes Lotus Symphony, while the slightly less antique Paradox for Windows does a passable Excel 5.
However, someone will have to sit down and export the tables one by one, or write a script do to it. Of course you'd need to have Paradox installed to write the script.
-Al.

MS has instructions for using the MS Jet driver to read data from files produced by Paradox 3-5. That can act as (at least) an ODBC driver, so you can use it to read a Paradox file from just about anything that knows how to use ODBC.

You have a few options:
Get your hands on the original Paradox software, and use it to export the database into CSV format. Unfortunately, Borland no longer sells it and the most recent version doesn't run well on Windows XP or above.
Access the database using either a Paradox or dBase/xBase ODBC driver. Paradox and xBase are very similar, so you may be able to extract the data using drivers meant for either of them. You may be able to get a Paradox ODBC driver somewhere on firebirdsql.org.
Use Borland Delphi to write a program which will export the data you need. As someone else mentioned, you can get a free version called Turbo Explorer. You will also have to install the BDE seperately, as it doesn't come with Turbo Explorer.

I've been working on a gigantic data migration from Paradox to MySQL. My general approach has been to export CSV files from Paradox, and then import the CSV files from the MySQL command line. However this system breaks down when there are M (memo) fields in Paradox, because that data doesn't get pulled into the CSV file as expected.
Here's my long-winded process for getting Paradox data into MySQL, hopefully it helps somebody!
Open Paradox file in Paradox, export to dbase (.dbf) file. What this does is it exports the memo data into dbase's blob format.
Open the .dbf file in Paradox. It might be necessary to convert double format to long integer or number before opening in dbfviewer. Double format appears to not be working. Save file.
Use this program to open up the dbase file and then export to Excel: http://dbfviewer.org/
Export -> XLS-File … this opens it in Excel
Now we need to create a macro because Excel doesn't have any native way to enclose CSV fields with quotes or anything else. I've pasted the macro below, but here are the reference sites that I found. One site had better instructions but corrupted text:
http://www.mrexcel.com/forum/showthread.php?320531-export-as-csv-file-enclosed-quotes
http://www.markinns.com/articles/full/export_excel_csvs_with_double_quotes/
In Excel replace all " with ' by CTRL-F, replace... any " in records will mess stuff up
In Excel press ALT - F11 to open up macros
Insert -> Module
Create this macro to save CSV files enclosed with double quotes:
Sub CSVFile()
Dim SrcRg As Range
Dim CurrRow As Range
Dim CurrCell As Range
Dim CurrTextStr As String
Dim ListSep As String
Dim FName As Variant
FName = Application.GetSaveAsFilename("", "CSV File (*.csv), *.csv")
If FName <> False Then
ListSep = Application.International(xlListSeparator)
If Selection.Cells.Count > 1 Then
Set SrcRg = Selection
Else
Set SrcRg = ActiveSheet.UsedRange
End If
Open FName For Output As #1
For Each CurrRow In SrcRg.Rows
CurrTextStr = ""
For Each CurrCell In CurrRow.Cells
CurrTextStr = CurrTextStr & """" & CurrCell.Value & """" & ListSep
Next
While Right(CurrTextStr, 1) = ListSep
CurrTextStr = Left(CurrTextStr, Len(CurrTextStr) - 1)
Wend
Print #1, CurrTextStr
Next
Close #1
End If
End Sub
Then Run -> Run Macro
Set up target MySQL db schema with text fields where we want the blobs to go
In MySQL command line here's an example of how to do the import:
LOAD DATA LOCAL INFILE 'C:/data.csv'
INTO TABLE table_name
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
(column1, column2)

Paradox is a native format for the Borland Database Engine, which is included with various Delphi programming products. Ownership has changed hands at least once recently, but at one point there were free "Express" versions of Delphi available that would let you write a simple program to export this stuff. If a free version is no longer available, the lowest available SKU should include BDE functionality.

Using MS Access 2007 you can import Paradox 7 and below using the BDE distribution included with the free Paradox Database Editor program (google it). Use a connection such as:
DoCmd.TransferDatabase acImport, "ODBC Database", _
"Paradox 3.X;HDR=NO;IMEX=2;ACCDB=YES;DATABASE=C:\apache\Archive;TABLE=Messages#db", _
acReport, DailyArchiveName, "MyDatabase"

Related

MySQL import - CSV - file refuses to be properly imported

I'm trying to import the following file into a MySQL Db:
https://drive.google.com/drive/folders/1WbRdNgqVre3wN4DpJZ-08jtGkJtCDJNQ?usp=sharing
Using the "data import wizard" on MySql Workbench, for some reason I'm getting "218\223 lines imported successfully", whereas the file contains close to 100K.
I tried looking for special chars around lines 210-230, also removing all of them, but still the same happens.
The file is a CSV of Microsoft Bing's geo locations, used in Microsoft Advertising campaigns, downloaded from Microsoft's website (using an ad account there).
I've been googling, reading, StackOverflowing, playing with the file and different import options...
I tried cutting the file into small bits, and the newly created file was completely corrupt somehow...
Encoding seems to be UTF-8, line breaks all "\n". I tried changing them all into "\r\n" using notepad++, but still the same happens.
File opens normally in Excel, looks normal, passes CSVlint.io...
The only weird thing is that the file contains quotes on some of the values but not on the rest (e.g. line 219. Yeah I know it sounds like this would be the problem, but I removed it, and all the rest of the lines with quotes, and it still happens... Also tried loading with ENCLOSED BY ", see below).
I also tried using SQL statements to import:
LOAD DATA LOCAL INFILE 'c:\\Users\\Gilad\\Downloads\\GeoLocations.csv'
INTO TABLE aw_geo_map_bmsl
FIELDS TERMINATED BY ','
(tried also with: ENCLOSED BY '"')
LINES TERMINATED BY '/n'
IGNORE 1 ROWS;
(had to add OPT_LOCAL_INFILE=1 to the connection on Advanced for MySQL Workbench to be allowed access to local files on my computer)
This gives 0 rows affected.
Help?
Epilogue: In the end I just gave up on all these import wizards and did it the old "make your SQL statements from Excel" way.
I imported the CSV data into Excel. Watch out: in this case I found I needed to use a data import wizard from Excel (but that one worked perfectly) to be able to change the encoding to UTF, which Excel 2010 chose as "windows" which was wrong.
After processing the data a bit to my liking, I used the following Excel code:
=CONCATENATE("INSERT INTO aw_geo_map_bmsl (`Location Id`,Name,`Canonical Name`,`Location Type`,Status,`Adwords Location Id`)
VALUES (",
A2,
",""",B2,"""",
",""",C2,"""",
",""",D2,"""",
",""",E2,"""",
",",F2,");")
to generate INSERT statements for every line, then copy-pasted and pasted only values, then pasted into an editor, removed additional quotes that Excel adds, and ran it in MySQL Workbench, which runs it line by line (takes some time), and you can see the progress.
Saved me hours of unsuccessfully playing around with "automatic tools" which fail for unknown reasons and don't give proper logs ootb.
Warning: do NOT do this for unsanitized code as it's vulnerable to SQL injection. In this case it was data from Microsoft so I know it's fine.

Importing data from csv files into Cratedb

I have created a table in Crate 0.38.x with columns having integer, string and timestamp data types. I want to load data into this table from delimited text files. Is there a utility to do a bulk import? Sorry, but I could not find one in the documentation or on Github
In order to do bulk imports from file the COPY FROM statement can be used (see https://crate.io/docs/stable/sql/reference/copy_from.html). But there is only support for JSON formatted files so you'll probably need to convert the text files first.
Not sure if there are any plans to add support for other formats, but if you create a github issue requesting the feature you'll get feedback once it has been implemented.
There are also docs available on how to migrate from mysql and mongodb
I have quickly imported data from MySQL to Crate 0.40 installing Ruby on Rails in the same server of the MySQL DB, and then using the Mysql2JSON gem (See the Mysql2xxx part).
Crate requires a one line per register JSON file. So, you have to edit the output replacing the [", ",", "] with ", "/n", " in the mysql2xxXX gem source, in order to have a format like this in the output:
{"id": 1, "quote": "Don't panic"}
{"id": 2, "quote": "Would it save you a lot of time if I just gave up and went mad now?"}
After exporting the MySQL JSON info with the Mysql2Json gem you have to upload the file to the Create server and put in the Crate console:
COPY table_name FROM 'file:///tmp/import_data/quotes.json'
Read this:
https://crate.io/docs/crate/reference/en/latest/general/dml.html#import-and-export
just make sure that you have created the table with schema beforehand using the copy function to import dataset from json or csv.

Read CSV File By Line in ASP

I am trying to read a CSV file with VBScript and it is causing huge problems because it isn't recognizing a line break. The way I have right now is:
Set objFSO = CreateObject("Scripting.FileSystemObject")
Set objFile = objFSO.OpenTextFile(server.mappath("my_csv_file.csv"), ForReading)
Do Until objFile.AtEndOfStream
strLine = objFile.ReadLine
arrFields = Split(strLine, ",")
LOOP_STUFF_HERE
Loop
The CSV file has several lines but the problem is that it is reading the CSV file all as one long line and the last item of each line is being combined with the first item of the next line because there is no comma after the last line (it's being created by a client of mine in Excel and then sent to me). My solution is that I have to open it up in a text editor, manually add a comma to the end of the lines and then remove the line breaks. This won't work in the long run because we are setting up an automated system.
Basically, I need to be able to to Split the lines on a line break (I've tried Split(strLine, "\n" but that doesn't seem to work) and then once they are split by line break, then split them by comma. It'd be a multidimensional array in other words.
I can't find out how to get VBScript to recognize the line breaks though. Any ideas? Thanks for your help.
Reading a CSV with FileSystem Object?. don't reinvent the wheel, there are different and proven ways to do exactly what you want in ASP.
The most easy way it's using OLEDB Jet or OLEDB ACE driver to read the file
Basiclyyou need to create a OLEDB.Connection Object with a specific Connection String to the get all the data in the CSV as rows; later you can pass all the data as array using GetRows method or you can use the Recordset object directly.
Post related of using this functionality
ASP.NET (Ace) When reading a CSV file using a DataReader and the OLEDB Jet data provider, how can I control column data types?
ASP.NET (Ace) Microsoft.ACE.OLEDB.12.0 CSV ConnectionString
ASP-Classic (Jet) Reading csv file in classic asp. Problem: column values are truncated up to 300 characters
the Connection is almost the same for Jet or Ace driver (it's the important part).

Entering data from an excel spreadsheet into database?

I'm looking for a clever way to extract 500 plus lines of data from an excel spreadsheet and enter is into my database.
The spreadsheet is like this
My table 'tbl_foot_teams' is set out as
id | name | rating
Quite simply, I need to enter get the two columns from the spreadsheet into the database fields name and rating.
Is there any efficient way to achieve this?
Individually, it will take me a ridiculous amount of time!
Thanks
Save Excel file as CSV and use LOAD DATA INFILE command to import data.
Your excel file has no id field. Make id field in the table as AUTO_INCREMENT, and use command like this -
LOAD DATA INFILE 'file_name.csv' INTO TABLE tbl_foot_teams
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r\n'
-- IGNORE 1 LINES -- if csv file has column headers
(name, rating)
SET id = NULL; -- this will set unique value for each row
Also, have a look at GUI Data Import tool (Excel or CSV format) in dbForge Studio for MySQL.
In phpmyadmin you have an Import From Excel option.
If you don't have one, you may have Import From CSV, so just convert the spreadsheet to CSV.
If you have none of above, you can write a php function that opens a text file, makes explode by rows and then explode by values
If we are talking about 50 rows, you can create easily a new column on spreadsheet with a formula to concatenate your values to a insert statement. Something like:
=concat( "insert into tbl_foot_teams ( name , rating) values ( " , $b8 , " ...
then, copy paste calculate formula text result on your database.
You don't specify what database you're using, but an easy way to do this with MySQL would be to export the spreadsheet as a csv file and then import to MySQL with mysqlimport.
This is described in a comment on this MySQL page, from user Philippe Jausions:
If you are one of the many people trying to import a CSV file into
MySQL using mysqlimport under MS-Windows command/DOS prompt, try the
following:
mysqlimport --fields-optionally-enclosed-by=""" --fields-terminated-by=, --lines-terminated-by="\r\n" --user=YOUR_USERNAME --password YOUR_DATABASE YOUR_TABLE.csv
Between quotes " and backslashes \ it can really give you a hard time
finding the proper combination under Windows...
I usually run this command from the folder containing the
YOUR_TABLE.csv file.
If you have a header in your .csv file with the name of columns or
other "junk" in it, just add a --ignore-lines=X to skip the first X
lines (i.e. --ignore-lines=1 to skip 1 line)
If your fields are (optionally) enclosed by double-quotes " and which
themselves are doubled inside a value (i.e. a double double-quote "" =
1 double-quote ") then also use --fields-escaped-by=\ (default) and
NOT --fields-escaped-by="""
Working from the Excel end, you can use ADO, for example Excel VBA: writing to mysql database
Dim cn As ADODB.Connection
''Not the best way to get the name, just convenient for notes
strFile = Workbooks(1).FullName
strCon = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" & strFile _
& ";Extended Properties=""Excel 8.0;HDR=Yes;IMEX=1"";"
Set cn = CreateObject("ADODB.Connection")
''For this to work, you must create a DSN and use the name in place of
''DSNName, however, you can also use the full connection string
strSQL = "INSERT INTO [ODBC;DSN=DSNName;].NameOfMySQLTable " _
& "Select AnyField As NameOfMySQLField FROM [Sheet1$];"
cn.Execute strSQL
After converting to a CSV file, you can import into any database that supports importing from CSV files (MySQL, PostgreSQL, etc.) but you would have to do this from the command-line:
Importing CSV files in PostgreSQL
Import CSV to Oracle table
MySQL 5.1 LOAD DATA INFILE syntax
Import CSV File into MSSQL
You can use a programming language such as Python and use a library for it such as the Excel spreadsheet reading library and then another library for interfacing with your SQL database.
You connect to the data, load the Excel file, and loop through each row and extract whichever column data you want. Then you take that data and execute the INSERT statement.
If you have an installation of phpMyAdmin on a server, you can use the Import from CSV option though you would first have to re-save your Excel spreadsheet as a CSV (Comma-Separated Values) file.

Access97 VBA Export to CSV format issue

I have an access97 database and I am trying to write some code to export to a CSV file - (I am new to VBA).
I have this working however, there is one field that I am exporting that is a currency so in it for example is £3,456.00 - when I export to the CSV I get exactly this - however I need it to just be the number i.e 3456.00.
On a similar issue - I have the date as dd/mm/yyyy and I wonder if there is a way to convert that in VBA to yyyy-mm-dd?
Please bear in mind any solutions has to be simple due to my limited knowledge!
Sorry about the delay; seemingly easy things took longer. As I assumue from your:
DoCmd.TransferText acExportDelim, "olly_csv", "olly aorder export", "\\10.0.0.38\nw_upload\aorders.csv"
that you have an export specification "olly_csv" that determines how to export the
resultset of the SELECT query "olly aorder export" to the file "aorders.csv"
in the destination folder "\10.0.0.38\nw_upload".
The easy way to export the CURRENCY field(s) as plain Double/Float/Single number
and the DATE field(s) with a format of your choice (dd/mm/yyyy) would be to
request just that in the export specification. I found no way to do that in Access
2000 (As far as I can see, there are limited ways to pick date formats, but the features of the Import Wizard to deal with the types of columns are not implemented by the
Export Wizard).
The Docs about "TransferText" (sorry, Access 2003) state:
SpecificationName Optional Variant. A string expression that's the name of
an import or export specification you've created and saved in the current
database. For a fixed-width text file, you must either specify an argument or
use a schema.ini file, which must be stored in the same folder as the
imported, linked, or exported text file. To create a schema file, you can use
the text import/export wizard to create the file. For delimited text files
and Microsoft Word mail merge data files, you can leave this argument blank
to select the default import/export specifications.
Now there are to schools of Microsoft Docs philology: The optimists will read
that as: If you don't pass an export specification and have a suitable schema.ini
file, then the export process will adhere to the specs in the file. The pessimists
will say: Microsoft never agreed to fullfill your pipe dreams - if you don't
specify an argument for a non-fixed-width file, the TransferText command will
use some obscure default export specification (please pay a consultant to
seek and change it).
Let's be optimistic!
So create a schema.ini file with a section for "aorders.csv". For my tests I
used a table
Tabelle: OlliesOrders
Name Typ Größe
OrderId Long Integer 4
Amount Währung 8
DateDue Datum/Uhrzeit 8
(sorry about the German; Amount is Currency, DateDue Date/Time). For that table
the schema.ini section looks like:
[aorders.csv]
ColNameHeader=True
CharacterSet=1252
Format=Delimited(;)
DateTimeFormat=dd/mm/yyyy
Col1=OrderId Integer
Col2=Amount Float
Col3=DateDue Date
You'll have to adapt this example to your fields. Do you want column headers? Is the
windows codepage ok? What about field separators? I had to use ; (German locale), you
may need "Format=CSVDelimited". Look here for some background. Then call
DoCmd.TransferText acExportDelim, , "olly aorder export", "\\10.0.0.38\nw_upload\aorders.csv"
and check if optimists rule.
For pessimists:
Create a new query on the table to export (from). Change the type to Ausführung/Execute (?)
and edit the SQL until it looks like:
SELECT OlliesOrders.* INTO [aorders.csv] IN 'M:\trials\23forum\SOTrials\txt' [TEXT;] FROM OlliesOrders;
resp.:
SELECT YourFieldsList INTO [aorders.csv] IN '\\10.0.0.38\nw_upload' [TEXT;] FROM YourTable;
and execute it (from the query window or a macro/module Sub). My result:
"OrderId";"Amount";"DateDue"
1;1411,09;29/04/2011
2;123,45;13/04/2011
ADDED: Evidence for my claim, that you can't specify types in the Export Wizard:
Export
Import