How can i import .DMP files into SQLiteStudio - mysql

i need to import some .DMP files into SQLiteStudio, i tried a lot of things but can´t find how to do it. If you can help me i appreciate it

You cannot import MySQL dump files into SQLite database. You need to first convert it into a format understandable by SQLite. There are several tools over the internet for doing so - just type "convert mysql dump into sqlite" in google or other search page. An example of such tool: https://github.com/dumblob/mysql2sqlite
Once you have the file compatible with SQLite, you can open SQLiteStudio, create an empty database and then right-click on it, pick "Execute SQL from file" and use the SQLite-compatible file you produced earlier.

Related

Convert csv Import in datagrip to SQL script

I am using the Import Data Tool From DataGrip well, and I would know if is that possible, after loading a CSV file, to get the SQL script that used to do the import like
LOAD DATA IN FILE...
or how can I write the script by matching the DataGrip Import CV parameter to the SQL LOAD DATA parameters like SEPARATOR, TERMINATED BY ...
For a better understand I am facing the same problem as here and would to know if it's possible to do the same with DataGrip instead of HeidiSQL
https://stackoverflow.com/a/3635318/8280536
At the moment DataGrip doesn't support the mentioned feature.
I filed a feature request based on your description.

MYSQL CSV import Characters turn into '?'

I am new to MySQL and facing this trouble since many days ,please pardon my technical terminology. After researching a lot and failing I decided to post the question. I am using PHP and MYSQL to create customers web application. The data from user comes in an excel file which goes into the application.
The problem is this excel file has special characters all of them get converted to '?' when I import it into MYSQL using PHPMyAdmin GUI Import option.
What I have tried so far:
Converting it into csv and then uploading it via PHPMyAdmin GUI Import option , now I have read blogs
and posts to upload this file in UFT-8 character set but this is not
working it still gives me '?'. My database character set is also UFT-8.
Uploading excel directly via PHPmyAdmin GUI import option (character set UFT-8) , still not working.
Converting excel spreadsheet into Open spreadsheet document and uploading via PHPmyadmin GUI Import option - This worked the characters were intact., but when I use aggregate functions on database, it does not give me the desired result . (basically it increases my trouble going further)
If someone can help me to solve this problem with CSV import , I would really appreciate it because moving forward user will be uploading the CSV and I can easily import data in MYSQL with 'LOAD DATA INFILE' command via PHP.
Example of special characters : ' ² ' (square symbol).
I think you can try these method phpMyAdmin UTF-8, datasource , connection, database, php code

MySQL Workbench 6.1 - Error importing recordset

I'm going to be getting a new computer soon and I don't want to lose all of the data I have entered in my tables, so I decided to test out the feature that allows you to export and import CSV files. I exported a table successfully (data was transferred to Microsoft Excel in CSV file), but when I opened the file in Microsoft Excel and added a few rows and tried to import it back in to MySQL Workbench, I got the following error:
"Error importing recordset
error calling Python module function
SQLIDEUtils.importRecordsetDataFromFile"
I've searched all over for info on this, but can't find any solutions. Does anyone know what I'm doing wrong?
In Workbench, open a MySQL connection and then navigate to [Server] --> [Data Export]. There are several backup options here, including saving the data as an individual file or folder. Choose the databases you want to export, and then click [Start Export].
If you ever prefer using Excel for editing and such, then use the MySQL for Excel plugin to access MySQL databases from within Excel. However, I don't think you need it here.
To export your mySQL data, use mysqldump, which will create all the schema for you.
Excel probably added some stuff to your file and now mySQL can't understand it. The best way to find out is by comparing the files before and after the change.
That error indicates a format problem. If the file is small enough, try opening it in wordpad (or the mac equivalent) and see if there's any difference in the formatting? Could be that the delimiting got a little messed up (this can happen especially with end of row markers in MySQL, I've noticed, it can also happen in mac to pc handoffs). If all else fails you could try exporting using a different format and see if that makes a difference (maybe tsv) when you add new rows.
Another reason can be the line endings used. Depending on the system and editor used to work with the cvs file it the line endings might get changed. For me mysql supported UNIX line endings. And in the editor the line ending had been set to MAC OS 9 since I was using a MAC.
Changing it to UNIX line ending worked.
I found that it might be due to a wrong encoding of the input file.
Using Notepad++ for example (or another similar editor) you need to change file encoding to UTF-8.

Easiest way to continually import data to MySQL from a dbf file on my local computer

I have a problem that has been annoying me for quite some time now and a few days ago I started googling for a solution, but I haven't really gotten anything to work. I've read a little about something called SSIS, but I'm not sure it does what I'm looking for or if there is something else I should research in order to accomplish my goal. This is my problem:
My accounting program produces and updates a .dbf file with information about all vouchers and places it in a folder on my local computer. Our MySQL must continually be updated with this information. So this is what I do twice a day:
I open up the .dbf file in excel
Save it as a .csv.
Close Excel
Open the file in notepad++
Convert the formating to utf8
Save
log in to MySQL
Go to the right table
Upload the .csv
Replace the old data with the new
As this takes quite a bit of time, I feel that there must be better ways to do this. It would be great if I could have this scheduled to be done automatically or if there is some kind of an SQL query that could do this, because then I could use PHP to make a website that I could enter and have the query run when I press a button or something.
So my question is: What is the most simple way to continually get the info from the .dbf file into my SQL server?
There is a way to do your job by shedule with DBF Commander Pro's command-line interface. Use the following command in a *.BAT file:
dbfcommander.exe -edb <dbf_file_name> <server_table_name> <connection_string>
After that, create a shedule for this BAT file using Windows Sheduler.
The only issue remains, that you need to clear the destination table on MySQL database before the export process.
In order to try the export process in app GUI, click 'File -> Export to DBMS'. In the window appears click Build button in order to build the connection string: select MS OLEDB Provider for MySQL Server, then choose your server from the list, provide login and password, select a database, click OK:
In the Export to DBMS window select the destination table you want to import source DBF file to, then click Export. The command line you need you can find at the bottom part of the window.
More info on import and export DBF to a database you can find here. Detailed using of command-line is here.
As you mention of doing in PHP. What is stopping you from doing it there.
You could create one connection handle using a VFPOleDB provider to open the path location of the table, open and read the table. Then have a SECOND connection to your MySQL database open and ready to push the data there.
Then, for each row read from the VFP OleDB connection result set, do whatever special cleansing you need to.
Then, query from the MySQL connection if its an existing entry or not and if an add or update is necessary, then send the data respectively.
Continue for the rest of the records from the VFP result set.
No need to open in Excel, save to CSV format, load yet another tool, etc...

Import MS Access to CSV on Mac programmatically

The Official Cohort Default Rates for Schools site has a link on the left to “Download Entire List.” That downloads an Access database file (.accdb). I'd like to have it in CSV (.csv) format.
This answer provides a Windows solution to import Access to MySQL, but ideally, I'd like to have a Unix command-line program, e.g., accdb2csv input.accdb output.csv. Is there anything like that? If not, how do I code that?
Here are some other links I've found:
http://jackcess.sourceforge.net/
https://github.com/akaihola/mdb2django
https://github.com/karlbennett/export-accessdb/blob/5b492778439c85f15d5c859a27094514f7aba8ee/src/main/java/org/youthnet/export/Smasher.java
https://github.com/Tomvb62/DBConvert/blob/dc67a3d835a9708320d29b8040ddc5cde7e7fa39/src/dbengine/export/MSAccess.java
I just released an access2csv program based on Jackess. Code is at https://github.com/AccelerationNet/access2csv, a binary is available at https://github.com/AccelerationNet/access2csv/releases.
so right click rename the file from Aaron.accdb to Aaron.zip and then right click unzip it. Office 2007 / 2010 formats are effectively zipped XML files.
This will give you a bunch of XML that you can easily parse using Excel, or XSL, etc.