Explore database contents from .sql file - mysql

I inherited the maintenance of a small web forum. Near as I can tell, it is powered by a MySQL database on the backend (the frontend is all PHP).
I need to extract some of the data (which also involves searching for the data I need to extract), but I don't want to touch the production database. I exported a database backup, which produced a several-hundred-megabyte .sql file.
What's the best way to mine these data? I can see several options:
grep through the .sql script in text mode, trying to extract the relevant data
Load it up in sqlite3 (I tried doing this, but it barfed on some of the statements in the script and didn't produce any tables. I have no database experience whatsoever though, so I haven't ruled it out as a dead end just yet).
Install MySQL on my home box, create a database, and execute the .sql script to recreate the data. Then just attach some database explorer tool.
Find some (Linux) app which can understand the .sql file natively (seems unlikely after a bit of Googling).
Any pointers to which of these options (or one I haven't thought of yet) would be the most productive?

I would say any option might work but for data mining, you definitely want to load it up in a new database so you can start query-ing the data and building reports on the data. I would load it up on your Home box. No need to have it remote.

Related

exporting mysql table with bulk data in phpmyadmin

I have a mysql table with large amount of data. I need to export this table to another database with all the data. But when I try to export the table as sql file from phpmyadmin, it shows error
The webpage at https://i.p.adress/domains/databases/phpMyAdmin/export.php might be temporarily down or it may have moved permanently to a new web address.
I tried exporting as CSV also, but the same error happens.
Does it happens because my table contains large amount of data? Is there any other way to export this table with all data?
I have around 1346641 records.
Does export work with a smaller database? Does it process for some time before showing that error or is it displayed as soon as you confirm your export options? I don't recall having seen that error message relating to large exports, but I may be remembering incorrectly. Any hints in your webserver or PHP error logs? Which phpMyAdmin version are you using?
Regarding large exports:
Because of the nature of phpMyAdmin running as a PHP script on your server, as well as sending the downloaded file to you as a download, there are a number of limitations forced on it. Web servers are usually configured to keep PHP programs from running for very long, and a long download (or long time processing the export) can be affected. Additionally, memory and other resources are often limited in a similar manner. Therefore, it's usually better to use some other means of exporting large databases.
The command-line utility mysqldump, provided with the MySQL server package, is the definitive standard. If you have command line/shell access, it's best to use mysqldump to export the .sql file(s) and then copy those through any normal file-transfer protocol (FTP, SCP, SSH, etc).
That being said, phpMyAdmin has several enhancements and tweaks that can make this possible.
Use $cfg[SaveDir] to enable writing the exported file to disk on the server, which you can then copy through any normal file transfer protocol.
If you encounter timeouts or resource restrictions, you can edit the PHP configuration directives (the linked documentation refers to imports but the same restrictions apply to exports).
Experiment with the export compression setting, in particular using an uncompressed export format (exporting to SQL directly rather than a zipped archive) can work around some memory restrictions.

Sending .csv files to a database: MariaDB

I will preface this by saying I am very new to databases. I am working on a project for my undergraduate research that requires various sensor data to be send from a Raspberry Pi via the internet to a database. I am using MariaDB at the moment, but am open to other options.
The background: Currently all sensor data is being saved in csv files on the RPi. There will be automation to send data at given intervals to the database.
The question: Am I able to audit the file itself to a database? For our application, a csv file is the most logical data storage format and we simply want the database to be a way for us to retrieve data remotely, since the system will be installed miles away from where we work.
I have read about "LOAD DATA INFILE" on this website, but am unsure how it applies to this database. Would JSON be at all applicable for this? I am willing to learn if it makes the process more streamlined.
Thank you!
If 'sending data to the database' means that, by one means or another, additional or replacement CSV files are saved on disk, in a location accessible to a MariaDB client program, then you can load these into the database using the "mysql" command-line client and an appropriate script of SQL commands. That script very likely will make use of the LOAD DATA LOCAL INFILE command.
The "mysql" program may be launched in a variety of ways: 1) spawned by the process that receives the uploaded file; 2) launched by a cron job (Task Scheduler on Windows) that runs periodically to check for new or changed CSV files; of 3) launched by a daemon that continually monitors the disk for new or changed CSV files.
A CSV is typically human readable. I would work with that first before worrying about using JSON. Unless the CSVs are huge, you could probably open them up in a simple text editor to read their contents to get an idea of what the data looks like.
I'm not sure of your environment (feel free to elaborate), but you could just use whatever web services you have to read in the CSV directly and inject the data into your database.
You say that data is being sent using automation. How is it communicating to your web service?
What is your web service? (Is it php?)
Where is the database being hosted? (Is it in the same webservice?)

recovering a mysql database using a copy of the datafiles

The primary HDD drive of my computer died yesterday, I got a new one and restored it but when I went to restore my mysql databases I realize I had no done a proper backup in a while.
Nevertheless, I do have the original database files from my previous installation as the datafiles were in the second HDD.
My question is, can I restore/create a new DB in the new machine using only the files from the previous installation?
Thank,
Ignacio
Yes, you should be able to do this.
I'm a bit unclear: when you say, "I do have the original database files" do you mean as a database? If so, go to phpMySql for the surviving database and choose Export. Export the database as a text file. Open the file with your favorite text editor and change the name of the database to match the name you are importing into.
Create the empty database where you want the new one to be, if it doesn't already exist. Now go to phpMySql for that database and import your text file. This should recreate all the tables and their data in the new location.
If instead, you only have the files that were the source of the data, then you need to use whatever tool was used to create the database in the first place. For example, if the data is in the form on an Excel spreadsheet, you would use MySql for Excel.
Hope this helps!
(after your comment that you have file backups of WordPress site)
In that case, you may not be so lucky. If you used a tool to do your backup, it may have backed up the database, but in general, the database is stored separately from your file system. So if you just copied the files yourself, you won't have the database. But do look for any file with a suffix of .SQL, which would be a database backup.
Next, contact your Internet Service Provider and see if they do periodic database backups. If so, you can recover from them.
For the future, see if your ISP provides automated backups (including database). And read https://codex.wordpress.org/WordPress_Backups.

publishing Filemaker (database software) data on Wordpress website

We would like to be able to publish Filemaker data on our Wordpress website. The website is up and running and the filemaker database is set up. We do not need a live connection between both systems so we chose to export the FM data to .csv so we can import it to the mysql database on the server and from there we would like to display in on the website.
Now are my questions, since this kind of development is new to us:
can I setup an automated import to the mysql database from a source like dropbox or something? For example can we make the mysql database import and overwrite the existing database each 24 hours from a .csv file located somewhere? We need this automated overwrite option because the FM data changes often and we need up to date info on the website)
How can we display the data from the mysql database on the WP frontend?
I've been looking into this myself and couldn't find any clear answers or guides. Can you guys point me in the right direction?
(btw, I know there are table plugins I can use for WP but they do not fulfill our needs, and I think it's exciting to do it all by ourself with help from this great community)
Update 01
I've successfully connected FM with my MySQL db using ODBC and can now select tables from the MySQL db in FM's relational graph.
I was wondering how I can write the data from my existing FM file to the MySQL db using ODBC, can anybody help me on this?
I would like to display the data in some MySQL tables so I can fetch them using php on my website.
Thanks!
It is possible to write directly into (and read from) a remote MySQL database from FileMaker via ODBC.
You need an MySQL account which allows remote access. There are providers where this is not allowed.
On the local box the odbc driver needs to be installed. On Win you can use the open source version (http://dev.mysql.com/downloads/connector/odbc/), on Mac it works better with the Actual Tech (http://www.actualtech.com/de/product_opensourcedatabases.php) drivers.
An odbc system dsn (not user dsn) is set up. Be sure to use the 32-bit odbc manager on Win.
Now you can create the external data source within FileMaker and read and write into MySQL tables.
Once you have made the connection to the MySQL database, and you can see the shadow tables, you can write to the fields directly via Filemaker layouts. It's as simple as that.
Once the layout contains the fields from the MySQL database you can move through records, find stuff all as if the data were native in your FM database. Of course, for more automated processing, you can create scripts, relationships etc and manipulate/synchronise data. Be warned though, the connection speed can limit complex relationships and large databases. I would advise 'baby steps'.

MySQL database manipulation program for Windows? Like MS Access or MS SQL?

Is there any program (preferably official) for Windows that can be used to manipulate MySQL data dumps?
For example, easily importing a MySQL text dump and create the database for all kinds of manipulations (you know, common data operations such as select, update, insert, delete, export into CSV, etc.) via a GUI interface. Much like what you can do with MS Excel and MS Access.
I know only phpMyAdmin which requires a local web server environment which might a little too much for what I need.
I thought http://dev.mysql.com/downloads/mysql/ was what I needed and installed to find out that it's not.
Any such tools exist? I ask this is actually because these MySQL dumps are for my users who know nothing about SQL or anything technical. This is for them, not me. After they downloaded the SQL I provided, they ask me: "How can I open it?"
I tried to provide them CSV, but CSV generated by this approach: http://www.kavoir.com/2010/11/mysql-export-table-to-csv-text-files-for-excel.html would contain stuff like \" if the original data contains ". When you open the CSV in Excel, \" are all over the place.
http://www.webyog.com/en/
I used to use SQLyog at my last job. It's a pretty decent GUI tool for interacting with MySQL, either local or remote. It'll cost you $99 at the cheapest, but you can try it for 30 days. If you like it and it makes life easier, it could be worth the $99, as well.
Running a local server is actually pretty easy. I use xampp which was really easy to install and came set up and ready to use phpMyAdmin. It's also really easy to shut the whole server (or just parts of it) down when it's not in use to conserve system resources.