I have a csv file that has over 1Million rows. The file is 37Mb roughly, where phpmyadmin says it can be up to 300MB. I try to import this large file into a PhpMyAdmin/MySql database. It loads the file, then brings me to "[phpMyAdminroot]/import.php" with a blank page. Nothing happens after this. No table has been inserted.
I assume this is due to the huge number of rows.
I guess my question is, how can I insert these desired rows into a new table on the database?
Edit 1
I am unable to access the database remotely, only through the provided hosting service's phpMyAdmin
I would recommend using MysqlWorkbench, I've had a similar problem and Workbench is able to import bigger files. As to why it doesn't work, I've also seen phpmyadmin stop when the execute time takes too long, maybe it's that but it's just not throwing an error?
I wrote an application called Excel2MySQL. It supports the Excel maximum sheet size which is 1,048,576 rows and 16,384 columns. I am currently working on adding support for CSV import as well, so that it can handle many millions of records. Currently it imports directly from Excel and more importantly, it will accurately import your data and handle table creation and field optimization.
Most modern providers allow you to add your local ip address to an "allow-access list" which you can do yourself through something like a cPanel. Then you can access your MySQL database through standard tcpip port 3306. Keep in mind that your remote ip address may change from time to time and you will have to add the new ip address to the access list from time to time. Here's one way to find your ip-address. Here is my host provider's documentation on remote MySQL connection.
Check carefully with your MySQL database service provider. Once you've added you ip to a white-list, you can access your MySQL server with other great tools like HeidiSQL... etc.
Related
I have an access database that connects to a vb6 application and this whole thing is connected between two computers via a shared network one running win 8 and other a win 7, and there is no internet involved in any sorta way nor should it be that is a requirement in fact
sorry I advance I have tried researching on the net but there is really short time and a lot of confusing material online
I am creating a WPF app connected to MySQL DB
now I have copied the access file and imported the contents of the DB in MySQL
things are a real mess in the imported DB so I am fixing it
what I am confused is how I am going to make it work there
do I go and install MySQL and do the whole process manually there, repeating all the steps and changes
is made
make a document that contains the code/script for all the changes I have made and run the data through
it, and is there even a way to implement that as a whole in a singular go
connect both databases together, i don't even know if this is possible
yes, in place of a simple "file share" of the Access file, you now are going to run some kind of SQL server system. In this case MySQL. But it could be PostgreSQL or any kind of "server" database.
That instance of "sql server" thus has to be setup, installed and you ensure that the "box" running that instance of MySQL also allows external connections (often by default the given computer firewall settings prevent this).
At that point, 2 or 10 different computers on that same network can now simply connect to the SQL server. The code of course is going to be VERY simular. You almost for sure used the oleDB provider for use with Access. However, you can use the ODBC provider, or even use the provider from MySQL. Those providers thus means you change the connect object, datareader object etc. However the "base" .net types such as row, or datatable, or dataset can remain as before (so you only change the provider). If you have a lot of code based on oleDB, then you could well consider to contine to use that oleDB provider code in .net, and thus you change the connection strings to now point to MySQL.
If you don't have a lot of code, then for sure do adopt the mySQL provider for .net. But as noted the least amount of changes would be to continue to use a oleDB provider for mysql, and that would suggest the least amount of code to be changed.
As for the msaccess data migration? Well, it not clear what tools and how you doing that now. But, once you transfer the data to the MySQL server (assuming you installed + setup my sql to run on one computer). The it is a simple matter to point your .net connection(s) in your code to Now MySQL as opposed to Acess. As a result, most if not all of your code logic for working with the tables can remain as before - but as noted you have to swap out the provider parts in .net
Now, if your REALLY lucky and the .net code used the ODBC provider? Then all you have to do is change your connection strings. And of course "some" SQL syntax in your code may have to be tweaked, as like Oracle, MS SQL server, postgreSQL, and MySQL?
Well, they all have some features and syntax that is different - this is especially in regards to date/time calculations, datediff() etc. But the general sql you have/had in your .net code should continue to run mostly un-changed against MySQL data tables.
As for how to migrate the data? I think that a really good tool is of course to use MS-Access. What you do is get MySQL up and running. Then use ms-access to open that database. You then add linked tables from MS-access to the MySQL tables.
At that point, you can now run append queries from Access to move/send the data to MySQL. It really depends on how many tables, and how many related tables are in that database. The more complex and the greater number of related tables in Access then the more the challenge to move such data up to MySQL.
Transferring Excel or a small or even big table is a breeze. (again, use MS Access and link to the tables on the sql server). However, where things can become messy is that if you have say 25 tables, and they are all related, many have cascade delete and say enforced parent to child relationships. So the more tables, and especially a larger number of related data tables, then the more work such a data migration task will become.
I think MS Access is a really good tool, since if you setup a connection to MySQL, then you can execute a transferDatabase commend in Access to send up one table to MySQL, and even all the columns and data types for those columns will be automatic created for you. So not only can Access transfer the data, but MORE valuable is it has the abilty to create the target tables on MySQL for you - and that will save you large amounts of time to build + setup the tables on MySQL.
I have a mysql table with large amount of data. I need to export this table to another database with all the data. But when I try to export the table as sql file from phpmyadmin, it shows error
The webpage at https://i.p.adress/domains/databases/phpMyAdmin/export.php might be temporarily down or it may have moved permanently to a new web address.
I tried exporting as CSV also, but the same error happens.
Does it happens because my table contains large amount of data? Is there any other way to export this table with all data?
I have around 1346641 records.
Does export work with a smaller database? Does it process for some time before showing that error or is it displayed as soon as you confirm your export options? I don't recall having seen that error message relating to large exports, but I may be remembering incorrectly. Any hints in your webserver or PHP error logs? Which phpMyAdmin version are you using?
Regarding large exports:
Because of the nature of phpMyAdmin running as a PHP script on your server, as well as sending the downloaded file to you as a download, there are a number of limitations forced on it. Web servers are usually configured to keep PHP programs from running for very long, and a long download (or long time processing the export) can be affected. Additionally, memory and other resources are often limited in a similar manner. Therefore, it's usually better to use some other means of exporting large databases.
The command-line utility mysqldump, provided with the MySQL server package, is the definitive standard. If you have command line/shell access, it's best to use mysqldump to export the .sql file(s) and then copy those through any normal file-transfer protocol (FTP, SCP, SSH, etc).
That being said, phpMyAdmin has several enhancements and tweaks that can make this possible.
Use $cfg[SaveDir] to enable writing the exported file to disk on the server, which you can then copy through any normal file transfer protocol.
If you encounter timeouts or resource restrictions, you can edit the PHP configuration directives (the linked documentation refers to imports but the same restrictions apply to exports).
Experiment with the export compression setting, in particular using an uncompressed export format (exporting to SQL directly rather than a zipped archive) can work around some memory restrictions.
We would like to be able to publish Filemaker data on our Wordpress website. The website is up and running and the filemaker database is set up. We do not need a live connection between both systems so we chose to export the FM data to .csv so we can import it to the mysql database on the server and from there we would like to display in on the website.
Now are my questions, since this kind of development is new to us:
can I setup an automated import to the mysql database from a source like dropbox or something? For example can we make the mysql database import and overwrite the existing database each 24 hours from a .csv file located somewhere? We need this automated overwrite option because the FM data changes often and we need up to date info on the website)
How can we display the data from the mysql database on the WP frontend?
I've been looking into this myself and couldn't find any clear answers or guides. Can you guys point me in the right direction?
(btw, I know there are table plugins I can use for WP but they do not fulfill our needs, and I think it's exciting to do it all by ourself with help from this great community)
Update 01
I've successfully connected FM with my MySQL db using ODBC and can now select tables from the MySQL db in FM's relational graph.
I was wondering how I can write the data from my existing FM file to the MySQL db using ODBC, can anybody help me on this?
I would like to display the data in some MySQL tables so I can fetch them using php on my website.
Thanks!
It is possible to write directly into (and read from) a remote MySQL database from FileMaker via ODBC.
You need an MySQL account which allows remote access. There are providers where this is not allowed.
On the local box the odbc driver needs to be installed. On Win you can use the open source version (http://dev.mysql.com/downloads/connector/odbc/), on Mac it works better with the Actual Tech (http://www.actualtech.com/de/product_opensourcedatabases.php) drivers.
An odbc system dsn (not user dsn) is set up. Be sure to use the 32-bit odbc manager on Win.
Now you can create the external data source within FileMaker and read and write into MySQL tables.
Once you have made the connection to the MySQL database, and you can see the shadow tables, you can write to the fields directly via Filemaker layouts. It's as simple as that.
Once the layout contains the fields from the MySQL database you can move through records, find stuff all as if the data were native in your FM database. Of course, for more automated processing, you can create scripts, relationships etc and manipulate/synchronise data. Be warned though, the connection speed can limit complex relationships and large databases. I would advise 'baby steps'.
I have databases in my system and also put database on web server also, so when I update my system database data I ll have to then replace or add data into web database.
but
problem is that I am doing changes in database to some specific record frequently for testing purpose.
So I want some mechanism that will used to export some specific records to sql file with insert statement.
Suppose I have made change in table tbl1 and added 10 records to it.
So right now I am manually adding or replacing whole table on web database.
So is there any mechanism in MySql or in Workbench using that I can export specific records.
Any Help for that.
The only automatic solution is to use replication, but that is probably not a good solution for your scenario. So what remains is some manual process. Here are some ideas:
Write a script that writes specific records into a dump file.
Then use a different script to load this dump file into your
target server.
If you frequently change the same records you could create a script
with insert statements that you edit for each new value and run
against both your local and your remote (web) server.
I've got a 90MB .csv extract I need to push into an Access 2007 DB.
AS A TEST - If I do a straight import of the file with default settings into a new table I end up with a DB of 134MB.
WHAT I NEED TO DO - The .csv extract contains a couple of columns I need to process before pushing them to into the Access DB. To accomplish this I am using SSIS (from SQL Server 2008 install) and using a couple of derived columns to contain the processed columns which then all get pushed into an existing Access table (which has no rows at the start of the process) via an ADO.NET Connection using the following connection string "Data Source=C:\Import\InTheGarden.accdb;Provider=Microsoft.ACE.OLEDB.12.0;". (Connection String from Connection in Connection Manager in SSIS)
When I process the data using SSIS I end up with a file of 1.16GB which when compacted comes down to a size of approximately 180MB!!!
Two things: -
Firstly I don't understand what is causing the bloat and how I might get around to it. I've read I don't know how many articles today on "Access 2007 Bloat" and can't seem to find anything that exlpains what exactly is going on:(
Secondly, whilst I can import the .csv file above I have another text file of 154MB which needs importing and given that the smaller file caused the DB to bloat to 1.16GB on import I'm not holding out much hope of success with this bigger file not exceeding the 2GB limit!
I would normally use SQL Server as a back end but there are reasons beyond my control as to why this can't happen:(
Happy to provide further information, Access is not something I use loads so there may be key information which I've missed!! :(
Thx!
Why not use a staging table in Access to do the import, process the data, and then insert to the real table? You'd use a separate temp database for that so as to avoid the bloat of importing twice. I've done dozens of imports into Access and this has always been the way I do it when I need to process the data before the final append.
Try breaking the CSV into several smaller files and running Access' Compact Database command after importing each one.