BigDump - UNEXPECTED: Can't set file pointer behind the end of file - mysql

While trying to start uplaod the 3.9 GB sql file via BigDump there is error
UNEXPECTED: Can't set file pointer
behind the end of file
Dump of database was exported from PHPMyAdmin. File is not corrupted. What is the problem? What are other ways to import such a big database?

Bigdump uses a INSERT INTO table VALUES (....) kind of method.
This is a very slow way of inserting!
Use
LOAD DATA INFILE 'c:/filename.csv' INTO TABLE table1
Instead. Note the use of forward slashes even on Windows.
See: http://dev.mysql.com/doc/refman/5.1/en/load-data.html
This is the fastest way possible to insert data into a MySQL table.
It will only work if the input file is on the same server as the MySQL server though.

I get similar error: I can't seek into .sql
The reason for this error is, that BigDump tries to set pointer at the end of .sql-File and then find out its size (using fseek() and fteil() functions). As fseek() is failing when you work with files over 2GB, you get this error. Solution is to split your SQL-File into chunks of 1,5GB - 2GB size...

Related

import sql file asreplace with phpmyadmin

How can I import a SQL-file (yes, sql not csv) with phpmyadmin so that it replaces or updates the data while importing?
I did not find option for that. I also created another temporary database where I imported the sql-file in question (having only INSERT -lines, only data no structure), and then went to export to select suitable option like INSERT ... SELECT ... ON DUPLICATE KEY UPDATE ..but did not find one or anything that would help in the situation.
So how can I achieve that? If not with phpMyAdmin, is there a program that transforms "insert" sql file to "update on duplicate", or even from "insert" to "delete" after which I could then re-import with original file?
How I came to this, if it helps the above or if someone has better solutions to previous steps:
I have a semi-large (1 GB) DB file to import, which I have then divided to multiple smaller files to get it imported. One of them being the structuce sql-dump and rest the data. When still trying to get the large file through, trying to adjust timeout settings through htaccess or phpmyadmin import options did not help - always getting the timeout anyway. Since those did not work, I found a program by Janos Rusiczki (https://rusiczki.net/2007/01/24/sql-dump-file-splitter/) to split the sql file into smaller ones (good program thanks Janos!). It also separated the structure from the data.
However after 8 succesfull imports I got timeout again, after phpmyadmin already imported part of the file. Thus I ended up in current situation. I know, I can always delete all and start over with even smaller partial files, but.. I am sure there is a better way to do this. There has to be a way to replace the files on import, or do some other way described above.
Thanks for any help! :)
Cribbing from INSERT ... ON DUPLICATE KEY (do nothing), you can use a regular expression to make every INSERT into an INSERT IGNORE in your sql file and it will pass over all the entries that have already been imported.
Note that will also ignore other errors, but other than timeout errors don't seem likely in this context.

sqlite load data infile Syntax Error

I'm working on importing a very large CSV file into SQLite. My understanding is that LOAD DATA INFILE is my best bet. I've created a table for it to reside in, and am attempting to execute the following query
LOAD DATA LOCAL INFILE 'F:/Downloads/NielsonReport.csv'
INTO TABLE neilson;
IGNORE 1 LINES
but, I get the following error:
Error while executing SQL query on database 'test': near "LOAD": syntax error
I seem to be getting an error along these lines regardless of what I'm trying to execute.
I feel like I'm missing something very basic, and would appreciate any help resolving this problem (I've been referencing this page for information so far)
When you are using SQLite, it would be a good idea to reference the SQLite documentation instead.
Anyway, SQLite itself does not have a CSV import function. But the sqlite3 command-line shell allows to import CSV files with the .import command.
Use import command like this.
.import '/Users/haseeb/Desktop/names_data.txt' Names

Load data local infile problems in Filemaker

I've got a script to load a text file into an Amazon RDS mysql database. It needs to handle a variable number of columns from the text file and store them as JSON. Here's an example with 5 columns that get stored as JSON:
LOAD DATA LOCAL INFILE '/Applications/FileMaker Pro 14/containers/imports/load1.txt' INTO TABLE jdataholder (rowdatetime, #v1, #v2, #v3, #v4, #v5) SET loadno = 1, formatno = 1, jdata = JSON_OBJECT('Site', #v1, 'Nutrients', #v2, 'Dissolved_O2', #v3, 'Turbitidy', #v4, 'Nitrogen', #v5);
local_infile is 'ON' on the server. The query works in Sequel Pro. When I try to run it in Filemaker Pro 14's (running on OS X 10.12) Execute SQL script step it doesn't work. I know the connection to the server is working because I can run other queries that don't use the LOAD DATA LOCAL INFILE statement.
The error message I get says:
ODBC ERROR: [Actual][MySQL] Load data local infile forbidden
From other answers on SO and elsewhere it seems like the client also needs to have local_infile enabled. This would explain why it works on one client and not the other. I've tried to do this, but the instructions I've seen all use the terminal. I don't think Filemaker has anything like this - you can just enter SQL into a query editor and send that to a remote database. I'm not sure how or even if you can change the configuration of the client.
Does anyone know how to enable this on Filemaker? Or, is there something else I can do to make this work?
Could I avoid this if I ran the load data local infile query from a stored procedure? That was my original plan, but the documentation says that the load data infile step has to call literal strings, not variables, so I couldn't think of a way to handle a variable number of columns.
Thanks.
It's possible that the FileMaker ExecuteSQL does not support this. I would suggest using a plugin that has the ability to run terminal commands, then use that plugin to perform the action in the terminal through FileMaker. There's a free plugin that has a function for that, it's called BaseElements. Below is a link to the documentation on this specific function.
https://baseelementsplugin.zendesk.com/hc/en-us/articles/205350547-BE-ExecuteShellCommand
If you are trying to insert data into a MySQL table, a better way is to use FileMaker ESS (External SQL Sources), which allows you to work with MySQL, Oracle and other supported databases inside FileMaker. This includes being able to import data into the ESS ( MySQL table. You can see a pdf document on ESS below:
https://www.filemaker.com/downloads/documentation/techbrief_intro_ess.pdf

Rename file within MySQL Stored Procedure

I'm trying to backup some of my data stored in a big table with
SELECT ... INTO OUTFILE
statement.
The output file is on a network hard drive, so if the network connection breaks just during the dump (it takes one minute more or less) I find a partial file on my network hard drive and I'd like to mark such file as "wrong".
Is there a SQL command that I can give inside my MySQL Stored Procedure that let me rename such file?
Thank you very much
Best
cghersi
You can't rename file in mysql, but you could use two files for dump and rotate it only when operation was successful.
Example:
You make dump to 'a.csv', if operation was successful will use 'b.csv' for next dump, otherwise use 'a.csv' again. And so on...

Upload mysql database in chunks

I am trying to upload a 32mb MYSQL database into a pre-existing database, but the php admin on my shared hosting has a 10mb limit... I have tried zipping it up - but when the server unzips the database, the uncompressed file is too large for the server to handle.
Is it possible to split the database up and upload it by pasting it in parts as an SQL query - I assume I would need each chunk to have something at the start of it which says
"Import this data into the pre-existing tables in the database"
What would this be?
At the moment there is a few hundred lines saying things like "CREATE" and "INSERT INTO"
You might try connecting to the database remotely with mysql workbench, or command line tool mysql. If you can do that, you can run:
source c:/path/to/your/file.sql
and you won't be constrained by phpmyadmin's upload size restrictions. Most shared hosting I've seen allows it. If not, you may just need to grant permissions for the user#host in phpmyadmin (or whatever the interface is).
The dump file created by mysqldump is just a set of SQL statements that will rebuild your tables.
To load it in in chunks I'd recommend either dumping it out in sets of tables and loading them one by one or if required the dump file should be roughly in the same (pseudo) format:
Set things up ready for loading
CREATE TABLE t1;
INSERT INTO TABLE t1...;
INSERT INTO TABLE t1...;
CREATE TABLE t2;
INSERT INTO TABLE t2...;
INSERT INTO TABLE t2...;
Finalise stuff after loading
You can manually split the file up by keeping the commands at the start and finish and just choosing blocks for individual tables by looking for their CREATE TABLE statements.