How to insert bulk record - mysql

I am working on API development using AWS API getaway and lambda.
I am using Serverless MySQL https://www.npmjs.com/package/serverless-mysql package for mysql connection and operation.
but I am unable to insert multiple records. If I pass the array of records to insert function it only insert single records.
Please suggest me how would I insert multiple records without using loop.
values=[
[
"229",
25,
"objective",
[
"49"
],
"2019-07-24 08:59:39",
"2019-07-24 08:59:39"
],
[
"229",
26,
"descriptive",
[
"Yes i have long term illness that limits my daily activities. Test..."
],
"2019-07-24 08:59:39",
"2019-07-24 08:59:39"
]
];
var sql = 'INSERT INTO `answers` (`user_id`, `question_id`, `question_type`, `answer`, `created_at`, `updated_at`) VALUES (?)';
await connection.query(sql, values);

I haven't used this package before, but just going through the documentation, it doesn't seems that it provides additional capability to batch insert. So I think you still need to compose the query as you normally do batch insert for mysql.
INSERT INTO table_name (field1,field2,field3) VALUES(1,2,3),(4,5,6),(7,8,9);

The batch mode is not available in this one. So if you want to avoid the loop one option is to compose the query as shown in one answer:
INSERT INTO table_name (field1,field2) VALUES(1,2,3),(4,5,6);
But the better way to do is create a separate lambda function to insert the values by passing sequentially. It will give you more flexibility to insert values across.
https://docs.aws.amazon.com/cli/latest/reference/glue/create-user-defined-function.html

Related

Insert multiple VALUES into table

I try to insert multiple VALUES into a table using the fat free frameworks sql mapper.
Docs
The problem is it only shows that for one VALUE
$db->exec('INSERT INTO mytable VALUES(?,?)',array(1=>5,2=>'Jim'))
As I have a lot of records and need to speed it up I wanted to add multiple
VALUES, as in VALUES(?,?),(?,?),?,?);
But how has the array to look then?
Background. I try to speed up the import this way because i parse big 100k+ csv files and import them.
The syntax to do that is:
$db->exec("INSERT INTO `table` (`col1`,`col2`) VALUES ('val1','val2'), ('val1','val2'), ('val1', 'val2')");
Definitely you want to use prepared statements, I recommend first to generate string for placeholders
VALUES (:q1, :q2), (:q3, :q4), (:q5, :q6)
and than generate bindings
[
':q1' => $data['val1'],
':q2' => $data['val2'],
':q3' => $data['val3'],
':q4' => $data['val4'],
//...
],

Performance of batched statements using INSERT ... SET vs INSERT ... VALUES

I recently wrote a simple Java program that processed some data and inserted it in a MyISAM table. About 35000 rows had to be inserted. I wrote the INSERT statement using INSERT ... SET syntax and executed it for all rows with PreparedStatement.executeBatch(). So:
String sql = "INSERT INTO my_table"
+ " SET "
+ " my_column_1 = ? "
+ " my_column_2 = ? "
...
+ " my_column_n = ? ";
try(PreparedStatement pst = con.prepareStatement(sql)){
for(Object o : someCollection){
pst.setInt(1, ...);
pst.setInt(2, ...);
...
pst.setInt(n, ...);
pst.addBatch();
}
pst.executeBatch();
}
I tried inserting all rows in a single batch and in bacthes of 1000, but in all cases the execution was VERY slow (about 1 minute per 1000 rows). After some tinkering I found that changing the syntax to INSERT ... VALUES improved the speed dramatically, 100x at the very least (I didn't measure it accurately).
String sql = "INSERT INTO my_table (my_column_1, my_column_2, ... , my_column_n)"
+ " VALUES (?, ?, ... , ?)";
What's going on here? Can it be that the JDBC driver cannot rewrite the batches when using INSERT ... SET? I didn't find any documentation about this. I am creating my connections with options rewriteBatchedStatements=true&useServerPrepStmts=false.
I first noticed this problem when accessing a database in another host. That is, I have used the INSERT ... SET approach before without any noticeable performance issue in applications that were executing in the same host as the database. So I guess the problem may be that many more statements are sent over the network with INSERT ... SET than with INSERT ... VALUES.
If you examine the INSERT ... SET syntax, you'll see it's only meant for inserting a single row. INSERT ... VALUES is meant for inserting multiple rows at one time.
In other words - even though you set rewriteBatchedStatements=true, the JDBC driver can't optimize the SET variation like it can with the VALUES variation because SET is not built for the batch case you have. Use VALUES to compress N inserts into one.
Bonus tip - If you use ON DUPLICATE KEY UPDATE, the JDBC currently can't rewrite those statements either. (edit: This statement is false - my mistake.)
There's an option you can set to verify all of this for yourself (I think it's 'profileSQL').

Substitue to Insert query for more than 200,000 records in mysql db table

I have to insert more than 200000 records at one go, into mysql db table, Insert query is resulting in performance issue, what could be the substitute to this.
Below is the code I am using
$xml = simplexml_load_file("247electrical.xml");
foreach($xml->merchant as $merchant){
define('API', 'PS');
require_once('constants.inc.php');
require_once('classes/class.ClientFactory.php');
$oClient = ClientFactory::getClient(API_USERNAME, API_PASSWORD, API_USER_TYPE); $merchattrs=$merchant->attributes();
$aParams100 = array('iMerchantId' => array($merchattrs->id)); $merchantinfo= $oClient->call('getMerchant', $aParams100);
//Get Products
foreach($xml->merchant->prod as $product){
$attrs=$product->attributes();
//Insert Products into DB
mysql_query('INSERT INTO productstemp (merchant_id, merchant_name, aw_product_id, merchant_product_id, product_name, description, category_id, merchant_category, aw_deep_link, aw_image_url, search_price, delivery_cost, merchant_image_url, aw_thumb_url, brand_name, delivery_time, display_price, in_stock, merchant_thumb_url, model_number, pre_order, stock_quantity, store_price, valid_from, valid_to, web_offer, merchantimage, cleancompany) VALUES("'.$merchattrs->id.'","'.$merchattrs->name.'","'.$attrs->id.'"," ","'.$product->text->name.'","'.$product->text->desc.'","'.$product->cat->awCatId.'","'.$product->cat->mCat.'","'.$product->uri->awTrack.'","'.$product->uri->awImage.'","'.$product->price->buynow.'","'.$product->price->delivery.'","'.$product->uri->mImage.'","'.$product->uri->awThumb.'","'.$product->brand->brandName.'","'.$product->delTime.'","'.$product->price->buynow.'","'.$attrs->in_stock.'","'.$product->uri->mThumb.'","'.$product->modelNumber.'","'.$attrs->pre_order.'","'.$attrs->stock_quantity.'","'.$product->price->store.'","'.$product->valFrom.'","'.$product->valTo.'","'.$attrs->web_offer.'","'.$merchantinfo->oMerchant->sLogoUrl.'","247electrical" ) ')
or die(mysql_error());
}
}
Thanks
I dont think that the INSERT queries per se are the problem. 200.000 inserts arent that much for mysql after all.
First I guess reading the file is slow. SimpleXML is convenient but for large files it results in a huge memory overhead. Think about a streaming XML reader like PHP´s XMLReader.
You are sending individual statements to the mysql server which is way slower then sending one huge statement. Also, your single insert statement should be wrapped in a transaction. What happens if you processed 10.000 records and inserted them and then your script dies/mysql server dies etc.? How do you safely start the script again without manual work (clearing table, lookup which were already processed etc.).
Apart from that, one single INSERT statement with many VALUES should be way faster. I would make your PHP script output the query so it looks in the end like this:
INSERT INTO table(field_1, field_2, field 3)
VALUES('foo 1', 'bar 1', 'baz 1'),
VALUES('foo 2', 'bar 2', 'baz 2'),
...
And then import that file via:
$ mysql ... credentials options etc ... < output.sql
If thats still too slow… buying more hardware might help, too.

Perl DBI insert multiple rows using mysql native multiple insert ability

Has anyone seen a DBI-type module for Perl which capitalizes, easily, on MySQL's multi-insert syntax
insert into TBL (col1, col2, col3) values (1,2,3),(4,5,6),...?
I've not yet found an interface which allows me to do that. The only thing I HAVE found is looping through my array. This method seems a lot less optimal vs throwing everything into a single line and letting MySQL handle it. I've not found any documentation out there IE google which sheds light on this short of rolling my own code to do it.
TIA
There are two approaches. You can insert (?, ?, ?) a number of times based on the size of the array. The text manipulation would be something like:
my $sql_values = join( ' ', ('(?, ?, ?)') x scalar(#array) );
Then flatten the array for calling execute(). I would avoid this way because of the thorny string and array manipulation that needs to be done.
The other way is to begin a transaction, then run a single insert statement multiple times.
my $sql = 'INSERT INTO tbl (col1, col2, col3)';
$dbh->{AutoCommit} = 0;
my $sth = $dbh->prepare_cached( $sql );
$sth->execute( #$_ ) for #array;
$sth->finish;
$dbh->{AutoCommit} = 1;
This is a bit slower than the first method, but it still avoids reparsing the statement. It also avoids the subtle manipulations of the first solution, while still being atomic and allowing disk I/O to be optimized.
If DBD::mysql supported DBI's execute_for_fetch (see DBI's execute_array and execute_for_fetch) this is the typical usage scenario i.e., you have multiple rows of inserts/updates/deletes available now and want to send them in one go (or in batches). I've no idea if the mysql client libs support sending multiple rows of bound parameters in one go but most other database client libs do and can take advantage of DBI's execute_array/execute_for_fetch. Unfortunately few DBDs actually implement execute_array/execute_for_fetch and rely on DBI implementing it one row at a time.
Jim,
Frezik has it. That is probably the most optimal:
my $sth = $dbh->prepare( 'INSERT INTO tbl (?, ?, ?)' );
foreach(#array) { $sth->execute( #{$_} ); }
$sth->finish;

bulk insert in zend framework [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
How do I add more than one row with Zend_Db?
everyone,
I need to have a bulk insert in zend framework. For normal sql I have the below query, I want to do the same in zend framework.
INSERT INTO `dbname`.'tablename` (`id`, `user_id`, `screen_name`, `screen_name_server`) VALUES (NULL, '15', 'test', 'test'), (NULL, '15', 'test', 'test');
Thanks.
There's no way to do this, as Marcin states.
If you'd like to do some messing around with the Zend Framework you'd could try to alter the
insert.
You could try to make it so that the insert method can take an array of arrays for the data. Then you could use the arrays of data to build the bulk insert.
For example,
$data1 = array( ); //data for first insert
$data2 = array( ); //data for 2nd insert
//a zend_db_table object
$dbTable->insert( array( $data1, $data, ) );
You would have to edit the insert method a bit to detect multiple data inserts and then build the insert accordingly. Unfortunately I haven't looked into how the code is built or I would just put it up here for you to use.
There is no insert method in zend_db that would insert multiple rows. What you can do though, is to use query method of Zend_Db_Adapter and put there your own insert sql.