Insert multiple VALUES into table - fat-free-framework

I try to insert multiple VALUES into a table using the fat free frameworks sql mapper.
Docs
The problem is it only shows that for one VALUE
$db->exec('INSERT INTO mytable VALUES(?,?)',array(1=>5,2=>'Jim'))
As I have a lot of records and need to speed it up I wanted to add multiple
VALUES, as in VALUES(?,?),(?,?),?,?);
But how has the array to look then?
Background. I try to speed up the import this way because i parse big 100k+ csv files and import them.

The syntax to do that is:
$db->exec("INSERT INTO `table` (`col1`,`col2`) VALUES ('val1','val2'), ('val1','val2'), ('val1', 'val2')");
Definitely you want to use prepared statements, I recommend first to generate string for placeholders
VALUES (:q1, :q2), (:q3, :q4), (:q5, :q6)
and than generate bindings
[
':q1' => $data['val1'],
':q2' => $data['val2'],
':q3' => $data['val3'],
':q4' => $data['val4'],
//...
],

Related

Pass a 'null' value to a MySQL database in node.js

I'm trying to pass data from a Webscraper to a MySQL database. I have a lot of variables that need to be entered at a time into the database and below is a snippet of the code I'm using. (where the etc. is there are a bunch more variables.
con.query(INSERT INTO Paper_2 (referenceCodeSubject,referenceCode,subject, etc.) values ('"+referenceCodeSubject+"','"+referenceCode+"','"+subject+"', etc.))
The columns in the database have types INT, VARCHAR and CHAR.
My issue is that when I scrape not all of the variables will be assigned values and will remain as 'null' and I cannot pass this null as NULL to MySQL. It would also be quite complicated to sort the different cases for when to pass what due to the large amount of variables.
I'm hoping theres a simple way of doing this as the only solutions I've seen so far are omit the value in the query (which is hard because I would then need to decide which values to omit) or pass a string of "NULL" or just a value of 0. Is there any other way of doing this?
Better use the built in escaping feature to avoid sql injection attacks!
conn.query(
'INSERT INTO Paper_2 (referenceCodeSubject,referenceCode,subject) VALUES ?'
[
['refCodeSubject1', 'refCode1', 'subject1'],
['refCodeSubject2', 'refCode2', null]
],
(error, results, fields) => {
...
}
)
If you have the case, that the bind values can sometime be a valid string and sometimes undefined, use an or operator in sqlValues to handle both cases with shorthand code:
let nameValue;
let sql="insert into user (name) values (?)"
let sqlValues[] = [nameValue || null ]

How to insert bulk record

I am working on API development using AWS API getaway and lambda.
I am using Serverless MySQL https://www.npmjs.com/package/serverless-mysql package for mysql connection and operation.
but I am unable to insert multiple records. If I pass the array of records to insert function it only insert single records.
Please suggest me how would I insert multiple records without using loop.
values=[
[
"229",
25,
"objective",
[
"49"
],
"2019-07-24 08:59:39",
"2019-07-24 08:59:39"
],
[
"229",
26,
"descriptive",
[
"Yes i have long term illness that limits my daily activities. Test..."
],
"2019-07-24 08:59:39",
"2019-07-24 08:59:39"
]
];
var sql = 'INSERT INTO `answers` (`user_id`, `question_id`, `question_type`, `answer`, `created_at`, `updated_at`) VALUES (?)';
await connection.query(sql, values);
I haven't used this package before, but just going through the documentation, it doesn't seems that it provides additional capability to batch insert. So I think you still need to compose the query as you normally do batch insert for mysql.
INSERT INTO table_name (field1,field2,field3) VALUES(1,2,3),(4,5,6),(7,8,9);
The batch mode is not available in this one. So if you want to avoid the loop one option is to compose the query as shown in one answer:
INSERT INTO table_name (field1,field2) VALUES(1,2,3),(4,5,6);
But the better way to do is create a separate lambda function to insert the values by passing sequentially. It will give you more flexibility to insert values across.
https://docs.aws.amazon.com/cli/latest/reference/glue/create-user-defined-function.html

Insert JSON into multiple tables on Database in Mule

I am trying to insert the contents of an JSON to a MySql database using Mule ESB. The JSON looks like:
{
"id":106636,
"client_id":9999,
"comments":"Credit",
"salesman_name":"Salvador Dali",
"cart_items":[
{"citem_id":1066819,"quantity":3},
{"citem_id":1066820,"quantity":10}
]
}
On mule I want to insert all data on a step like:
Insert INTO order_header(id,client_id,comments,salesman_name)
Insert INTO order_detail(id,citem_id,quantity)
Insert INTO order_detail(id,citem_id,quantity)
Currently i have come this far on Mule:
MuleSoft Flow
Use Bulk Execute operation of Database Connector.
You will insert into multiple tables.
for ex :
Query text
Insert INTO order_header(payload.id,payload.client_id,payload.comments,payload.salesman_name);
Insert INTO order_detail(payload.id,payload.cart_items[0].citem_id,payload.cart_items[0].quantity); etc..
There is an excellant article here http://www.dotnetfunda.com/articles/show/2078/parse-json-keys-to-insert-records-into-postgresql-database-using-mule
that should be of help. You may need to modify as you need to write the order_header data first and then use a collection splitter for the order_detail and wrap the whole in a transaction.
Ok. Since, you have already converted JSON into Object in the flow, you can refer individual values with their object reference like obj.id, obj.client_id etc.
Get a database connector next.
Configure your MySQL database in "Connector Configuration".
Operation: Choose "Bulk execute"
In "Query text" : Write multiple INSERT queries and pass appropriate values from Object (converted from JSON). Remember to separate multiple queries with semicolon (;) in Query text.
That's it !! Let me know if you face any issue. Hope it works for you..

How to get database sql values from an active record object?

My original problem is that I need to insert a lot of records to DB, so to speed up, I want to use mysqlimport which takes a file of row values and load them to specified table. So suppose I have model Book, I couldn't simply use book.attributes.values as one of the fields is a hash that is serialized to db (using serialize), so I need to know what is the format this hash will be stored in in the db. Same for time and dates fields. Any help?
How about using SQL insert statements instead of serialization?
book = Book.new(:title => 'Much Ado About Nothing', author: 'William Shakespeare')
sql = book.class.arel_table.create_insert
.tap { |im| im.insert(record.send(
:arel_attributes_with_values_for_create,
record.attribute_names)) }
.to_sql

MySQL function using Rails/ActiveRecord

I have a problem in using Rails / ActiveRecord.
I want to insert record with MySQL function, for example GeomFromText('POINT(1 1)').
Using ActiveRecord normally, these functions are quoted automatically. I want not to quote these values.
Model.create(geo: GeomFromText('POINT(1 1)'))
this ActiveRecord statement will generate following SQL
INSERT INTO `Model` (`geo`) VALUES ('GeomFromText(\'POINT(1 1)\')')
It may be easy to use raw SQL, but I want to use ActiveRecord because my Model set several callbacks include self table.
How can use MySQL function with ActiveRecord statement?
Summary
You can't by design; this behavior is important for preventing SQL injection attacks. You would need to explicitly execute raw SQL in conjunction with ActiveRecord.
Details
As you saw, the SQL statement gets interpolated as a string by design, which doesn't do what you want (Rails ~> 4.0):
> Country.create(name: 'LOWER("CANADA")')
=> SQL (0.3ms) INSERT INTO `Country` (`Name`) VALUES ('LOWER(\"CANADA\")')
Nor can you use the same tricks that would work for the .where method:
> Country.create(["name = LOWER(:country)", { country: 'CANADA' }])
=> ArgumentError: When assigning attributes, you must pass a hash as an argument.
You would need to execute arbitrary SQL first to get the proper value, then make another SQL call via ActiveRecord to achieve your callback:
> Country.create( name: ActiveRecord::Base.connection.execute(%q{ SELECT LOWER('CANADA') }).first[0] )
=> (0.3ms) SELECT LOWER('CANADA')
=> SQL (0.3ms) INSERT INTO `Country` (`Name`) VALUES ('canada')
That said, it's probably cleaner to re-implement the SQL function at the application layer instead of the DB layer (unless you've got a really complex SQL function).