Searching for multiple values in 1 query - mysql

If I have a database having 2 fields, Roll no and name and I have a list (of n values) of roll numbers for which I have to search the corresponding names.
Can this be done using just one query in SQL or HQL?

SELECT name FROM [table] WHERE id IN ([list of ids])
where [list of ids] is for example 2,3,5,7.

Use the IN operator and separate your Roll no's by a comma.
SELECT name
FROM yourtable
WHERE [Roll no] IN (1, 2, 3, 4, etc)

You can use the IN statement as shown above.
There are a couple of minor issues with this. It can perform poorly if the number of values in the clause gets too large.
The second issue is that in many development environments you land up needing to dynamically create the query with a variable number of items (or a variable number of placeholders if using parameterised queries). While not difficult if does make your code look messy and mean you haven't got a nice neat piece of SQL that you can copy out and use to test.
But examples (using php).
Here the IN is just dynamically created with the SQL. Assuming the roll numbers can only be integers it is applying intval() to each member of the array to avoid any non integer values being used in the SQL.
<?php
$list_of_roll_no = array(1,2,3,4,5,6,7,8,9);
$sql = "SELECT FROM some_table WHERE `Roll no` IN (".implode(", ", array_map ('intval', $list_of_roll_no)).")";
?>
Using mysqli bound parameters is a bit messy. This is because the bind parameter statement expects a variable number of parameters. The 2nd parameter onwards are the values to be bound, and it expects them to be passed by reference. So the foreach here is used to generate an array of references:-
<?php
$list_of_roll_no = array(1,2,3,4,5,6,7,8,9);
if ($stmt = $mysqli->prepare("SELECT FROM some_table WHERE `Roll no` IN (".implode(",", array_fill(0, count($list_of_roll_no), '?')).")"))
{
$bind_arguments = [];
$bind_arguments[] = str_repeat("i", count($list_of_roll_no));
foreach ($list_of_roll_no as $list_of_roll_no_key => $list_of_roll_no_value)
{
$bind_arguments[] = & $list_of_roll_no[$list_of_roll_no_key]; # bind to array ref, not to the temporary $recordvalue
}
call_user_func_array(array($statement, 'bind_param'), $bind_arguments);
$statement->execute();
}
?>
Another solution is to push all the values into another table. Can be a temp table. Then you use an INNER JOIN between your table and your temp table to find the matching values. Depending on what you already have in place then this is quite easy to do (eg, I have a php class to insert multiple records easily - I just keep passing them across and the class batches them up and inserts them occasionally to avoid repeatedly hitting the database).

Related

How to mass update in Eloquent given an array of keys and values efficiently

I have a very large (millions of rows) database table where I need to add some missing data from a 3rd party to every row.
The data source has a 'reference key' which is my only way to map to the correct item in the table
Each row needs 1 number updated
I can loop through the 3rd party data source and perform an eloquent Update to each row using a unique identifier, but this is very slow from my tests:
Orders
id, reference_key, new_value
int, string, double(8,2)
foreach ($xml as $row) {
Order::where('reference_key', $reference_key)
->update('new_value', (float)$row->new_value);
}
Is there a more efficient way I can do this?
Eloquent is very powerfull to manage complicated relationship, joins, eager loaded models etc... But this abstraction has a performance cost. Each model have to be created, filled and saved, it is packed with tons of features you don't need for this precise use case.
When editing thousand or even millions of records, it is highly inefficient to use Eloquent models. Instead you can either use the laravel Query builder or a raw SQL statement.
I would recommend this approach:
$table = Db::table('orders');
foreach ($xml as $row) {
$table->where('reference_key', $reference_key)
->update('new_value', (float)$row->new_value);
}
But you can also do something like this:
foreach ($xml as $row) {
DB::statement('UPDATE orders SET new_value=? WHERE reference_key=?',
[(float)$row->new_value, $reference_key]);
}
It will cut down your execution time significantly but the loop over millions of XML lines will still take a long time.
I would do update statement to do all in once like this :
UPDATE OrderTable
INNER JOIN table_to_fill ON OrderTable.refkey = table_to_fill.refkey
SET OrderTable.value = table_to_fill.value

Store results of expensive function calls in a MySQL table

Let's suppose I have a set of integers of a variable length. I apply a function on this set of integers and I obtain a result.
myFunction(setOfIntegers) => myResult
Let's suppose a call to myFunction is very expensive and I would like to somehow store the results of this function calls.
In my application I am already using MySQL and what I was thinking was to somehow create a table with the setOfIntegers as a PK and myResult as an additional field.
I was thinking that I could do this by transforming the setOfIntegers to a string before storing it in the DB.
Can this be done in any other way? Or would there be a better way to store results of such function calls in order to avoid calling them a 2nd time with the same set of integers?
I don't know about Java, but Perl has my $str = join(',', $array) and PHP has $str = implode(',', $array). Then the string $str could be used as the PRIMARY KEY (assuming it is not too long). And the result would go in the other column.
Your app code (in Java) would need to first do an implode and SELECT to see if the function has already been evaluated for the given array. If not, then perform the function and end by INSERTing a new row.
If this will be multi-threaded, you could use INSERT IGNORE to deal with dups. (There are other solutions, too.)
Another note: If your set-of-integers is ordered, then what I describe is 'complete'. If it is unordered, then sort it before imploding. This will provide a canonical representation.
If the function can be implemented in MySQL directly, I would suggest using Views.
https://www.mysqltutorial.org/mysql-views-tutorial.aspx/

Using fetchrow_hashref to store data

I am trying to take information out of a MySQL database, which I will then manipulate in perl:
use strict;
use DBI;
my $dbh_m= DBI->connect("dbi:mysql:Populationdb","root","LisaUni")
or die("Error: $DBI::errstr");
my $Genotype = 'Genotype'.1;
#The idea here is eventually I will ask the database how many Genotypes there are, and then loop it round to complete the following for each Genotype:
my $sql =qq(SELECT TransNo, gene.Gene FROM gene JOIN genotypegene ON gene.Gene = genotypegene.Gene WHERE Genotype like '$Genotype');
my $sth = $dbh_m-> prepare($sql);
$sth->execute;
my %hash;
my $transvalues = $sth->fetchrow_hashref;
my %hash= %$transvalues;
$sth ->finish();
$dbh_m->disconnect();
my $key;
my $value;
while (($key, $value) = each(%hash)){
print $key.", ".$value\n; }
This code doesn't produce any errors, but the %hash only stores the last row taken from the database (I got the idea of writing it this way from this website). If I type:
while(my $transvalues = $sth->fetchrow_hashref){
print "Gene: $transvalues->{Gene}\n";
print "Trans: $transvalues->{TransNo}\n";
}
Then it does print off all the rows, but I need all this information to be available once I've closed the connection to the database.
I also have a related question: in my MySQL database the row consists of e.g 'Gene1'(Gene) '4'(TransNo). Once I have taken this data out of the database as I am doing above, will the TransNo still know which Gene it is associated with? Or do I need to create some kind of hash of hash structure for that?
You are calling the "wrong" function
fetchrow_hashref will return one row as a hashref, you should wrap it's use inside a loop, ending it when fetchrow_hashref returns undef.
It seems like you are looking for fetchall_hashref, that will give you all of the returned rows as a hash with the first parameter specified what field to use as a key.
$hash_ref = $sth->fetchall_hashref ($key_field);
Each row will be inserted into $hash_ref as an internal hashref, using $key_field as the key in which you can find the row in $hash_ref.
What does the documentation say?
The fetchall_hashref method can be used to fetch all the data to be returned from a prepared and executed statement handle.
It returns a reference to a hash containing a key for each distinct value of the $key_field column that was fetched.
For each key the corresponding value is a reference to a hash containing all the selected columns and their values, as returned by fetchrow_hashref().
Documentation links
DBI - search.cpan.org #fetchrow_hashref
DBI - search.cpan.org #fetchall_hashref

mysql insert nested select from other db truncates double values

I have a table in one database, call this db x. I have another database, call it y. I want to copy data from x.some_table to y.some_table. I don't want to do an exact copy of the table, because some columns don't make sense in database b. I use the following query:
INSERT INTO y.some_table (a_field) SELECT a_field FROM x.some_table;
a_filed in both tables is defined as DOULBE(17,0). If i run this:
USE y;
SELECT a_field FROM x;
Then I get output with full values --no floating-point truncation. However, if after insertion using the first query I showed, I get nothing but whole numbers in y's some_table.a_field. The floating-point remainders are truncated.
What am I doing wrong? Thanks.
Are you sure that the column is defined as DOUBLE(17,0) in both tables? Doesn't that specify 17 total digits with 0 after the decimal? If so you're select from table x should also have 0 decimal places. If its defined differently in x say DOUBLE(17,6) and you are trying to insert it into DOUBLE(17,0) then the decimals will be truncated I believe.
Not sure what is causing truncation .. you can make sure that you properly set the floating type .. if you think your table definition is OK, you can create a script to test it
for example, in PHP you could do something like -
$sql = "SELECT your_select_field FROM your_table";
$result = mysql_query($sql);
while($row = mysql_fetch_assoc($result)) {
$sql_ins = "INSERT INTO your_insert_table SET your_field = '".$row['your_select_field']."' ";
$res_ins = mysql_query($sql_ins);
}

Why does my INSERT sometimes fail with "no such field"?

I've been using the following snippet in developements for years. Now all of a sudden I get a DB Error: no such field warning
$process = "process";
$create = $connection->query
(
"INSERT INTO summery (process) VALUES($process)"
);
if (DB::isError($create)) die($create->getMessage($create));
but it's fine if I use numerics
$process = "12345";
$create = $connection->query
(
"INSERT INTO summery (process) VALUES($process)"
);
if (DB::isError($create)) die($create->getMessage($create));
or write the value directly into the expression
$create = $connection->query
(
"INSERT INTO summery (process) VALUES('process')"
);
if (DB::isError($create)) die($create->getMessage($create));
I'm really confused ... any suggestions?
It's always better to use prepared queries and parameter placeholders. Like this in Perl DBI:
my $process=1234;
my $ins_process = $dbh->prepare("INSERT INTO summary (process) values(?)");
$ins_process->execute($process);
For best performance, prepare all your often-used queries right after opening the database connection. Many database engines will store them on the server during the session, much like small temporary stored procedures.
Its also very good for security. Writing the value into an insert string yourself means that you must write the correct escape code at each SQL statement. Using a prepare and execute style means that only one place (execute) needs to know about escaping, if escaping is even necessary.
Ditto what Zan Lynx said about placeholders. But you may still be wondering why your code failed.
It appears that you forgot a crucial detail from the previous code that worked for you for years: quotes.
This (tested) code works fine:
my $thing = 'abcde';
my $sth = $dbh->prepare("INSERT INTO table1 (id,field1)
VALUES (3,'$thing')");
$sth->execute;
But this next code (lacking the quotation marks in the VALUES field just as your first example does) produces the error you report because VALUES (3,$thing) resolves to VALUES (3,abcde) causing your SQL server to look for a field called abcde and there is no field by that name.
my $thing = 'abcde';
my $sth = $dbh->prepare("INSERT INTO table1 (id,field1)
VALUES (3,$thing)");
$sth->execute;
All of this assumes that your first example is not a direct quote of code that failed as you describe and therefore not what you intended. It resolves to:
"INSERT INTO summery (process) VALUES(process)"
which, as mentioned above causes your SQL server to read the item in the VALUES set as another field name. As given, this actually runs on MySQL without complaint and will fill the field called 'process' with NULL because that's what the field called 'process' contained when MySQL looked there for a value as it created the new record.
I do use this style for quick throw-away hacks involving known, secure data (e.g. a value supplied within the program itself). But for anything involving data that comes from outside the program or that might possibly contain other than [0-9a-zA-Z] it will save you grief to use placeholders.