Combining multiple queries into a single query - mysql

I have code that I've written in an ORM syntax. It reads blog comments data from a file, and inserts them into blogs and comments tables. I'd like to take this ORM code back to mysql, because I need to combine as many queries as possible into a single query, and this optimization wouldn't be easy in the ORM language. The reason I need this optimization is because I'm working with a remote server, so the fewer the queries, the better. I wrote the code below in mysql pseudo-code, because I somewhat forgot mysql.
This is the comments file that contains all the comments for all the blogs. from blog url tells me which blog this comment belongs to.
comment text from blog url
------------------------------------------
first comment text first-blog-url
second comment text first-blog-url
third comment text first-blog-url
fourth comment text blog-2-url
fifth comment text blog-2-url
sixth comment text 3rd-blog-url
This is the ORM code that I use to process the file. (at the very bottom, I added the description of the tables).
//I read a comment from the comments file, `comment text` and `from blog url`
//does a blog exist that has 'link' that matches 'from blog url'
$blog = //SELECT FROM blogs where 'link' has value 'first-blog-url'
//if it doesn't exist, create it
if($blog == null){
$blog = INSERT INTO blogs a new record and set 'link' to 'first-blog-url'
}
//then read the id of the (existing or just-created) blog row
$blog_id = $blog->getId();
//then use the $blog_id to insert the comment into the 'comments' table.
//does this comment text already exist for this blog id?
$comment = SELECT FROM comments where `commenttext' has value 'whatever comment text' and 'blogid' has value $blog_id
//if it doesn't exist, create it
if($comment == null){
$comment = INSERT INTO comments a new record and set 'commenttext' to 'the comment text' and 'blogid' to $blog_id.
}
$comment_id = $comment->getId();
So my question: is it possible to write this in a single mysql query?
I found a similar question here but it doesn't fully solve my problem, and I'm not sure if it's the most efficient way to do it.
The 2 tables are blogs and comments where each row in comments has a field blogid that links it to the right blog in blogs. So it's basically a 1:many relationship where each blog row can be linked to many comment rows. They look like this:
blogs:
id link other fields
--------------------------------------------
1 first-blog-url
2 blog-2-url
3 3rd-blog-url
comments:
id commenttext blogid
-----------------------------
1 random 1
2 comment 1
3 goes 1
4 here 2
5 any 2
6 thing 3

You can use this technique to insert a row if not exists:
INSERT INTO blogs (link)
select 'first-blog-url'
from dual
where not exists
( select 1
from blogs
where link = 'first-blog-url'
);
As you can see, select clause will return a only one row with data to be inserted only when not yet exists in database. You can test it at SQLFIDDLE.
To insert into comment table you can use same technique. You can get Blog id for second query with LAST_INSERT_ID() if inserted is dued (if not, you need a new query).
This is only a point to start, perhaps you can reduce to 3 your 4 queries. Any comment are welcome to figure up final solution.
As you know, MySQL don't has MERGE statement. I think that replace don't match your requeriments.

Related

Updating JSON in SQLite with JSON1

The SQLite JSON1 extension has some really neat capabilities. However, I have not been able to figure out how I can update or insert individual JSON attribute values.
Here is an example
CREATE TABLE keywords
(
id INTEGER PRIMARY KEY,
lang INTEGER NOT NULL,
kwd TEXT NOT NULL,
locs TEXT NOT NULL DEFAULT '{}'
);
CREATE INDEX kwd ON keywords(lang,kwd);
I am using this table to store keyword searches and recording the locations from which the search was ininitated in the object locs. A sample entry in this database table would be like the one shown below
id:1,lang:1,kwd:'stackoverflow',locs:'{"1":1,"2":1,"5":1}'
The location object attributes here are indices to the actual locations stored elsewhere.
Now imagine the following scenarios
A search for stackoverflow is initiated from location index "2". In this case I simply want to increment the value at that index so that after the operation the corresponding row reads
id:1,lang:1,kwd:'stackoverflow',locs:'{"1":1,"2":2,"5":1}'
A search for stackoverflow is initiated from a previously unknown location index "7" in which case the corresponding row after the update would have to read
id:1,lang:1,kwd:'stackoverflow',locs:'{"1":1,"2":1,"5":1,"7":1}'
It is not clear to me that this can in fact be done. I tried something along the lines of
UPDATE keywords json_set(locs,'$.2','2') WHERE kwd = 'stackoverflow';
which gave the error message error near json_set. I'd be most obliged to anyone who might be able to tell me how/whether this should/can be done.
It is not necessary to create such complicated SQL with subqueries to do this.
The SQL below would solve your needs.
UPDATE keywords
SET locs = json_set(locs,'$.7', IFNULL(json_extract(locs, '$.7'), 0) + 1)
WHERE kwd = 'stackoverflow';
I know this is old, but it's like the first link when searching, it deserves a better solution.
I could have just deleted this question but given that the SQLite JSON1 extension appears to be relatively poorly understood I felt it would be more useful to provide an answer here for the benefit of others. What I have set out to do here is possible but the SQL syntax is rather more convoluted.
UPDATE keywords set locs =
(select json_set(json(keywords.locs),'$.**N**',
ifnull(
(select json_extract(keywords.locs,'$.**N**') from keywords where id = '1'),
0)
+ 1)
from keywords where id = '1')
where id = '1';
will accomplish both of the updates I have described in my original question above. Given how complicated this looks a few explanations are in order
The UPDATE keywords part does the actual updating, but it needs to know what to updatte
The SELECT json_set part is where we establish the value to be updated
If the relevant value does not exsit in the first place we do not want to do a + 1 on a null value so we do an IFNULL TEST
The WHERE id = bits ensure that we target the right row
Having now worked with JSON1 in SQLite for a while I have a tip to share with others going down the same road. It is easy to waste your time writing extremely convoluted and hard to maintain SQL in an effort to perform in-place JSON manipulation. Consider using SQLite in memory tables - CREATE TEMP TABLE... to store intermediate results and write a sequence of SQL statements instead. This makes the code a whole lot eaiser to understand and to maintain.

Join two INSERT INTOs

I'm not sure how to articulate this correctly so here it goes.
I'm designing a small webapp and in my database I have a table for users, commits, and userCommits. it's pretty minimal as in users has an ID that is auto number as well as commitshas a similar ID column. Now userCommits is what links them together as you can see in the following figure.
What I've done is INSERT INTO commits (...) VALUES (...) but the problem with that is how do I create the link in userCommits so that userCommits.commitID will equal the correct commit. I feel like I shouldn't just query the table to find the latest one and use its ID. Is there a way to join the ID's somehow?
I know that I can do a query like this to list all the commits the user via email.
SELECT
commits.id,
commits.cName,
users.email
FROM
commits
INNER JOIN userCommits ON commits.id = userCommits.commitID
INNER JOIN users ON userCommits.userID = users.id
WHERE users.email = "someonecool#hotmail.com"
But what I'm now trying to do is to insert into the database when the user creates a commit.
You perform the insert and select the inserted id as part of the same command for one table, then do the same for the other table
Once your app has knowledge of both ids, it can insert into the usercommits table
See Get the new record primary key ID from mysql insert query?
So I didn't mention I was using CodeIgniter. My bad. However the solutions provided would work. I just want to post what actually is used for other people that may encounter this.
As I learned for mySQL each client gets a session of it's own. This means that when you call $this->db->insert_id(), or SELECT LAST_INSERT_ID(); it will return YOUR last entered ID, not the server's last entered ID.
$commitData = array(
'cName' => $this->input->post('commitName'),
'cLocation' => $this->input->post('location'),
'cStartDate' => $this->input->post('startDate'),
'cEndDate' => $this->input->post('endDate')
);
$this->db->insert('commits', $commitData);
$userData = array(
'commitID' => $this->db->insert_id(),
'userID' => $user->id
);
$this->db->insert('userCommits', $userData);
There is obviously going to be security built around this but this is just the basic nuts and bolts.

Delete duplicate rows, do not preserve one row

I need a query that goes through each entry in a database, checks if a single value is duplicated elsewhere in the database, and if it is - deletes both entries (or all, if more than two).
Problem is the entries are URLs, up to 255 characters, with no way of identifying the row. Some existing answers on Stack Overflow do not work for me due to performance limitations, or they use uniqueid which obviously won't work when dealing with a string.
Long Version:
I have two databases containing URLs (and only URLs). One database has around 3,000 urls and the other around 1,000.
However, a large majority of the 1,000 urls were taken from the 3,000 url database. I need to merge the 1,000 into the 3,000 as new entries only.
For this, I made a third database with combined URLs from both tables, about 4,000 entries. I need to find all duplicate entries in this database and delete them (Both of them, without leaving either).
I have followed the query of a few examples on this site, but whenever I try to delete both entries it ends up deleting all the entries, or giving sql errors.
Alternatively:
I have two databases, each containing the separate database. I need to check each row from one database against the other to find any that aren't duplicates, and then add those to a third database.
Since you were looking for a SQL solution here is one. Lets assume that your table has a single column for simplicity sake. However this will work for any number of fields of course:
CREATE TABLE `allkindsofvalues` (
`value` int(11) NOT NULL
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
The following series of queries will accomplish what you are looking for:
CREATE TABLE allkindsofvalues_temp LIKE allkindsofvalues;
INSERT INTO allkindsofvalues_temp SELECT * FROM allkindsofvalues akv1 WHERE (SELECT COUNT(*) FROM allkindsofvalues akv2 WHERE akv1.value = akv2.value) = 1;
DROP TABLE allkindsofvalues;
RENAME TABLE allkindsofvalues_temp to allkindsofvalues;
The OP wrote:
I've got my own PHP solution which is pretty hacky, but works.
I went with a PHP script to accomplish this, as I'm more familiar with PHP than MySQL.
This generates a simple list of urls that only exist in the target
database, but not both. If you have more than 7,000 entries to parse
this may take awhile, and you will need to copy/paste the results
into a text file or expand the script to store them back into a
database.
I'm just doing it manually to save time.
Note: Uses MeekroDB
<pre>
<?php
require('meekrodb.2.1.class.php');
DB::$user = 'root';
DB::$password = '';
DB::$dbName = 'testdb';
$all = DB::query('SELECT * FROM old_urls LIMIT 7000');
foreach($all as $row) {
$test = DB::query('SELECT url FROM new_urls WHERE url=%s',
$row['url']);
if (!is_array($test)) {
echo $row['url'] . "\n";
}else{
if (count($test) == 0) {
echo $row['url'] . "\n";
}
}
}
?>
</pre>

Many UPDATE WHERE in one SQL

I have a database with very bad architecture and no one wants to fix anything and i have to work with what i have. So the problem is in the screen
I need to update users connections and also some more fields... The solution which i see is SET ... WHERE AND SET .. WHERE AND SET ... WHERE (and yes, at this point i'm not even sure that it would work). So may be there is more common way to solve this problem?
As long as your values are different records I think you will need to make individual updates.
You could also consider something like this:
UPDATE categories
SET display_order = CASE id
WHEN 1 THEN 3
WHEN 2 THEN 4
WHEN 3 THEN 5
END,
title = CASE id
WHEN 1 THEN 'New Title 1'
WHEN 2 THEN 'New Title 2'
WHEN 3 THEN 'New Title 3'
END
WHERE id IN (1,2,3)
Is there a special reason why you should be updating all the fields in one single query. Why not try writing individual update queries and push it to database as a batch.

Update multiple mysql rows with 1 query?

I am porting client DB to new one with different post titles and rows ID's , but he wants to keep the hits from old website,
he has over 500 articles in new DB , and updating one is not an issue with this query
UPDATE blog_posts
SET hits=8523 WHERE title LIKE '%slim charger%' AND category = 2
but how would I go by doing this for all 500 articles with 1 query ? I already have export query from old db with post title and hits so we could find the new ones easier
INSERT INTO `news_items` (`title`, `hits`) VALUES
('Slim charger- your new friend', 8523 )...
the only reference in both tables is product name word within the title everything else is different , id , full title ...
Make a tmp table for old data in old_posts
UPDATE new_posts LEFT JOIN old_posts ON new_posts.title = old_posts.title SET new_posts.hits = old_posts.hits;
Unfortunately that's not how it works, you will have to write a script/program that does a loop.
articles cursor;
selection articlesTable%rowtype;
WHILE(FETCH(cursor into selection)%hasNext)
Insert into newTable selection;
END WHILE
How you bridge it is up to you, but that's the basic pseudo code/PLSQL.
The APIs for selecting from one DB and putting into another vary by DBMS, so you will need a common intermediate format. Basically take the record from the first DB, stick it into a struct in the programming language of your choice, and prefrom an insert using those struct values using the APIs for the other DBMS.
I'm not 100% sure that you can update multiple records at once, but I think what you want to do is use a loop in combination with the update query.
However, if you have 2 tables with absolutely no relationship or common identifiers between them, you are kind of in a hard place. The hard place in this instance would mean you have to do them all manually :(
The last possible idea to save you is that the id's might be different, but they might still have the same order. If that is the case you can still loop through the old table and update the number table as I described above.
You can build a procedure that'll do it for you:
CREATE PROCEDURE insert_news_items()
BEGIN
DECLARE news_items_cur CURSOR FOR
SELECT title, hits
FROM blog_posts
WHERE title LIKE '%slim charger%' AND category = 2;
DECLARE CONTINUE HANDLER FOR NOT FOUND SET done = TRUE;
OPEN news_items_cur;
LOOP
IF done THEN
LEAVE read_loop;
END IF;
FETCH news_items_cur
INTO title, hits;
INSERT INTO `news_items` (`title`, `hits`) VALUES (title, hits);
END LOOP;
CLOSE news_items_cur;
END;