Getting the last database entry with sql is very slow - mysql

My database has two columns : time and work. Its very very large and still growing.
I want to select the last entry:
$sql = "SELECT time, work FROM device ORDER BY time DESC LIMIT 1"
It takes 1.5 seconds to response it. How can I speed this up?. Because I repeat it 20 times.
I can't wait 20 seconds.
Greetings!

use MAX
SELECT *
FROM tableName
WHERE time = (SELECT MAX(time) FROM device)
add also an Index on column time
I was wondering why do you want to repeat this 20 times. If you are working on the application lebel, maybe you can add the result in a variable so you won't execute it again.

Related

Mysql Group by taking 8 seconds or more on simple query

My table is below (have 1.2 million rows),
My query is below taking 8 seconds is more even if when I remove where query,
select `online_user_log`.`id` AS `id`,`online_user_log`.`abone_no` AS
`abone_no`,`online_user_log`.`kategori` AS
`kategori`,`online_user_log`.`islem_detay` AS
`islem_detay`,`online_user_log`.`date` AS `date` from `online_user_log`
where `online_user_log`.`kategori` = 'Kurma' or `online_user_log`.`kategori`
= 'Kapama' group by `online_user_log`.`abone_no`;
I have index on abone_no column
Phpmyadmin says it takes 0.005 seconds to take but in my php application and phpmyadmin it takes 8 seconds or more to retrieve data.
Thanks in advance

Mysql WHERE ... ORDER BY taking too long in large database

I have a large database.
There are 22 million lines for now, but there will be billions of lines in the future.
I have attached example lines where timestamp is in milliseconds.
What I want to do is get the last line every second.
SELECT *
FROM btcusdt
WHERE 1502942432000 <= timestamp
AND timestamp < 1502942433000
ORDER BY tradeid DESC
LIMIT 1
The query I wrote above works fine, but the WHERE condition takes too long because it scans all the rows in the table, but actually
It doesn't need to scan all the lines because the time is already sequential. As soon as it doesn’t fit the where condition, it should finish scanning.
Any suggestions on how I can speed this up?
db example

How to optimize id and limit query for huge data in mysql?

I have a billion rows in mysql table and I want to query the table with an indexed field lets say timestamp.
I want to query last 7 days data which can be 1000000 rows approximately and I am querying based on last id fetched and a limit which is 500.
This query works fine when I am processing upto 5000000 rows of data which is 10000 queries but when I increase the number of queries to, let's say, 50000, I can see a degradation in performance over time. Query used to take 5-10ms in the starting but after running for a long time it degraded to 2s. How can I optimize this ?
I earlier tried a naive solution which is limit, offset which gave highly unoptimized results so I tried to optimized it by saving last id and adding last id while querying every time but then again performance degraded overtime for this if I keep fetching one after another for 3-4 hours.
JAVA : Using Hibernate and Slicing
Date date = new Date();
Date timestamp = new DateTime(date).minusDays(7).toDate();
while (true) {
Integer rowLimit = 500;
Sort.Order sortingOrder = Sort.Order.asc("timestamp");
Sort sort = Sort.by(sortingOrder);
Pageable pageable = PageRequest.of(0, rowLimit, sort);
long queryStartTime = System.currentTimeMillis();
entityDataSlice = repository.findAllByTimestampAfterAndIdGreaterThan(
timestamp, lastId, pageable
);
long queryEndTime = System.currentTimeMillis();
if (!entityDataSlice.hasNext()) {
break;
}
}
MYSQL :
select *
from table
where timestamp >= "some_time"
and id >= <some_id>
order
by timestamp
limit 500
Expected result was a performance optimization but overtime it degraded.
Expected upto 100ms overtime but its actually upto 2-3 secs which is more likely to be degraded further upto 5-10 secs
Please provide SHOW CREATE TABLE. Meanwhile, if you have INDEX(timestamp) you don't need the and id.... In fact, it may get in the way of optimizing the ORDER BY.
So, if your query is this:
select *
from table
where timestamp >= "some_time"
order by timestamp
limit 500
and you have INDEX(timestamp), then it is well optimized, and it will not slow down (aside from caching issues).
If that is just a simplified version of the 'real' query, the all bets are off.

MySQL with more rows loads longer but its run with limit

i have this Problem.
To make a lower loading time, i want to get only 30 Items from my table. It works, but it loads longer time when there much items on the table.
But why? I only want to get 30 Items, not everything and setup the Offset when going to next Page.
Here is my SELECT query
SELECT *
FROM bank
WHERE withdraw = 0
ORDER BY CAST(item_price AS DECIMAL(10,2)) DESC
LIMIT 30 OFFSET (+page)
can someone explain me how to make it faster?
That +page is the offset with javascript, it works
I change the query a little bit, but doesnt help.
SELECT *
FROM bank
WHERE withdraw = 0
ORDER BY item_price DESC
LIMIT 30 OFFSET (+page)
// new
if i change that to that query, it loads the same time but on 350 rows? Why?
SELECT *
FROM bank
WHERE withdraw = 0

How to pull random N records from within a range of rows and keep iterating?

I have a MySQL database with 1-100 records I'm displaying in a feed.
When the page first loads I'd like to load a random unique 10 record iteration from say:
80-90, 90-100, 1-10, 10-20 .... all the way back to 80 and then stop so the all records get shown.
Next time the page loads I'd like it to start at another random 10:
40-50, 50-60 .... all the way back to 40.
You can use LIMIT and OFFSET offered by MySQL.
First:
SELECT * FROM YOUR_TABLE LIMIT 10;
From the next run onward, do following:
SELECT * FROM YOUR_TABLE LIMIT 10 OFFSET 10;
For that you have to manage data how many time user has visited the page and from there you can load. With the help of row_number() take a serial number.
You can create temp table that if your has visited once then start from 80, if twice then from 40 etc..