MySQL with more rows loads longer but its run with limit - mysql

i have this Problem.
To make a lower loading time, i want to get only 30 Items from my table. It works, but it loads longer time when there much items on the table.
But why? I only want to get 30 Items, not everything and setup the Offset when going to next Page.
Here is my SELECT query
SELECT *
FROM bank
WHERE withdraw = 0
ORDER BY CAST(item_price AS DECIMAL(10,2)) DESC
LIMIT 30 OFFSET (+page)
can someone explain me how to make it faster?
That +page is the offset with javascript, it works
I change the query a little bit, but doesnt help.
SELECT *
FROM bank
WHERE withdraw = 0
ORDER BY item_price DESC
LIMIT 30 OFFSET (+page)
// new
if i change that to that query, it loads the same time but on 350 rows? Why?
SELECT *
FROM bank
WHERE withdraw = 0

Related

Efficient SQL Query to calculate portion of a row in half hourly time series that has occurred

I have a table that looks like this:
id
slot
total
1
2022-12-01T12:00
100
2
2022-12-01T12:30
150
3
2022-12-01T13:00
200
There's an index on slot already. The table has ~100mil rows (and a bunch more columns not shown here)
I want to sum the total up to the current moment in time (EDIT: WASN'T CLEAR INITIALLY, I WILL PROVIDE A LOWER SLOT BOUND, SO THE SUM WILL BE OVER SOME NUMBER OF DAYS/WEEKS, NOT OVER FULL TABLE). Let's say the time is currently 2022-12-01T12:45. If I run select * from my_table where slot < CURRENT_TIMESTAMP(),
then I get back records 1 and 2.
However, in my data, the records represent forecasted sales within a time slot. I want to find the forecasts as of 2022-12-01T12:45, and so I want to find the proportion of the half hour slot of record 2 that has elapsed, and return that proportion of the total.
As of 2022-12-01T12:45 (assuming minute granularity), 50% of row 2 has elapsed, so I would expect the total to return as 150 / 2 = 75.
My current query works, but is slow. What are some ways I can optimise this, or other approaches I can take?
Also, how can we extend this solution to be generalised to any interval frequency? Maybe tomorrow we change our forecasting model and the data comes in sporadically. The hardcoded 30 would not work in that case.
select sum(fraction * total) as t from
select total,
LEAST(
timestampdiff(
minute,
datetime,
current_timestamp()
),
30
) / 30 as fraction
from my_table
where slot <= current_timestamp()
Consider computing your sum first, then remove the last element partial total. In order to keep the last element total, I'd prefer applying window functions instead of aggregations, and limit the output to the last row.
SET #current_time = CURRENT_TIMESTAMP();
WITH cte AS (
SELECT slot,
SUM(total) OVER(ORDER BY slot) AS total,
total AS rowtotal
FROM my_table
WHERE slot < #current_time
ORDER BY slot DESC
LIMIT 1
)
SELECT slot,
total - (30 - TIMESTAMPDIFF(MINUTE,
slot,
#current_time))
/30 * rowtotal AS total
FROM cte
Check the demo here.
Note1: Adding an index on the slot field is likely to boost this query performance.
Note2: If your query is running on millions of data, your timestamp may be likely to change during the query. You could store it into a variable before the query is run (or into another cte).
create an ondex in slot column btree as it is having high selectivity;

Cutting SELECT query time in MySQL

I'm using CodeIgniter 2 and in my database model, I have a query that joins two tables and filters row based upon distance from a given geolocation.
SELECT users.id,
(3959 * acos(cos(radians(42.327612)) *
cos(radians(last_seen.lat)) * cos(radians(last_seen.lon) -
radians(-77.661591)) + sin(radians(42.327612)) *
sin(radians(last_seen.lat)))) AS distance
FROM users
JOIN last_seen ON users.id = last_seen.seen_id
WHERE users.age >= 18 AND users.age <= 30
HAVING distance < 50
I'm not sure if it's the distance that is making this query take especially long. I do have over 300,000 rows in my users table. The same amount in my last_seen table. I'm sure that plays a role.
But, the age column in the users table is indexed along with the id column.
The lat and lon columns in the last_seen table are also indexed.
Does anyone have ideas as to why this query takes so long and how I can improve it?
UPDATE
It turns out that this query actually runs pretty quickly. When I execute this query in PHPMyAdmin, it takes 0.56 seconds. Not too bad. But, when I try to execute this query with a third party SQL client like SequelPro, it takes at least 20 seconds and all of the other apps on my mac slow down. When the query is executed by loading the script via jQuery's load() method, it takes around the same amount of time.
Upon viewing my network tab in Google Chrome's developer tools, it seems that the reason it's taking so long to load is because of what's called TTFB or Time To First Byte. It's taking forever.
To make this query faster you need to limit the count of rows using an index before actually calculating the distance on every and each of them. To do so you can limit the rows from last_seen based on their lat/lon and a rough formula for desired distance.
The idea is that the positions with the same latitude as the reference latitude would be in 50 miles distance if their longitude falls in a certain distance from the reference longitude and vice versa.
For 50 miles distance, RefLat+-1 and RefLon+-1 would be a good start to limit the rows before actually calculating the precise distance.
last_seen.lat BETWEEN 42.327612 - 1 AND 42.327612 + 1
AND last_seen.lon BETWEEN -77.661591 - 1 AND -77.661591 + 1
For this query:
SELECT users.id, (3959 * acos(cos(radians(42.327612)) * cos(radians(last_seen.lat)) * cos(radians(last_seen.lon) - radians(-77.661591)) + sin(radians(42.327612)) * sin(radians(last_seen.lat)))) AS distance
FROM users JOIN
last_seen
ON users.id = last_seen.seen_id
WHERE users.age >= 18 AND users.age <= 30
HAVING distance < 50;
The best index is users(age, id) and last_seen(seen_id). Unfortunately, the distance calculations are going to take a while, because they have to be calculated for every row. You might want to consider a GIS extension to MySQL to help with this type of query.

query optimization with multiple sub-queries

I want to retrieve data
number of meters in this month, minus the number of meters in the previous month,
and the value of the meter is deducted in accordance with their respective codes.
then summed the whole.
there are about 8000 records.
but I try to take 5 records, and it takes time 2:53 sec,
100 records takes time (1 min 1:57 sec).
really matter .
I have query like this.
SELECT code hvCode,
IFNULL( (SELECT meter
FROM bmrpt
WHERE waktu_foto LIKE '2014-05%'
GROUP BY code HAVING code = hvCode),0 )
-IFNULL( (SELECT meter
FROM bmrpt WHERE waktu_foto LIKE '2014-04%'
GROUP BY code HAVING code = hvCode),0 )hasil
FROM bmrpt group by code;
does anybody have an idea to change the query to be optimized?
this the sqlfiddle http://www.sqlfiddle.com/#!2/495c0/1
best regards
though your question is unclear but try below subquery , as what I understand
SELECT COALESCE(SUM(`meter`),0) FROM table WHERE code ='hvCode' AND MONTH(`date_column`) = 5
-
SELECT COALESCE(SUM(`meter`),0) FROM table WHERE code ='hvCode' AND MONTH(`date_column`) = 4

mysql compare 3 columns order on overall

I have a game that a user can save there name and score in a database.
Columns
Name, Tries, Percentage correct, Time taken, Image set
3 of these columns are based on a score
Tries, Percentage correct, Time taken,
Currently I have the table display time ascending.
$sql = "SELECT * FROM Score ORDER BY time ASC LIMIT 5";
Is there away to compare the scores
This is an example of 2 scores
1st row: 46, 52%, 02:36
2nd row: 38, 63%, 02:47
Is there away that i can compare on average which of those should be top based on all 3 scores.
Row one had more tries and less percentage correct but faster time.
Row two had less tries and higher percentage correct but slower time.
In a theory tries: ASC, percent: DESC, time Asc
If I change the Order by to:
$sql = "SELECT * FROM Score ORDER BY 'time ASC', 'tries ASC', 'percent
DESC' LIMIT 5";
Will it mess the rows up or will it display them in an order based on all 3
This image is using time ASC
The minimum amount of tries is 12 which will be 100%
Some how I need to compare Tries with Time
Can I divide time / tries and then order by result
Is the right?
$sql = "SELECT * FROM Score ORDER BY time / tries ASC LIMIT 5";
If you want to divide 2 columns then order that result use
$sql = "SELECT * FROM Score ORDER BY time / tries ASC LIMIT 18446744073709551615";
Then if you wish you can display that result in your table
For example
<td>".round($data[4] / $data[2],2)." Seconds</td>
/ will divide time by tries and then wrap with round() to round down the result.
In this case I would recommend round(??? ,2) which will output something like 3.53 Seconds
If you want an average score then use time + tries - percent
Lowset possible tries = 12
Time could be still high even with 12 tries
getting 12 tries will give 100% that is a high value so its best to minus that.
12 tries + 1m 30s - 100%
12 tries + 10h 30m 30s - 100%
This is too long for a comment.
"Is there away that i can compare on average which of those should be top based on all 3 scores?"
Yes. But first you have to figure out what the method is. Then you can implement it in a query.
Second, your query as written is:
SELECT *
FROM Score
ORDER BY 'time ASC', 'tries ASC', 'percent DESC'
LIMIT 5;
This will do nothing, because it is ordering by three constants. Drop the single quotes, and only use them for string constants:
SELECT *
FROM Score
ORDER BY time ASC, tries ASC, percent DESC
LIMIT 5;
In practice, this would be very much like:
ORDER BY time ASC
unless a lot of people have exactly the same time on two rows, the additional ordering criteria will not be used.

Getting the last database entry with sql is very slow

My database has two columns : time and work. Its very very large and still growing.
I want to select the last entry:
$sql = "SELECT time, work FROM device ORDER BY time DESC LIMIT 1"
It takes 1.5 seconds to response it. How can I speed this up?. Because I repeat it 20 times.
I can't wait 20 seconds.
Greetings!
use MAX
SELECT *
FROM tableName
WHERE time = (SELECT MAX(time) FROM device)
add also an Index on column time
I was wondering why do you want to repeat this 20 times. If you are working on the application lebel, maybe you can add the result in a variable so you won't execute it again.