I want to query all records in my table for csv export. Normally limit the results to 20-25 is good with pagination, but in this case I want get rid of the limit call what is automatically added to my query.
How to avoid the automatic limit call?
Related
I have a table that is holding almost a thousand records. Using datatables plugin I can limit the number of records shown on the screen but the data query from database takes too much time. I want to limit database query based on the number of records shown by the datatables.
I am trying to apply group active record command in rails rest api, however my database is in MySql.
When I query without group by I get correct data but when I use group on the same query I get strange data collection. I am using group to decrease query time coz in original it takes alot of time to retrieve data from database
Here is my original query
Records.owned_by(User.find_by_email(params[:user].to_s).id).where(device_id: params[:did]).includes(:record_students, :record_employees, :record_admins, :record_others)
but when I use group to increase the efficiency the returned data set is not valid
here is my new query with group
Records.owned_by(User.find_by_email(params[:user].to_s).id).where(device_id: params[:did]).includes(:record_students, :record_employees, :record_admins, :record_others).group("date(created_at)")
any idea what is wrong. Thanks
How do I format a query with multiple order by columns. The data I'm working with has a date column and a time column and I want to order by both of them. I know how to do this query in regular SQL but I can't make it work in SoQL. Here is what I've tried:
This works ('date DESC') but isn't what I'm trying to do:
http://data.sfgov.org/resource/tmnf-yvry.json?$order=date+DESC
This fails ('date DESC, time DESC') with a 403 error:
http://data.sfgov.org/resource/tmnf-yvry.json?$order=date+DESC%2Ctime+ASC
This fails ("'date DESC, time DESC'") with a 403 error:
http://data.sfgov.org/resource/tmnf-yvry.json?$order=%27date+DESC%2C+time+DESC%27
Currently, sorting on multiple columns at the same time is something you unfortunately can't do with the SODA API. It'll respond with a "query.execution.queryTooComplex" error like you're seeing.
However, this is something that'll be fixed in the future as we migrate datasets to our new backend. Details on this process and how to tell when/if a dataset has been migrated will be available soon.
Note: You also need to use the $order parameter in your query, not just order. I'll edit your URLs above to match.
I have made a processing application which has to process loads of data from table based on query like this..
Select * from table_name where column_name='';
Now my concern is this that suppose processing application stops suddenly in that case on restarting of the application it will start processing from the first row of the table which i dont want ..I want to ask is there any keyword in mysql by which i can get the rownumber of the table and every time it should be inserted into separate table so that on start of the application it should go to the table where row number has been stored and start processing from that row...
Any help will be welocmed.
Thanks in advance..
You can use LIMIT to cut the fist n rows from your query result, you only have to keep track of your progress somewhere and next time your program starts you read this information and continue from there.
SELECT * FROM table WHERE column = '' LIMIT 0,18446744073709551615
0 is to fetch from the beginning of the table. To skip the first n rows, you just have to put the number there.
18446744073709551615 is the maximum possible number of rows (2^64-1) in case you want to fetch them all in a single query (MySQL documentation instructs to use this number as a method of doing that).
i have a query to fetch 100 rows of events ordered by timestamp. i want to ignore top 2 entries from the result set. The problem is that there is no criteria match (simply to ignore first 2 rows).
i am using pager (drupal) which results 10 events per page. If i process it after fetching 10 rows i lost 2 entries (first page contains only 8 entries). how to solve the problem ?
If you are using Views, you can just set the offset to 2 which will ignore the first two records.
USe limit
LIMIT 2,98
LIMIT 2,100
Add that to your SQL command, I think it should work.
You can't use offsets with pager_query() which I assume you're using here. Maybe you need to reconsider how you're querying? Maybe run a query for the first two records, and then in your pager SQL use a WHERE condition to exclude the IDs of the first two results.