I already asked a Question Here.
My this question is just a step ahead to same problem.
I have a delphi code which works perfectly for calling report.
But, now I wanted to show a MessageBox before opening rpt file.
I tried to query it separately for record count and then decide for MessageBox. But, this solution is having a worst case where a aprticular report's query itself takes 3 mins to execute and then again querying while opening rpt it takes 30 sec to load (in second query it takes less time may be because of some data may present in buffer/temp place,etc.).
qPODPhy.close;
qPODPhy.SQL.clear;
qPODPhy.SQL.text :='select * from ViewName';
qPODPhy.Open;
If qPODPhy.RecordCount < 1 Then
MessageBOx('No data To Display...');
Else
Begin
crRep.Somproperties := Initialization
.
.
.
CrRep.SQLQuery := qPODPhy.SQL.text;
crRep.action := 1
End
My Question is :
How can I show a MessageBox if no record is going to present for particular view's output.
OR
Is there a method where I can open the dataset of .rpt file in delphi code and just check the count of records and make the decision? In short, is there some property of crystalreport component which can do this?
You can do a select count(*) separately, that is much faster.
Or maybe select only one record: SELECT TOP 1 ....
And, as RBA suggested, you can try putting that SELECT COUNT in a stored procedure for even more speed.
Just experiments with these methods to see if/when you've gained enough speed.
Are you pushing the data ? You can probably use ReadRecords method of the ReportDocument and check Rows.Count property. If the report is retrieving the data the you can use the NoData event of ReportDocument.
Related
I have a procedure on MySQL that returns two results and I need to show this on Delphi but I didn't find how to pass for each result.
Here is how appear on DBForge when I execute, I want this on delphi too, show Query1 and Query2 in a TTabControl.
How to go through this queries and get the name of the query like: Query1,Query2?
You don't say what DB interface lib you're using, like FireDAC, Zeos, or something else.
You're going to issue something like a dbxyz.ExecSQL() call and check for some results.
It sounds like you're expecting to get back multiple records in the result set. Using a TTabControl, you'd simply create a list of tabs like "Result-1", "Result-2", and so on, depending on how many records you get back. (They're results, not queries.) You could select the first one by default.
When another tab is clicked, use the control's TabIndex property to select the corresponding result from the result set, then format up the data in that result and display it in whatever format you're using below the tabs.
There's so little detail you've given that it's impossible to show any code for anything other than the tab control's OnChange handler that will use the TabIndex property to find the desired result set. But this is the overall approach I'd take for this using the TTabControl.
I solve the problem with the command
var
tab: TTabItem;
stringGrid: TStringGrid;
repeat
tab:= TTabItem.Create(nil);
tab.Parent:= tabcontrol1;
tab.Text:= query.Fields.Fields[0].FieldName; //the table name
stringGrid:= TStringGrid.Create(Self);
stringGrid.Parent:= tab;
stringGrid.Align:= TAlignLayout.Client;
for I := 1 to query.FieldCount-1 do
begin
stringGrid.AddObject(TStringColumn.Create(stringGrid));
stringGrid.Columns[i-1].Header:= query.Fields.Fields[i].FieldName;
query.First;
for j := 0 to query.RecordCount-1 do
begin
stringGrid.cells[i-1, j]:= query.Fields.Fields[i].value;
query.Next;
end;
end;
stringGrid.RowCount:= j;
until not query.openNext;
I am totally clueless how to get around to get the following kinda result from the same table in MySQL.
Required Result:
The raw data as shown in below image.
Mc_id and op_id can be different. For example, if mc_id is 4 and op_id is 10 then it has to loop through each vouid and extract done_on_date, again it has to loop through for the same mc_id 4 and op_id 10 and extract done_on_date where done_on_date is after first extracted done_on_date. Here second extracted done_on_date, we refer to, as next_done_on_date, just to distinguish it differently. Accordingly continue till end of the table. I hope I am clear enough now.
The idea is basically to see when was particular operation_id carried out for the said machine having mc_id. First time operation done is refered to as done_on_date and when the same operation carried out for the same machine next time, we refer to as next_done_on_date but actually inside the database table it is done_on_date.
Though let me know if anything yet to be clarified
I have collection with 10mill rows without any index.In this case system should read whole table ?
When i use eplain statement then it shows
db.employees.find({hundreds2:{$lt:1}},{}).explain();
"nscannedObjects" : 10000000,
"n" : 105
millis" : 6027
It works fine.
But i am using java to process query . Here is code
whereQuery = new BasicDBObject();
whereQuery.put("hundreds2",new BasicDBObject("$lt", rangeQuery));
timer.start();
setupMongoDBReadQuery(arrForRange[posOfArr]);
cursor = coll.find(whereQuery);
timer.stop();
this.selectTime= timer.elapsed();
timer.start();
while (cursor.hasNext())
{
numberOfRows++;
cursor.next();
}
timer.stop();
this.fetchTime= timer.elapsed();
this.totalOfSelAndFetch=this.selectTime+this.fetchTime;
But after test result .I got this information
selTime=2 fetchTime=6350 numRows105 TotalTime6352
selTime=0 fetchTime=6290 numRows471 TotalTime6290
selTime=0 fetchTime=6365 numRows922 TotalTime6365
Why fetch time is more than select .As per my knowledge ,while loop is just printing data . Why it taking so much time to print and how mongoDB select number of rows with 0 or 2 millSec?
Same experiment i did in MySQL with similiar code and results are
selTime=6302 fetchTime=1 numRows105 TotalTime6303
selTime=6318 fetchTime=1 numRows471 TotalTime6319
selTime=6387 fetchTime=2 numRows922 TotalTime6389
MongoDB uses lazy evaluation with cursors. That means in many cases when you start a MongoDB query which returns a cursor, the query doesn't get executed yet.
The actual selection happens when you start requesting data from the cursor.
The main reason is that this allows you to call methods like sort(by), limit(n) or skip(n) on the cursor which can often be processed much more efficiently on the database before selecting any data.
So what you measure with the "fetch time" is actually also part of the selection.
When you want to force the query to execute without fetching any data yet, you could call explain() on the cursor. The database can't measure the execution time without actually performing the query. However, in actual real-world use, I would recommend you to not do this and use cursors the way they were intended.
I'm using Gerrit REST API to query all changes whose status is "merged". My query is
https://android-review.googlesource.com/changes/?q=status:merged&n=2
where "n=2" limits the size of query results to 2. So I got a JSON object like:
Of course there are more results. According to the REST document:
If the n query parameter is supplied and additional changes exist that match the query beyond the end, the last change object has a _more_changes: true JSON field set. Callers can resume a query with the N query parameter, supplying the last change’s _sortkey field as the value.
So I add the query parameter N with the _sortkey of the last change 100309. The new query is:
https://android-review.googlesource.com/changes/?q=status:merged&n=2&N=002e4203000187d5
With this new query, I was hoping that I'll get another 2 new query results, since I provided the _sortkey as a cursor of my previous search results.
However, it's really weird that this new query returns exactly the same results as the previous query, instead of the next 2 results as I expected. It seems like providing "N=002e4203000187d5" has no effect at all.
Does anybody know why using _sortkey to resume my query doesn't work?
I chatted with one of the developers at Google, and he confirmed that _sortkey has been removed from the newer versions of Gerrit they are running at android-review and gerrit-review. The N= parameter is no longer valid. The documentation will be updated to reflect this.
The alternative is to use &S=x to skip x results, which I tested and works well.
sortkey is deprecated in Gerrit v2.9 -
see the (Gerrit) ReleaseNotes-2.9.txt, under REST API - Changes:
[[sortkey-deprecation]]
Results returned by the [query changes] endpoint are now paginated using offsets instead of sortkeys.
The sortkey and sortkey_prev parameters on the endpoint are deprecated.
The results are now paginated using the --limit (-n) option to limit the number of results, and the -S option to set the start point.
Queries with sortkeys are still supported against old index versions, to enable online reindexing while clients have an older JS version.
See also here -
PSA: Removing the "sortkey" field from the gerrit-on-borg query interface:
...
Our solution is to kill the sortkey field and its related search operators (sortkey_before, sortkey_after, and resume_sortkey).
There are two ways you can achieve similar functionality.
Add "&S=" to your query to skip a fixed number of results.
(Note that this redoes the search so new results may have jumped ahead and
you might process the same change twice.
This is true of the resume_sortkey implementation as well,
so your code should already be able to handle this.)
Use the before/after operators.
Instead of taking the sortkey field from the last returned change and
using it in a resume_sortkey operator, you take the updated field from
the last returned change and use it in a before operator.
(This has slightly different semantics than the sortkey field, which
uses the change number as a tiebreaker when changes have similar updated times.)
...
The 1,500 page Access 97 Bible (don't laugh!) that I've been given by my boss to solve his problem doesn't solve my problem of how to solve his problem, because it has nee VBA code.
Let me first make clear that I've made attempts to solve this without (much) coding, and that I've coded quite a bit in VBA already, so I'm basically familiar with most things including recordsets, queries, etc etc but have problems with MS Access limits on how to form a report with data coming from VBA variables. I'm also versatile in most programming languages, but this is not a language problem but rather a "how to/what's possible" problem.
My problem right now is that dragging the query fields into the Detail subform and putting them into cells in columns setting Left and Top with VBA code are moving them alright, but each cell is on a new page. Unfortunately, there is multiple data in each cell that won't conform to the Create Report Guide options available.
So my question is simply this: Can someone point me to working examples of code that create, place, and fill with VBA variable strings, text fields at any coordinate I please on a paper size of my choice?
Edit: The above is not an option, as I understand this will prohibit the client from getting an .mde database. What remains, then, is to merely ask for some sound advice on how to get several rows GROUPed BY weekday and machine (see below) into a recordset or similar for each cell. I guess the best way is to count the number of columns in the table (machines in the sql result) and create 5 rows of these with dummy data, then go through the result rows and place the data in the relevant controls. But if you have ideas for doing this work better and faster, write them as answers.
Sorry for this, I knew there was something I wasn't understanding. Basically, I thought Access supported creating reports dynamically via VBA, ie. "generating pages with data" rather than "preparing a flow of controls connected to datasources". But Access requires that you create an ample amount of dummy, unlinked controls manually, then either fill or hide them and that's how they become "dynamic".
This is for Access 2003 on a remote server accessing local and remote ODBC SQL database tables, if relevant. The goal is to make a week schedule of n columns (n=number of machines at a certain plant) x 5 rows (weekday Mon-Fri), and put 1 or more recordset rows (=scheduled activities for that day on that machine) in each of the "n by 5 table" cells.
If you detect venting frustration in this post I can only ask your forgiveness and hope for your understanding.
So, has many techniques for this:
Ex: 1) using dinamic sql for this:
'Create a function to make sql query
Function MakeMySQlReport(Parameters):
Dim strSql as string
Dim strMyVar as string
strsql = vbnullstring
strsql = "Select " & myVar1 & " as MyFieldVar1, * from myTable where Fieldx =" & Parameters
MyReport.recordSource = ssql
End Function
Ex: 2) create function that returns yours strings:
Function MyString1() as string
MyString1 = 'ABC'
end Function
An in your report, select the textbox will receive the value and type =MyString1()]
I hope this help to you, need more examples?
Solution:
Create many objects manually (grr!)
name them systematically
put them in a Control Array (get all Me.Controls, sift out the ones you're interested in, and put them in an indexed array)
go through the array and change their properties