Codeigniter - Combining Get and then Update on same query - mysql

I am using below code to get records with specified condition, and then to update only the same records.
$this->db->where('Parameter1', 'TRUE');
$query = $this->db->get('Messages');
$this->db->where('Parameter1', 'TRUE');
$this->db->set('Parameter1', 'FALSE');
$this->db->update('Messages');
This works, but calling two times the same query using where() command seems like wasting of server power. Is it possible to make get() command not reset query or to use the previous record set in the update command?

I doubt this is something you really need to worry about taking up too many resources, and you can't really reuse the where clause in the actual sql query. But if you'd like you can refactor to get slightly cleaner code.
$unread = array('Parameter1'=>TRUE);
$read = array('Parameter1'=> FALSE);
$query = $this->db->get_where('Messages', $unread);
$this->db->update('Messages', $read, $unread);
Note:
In your code your getting every element where Parameter1 is set to true, and then changing every one of those elements to false. This almost certainly is not desirable, but perhaps it is a problem you take care of somewhere else in your real application.

Related

Is it good to filter data in controller or use sql query in model?

What is the best approach for searching?What will be difference if i filter the all data in controller and get result, and use where query in model and get result ?Please suggest your opinion.
It depends on the complexity of your query..
You can try to mesure the time of processing by putting flags in your code between each steps in order to measure the time of processing.
Then you make one test of speed processing like:
print time_flag 1
var results_sql_processing = *complex query*
print time flag 2
var raw_results_script_processing = *dump query*
print time flag 3
var results_script_processing = *processing the data*
print time flag 4
and make sure results_script_processing == results_sql_processing.
You can also set different datasets size (limit 100, 500, 1000) and see how the difference evolves between both solutions
Also, i would recommend to take a look at the query builders you might find in many frameworks (I use laravel's query builder).
They are usually a very good compromise when the query isn't too complex (no data aggregation, complex concat,....), you can still use joins, union and many filters on them.
But in case you want to get a super complex pivot table for instance, just build a strong sql query and then fire it in your code!

After issuing an DBQuery with an insert, how can I get the generated id, so I can add it back to the DAO?

Its more than just insert, really. If I already have a partially loaded DAO, how can I load the rest of it?
What I'm going to do is to do a select query, and then use BeanCopy. I'd rather have the result set mapper directly set the properties on the DAO.
Ok, let me try to answer this. For all generated values (like auto-generated IDs) you can use the following flow:
q = DbEntitySql.insert(foo).query();
// ... or any other way to get DbQuery
q.setGeneratedColumns("ID");
q.executeUpdate();
DbOomUtil.populateGeneratedKeys(dao, q);
Basically, for each query/dao you need to specify fields that are autogenerated. Currently there is no annotation for doing so - we are trying to keep number of annotations small as possible. We are working on making this more automatic.
Now, for populating the DAO. I would not use BeanCopy - simply load new DAO instance and ditch the old one. So after you execute the full select query you will get the full DAO loaded, and just continue with it.

Why mongoDB takes less time for Select than Fetch time?

I have collection with 10mill rows without any index.In this case system should read whole table ?
When i use eplain statement then it shows
db.employees.find({hundreds2:{$lt:1}},{}).explain();
"nscannedObjects" : 10000000,
"n" : 105
millis" : 6027
It works fine.
But i am using java to process query . Here is code
whereQuery = new BasicDBObject();
whereQuery.put("hundreds2",new BasicDBObject("$lt", rangeQuery));
timer.start();
setupMongoDBReadQuery(arrForRange[posOfArr]);
cursor = coll.find(whereQuery);
timer.stop();
this.selectTime= timer.elapsed();
timer.start();
while (cursor.hasNext())
{
numberOfRows++;
cursor.next();
}
timer.stop();
this.fetchTime= timer.elapsed();
this.totalOfSelAndFetch=this.selectTime+this.fetchTime;
But after test result .I got this information
selTime=2 fetchTime=6350 numRows105 TotalTime6352
selTime=0 fetchTime=6290 numRows471 TotalTime6290
selTime=0 fetchTime=6365 numRows922 TotalTime6365
Why fetch time is more than select .As per my knowledge ,while loop is just printing data . Why it taking so much time to print and how mongoDB select number of rows with 0 or 2 millSec?
Same experiment i did in MySQL with similiar code and results are
selTime=6302 fetchTime=1 numRows105 TotalTime6303
selTime=6318 fetchTime=1 numRows471 TotalTime6319
selTime=6387 fetchTime=2 numRows922 TotalTime6389
MongoDB uses lazy evaluation with cursors. That means in many cases when you start a MongoDB query which returns a cursor, the query doesn't get executed yet.
The actual selection happens when you start requesting data from the cursor.
The main reason is that this allows you to call methods like sort(by), limit(n) or skip(n) on the cursor which can often be processed much more efficiently on the database before selecting any data.
So what you measure with the "fetch time" is actually also part of the selection.
When you want to force the query to execute without fetching any data yet, you could call explain() on the cursor. The database can't measure the execution time without actually performing the query. However, in actual real-world use, I would recommend you to not do this and use cursors the way they were intended.

how to use async.series in middle of for loop

I am trying to find a way to ensure my database query is ran as part of the for loop shown below. I don't know if async.series is the best way, or if fibers might work? Rather than paste loads of code, in pseudo code it looks like this:
for length of array
newArray = array.split
databaseQuery(select **** from *** where x = newArray[0] and y = newArray[1]
my problem is as node is asynchronous, it simply repeats the for loop split while the query is running and then gets the result of the query, meaning I only get the last result returned.
Is there a way to ensure the database query is executed each time in the for loop? I've used nested queries, and callbacks in the past but I can't seem to figure out the best way to call the query each time for the loop

nested sql queries in rails

I have the following query
#initial_matches = Listing.find_by_sql(["SELECT * FROM listings WHERE industry = ?", current_user.industry])
Is there a way I can run another SQL query on the selection from the above query using a each do? I want to run geokit calculations to eliminate certain listings that are outside of a specified distance...
Your question is slightly confusing. Do you want to use each..do (ruby) to do the filtering. Or do you want to use a sql query. Here is how you can let the ruby process do the filtering
refined list = #initial_matches.map { |listing|
listing.out_of_bounds? ? nil : listing
}.comact
If you wanted to use sql you could simply add additional sql (maybe a sub-select) it into your Listing.find_by_sql call.
If you want to do as you say in your comment.
WHERE location1.distance_from(location2, :units=>:miles)
You are mixing ruby (location1.distance_from(location2, :units=>:miles)) and sql (WHERE X > 50). This is difficult, but not impossible.
However, if you have to do the distance calculation in ruby already, why not do the filtering there as well. So in the spirit of my first example.
listing2 = some_location_to_filter_by
#refined_list = #initial_matches.map { |listing|
listing.distance_from(listing2) > 50 ? nil : listing
}.compact
This will iterate over all listings, keeping only those that are further than 50 from some predetermined listing.
EDIT: If this logic is done in the controller you need to assign to #refined_list instead of refined_list since only controller instance variables (as opposed to local ones) are accessible to the view.
In short, no. This is because after the initial query, you are not left with a relational table or view, you are left with an array of activerecord objects. So any processing to be done after the initial query has to be in the format of ruby and activerecord, not sql.