I have a view where user select fields to make an advance search, the fields are:
**name**- **age** - **location** - **isPaid** - **date_subscription**, **date_expiration**
the user might choose one column or make combinations of multiple columns, I am confused if I should use if statement to detect which columns are selected, and then run the query depends on the selected column, but this way I will need to set all the valid conditions, which mean set the all the valid combination. Is there an other way to execute such queries?
I was writing this query, but i stopped because i just realize how long it will be:
SELECT * FROM internetclientdetails icd INNER JOIN internetclient ic on ic.id = icd.icid WHERE
(icd.date_sub <=".$start_dateSubsc." and cd.date_sub >= ".$end_dateSubsc.")
OR
(icd.date_exp <=".$start_dateExp." and cd.date_exp >= ".$end_dateExp.")
OR
(
(icd.date_sub <=".$start_dateSubsc." and cd.date_sub >= ".$end_dateSubsc.")
AND
(icd.date_exp <=".$start_dateExp." and cd.date_exp >= ".$end_dateExp.")
)
OR
.
.
.
but this is too long to be write since i still have 4 left fields to set OR , AND operators
Generally you add some already query building library, which can construct valid SQL from the input you select.
There are a number of them, for example, Doctrine DB abstraction layer, Medoo and a number of others.
For example, in Medoo a rather complex query such as:
SELECT account,user_name FROM table
WHERE user_id IN (2,123,234,54) OR
email IN ('foo#bar.com','cat#dog.com','admin#medoo.in')
will be writen in PHP as:
$database->select("account", "user_name", [
"OR" => [
"user_id" => [2, 123, 234, 54],
"email" => ["foo#bar.com", "cat#dog.com", "admin#medoo.in"]
]
]);
So all you have to do when using medoo is to pass it the correct input from the form.
As regards your question about how the user selects different columns, you can use something like this:
$mapping=array("start_dateSubsc"=>"date_sub", "end_dateSubsc"=>"date_sub",...);
where you list all the possible fields for user to enter in webpage, which are mapped to the real database table column names.
Then you do something like this, when you process the page:
$wherequery["OR"]=array();
foreach ($mapping as $userfield => $dbfield)
{
if array_key_exists($userfield, $_REQUEST)
array_push($wherequery["OR"], $dbfield => $_REQUEST[$userfield]);
}
$database->select("your columns"),[ $wherequery ]);
This will work for fields that need to be = to what user said and where you must match any of the fields.
You would have a bit more to do with fields that can be in range, and process them seperately, as well as processing fields with "AND", but that depends on the range and possibilities of your actual fields.
Related
I have a problem of checking whether any values in an array match any values in json column which contains an array with a name.
suppose i have an array [25,36,45,52] and json column is {"values": [25,24,15]}.
I want to check whether any values in array match any of values in json column in xampp mysql. please provide a better solution of doing this. this image show table structure of my database
i have 4 tables.
user
profile
profile
jobs
user table (id,userid)
jobs table (id,user_id,skill_id)
skill table (id,job_id,)
profile table (id,user_id)
now i want to search all jobs that match some or at least one skills.
i have tried with this but this is giving all jobs with out skills filtered.
$jobs = Job::with(['user','profile'])->with(['skills' => function($query){
$query->whereJsonContains('skills->skills',[35]);
}])->where('jobs.is_completed',0);
please help me.
you can use where Clause easily for example you would like to get rows that match skills 35,54:
$users = DB::table('table')
-> whereJsonContains('skills->skills', [35,54])
->get();
for more details about how to querying json column check official docs :
https://laravel.com/docs/5.8/queries#json-where-clauses
I have a model called lists, which has a column called item_ids. item_ids is a JSON column (MySQL) and the column contains array of UUIDs, each referring to one item.
Now when someone creates a new list, I need to search whether there is an existing list with same set of UUIDs, and I want to do this search using query itself for faster response. Also use ActiveRecord querying as much as possible.
How do i achieve this?
item_ids = ["11E85378-CFE8-39F8-89DC-7086913CFD4B", "11E85354-304C-0664-9E81-0A281BE2CA42"]
v = List.new(item_ids: item_ids)
v.save!
Now, how do I check whether a list exists which has item ids exactly matches with that mentioned in query ? Following wont work.
list_count = List.where(item_ids: item_ids).count
Edit 1
List.where("JSON_CONTAINS(item_ids, ?) ", item_ids.to_json).count
This statement works, but it counts even if only one of the item matches. Looking for exact number of items.
Edit 2
List.where("JSON_CONTAINS( item_ids, ?) and JSON_LENGTH(item_ids) = ?", item_ids.to_json, item_ids.size).count
Looks like this is working
You can implement a has many relation between lists and items and then access like this.
List.includes(:item).where('items.id in (?)',item_ids)
To implement has_many relation:
http://guides.rubyonrails.org/association_basics.html#the-has-many-through-association
I'm trying to set up an ability to get some numbers from my Sphinx indexes, but not sure how to get the info I want.
I have a mysql db with articles, sphinx index set up for that db and full text search, all working. What I want is to get some numbers:
How many times search text (keyword, or key phrase) appears over all articles for all time (more likely limited to "articles from time interval from X and to Y")
Same as previous but for how many times 2 keywords or keyphrases (so "x AND y") appear in same articles
I was doing something similar to first manually using bat file I made
indexer ind_core -c c:\%SOME_PATH%\development.sphinx.conf --buildstops stats.txt 10000 --buildfreqs
Which generated me a txt with all repeating keywords and how often they appear at early development stages, which helped to form a list of keywords I'm interested in. Now I'm trying to do the same but just for a finite list of predetermined keywords and integrated into my rails project to be able to build charts in future.
I tried running some queries like
#testing = Article.search 'Keyword1 AND Keyword2', :ranker => :wordcount
but I'm not sure how it works and how to process the result, as well as if that's what I'm looking for.
Another approach I tried was manual mysql queries such as
SELECT id,title,WEIGHT() AS w FROM ind_core WHERE MATCH('#title keyword1 | keyword2') OPTION ranker=expr('sum(hit_count)');
but I'm not sure how to process results from here either (as well as how to actually implement it into my existing rails project), and it's limited to 20 lines per query (which I think I can change somewhere in settings?). But at least looking at mysql results what I'm interested in is hit_count over all articles (or all articles from set timeframe).
Any ideas on how to do this?
UPDATE:
Current way I found was to add
#testing = Article.search params[:search], :without => {:is_active => false}, :ranker => :bm25
to controller with some conditions (so it doesn't bug out from nil search). :is_active is my soft delete flag, don't want to search deleted entries, so don't mind it. And in view I simply displayed
<%= #testing.total_entries %>
Which if I understand it correct shows me number of matches sphinx found (so pretty much what I was looking for).
So, to figure out the number of hits per document, you're pretty much on the right track, it's just a matter of getting it into Ruby/Thinking Sphinx.
To get the raw Sphinx results (if you don't need the ActiveRecord objects):
search = Article.search "foo",
:ranker => "expr('SUM(hit_count)')",
:select => "*, weight()",
:middleware => ThinkingSphinx::Middlewares::RAW_ONLY
… this will return an array of hashes, and you can use the weight() string key for the hit count, and the sphinx_internal_id string key for the model's primary key (id is Sphinx's own primary key, which isn't so useful).
Or, if you want to use the ActiveRecord objects, Thinking Sphinx has the ability to wrap each search result in a helper object which passes appropriate methods through to the underlying model instances, but lets weight respond with the values from Sphinx:
search = Article.search "foo",
:ranker => "expr('SUM(hit_count)')",
:select => "*, weight()"; ""
search.context[:panes] << ThinkingSphinx::Panes::WeightPane
search.each do |article|
puts article.weight
end
Keep in mind that panes must be added before the search is evaluated, so if you're testing this in a Rails console, you'll want to avoid letting the console inspect the search variable (which I usually do by adding ; "" at the end of the initial search call.
In both of these cases, as you've noted, the search results are paginated - you can use the :page option to determine which page of results you want, and :per_page to determine the number of records returned in each request. There is a standard limit of 1000 results overall, but that can be altered using the max_matches setting.
Now, if you want the number of times the keywords appear across all Sphinx records, then the best way to do that while also taking advantage of Thinking Sphinx's search options, is to get the raw results of an aggregate SUM - similar to the first option above.
search = Article.search "foo",
:ranker => "expr('SUM(hit_count)')",
:select => "SUM(weight()) AS count",
:middleware => ThinkingSphinx::Middlewares::RAW_ONLY
search.first["count"]
I have a design problem with SQL request:
I need to return data looking like:
listChannels:
-idChannel
name
listItems:
-data
-data
-idChannel
name
listItems:
-data
-data
The solution I have now is to send a first request:
*"SELECT * FROM Channel WHERE idUser = ..."*
and then in the loop fetching the result, I send for each raw another request to feel the nested list:
"SELECT data FROM Item WHERE idChannel = ..."
It's going to kill the app and obviously not the way to go.
I know how to use the join keyword, but it's not exactly what I want as it would return a row for each data of each listChannels with all the information of the channels.
How to solve this common problem in a clean and efficient way ?
The "SQL" way of doing this produces of table with columns idchannel, channelname, and the columns for item.
select c.idchannel, c.channelname, i.data
from channel c join
item i
on c.idchannel = i.idchannel
order by c.idchannel, i.item;
Remember that a SQL query returns a result set in the form of a table. That means that all the rows have the same columns. If you want a list of columns, then you can do an aggregation and put the items in a list:
select c.idchannel, c.channelname, group_concat(i.data) as items
from channel c join
item i
on c.idchannel = i.idchannel
group by c.idchannel, c.channelname;
The above uses MySQL syntax, but most databases support similar functionality.
SQL is made for accessing two-dimensional data tables. (There are more possibilities, but they are very complex and maybe not standardized)
So the best way to solve your problem is to use multiple requests. Please also consider using transactions, if possible.
Have not been able to find any information on this, I could do this in its own but I feel keeping it in the query might be the best option, if its possible.
Basically I want to try to add a top level "statistics" portion of a query.
So when I get the results I will see it like so
num_rows = 900
distinct_col = 9
results = array()
This way I can loop the results normally, and then pull out information that I would only need once outside of it. Is this possible?
EDIT:
I am not looking for the normal mysql statistics like num_rows exactly. But in a case where lets say you limit the results to ten, num_rows would return 10, but you want total results, so 900. In most cases I would just use another query and look just for the amount, however combining it all into one query logically seems faster for me. There is also more then just the num_rows I may need, say they are all products and have a specific category, I would need to count the amount of categories all items fall under. So looping the raw results when there is only one result for those columns is sillyness.
EDIT 2:
To clarify further I need to get some counts on some columns, and maybe a min-max result on a join. Having it return on every loop would work, but the same exact return uselessly returning on every loop when its only needed once does not seem logical. I am no MySQL expert and am mainly just trying to make sure I come up with the most logical and fastest method to get the required data.
Here is a PHP return example:
array(
[num_rows] => 900,
[categories] => 9,
[min_price] => 400,
[max_price] => 900,
[results] => array(
[0] => //row array
[1] -> //row array
)
);
Mysql returns its default num rows before you "fetch" the results, having custom results added there may be sufficient.
Dunno why do you need that but that's very easy to get
Assuming you are using safeMysql (though you can use whatever way to get data into array)
$results = $db->getAll("SELECT * FROM t");
$num_rows = count($results);
$num_cols = count($results[0]);
that's all
I am mainly just trying to make sure I come up with the most logical and fastest method to get the required data.
Yes, you are.
Nothing wrong with getting aggregated data with every loop.
As for the count beyond LIMITs - when you need it, you can use mysql's SQL_CALC_FOUND_ROWS / FOUND_ROWS() feature