Rails dynamically add condition to mysql query - mysql

How to add condition dynamically to sql query
for example if i have one element than it will look like
query=['one_element']
User.where('name LIKE ?, %"#{query[0]}"%')
but if it more than one
User.where('name LIKE ? and LIKE ? and Like... , %"#{query}"%', ..so on)
Im use myslq
so my main goal to split search query if it contains more than 2 words and search by them separately in one sql query
not where(name:'john dou') but where(name:'john' and name:'dou')

If you have a version of MySQL that supports RLIKE or REGEXP_LIKE:
User.where("RLIKE(name, :name_query)", name_query: query.join("|"))
Otherwise, you'll have to manually build with the ActiveRecord or operator:
# In the User model
scope :name_like, ->(name_part) {
return self.all if name_part.blank?
where("name LIKE :name_query", name_query: "%#{name_part}%")
}
scope :names_like, ->(names) {
relation = User.name_like(names.shift)
names.each { |name| relation = relation.or(name_like(name)) }
relation
}
Then you can pass it an array of any name partials you want:
query = ["john", "dou"]
User.names_like(query)

First split the word by it's separator like this query.split(' ') this will give you array of words. Then you can use like below in rails.
User.where(name: ['John', 'dou']

Related

Laravel Filter JSON Data with SQL LIKE Operator

Code
//text to search
$details = "Successfully";
ActivityLog::with('getCauserDetails')
->when($details ?? false, function ($q) use ($details) {
$q->whereJsonContains('properties->activity', $details);
})
->get()
->toArray();
Table Structure
id - int
name - varchar
properties - json
user_id - int
Json Data
{
"ip":"192.168.0.1",
"platform":"Windows",
"activity":"Successfully logout"
}
{
"ip":"192.168.0.1",
"device":"WebKit",
"browser":"Chrome",
"platform":"Windows",
"activity":"Successfully logged in"
}
Question: Above code have been successfully to search the value of data inside the JSON data but need to search into full sentence. For example, "Successfully logout", if I search with "Successfully" sentence, it's will not filter the data. Does anyone know how to filter it's with the SQL LIKE Operator inside the JsonContains, mean that if I filter with sentence "Successfully", it's will also return the data instead of full sentence.
I didn't find any appropriate Laravel method but your query in SQL would be like this. It seems you don't have other choice except using raw query:
SELECT *
FROM table_name
WHERE JSON_EXTRACT(properties, "$.activity") LIKE '%Successfully logout%'

Returning MySQL data as an OBJECT rather than an ARRAY (Knex)

Is there a way to get the output of a MySQL query to list rows in the following structure
{
1:{voo:bar,doo:dar},
2:{voo:mar,doo:har}
}
as opposed to
[
{id:1,voo:bar,doo:dar},
{id:2,voo:mar,doo:har}
]
which I then have to loop through to create the desired object?
I should add that within each row I am also concatenating results to form an object, and from what I've experimented with you can't group_concatenate inside a group_concatenation. As follows:
knex('table').select(
'table.id',
'table.name',
knex.raw(
`CONCAT("{", GROUP_CONCAT(DISTINCT
'"',table.voo,'"',':','"',table.doo,'"'),
"}") AS object`
)
.groupBy('table.id')
Could GROUP BY be leveraged in any way to achieve this? Generally I'm inexperienced at SQL and don't know what's possible and what's not.

Logstash - Substring from CSV column

I want to import many informations from a CSV file to Elastic Search.
My issue is I don't how can I use a equivalent of substring to select information into a CSV column.
In my case I have a field date (YYYYMMDD) and I want to have (YYYY-MM-DD).
I use filter, mutate, gsub like:
filter
{
mutate
{
gsub => ["date", "[0123456789][0123456789][0123456789][0123456789][0123456789][0123456789][0123456789][0123456789]", "[0123456789][0123456789][0123456789][0123456789]-[0123456789][0123456789]-[0123456789][0123456789]"]
}
}
But my result is false.
I can indentified my string but I don't how can I extract part of this.
My target it's to have something like:
gsub => ["date", "[0123456789][0123456789][0123456789][0123456789][0123456789][0123456789][0123456789][0123456789]","%{date}(0..3}-%{date}(4..5)-%{date}"(6..7)]
%{date}(0..3} : select from the first to the 4 characters of csv columns date
You can use ruby plugin to do conversion. As you say, you will have a date field. So, we can use it directly in ruby
filter {
ruby {
code => "
date = Time.strptime(event['date'],'%Y%m%d')
event['date_new'] = date.strftime('%Y-%m-%d')
"
}
}
The date_new field is the format you want.
First, you can use a regexp range to match a sequence, so rather than [0123456789], you can do [0-9]. If you know there will be 4 numbers, you can do [0-9]{4}.
Second, you want to "capture" parts of your input string and reorder them in the output. For that, you need capture groups:
([0-9]{4})([0-9]{2})([0-9]{2})
where parens define the groups. Then you can reference those on the right side of your gsub:
\1-\2-\3
\1 is the first capture group, etc.
You might also consider getting these three fields when you do the grok{}, and then putting them together again later (perhaps with add_field).

MySQL Query and POSIXct in Loop

I have the following MySQL Query using RMySQL. All the Database parameters are set before and Query is working well. Is there a possibility to put it in a loop to get multiple zoo objects from more than one dpname? Thanks!
dpname.df<-"%Name%"
paste.query.df<-paste("select handle from db.connect where dpname like '",dpname.df,"'",sep='')
handle.df<-dbGetQuery(dbLT,paste.query.df)
paste.query2.df<-paste("select time,value from db.data where handle='",handle.df,"' and time between '",x,"' and '",y,"'",sep='')
df <- dbGetQuery(dbLT,paste.query2.df)
df$time<-as.POSIXct(df$time,format="%Y-%m-%d %H:%M:%S")
df.zoo<-zoo(df[,-1],df[,1])
I tried to set a function with mapply:
query<-function(x,y,dpname.df)
{
paste.query.df<-paste("select handle from db.connect where dpname like '",dpname.df,"'",sep='')
handle.df<-dbGetQuery(dbLT,paste.query.df)
paste.query2.df<-paste("select time,value from db.data where handle='",handle.df,"' and time between '",x,"' and '",y,"'",sep='')
dbGetQuery(dbLT,paste.query2.df)
}
which I can run: with mapply(query,x,y,dpname.df)
But I can't get multiple outputs for each! query. Is it possible to set another List with the output names? So I could also put the zoo and POSIXct stuff in my function.Thanks!

DBIx::Class Temporary column

I am using DBIx::Class and I have a query like this:
$groups = $c->model('DB::Project')->search(
{ "sessions.user_id"=>$c->user->id,done_yn=>'y' },
{
select => ["name", "id",\'SUM(UNIX_TIMESTAMP(end_time)-UNIX_TIMESTAMP(start_time)) as total_time'], #\''
join => 'sessions',
}
);
I'd like to be able to get the value of SUM(UNIX_TIMESTAMP(end_time)-UNIX_TIMESTAMP(start_time)), but because this is not a real column in the table, referencing total_time for a DBIx::Class::Row object doesn't seem to work. Does anyone know how I can get these temporary columns? Thanks!
The select docs describe perfectly how to achieve what you're trying to accomplish.
It's also recommended to avoid literal SQL when possible, you can use { sum => \'UNIX_TIMESTAMP(end_time)-UNIX_TIMESTAMP(start_time)' } instead.
The 'as' in the literal SQL isn't required to give the column a name, you have to use either the as search attribute or better the columns shortcut instead of select+as.