how to put % in string - mysql

I have tried using '%%' and '%' already, none of these working in the below scenario.
The output of query is:
UPPER('field') LIKE UPPER('string')
But I want the output as:
UPPER('field') LIKE UPPER('%string%')

You're trying to combine a Lookup and a Transform in the same query. You can only do one at a time. This is because Lookup automatically adds parentheses around lhs_params and rhs_params.
When you tried return 'UPPER(%s) LIKE UPPER(%%%s%%)' % (lhs, rhs), params this is what caused the unsupported character error.
You need to construct your Bilateral Transformer to convert both sides to uppercase separately from your lookup function.
Really what you want is to use the Transform from the docs combined with __contains which replaces your LIKE Lookup:
from django.db.models import Transform
class UpperCase(Transform):
lookup_name = 'upper'
function = 'UPPER'
bilateral = True
Your query would be:
YourObject.objects.filter(yourvalue__upper__contains="something")

You can override process_rhs method and modify the value there:
class MyCustomLookup(Lookup):
lookup_name = 'custom_lookuptest'
def as_sql(self, compiler, connection):
lhs, lhs_params = self.process_lhs(compiler, connection)
rhs, rhs_params = self.process_rhs(compiler, connection)
params = lhs_params + rhs_params
return 'UPPER(%s) LIKE UPPER(%s)' % (lhs, rhs), params
def process_rhs(self, compiler, connection):
value = '%%%s%%' % self.rhs
if self.bilateral_transforms:
if self.rhs_is_direct_value():
value = Value(value, output_field=self.lhs.output_field)
value = self.apply_bilateral_transforms(value)
value = value.resolve_expression(compiler.query)
if hasattr(value, 'get_compiler'):
value = value.get_compiler(connection=connection)
if hasattr(value, 'as_sql'):
sql, params = compiler.compile(value)
return '(' + sql + ')', params
if hasattr(value, '_as_sql'):
sql, params = value._as_sql(connection=connection)
return '(' + sql + ')', params
else:
return self.get_db_prep_lookup(value, connection)

Related

C sharp Dapper ms access query like parameters

I have a list of parameters : numericPartList : 029%, 035% for example.
As mentioned in the code, with only 1 parameter it works, but more, no...
public static List<tLicencie> GetDepartement(List<string> numericPartList)
{
ConnectionStringSettings connex = ConfigurationManager.ConnectionStrings["MaConnection"];
string connString = connex.ProviderName + connex.ConnectionString;
using (OleDbConnection con = new OleDbConnection(connString))
{
string sql;
if (numericPartList.Count < 2)
{
sql = "select * from tLicencie where NOCLUB like #numericPartList "; // Works !
}
else
{
sql = "select * from tLicencie where NOCLUB like #numericPartList "; // Does not Work
}
return (List<tLicencie>)con.Query<tLicencie>(sql, new { numericPartList });
}
}
I get an error message :
Syntax error (comma) in expression 'NOCLUB like (# numericPartList1, # numericPartList2)'. "
How to solve this problem?
A solution for the moment: I add DapperExtentions then
var pga = new PredicateGroup { Operator = GroupOperator.Or, Predicates = new List<IPredicate>() };
pga.Predicates.Add(Predicates.Field<tLicencie>(f => f.NOCLUB, Operator.Like, "029%"));
pga.Predicates.Add(Predicates.Field<tLicencie>(f => f.NOCLUB, Operator.Like, "035%"));
IEnumerable<tLicencie> list = con.GetList<tLicencie>(pga);
The dapper list expansion is only intended for use with in, i.e. it would expand:
where foo in #x
to
where foo in (#x0, #x1, #x2)
(or some other variants, depending on the scenario).
It cannot be used with like in this way, as it will give invalid SQL; you would need to compose your SQL and parameters in a more manual fashion, perhaps using DynamicParameters which works more like a dictionary; i.e. you would need to loop in such a way as to construct
where (foo like #x0 or foo like #x1 or foo like #x2)

How to use filter_by and not equals to in sqlalchemy?

I have a function defined as below to query the database table
def query_from_DB(obj, **filter):
DBSession = sessionmaker(bind=engine)
session = DBSession()
res = session.query(obj).filter_by(**filter)
session.close()
return [x for x in res]
I query the table using the request as below
query_from_DB(Router, sp_id="sp-10.1.10.149", connectivity="NO")
the above result returns the response from the DB correctly, but when I make a query using
query_from_DB(Router, sp_id!="sp-10.1.10.149", connectivity="NO")
i got an error
SyntaxError: non-keyword arg after keyword arg
What could be the possible changes I can make to get the result?
I don't believe you can use != on a keyword argument.
You can do connectivity="YES" or use the filter sqlalchemy function so you can pass == or !=, but then you won't be able to use keyword args. You'd have to pass a SQL expression like so...
res = session.query(obj).filter_by(connectivity != "NO")
This question may be helpful...
flask sqlalchemy querying a column with not equals
Did you simply try res = session.query(obj).filter_by(connectivity <> "NO") ?

Custom ARQ function not working with fuseki endpoint

I successfully implemented sparql queries using custom ARQ functions using the following (custom function code):
public class LevenshteinFilter extends FunctionBase2 {
public LevenshteinFilter() { super() ; }
public NodeValue exec(NodeValue value1, NodeValue value2){
LevenshteinDistance LD=new LevenshteinDistance();
int i = LD.apply(value1.asString(), value2.asString());
return NodeValue.makeInteger(i);
}
}
it works fine when I query against a Model loaded from a turtle file, like this:
InputStream input = QueryProcessor.class.getClassLoader().getResourceAsStream("full.ttl");
model = ModelFactory.createMemModelMaker().createModel("default");
model.read(input,null,"TURTLE"); // null base URI, since model URIs are absolute
input.close();
with the query being sent like this :
String functionUri = "http://www.example1.org/LevenshteinFunction";
FunctionRegistry.get().put(functionUri , LevenshteinFilter.class);
String s = "whatever you want";
String sparql = prefixes+" SELECT DISTINCT ?l WHERE { ?x rdfs:label ?l . " + "FILTER(fct:LevenshteinFunction(?l, \"" + s + "\") < 4) }";
Query query = QueryFactory.create(sparql);
QueryExecution qexec = QueryExecutionFactory.create(query, model);
ResultSet rs = qexec.execSelect();
However, if i use a working fuseki endpoint for the same dataset (full.ttl) like this :
fusekiUrl="http://localhost:3030/ds/query";
sending the query like this (using QueryExecutionFactory.sparqlService(fusekiUrl,query) instead of QueryExecutionFactory.create(query,model) ):
String functionUri = "http://www.example1.org/LevenshteinFunction";
FunctionRegistry.get().put(functionUri , LevenshteinFilter.class);
String s = "whatever you want";
String sparql = prefixes+" SELECT DISTINCT ?l WHERE { ?x rdfs:label ?l . " + "FILTER(fct:LevenshteinFunction(?l, \"" + s + "\") < 4) }";
Query query = QueryFactory.create(sparql);
QueryExecution qexec = QueryExecutionFactory.sparqlService(fusekiUrl,query);
ResultSet rs = qexec.execSelect();
Then I don't get any results back. In both cases I printed out the FunctionRegistry and they contain exactly the same entries, especially :
key=http://www.example1.org/LevenshteinFunction
value: org.apache.jena.sparql.function.FunctionFactoryAuto#5a45133e
Any clue ?
Thanks
Back on that question that I finally solved. There was several issues, one which is the fact that (obviously !!) Remote endpoint and client are running on different jvms.
To get the thing working, do the following (for a stupid MyFilter custom function - i.e strlen) :
1) Deploy the custom functions classes jar on fuseki server
2) Modify the fuseki config :
add [] ja:loadClass "my.functions.package.MyFilter"
where MyFilter implementation is :
import org.apache.jena.sparql.expr.NodeValue;
import org.apache.jena.sparql.function.FunctionBase1;
public class MyFilter extends FunctionBase1 {
public MyFilter() { super() ; }
public NodeValue exec(NodeValue value1){
int d = value1.asString().length();
return NodeValue.makeInteger(new Integer(d));
}
}
3) add the following prefix to the context:
PREFIX f: <java:my.functions.package.>
Note that "my.functions.package." is the package of MyFilter
class, not the class itself --> this means that you never call a class
method in sparql queries but only a class that implements
org.apache.jena.sparql.function.FunctionBaseX where X is the number of
argument of your filter function
4) Write (for instance) the query like this:
SELECT DISTINCT ?l
WHERE { ?x skos:prefLabel ?l .
FILTER (f:MyFilter(?l) < 20)
}
EDIT: step 2) is not necessary

Spark UDF returns a length of field instead of length of value

Consider the code below
object SparkUDFApp {
def main(args: Array[String]) {
val df = ctx.read.json(".../example.json")
df.registerTempTable("example")
val fn = (_: String).length // % 10
ctx.udf.register("len10", fn)
val res0 = ctx sql "SELECT len10('id') FROM example LIMIT 1" map {_ getInt 0} collect
println(res0.head)
}
}
JSON example
{"id":529799371026485248,"text":"Example"}
The code should return a length of the field value from JSON (e.g. 'id' has value 18). But instead of returning '18' it returns '2', which is the length of 'id' I suppose.
So my question is how to rewrite UDF to fix it?
The problem is that you are passing the string id as a literal to your UDF so it is interpreted as one instead of a column (notice that it has 2 letters this is why it returns such number). To solve this just change the way how you formulate the SQL query.
E.g.
val res0 = ctx sql "SELECT len10(id) FROM example LIMIT 1" map {_ getInt 0} collect
// Or alternatively
val len10 = udf(word => word.length)
df.select(len10(df("id")).as("length")).show()

grails json converter only returns strings not numbers, integers or doubles

I have a database table that stores the structure of a simple table. The table contains user data. The user can define how the table is set up. Thus I store values and their types. Everything is a string in the stored table. I need to pass this as JSON to the browser client, which does some JavaScript on the JSON. But, I want numeric values and string values in the JSON object, but the Grails JSON converter, only spits out Strings
My Service has:
def testMap(){
//my results from a query
List query = [["name":"price","value":"4.23","type":"double"],["name":"title","value":"box","type":"string"]]
Map results = [:]
query.each{ row ->
if (row.type == "double"){
results << [(row.name): row.value]
}
else
{
//what do I do here?
results << [(row.name): row.value]
}
}
return results
}
My Controller has...
def showMap(){
render touchSourceSystemService.testMap() as JSON
}
The results are...
{"price":"4.23","title":"box"}
But I need the price to be numeric, not a string, like this.
{"price":4.23,"title":"box"}
I had this same issue. The only way I could get it to work was like this...
Add this to top of your Service Class:
import groovy.json.*
And in your Service method, do this:
def testMap(){
//my results from a query
List query = [[name:"price",value:"4.23",type:"double"],[name:"title",value:"box",type:"string"]]
def slrp = new JsonSlurper()
def results = [:]
query.each{ row ->
if (row.type == "double"){
//results << ["'${row.name}'": "${row.value}"]
results << slrp.parseText('{"' + row.name + '":' + row.value + '}')
}
else
{
//what do I do here?
results << slrp.parseText('{"' + row.name + '":"' + row.value + '"}')
//results << ["'${row.name}'": "'${row.value}'"]
}
}
return results
}
Your controller should now return this:
{"price":4.23,"title":"box"}