Feathers.js mongoose querying - feathersjs

I'm very new to feathers.js, how would you accomplish querying for an object?
{
...,
obj: {
foo: 1,
bar: 1
},
...
}
The following seems to not work
/some-doc?obj['foo']['$eq']=1
Also, how would you tackle a query like checking a size of array
/some-doc?someArray['length']['$gt']=0
I've been trying to send param like
checkArray=true
Process it at before:find but no luck. Is this the right approach?
Thank you,

In general, most queries supported by Mongoose and MongoDB will work for your Feathers service. MongoDB queries on nested fields using the dot notation so it would be:
/some-doc?obj.foo=1
Length queries can be done with the $size operator. To check if an array has a certain length, you can use the dot notation to see if the entry at the index exists (see this answer):
/some-doc?someArray.1[$exists]=true

Related

How can I select all possible JSON Data in arrow syntax/JSON Extract in SQL

I need to be able to access all the available JSON data, the problem is that a lot of it is nested.
I currently have this query.
SELECT * FROM `system_log` WHERE entry->"$[0]" LIKE "%search_term%";
I need instead of entry->"$[0]", something like entry->"$*"
I think the arrow syntax is short for JSON_EXTRACT which I think would mean that a solution for extract would work for the arrow syntax.
{
" Name": {
"after": "Shop",
"before": "Supermarket"
}
}
This is an example of my JSON data and as you can see there are multiple levels to it meaning that entry->"$[0]" won't catch it.
version 8.0.19 of SQL
What I've tried so far is entry->"$[0]" and then prepending [0] after, but this solution does not seem very dynamic as the JSON data could get deeper and deeper.
JSON_SEARCH() won't work for the search you describe, because JSON_SEARCH() only searches for full string matches, not wildcards.
If you truly cannot predict the structure of your JSON, and you just want to find if the pattern '%search_term%' appears anywhere, then just treat the whole JSON document as a string, and use LIKE:
SELECT * FROM `system_log` WHERE entry LIKE "%search_term%";
If you have more specific search requirements, then you'll have to come up with a way to predict the path to the value you're searching for in your JSON document. That's something I cannot help you with, because I don't know your usage of JSON.

How to return an array of JSON objects rather the a collection of rows using JOOQ & PostgreSQL

Having read this post suggesting that it's sometimes a good trade-off to use JSON operators to return JSON directly from the database; I'm exploring this idea using PostgreSQL and JOOQ.
I'm able to write SQL queries returning a JSON array of JSON objects rather than rows using the following pattern:
select jsonb_pretty(array_to_json(array_agg(row_to_json(r)))::jsonb)
from (
select [subquery]
) as r;
However, I failed to translate this pattern in JOOQ.
Any help regarding how to translate a collection of rows (fields being of "usual" SQL type or already mapped as json(b)) using JOOQ would be appreciated.
SQL Server FOR JSON semantics
That's precisely what the SQL Server FOR JSON clause does, which jOOQ supports and which it can emulate for you on other dialects as well:
ctx.select(T.A, T.B)
.from(T)
.forJSON().path()
.fetch();
PostgreSQL native functionality
If you prefer using native functions directly, you will have to do with plain SQL templating for now, as some of these functions aren't yet supported by jOOQ, including:
JSONB_PRETTY (no plans of supporting it yet)
ARRAY_TO_JSON (https://github.com/jOOQ/jOOQ/issues/12841)
ROW_TO_JSON (https://github.com/jOOQ/jOOQ/issues/10685)
It seems quite simple to write a utility that does precisely this:
public static ResultQuery<Record1<JSONB>> json(Select<?> subquery) {
return ctx
.select(field(
"jsonb_pretty(array_to_json(array_agg(row_to_json(r)))::jsonb)",
JSONB.class
))
.from(subquery.asTable("r"))
}
And now, you can execute any query in your desired form:
JSONB result = ctx.fetchValue(json(select(T.A, T.B).from(T)));
Converting between PG arrays and JSON arrays
A note on performance. It seems that you're converting between data types a bit often. Specifically, I'd suggest you avoid aggregating a PostgreSQL array and turning that into a JSON array, but to use JSONB_AGG() directly. I haven't tested this, but it seems to me that the extra data structure seems unnecessary.

Supporting JSON paths in BigQuery

I was wondering if BigQuery has any additional support to JSON paths, as it seems like this is such a common way to work with nested data in BigQuery. For example, as a few years ago it seemed like the answer was: What JsonPath expressions are supported in BigQuery?, i.e., "Use a UDF".
However, it seems like using a path within an array, such as:
`$..Job'
Is such a common operation given BigQuery's repeated field, that about 70% of the times I've tried to use BigQuery's JSON_EXTRACT, I run into the limitation of having to iterate down an array.
Is this ability supported yet in BigQuery, or are there plans to support it, without having to do a UDF? As nice as something like the following works:
CREATE TEMPORARY FUNCTION CUSTOM_JSON_EXTRACT(json STRING, json_path STRING)
RETURNS STRING
LANGUAGE js AS """
try { var parsed = JSON.parse(json);
return JSON.stringify(jsonPath(parsed, json_path));
} catch (e) { return null }
"""
OPTIONS (
library="gs://xx-bq/jsonpath-0.8.0.js"
);
SELECT CUSTOM_JSON_EXTRACT(to_json_string(Occupation), '$..Job'), to_json_string(MovieInfo), json_extract(MovieInfo, '$.Platform') FROM `xx-163219.bqtesting.xx` LIMIT 1000
It ends up taking anywhere between 4-6x longer than a normal JSON_EXTRACT function (2s vs. about 10s). Or, is there something that I'm missing with what you're able to do with JSON objects in BQ?
Currently, the support for JSONPath on BigQuery includes and is limited to $, ., and [], where the latter can be either a child operator or a subscript (array) operator.
Other syntax elements from JSONPath are not supported, but for future reference, there's a public feature request to support complete JSONPath syntax.

What is the most elegant way to stream the results of an SQL query out as JSON?

Using the Play framework with Anorm, I would like to write a Controller method that simply returns the results of an SQL query as JSON. I don't want to bother converting to objects. Also, ideally this code should stream out the JSON as the SQL ResultSet is processed rather than processing the entire SQL result before returning anything to the client.
select colA colB from mytable
JSON Response
[{"colA": "ValueA", "colB": 33}, {"colA": "ValueA2", "colB": 34}, ...]
I would like to express this in Scala code as elegantly and concisely as possible, but the examples I'm finding seem to have a lot of boiler plate (redundant column name definitions). I'm surprised there isn't some kind of SqlResult to JsValue conversion in Play or Anorm already.
I realize you may need to define Writes[] or an Enumeratee implementation to achieve this, but once the conversion code is defined, I'd like the code for each method to be nearly as simple as this:
val columns = List("colA", "colB")
db.withConnection { implicit c =>
Ok(Json.toJson(SQL"select #$columns from mytable"))
}
I'm not clear on the best way to define column names just once and pass it to the SQL query as well as JSON conversion code. Maybe if I create some kind of implicit ColumnNames type, the JSON conversion code could access it in the previous example?
Or maybe define my own kind of SqlToJsonAction to achieve even simpler code and have common control over JSON responses?
def getJson = SqlToJsonAction(List("colA", "colB")) { columns =>
SQL"select #$columns from mytable"
}
The only related StackOverflow question I found was: Convert from MySQL query result or LIST to JSON, which didn't have helpful answers.
I'm a Java developer just learning Scala so I still have a lot to learn about the language, but I've read through the Anorm, Iteratee/Enumeratee, Writes, docs and numerous blogs on Anorm, and am having trouble figuring out how to setup the necessary helper code so that I can compose my JSON methods this way.
Also, I'm unclear on what approaches allow Streaming out the Response, and which will iterate the entire SQL ResultSet before responding with anything to the client. According to Anorm Streaming Results only methods such as fold/foldWhile/withResult and Iteratees stream. Are these the techniques I should use?
Bonus:
In some cases, I'll probably want to map a SQL column name to a different JSON field name. Is there a slick way to do this as well?
Something like this (no idea if this Scala syntax is possible):
def getJson = SqlToJsonAction("colA" -> "jsonColA", "colB", "colC" -> "jsonColC")) { columns =>
SQL"select #$columns from mytable"
}

Accessing nested list items in an interpolated string using dot notation in Scala

I'm trying to pass a value via JSON that I am having trouble accessing. We have a data structure (that was obviously not built by me otherwise I would likely understand it) that looks something like this when sent to the browser:
{Foo(Bar(List(Baz(List(G3),w))),G3,None)}
This is sent via a JSON write method, but the originating Scala line looks like:
val hint = Some(s"{$question}") where $question is of type Foo.
I've tried using dot notation to access the list items in ways that I thought would work:
val hint = Some(s"{$question.Bar.Baz})"
val hint = Some(s"{$question.Bar(0).Baz(0)"})
It's the deepest G3 I wanted to strip out and send, but instead the JSON object comes through looking like:
{Foo(Bar(List(Baz(List(G3),w))),G3,None)}.Bar.Baz or
{Foo(Bar(List(Baz(List(G3),w))),G3,None)}.Bar(0).Baz(0)
I must be fundamentally missing something about the data structures involved here.
I think you're just using the wrong syntax. The $ needs to come before the {} and the {} is necessary for any expression more complicated than just a variable name:
s"${question.bar(0).baz(0)}"