I am trying to use the Newtonsoft.Json.Net in c#. The following is part of JSON file that I need to retrieve data out of:
{
"video":{
"local_recording_device":{
"codecs":null
},
"preferred_string":"___PREFERRED___",
"streams":{
"99176901":{
"id":"99176901",
"name":"PTZ Camera",
"site":"someone",
"email":"someone#awebsite.com",
"codec":"VP8 HD1 (720p)",
"local":true,
"screen":false,
"fit_to_window":true,
"stay_on_top":false,
"layout":0,
"native_width":1280,
"native_height":720,
"window_width":456,
"window_height":254,
"preferred":false,
"local_recording":false,
"device_id":"MJPEG Camera",
"normalized_device_id":"MJPEGCamera",
"shared_window_id":"MJPEG Camera",
"enable":true,
"big_location":"2",
"x":347,
"y":737,
"window_id":"197302",
"camera_id":null
},
"3091494011":{
"id":"3091494011",
"name":"Logitech Webcam C930e",
"site":"Joe Smith",
"email":"joe#awebsite.com",
"codec":"VP8 Medium (CIF)",
"local":false,
"screen":false,
"fit_to_window":true,
"stay_on_top":false,
"layout":0,
"native_width":352,
"native_height":288,
"window_width":864,
"window_height":702,
"preferred":true,
"local_recording":false,
"enable":true,
"big_location":"1",
"x":204,
"y":0,
"window_id":"197296",
"camera_id":null
},
"3798287599":{
"id":"3798287599",
"name":"Drive Camera",
"site":"ASiteName",
"email":"asitesame#awebsite.com",
"codec":"VP8 HD1 (720p)",
"local":true,
"screen":false,
"fit_to_window":true,
"stay_on_top":false,
"layout":0,
"native_width":1280,
"native_height":720,
"window_width":456,
"window_height":254,
"preferred":true,
"local_recording":false,
"device_id":"Logitech Webcam C930e",
"normalized_device_id":"LogitechWebcamC930e",
"shared_window_id":"Logitech Webcam C930e",
"enable":true,
"big_location":"3",
"x":814,
"y":737,
"window_id":"262822",
"camera_id":null
}
}
}
}
So, inside the JSON data is: "video", "streams" inside streams can be x amount of different streams (stream id's). The streams in "streams" (the long numbers) can change at anytime. In my example here there are three. I need to search through all streams in "streams" and see if any of them has a "email" that matches a particular email address. Each of the streams has a "email". If a email matches my supplied email address I need to check that streams "enable" to see if it's true or false.
Any help is appreciated in leading me in the right direction. I have not worked with a JSON data before.
You can use LINQ to JSON and SelectTokens to do the required query:
string json = GetJson();
var obj = JObject.Parse(json);
var testEmail = "someone#awebsite.com";
var streamQuery = obj.SelectTokens("video.streams.*").Where(s => (string)s["email"] == testEmail);
var firstEnabled = streamQuery.Select(s => (bool?)s["enable"]).FirstOrDefault();
The query returns a nullable bool that is true or false if the first stream for the desired email is enabled, or null if there is no stream for that email address.
Note that this returns the enabled state of the first stream matching the given email address. If you want to know if any are enabled, do:
var anyEnabled = streamQuery.Any(s => (bool)s["enable"]);
Related
I am using react JS to fetch some data from and API endpoint and then display it.
I want to sort it by date and time before displaying it.
Data looks like this when it's fetched:
{
"2021-03-09T07:47:24.897Z[UTC]": "Something happened",
"2021-03-09T07:48:12.256Z[UTC]": "Test event",
"2021-03-09T08:04:49.484Z[UTC]": "Warning",
"2021-03-09T07:08:15.714Z[UTC]": "Test event 2",
"2021-03-09T07:47:24.736Z[UTC]": "Something bad happened 2"
}
I cannot change this json structure. I need to sort this by date and time and display in this format YYYY-MM-DD h:mm:ss
My function to do this looks like this:
formatDate(obj) {
return Object.keys(obj).sort((a,b) => moment(a.obj).format('YYYY-MM-DD h:mm:ss') - moment(b.obj).format('YYYY-MM-DD h:mm:ss'))
}
Then I do the following to the fetched json object:
console.log(this.formatDate(json));
Doing so returns the following:
0: "2021-03-09T07:47:24.897Z[UTC]"
1: "2021-03-09T07:48:12.256Z[UTC]"
2: "2021-03-09T08:04:49.484Z[UTC]"
3: "2021-03-09T07:08:15.714Z[UTC]"
4: "2021-03-09T07:47:24.736Z[UTC]"
Returned dates are not sorted. How do I make sure these returned sorted?
There is no obj property in your json.
There is no need to apply format, as the value becomes String. We can just compare the moment object.
The "[UTC]" inside the key is not standard date format, which leads to warning
Deprecation warning: value provided is not in a recognized RFC2822 or ISO format. moment construction falls back to js Date(), which is not reliable across all browsers and versions. Non RFC2822/ISO date formats are discouraged. Please refer to http://momentjs.com/guides/#/warnings/js-date/ for more info.
You may try following snippet which fix above points:
$(function () {
let json = {
"2021-03-09T07:47:24.897Z[UTC]": "Something happened",
"2021-03-09T07:48:12.256Z[UTC]": "Test event",
"2021-03-09T08:04:49.484Z[UTC]": "Warning",
"2021-03-09T07:08:15.714Z[UTC]": "Test event 2",
"2021-03-09T07:47:24.736Z[UTC]": "Something bad happened 2"
};
console.log("Sorted result:");
console.log(formatDate(json));
});
// better to name it as sortDate
function formatDate(obj) {
return Object.keys(obj).sort((a,b) => moment(a.replace("\[UTC\]","")) - moment(b.replace("\[UTC\]","")));
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/moment.js/2.29.1/moment.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
It's unclear by your function if you want to format a date, or sort it like your question asks, but you can directly compare datetime strings for the sorting, i.e. dateA.localeCompare(dateB).
Object.fromEntries(
Object.entries(data).sort(([keyA], [keyB]) => keyA.localeCompare(keyB))
);
Convert the object into an array of key-value pairs and sort the array by the key values, then convert the array of key-value pairs back into an object.
If you need to do any format conversions then you should do this via a map operation, i.e. you map an array of keys from one format to another.
To convert to a UTC time:
moment.utc(key.replace("[UTC]", "")).format('YYYY-MM-DD h:mm:ss')
const data = {
"2021-03-09T07:47:24.897Z[UTC]": "Something happened",
"2021-03-09T07:48:12.256Z[UTC]": "Test event",
"2021-03-09T08:04:49.484Z[UTC]": "Warning",
"2021-03-09T07:08:15.714Z[UTC]": "Test event 2",
"2021-03-09T07:47:24.736Z[UTC]": "Something bad happened 2"
};
const sortedData = Object.fromEntries(
Object.entries(data).sort(([keyA], [keyB]) => keyA.localeCompare(keyB))
);
console.log(sortedData);
const mappedData = Object.fromEntries(
Object.entries(sortedData).map(([key, value]) => [
moment.parseZone(key.replace("[UTC]", "")).format('YYYY-MM-DD h:mm:ss'),
value
])
);
console.log(mappedData);
<script src="https://cdnjs.cloudflare.com/ajax/libs/moment.js/2.29.1/moment.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
I know JSON messages are just key and value, but is there a wireshark plugin or tool available for filtering JSON messages within a pcap ?
For Eg:- Let say i have below 4 json message(s) within a pcap
{"name":"john","salary":50000,"email":"john#stackoverflow.com"}
{"name":"vj","salary":55000,"email":"vj#stackoverflow.com"}
{"name":"rj","salary":65000,"email":"rj#stackoverflow.com"}
{"name":"rambo","salary":66000,"email":"rambo#stackoverflow.com"}
I want to filter out all the people whose salary is less than 65000 ?
snippet code
func filter_packet_based_on_rule(json_packet)
{
// In this case if salary is < 65000, but it could be more complex
if json_packet matches any of the rule
return true
else
return false
}
main()
{
while true
json_packet = read_packet_from_pcap()
if ! json_packet { break; }
if (filter_packet_based_on_rule(json_packet))
print json_packet
}
I believe your are looking for something like this - https://github.com/ranjithum/Json-Analyzer/
All-though there is not code to parse the pcap (which shouldn't be hard using libpcap), u could write your own filtering rules and filter out necessary packets.
I'm trying to set a rule in Azure Stream Analytics job with the use of reference data and input stream which is coming from an event hub.
This is my reference data JSON packet in BLOB storage:
{
"ruleId": 1234,
"Tag" : "TAG1",
"metricName": "velocity",
"alertName": "velocity over 500",
"operator" : "AVGGREATEROREQUAL",
"value": 500
}
And here is the transformation query in the stream analytics job:
WITH
transformedInput AS
(
SELECT
metric = GetArrayElement(DeviceInputStream.data,0),
masterTag = rules.Tag,
ruleId = rules.ruleId,
alertName = rules.alertName,
ruleOperator = rules.operator,
ruleValue = rules.value
FROM
DeviceInputStream
timestamp by EventProcessedUtcTime
JOIN
rules
ON DeviceInputStream.masterTag = rules.Tag
)
--rule output--
SELECT
System.Timestamp as time,
transformedInput.Tag as Tag,
transformedInput.ruleId as ruleId,
transformedInput.alertName as alert,
AVG(metric.velocity) as avg
INTO
alertruleblob
FROM
transformedInput
GROUP BY
transformedInput.masterTag,
transformedInput.ruleId,
transformedInput.alertName,
ruleOperator,
ruleValue,
TumblingWindow(second, 6)
HAVING
ruleOperator = 'AVGGREATEROREQUAL' AND avg(metric.velocity) >= ruleValue
This is not yielding any results. However, when I do a test with sample input and reference data I get the expected results. But this doens't seem to be working with the streaming data. My use case is if the average velocity is greater than 500 for a 6 second window, store that result in another blob storage. The value of velocity has been greater than 500 for sometime, but I'm not getting any results.
What am I doing wrong?
This was working all along. I just had to specify the input path of the reference blob in the reference input path of stream analytics including the file name. I was basically referencing only the blob container without the actual file. So when I changed the path pattern to "filename.json", I got the results. It was a stupid mistake.
What's the proper Loopback filter format to query a value in a MySQL (5.7) JSON field? For example, how would we perform this query using a Loopback REST filter?
QUERY
SELECT name, info->"$.id", info->"$.team"
FROM employee
WHERE JSON_EXTRACT(info, "$.team") = "A"
ORDER BY info->"$.id";
SAMPLE DATA
+---------------------------+----------
| info | name
+---------------------------+---------
| {"id": "1", "team": "A"} | "Sam"
| {"id": "2", "team": "A"} | "Julie"
| {"id": "3", "name": "B"} | "Rob"
| {"id": "4", "name": "B"} | "Cindy"
+---------------------------+---------
UNSUCCESSFUL ATTEMPTS
/employee?filter[where][info->$.team]=A
/employee?filter[where][info.team]=A
/employee?filter[where][info][team]=A
Since this question is the first to pop from Google search, and that the loopback mysql connector still does not allow for json querying, I feel like a proper answer should be given for future readers.
The work around is to add the feature directly on the connector by yourself, until the loopback politics settle on a decision on how to really handle it.
Here's our take: put this inside your /connectors folder:
import * as connector from "loopback-connector-mysql"
var g = require('strong-globalize')();
var SqlConnector = require('loopback-connector').SqlConnector;
var ParameterizedSQL = SqlConnector.ParameterizedSQL;
const _buildExpression = connector.MySQL.prototype.buildExpression;
connector.MySQL.prototype.buildExpression = function (columnName, operator, operatorValue, propertyDefinition) {
if (operator === 'json') {
operatorValue = JSON.parse(operatorValue);
const keys = Object.keys(operatorValue);
if (keys.length > 1) {
g.warn('{{MySQL}} {{json}} syntax can only receive one key, received ' + keys.length);
}
const jsonPointer = "$." + keys[0];
const value = operatorValue[keys[0]];
const column = `JSON_EXTRACT(${columnName}, "${jsonPointer}")`;
if (value && value.constructor === Object) {
// this includes another operator, apply it on the built column
const operator = Object.keys(value)[0];
return _buildExpression.apply(this, [column, operator, value[operator], propertyDefinition]);
}
const clause = `${column} = ?`;
return new ParameterizedSQL(clause,
[value]);
} else {
return _buildExpression.apply(this, [columnName, operator, operatorValue, propertyDefinition])
}
};
export default connector;
Then point to this file in your database config connector: 'connectors/mysql-json' or require it into the connector if that doesn't work (doc says we can define a path but we could not get it working...)
This is pretty simple, we overwrite the buildExpression function to be able to insert a new operator json. This will make it usable anywhere you'd use other operators such as gt, lt, nin etc.
We went a step further and allow operators be passed in your json operator, to be able to leverage them too.
Here's an example query filter:
{"where":
{
"jsonProperty":{"json":{"id":1}}
}
}
// Results in
// WHERE JSON_EXTRACT('jsonProperty', '$.id') = 1
{"where":
{
"jsonProperty":{"json":{"id":{"gt":1}}}
}
}
// Results in
// WHERE JSON_EXTRACT(`jsonProperty`, '$.id') > 1
We simply prepend the key of the object passed to json with $. for ease of use (not sure if it's the best but works for us). You could write any json path as the key, just ommit the $.
I don't think this is possible for the moment, i think that neither the connector support the JSON_EXTRACT.
https://github.com/strongloop/loopback-connector-mysql
I think that for execute that kind of queries you may use raw queries and write custom code using the actual connector.
I see a lot of references to "compressed JSON" when it comes to different serialization formats. What exactly is it? Is it just gzipped JSON or something else?
Compressed JSON removes the key:value pair of json's encoding to store keys and values in seperate parallel arrays:
// uncompressed
JSON = {
data : [
{ field1 : 'data1', field2 : 'data2', field3 : 'data3' },
{ field1 : 'data4', field2 : 'data5', field3 : 'data6' },
.....
]
};
//compressed
JSON = {
data : [ 'data1','data2','data3','data4','data5','data6' ],
keys : [ 'field1', 'field2', 'field3' ]
};
This method of usage i found here
Content from link (http://www.nwhite.net/?p=242)
rarely find myself in a place where I am writing javascript applications that use AJAX in its pure form. I have long abandoned the ‘X’ and replaced it with ‘J’ (JSON). When working with Javascript, it just makes sense to return JSON. Smaller footprint, easier parsing and an easier structure are all advantages I have gained since using JSON.
In a recent project I found myself unhappy with the large size of my result sets. The data I was returning was tabular data, in the form of objects for each row. I was returning a result set of 50, with 19 fields each. What I realized is if I augment my result set I could get a form of compression.
// uncompressed
JSON = {
data : [
{ field1 : 'data1', field2 : 'data2', field3 : 'data3' },
{ field1 : 'data4', field2 : 'data5', field3 : 'data6' },
.....
]
};
//compressed
JSON = {
data : [ 'data1','data2','data3','data4','data5','data6' ],
keys : [ 'field1', 'field2', 'field3' ]
};
I merged all my values into a single array and store all my fields in a separate array. Returning a key value pair for each result cost me 8800 byte (8.6kb). Ripping the fields out and putting them in a separate array cost me 186 bytes. Total savings 8.4kb.
Now I have a much more compressed JSON file, but the structure is different and now harder to work with. So I implement a solution in Mootools to make the decompression transparent.
Request.JSON.extend({
options : {
inflate : []
}
});
Request.JSON.implement({
success : function(text){
this.response.json = JSON.decode(text, this.options.secure);
if(this.options.inflate.length){
this.options.inflate.each(function(rule){
var ret = ($defined(rule.store)) ? this.response.json[rule.store] : this.response.json[rule.data];
ret = this.expandData(this.response.json[rule.data], this.response.json[rule.keys]);
},this);
}
this.onSuccess(this.response.json, text);
},
expandData : function(data,keys){
var arr = [];
var len = data.length; var klen = keys.length;
var start = 0; var stop = klen;
while(stop < len){
arr.push( data.slice(start,stop).associate(keys) );
start = stop; stop += klen;
}
return arr;
}
});
Request.JSON now has an inflate option. You can inflate multiple segments of your JSON object if you so desire.
Usage:
new Request.JSON({
url : 'url',
inflate : [{ 'keys' : 'fields', 'data' : 'data' }]
onComplete : function(json){}
});
Pass as many inflate objects as you like to the option inflate array. It has an optional property called ’store’ If set the inflated data set will be stored in that key instead.
The ‘keys’ and ‘fields’ expect strings to match a location in the root of your JSON object.
Based in Paniyar's answer, we can convert a List of Objects in "compressed" Json format using C# like this:
var JsonString = serializer.Serialize(
new
{
cols = new[] { "field1", "field2", "field3"},
items = data.Select(x => new object[] {x.field1, x.field2, x.field3})
});
I used an array of object because the fields can be int, bool, string...
More Reduction:
If the field is repeated very often and it is a string type, you can get compressed a little be more if you add a distinct list of that field... for instance, a field name job position, city, etc are excellent candidate for this. You can add a distinct list of this items and in each item change the value for a reference number. That will make your Json more lite.
Compressed:
[["KeyA", "KeyB", "KeyC", "KeyD", "KeyE", "KeyF"],
["ValA1", "ValB1", "ValC1", "ValD1", "ValE1", "ValF1"],
["ValA2", "ValB2", "ValC2", "ValD2", "ValE2", "ValF2"],
["ValA3", "ValB3", "ValC3", "ValD3", "ValE3", "ValF3"],
["ValA4", "ValB4", "ValC4", "ValD4", "ValE4", "ValF4"]]
Uncompressed:
[{KeyA: "ValA1", KeyB: "ValB1", KeyC: "ValC1", KeyD: "ValD1", KeyE: "ValE1", KeyF: "ValF1"},
{KeyA: "ValA2", KeyB: "ValB2", KeyC: "ValC2", KeyD: "ValD2", KeyE: "ValE2", KeyF: "ValF2"},
{KeyA: "ValA3", KeyB: "ValB3", KeyC: "ValC3", KeyD: "ValD3", KeyE: "ValE3", KeyF: "ValF3"},
{KeyA: "ValA4", KeyB: "ValB4", KeyC: "ValC4", KeyD: "ValD4", KeyE: "ValE4", KeyF: "ValF4"}]
The most likely answer is that it really is just gzipped JSON. There is no other standard meaning to this phrase.
Re-organizing a homogenous array of JSON objects into a pair of arrays is a very useful technique to make the payload smaller and to speed up encoding and decoding, it is not commonly called "compressed JSON". I haven't run across it ever in open source or any open API, but we use this technique internally and call it "jsontable".