I have such JSON array in file
var jsonfromfile = [
[Date.UTC(2004, 1, 3), 19.3],
[Date.UTC(2004, 1, 10), 12.7],
[Date.UTC(2004, 1, 17), 3.6],
[Date.UTC(2004, 1, 24), 19.1],
[Date.UTC(2004, 1, 31), 12.1],
[Date.UTC(2004, 2, 7), 11.3],
[Date.UTC(2004, 2, 28), 9.3],
[Date.UTC(2004, 3, 6), 14.3],
[Date.UTC(2004, 3, 13), 5.8],
[Date.UTC(2004, 3, 20), 8.6],
[Date.UTC(2004, 3, 27), 19.9],
[Date.UTC(2004, 4, 3), 14.2],
[Date.UTC(2004, 4, 10), 12.8],
[Date.UTC(2004, 4, 17), 10.6],
[Date.UTC(2004, 4, 24), 8.4],
[Date.UTC(2004, 5, 1), 19.8],
[Date.UTC(2004, 5, 8), 13.8]
];
Which i was using as dummy data making first steps with this charts http://www.highcharts.com/products/highstock.
Now i want to use dynamic data with that charts, so i have controller which returning Key-Value data
public virtual JsonResult GetData(int type)
{
Dictionary<string, decimal> data = getData(type);
return Json(data.ToArray(), JsonRequestBehavior.AllowGet);
}
and i calling that controller with jquery ajax.
var jsonFirstTry = {
data: []
};
$.ajax({
url: actionUrl,
dataType: 'json',
cache: false,
data: { type: type },
success: function (items) {
var jsonSecondTry = "[";
$.each(items, function (itemNo, item) {
jsonFirstTry.data.push(item.Key, item.Value);
jsonSecondTry += "[" + item.Key + "," + item.Value + "],";
})
jsonSecondTry = jsonSecondTry.substring(0, jsonSecondTry.length-1);
jsonSecondTry += "];";
//...
}
});
I was trying reproduce the data like in js file (jsonfromfile) jsonFirstTry and jsonSecondTry but couldn't do the data exactly like in js file
Here is how the data loaded from js file looking in debug like that:
Here is how data looking from my first try
Here is second try data(but it is just string so it is not valid data for chart....)
So i need to generate the same jason like in first image, any thoughts how can i do that?
Your initial example ( var jsonfromfile = [... ) is not JSON. It is an array of arrays (in JavaScript), using JavaScript's array literal syntax.
JSON is a string representing a serialized data structure, using a subset of the JavaScript object literal syntax. JSON cannot have method calls or method definitions.
Thus, attempting to provide, in JSON format, what you used as sample data will not work. You need to provide real JSON and manipulate it as needed (calling Date.UTC() on portions of it) when received.
Related
I am getting unquoted JSON from a trusted 3rd party that looks like this:
{id: 2, name: Test Testerson, course_progress: 0, last_activity_date: null}, {id: 3, name: Poghos Adamyan, course_progress: 0, last_activity_date: null}
What is the best way using Dart for me to format this into valid JSON for use?
If you're absolutely certain that your response is that verbatim string and that you haven't already decoded it, then you would have to parse it manually and hope that there will never be ambiguity. If the data always follows a strict format and if all fields are always in a specific order, you could write a regular expression to parse it. For example:
void main() {
var s =
'{id: 2, name: Test Testerson, course_progress: 0, last_activity_date: null}, {id: 3, name: Poghos Adamyan, course_progress: 0, last_activity_date: null}';
var re = RegExp(
r'\{'
r'id: (?<id>\d+), '
r'name: (?<name>[^,]+), '
r'course_progress: (?<progress>\d+), '
r'last_activity_date: (?<last>[^}]+)'
r'\}',
);
var matches = re.allMatches(s);
var items = <Map<String, dynamic>>[
for (var match in matches)
<String, dynamic>{
'id': int.parse(match.namedGroup('id')!),
'name': match.namedGroup('name')!,
'course_progress': int.parse(match.namedGroup('progress')!),
'last_activity': DateTime.tryParse(match.namedGroup('last')!),
}
];
items.forEach(print);
}
I have a List consisting of multiple Maps (Dictionaries) regardless of the value of each element in the map
var list = [{'a':1, 'b':1, 'c':7},
{'J':8, 'b':2, 'e':2},
{'l':1, 'b':3, 'r':4},
{'u':9, 'k':7} ];
Note that I don't know how many maps the list will have. It could be 0 and could be 1000 (I read from a JSON file).
I want to intersect them so the output would be like this:
var res = 'b';
I've done it in python by using this method:
res = set.intersection(*map(set, list))
The following would do the trick. It folds the list by intersecting the map keys one by one.
final list = [
{'a': 1, 'b': 1, 'c': 7},
{'J': 8, 'b': 2, 'e': 2},
{'l': 1, 'b': 3, 'r': 4},
{'u': 9, 'b': 7}
];
final res = list.fold<Set<String>>(
list.first.keys.toSet(),
(result, map) => result.intersection(map.keys.toSet()),
);
print(res); // Prints: {b}
As per the example from the documentation:
var pairs = [[1, 2], [3, 4]];
var flattened = pairs.expand((pair) => pair).toList();
print(flattened); // => [1, 2, 3, 4];
var input = [1, 2, 3];
var duplicated = input.expand((i) => [i, i]).toList();
print(duplicated); // => [1, 1, 2, 2, 3, 3]
It looks like it flattens an iterable if it contains nested iterables, but how is the question.
What it basically does is that, it iterates over the iterable calling the argument function on each iteration and concatenating the iterables returned by the argument function once the iteration is over and lastly returning the result of the concatenation which is an iterable.
That was a summary on how it works, let's understand it using the example from the documentation itself:
var pairs = [[1, 2], [3, 4]];
var flattened = pairs.expand((pair) => pair).toList();
print(flattened); // => [1, 2, 3, 4];
Here we have an iterable pairs and we called the expand() method on it. Now the expand() method will iterate over pairs calling the argument function which is (pair) => pair once per iteration.
Note that the syntax of the expand() method look like this Iterable<T> expand<T>(Iterable<T> f(T element)) which clearly shows that it takes a function as an argument which takes an argument of type T and returns an iterable. e.g.(pair) => pair where the pair is of type List<int>
As of now we are clear that the expand() method iterates over an iterable calling the argument function on each iteration. The argument function takes an argument which is of same type as the iterable and it returns an iterable.
Lastly, the expand() method concatenates the iterables returned by the argument function once the iteration on the iterable e.g. pairs is over [1, 2] + [3, 4] = [1, 2, 3, 4]. Then it returns the result of the concatenation which is an iterable [1, 2, 3, 4].
It's basically just a loop within a loop that iterates into each iterable, finds each inner element of the inner iterable, then returns it as a single stretched out iterable.
I can't find source code for expand, but in my darq package, you can see the same concept in action using the selectMany method (which is because selectMany is just expand with an additional index passed to the selector). For how Dart's expand works, ignore all the parts that deal with index.
extension SelectManyExtension<T> on Iterable<T> {
/// Maps elements in an iterable to collections and then flattens those
/// collections into a single iterable.
///
/// During iteration, the [selector] function is provided each value in the iterable
/// along with the index of the value in the iteration. The
/// returned collection of that function is then iterated over, and each
/// value in that iteration is provided as the next element of the
/// resulting iterable. The result is all of the collections flattened so that
/// their values become elements in a single iterable.
///
/// Example:
///
/// void main() {
/// final list = ['abc', 'de', 'f', 'ghij'];
/// final result = list.selectMany((s, i) => s.iterable);
///
/// // Result: ['a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j']
/// }
Iterable<TResult> selectMany<TResult>(
Iterable<TResult> Function(T element, int index) selector) sync* {
var index = 0;
for (var v in this) {
yield* selector(v, index++);
}
}
}
var list = [[1, 2, 3], [4, 5], [6]];
var flattened = list.selectMany((inner, idx) => inner);
// flattened = [1, 2, 3, 4, 5, 6]
I get this object
{
"138.68.226.120:26969": 1,
"178.128.50.37:26969": 1,
"207.180.218.133:26969": 1,
"66.42.67.157:26969": 1,
"140.82.14.193:26969": 1,
"51.15.39.62:26969": 1,
"144.217.91.232:26969": 1,
"144.217.81.95:26969": 1,
"68.183.105.143:26969": 1,
"192.99.246.177:26969": 1,
"167.99.98.151:26969": 1,
"59.79.71.205:26969": 1
}
When I use jq '."59.79.71.205:26969"' it give me the value only, is there a way to get the key-value from the object into an object like the example
{
"59.79.71.205:26969": 1
}
The answer is in the Object Construction section of the manual.
jq '{"59.79.71.205:26969"}'
const splittedObject = Object.keys( // get all the keys
yourObject
).map((key) => { // then for each key, turn the key into an object with the key-value pair
return {
[key]: yourObject[key] // assign the value to the key and voila
}
});
Now splittedObject is an array of those objects with one key, it's better to demonstrate with this snippet:
const yourObject = { "138.68.226.120:26969": 1, "178.128.50.37:26969": 1, "207.180.218.133:26969": 1, "66.42.67.157:26969": 1, "140.82.14.193:26969": 1, "51.15.39.62:26969": 1, "144.217.91.232:26969": 1, "144.217.81.95:26969": 1, "68.183.105.143:26969": 1, "192.99.246.177:26969": 1, "167.99.98.151:26969": 1, "59.79.71.205:26969": 1 };
const splittedObject = Object.keys( // get all the keys
yourObject
).map((key) => { // then for each key, turn the key into an object with the key-value pair
return {
[key]: yourObject[key] // assign the value to the key and voila
}
});
console.log(splittedObject);
By the way, can I ask why you need to do this?
I have a metadata object in the form
{
filename: "hugearray.json",
author: "amenadiel",
date: "2014-07-11",
introduction: "A huge ass array I want to send to the browser"
}
That hugearray.json is a text file in my folder which contains, as its name implies, an array of potentially infinite elements.
[
[14, 17, 25, 38, 49],
[14, 41, 54, 57, 58],
[29, 33, 39, 53, 59],
...
[03, 14, 18, 34, 37],
[03, 07, 14, 29, 33],
[05, 16, 19, 30, 49]
]
What I want to achieve is to output to the browser an object which is the original object, with the extra key 'content' which is the huge array
{
filename: "hugearray.json",
author: "amenadiel",
date: "2014-07-11",
introduction: "A huge ass array I want to send to the browser",
content: [
[14, 17, 25, 38, 49],
...
[05, 16, 19, 30, 49]
]
}
But since I don't know the array size, I don't want to store the whole thing in memory before outputting, so I thought of using streams. I can stream the array fine with
var readStream = fs.createReadStream("hugearray.json");
readStream.on('open', function () {
readStream.pipe(res);
});
And of course I can send the metadata object to the res with
res.json(metadata);
And I've tried deconstructing metadata, writing each key : value pair and leaving a content key open, then to pipe the file results, then closing the curly braces. It doesn't seem to work:
{
filename: "hugearray.json",
author: "amenadiel",
date: "2014-07-11",
introduction: "A huge ass array I want to send to the browser",
content:
}[
[14, 17, 25, 38, 49],
[14, 41, 54, 57, 58],
[29, 33, 39, 53, 59],
...
[03, 14, 18, 34, 37],
[03, 07, 14, 29, 33],
[05, 16, 19, 30, 49]
]
I guess I need to wrap the stream in my metadata content key instead of trying to output json and stream into the result. ¿Any ideas?
Well, my question went unnoticed but made me win the Tumbleweed badge. It's something.
I kept investigating and I came out with a solution. I was hoping to find a one liner, but this one works too and so far I've been able to output several MBs to the browser without noticeable performance hit in my node process.
This is the method I used
app.get('/node/arraystream', function (req, res) {
var readStream = fs.createReadStream("../../temp/bigarray.json");
var myObject = {
filename: "hugearray.json",
author: "amenadiel",
date: "2014-07-11",
introduction: "A huge ass array I want to send to the browser"
};
readStream.on('open', function () {
console.log('readStream open');
var myObjectstr = JSON.stringify(myObject);
res.write(myObjectstr.substring(0, myObjectstr.length - 1) + ',"content":');
});
readStream.on('error', function (err) {
console.log('readStream error', err);
throw err;
});
readStream.on('close', function () {
console.log('readStream closed');
readStream.destroy();
res.write('}');
res.end();
});
readStream.on('data', function (data) {
console.log('readStream received data', data.length);
var buf = new Buffer(data, 'ascii');
res.write(buf);
});
});
Basically, instead of turning my object into a stream, I turned my array into a buffer.