I am trying to create JSON in Ruby from data coming from a SQL Server table, based off a query. I've worked with Ruby quite a bit and JSON some. But never together.
This is a sample of the JSON I'm trying to create.
Even help with just creating the JSON with the nested arrays and the root element would be helpful.
{
"aaSequences": [
{
"authorIds": [
"ent_fdfdfdfdf_one"
],
"aminoAcids": "aminoAcids_data",
"name": "bbbbb-22",
"schemaId": "ls_jgjgjg",
"registryId": "src_fgfgfgf",
"namingStrategy": "NEW_IDS"
},
{
"authorIds": [
"ent_fdfdfdfdf_two"
],
"aminoAcids": "aminoAcids_data",
"name": "bbbbb-22",
"schemaId": "ls_jgjgjg",
"registryId": "src_fgfgfgf",
"namingStrategy": "NEW_IDS"
}
]
}
Generate a JSON from a Ruby hash object
To generate a json, first start with a hash (like a dict) in Ruby
my_hash = {:foo => 1, :bar => 2, :baz => 3}
Make sure you require the json package as well
require 'json'
Then you can simply convert the hash object to a JSON string
my_hash.to_json # outputs: "{'foo': 1, 'bar': 2, 'baz': 3'}"
You can nest arrays into your hash as well
my_hash_2 = {:foo => [1, 2, 3, 4], :bar => ['a', 'b', 'c', 'd']}
I'll let you try that one on your own, but ruby will handle the nested object just fine for you.
From the docs
Related
I've noticed that some APIs use a format of sending a stripped down version of their data via a JSON array like the following:
[
"joe",
[
5,
2,
"yellow"
]
]
And store a set of keys like the following:
[
"name",
["some_data", [
"favorite_number",
"least_favorite_number",
"car_color"
]]
]
To turn the data from a bunch of random values to a readable set of data, like the following:
{
"name": "joe",
"some_data": {
"favorite_number": 5,
"least_favorite_number": 2,
"car_color": "yellow"
}
}
I was wondering how this could be done? I'd prefer it'd be in python, but I'm fine with programming my own libraries.
After grasping at more straws than I could fit in my mouth, I've figured it out. JSON schema is what I'm supposed to be using!
I have JSON data shown below. I am using Python to encode a list, a dictionary and another list into JSON. The final JSON data will look like so:
{
"0": [3, 3, 3],
"1": {
"0": [0, 8, 9],
"1": [1, 2, 3, 4, 10, 11],
"2": [4]
},
"2": [1, 1, 1, 1]
}
My aim is to write some type of Scala function to extract the JSON data in a way that allows:
"0": [3, 3, 3] to be a List(3,3,3)
{"0":[0,8,9], ...} to be a HashMap[Int,List[Int]]
"2": [1, 1, 1, 1] to be a List(1,1,1,1)
Note the length of original Python list and dictionary will vary in size, and the "0", "1", "2" will always be there representing the list, dictionary and list in this order.
I am quite new to Scala and struggling on how to do it without using external libraries. I am trying to use spray-json to do it, since I am using a newer version of Scala (no built-in json parser).
That doesn't look like valid JSON to me, which means any of the JSON parsers you could use won't work. Is that structure fixed? You may want to instead convert it to something thats valid JSON.
eg.
{
"list" : [ 1,1,1],
"someotherObject" : {
"0" : [1,2,3]
},
"anotherList" : [9,8,7]
}
Then you could use Argonaut (for example), and define a decoder, which tells how to map that JSON to object types you specify. See http://argonaut.io/doc/codec/
I'm putting tsung logs into ElasticSearch (ES) so that I can filter, visualize and compare results using Kibana.
I'm using logstash and its JSON parsing filter to push tsung logs in JSON format to ES.
Tsung logs are a bit complicated (IMO) with array objects into array objects, multiple-lines event, and several fields having the same name such as "value" in my example hereafter.
I would like to transform this event:
{
"stats":[
{"timestamp": 1317413861, "samples": [
{"name": "users", "value": 0, "max": 1},
{"name": "users_count", "value": 1, "total": 1},
{"name": "finish_users_count", "value": 1, "total": 1}]}]}
into this:
{"timestamp": 1317413861},{"users_value":0},{"users_max":1},{"users_count_value":1},{"users_count_total":1},{"finish_users_count_value":1},{"finish_users_count_total":1}
Since the entire tsung log file is forwarded to logstash at the end of a performance test campaign, I'm thinking about using regex to remove CR and unusefull stats and samples arrays before sending the event to logstash in order to simplify a little bit.
And then, I would use those kind of JSON filter options:
add_field => {"%{name}_value" => "%{value}"}
add_field => {"%{name}_max" => "%{max}"}
add_field => {"%{name}_total" => "%{total}"}
But how should I handle the fact that there are many value fields in one event for instance? What is the best thing to do?
Thanks for your help.
Feels like the ruby{} filter would be needed here. Loop across the entries in the 'samples' field, and construct your own fields based on the name/value/total/max.
There are examples of this type of behavior elsewhere on SO.
I'm going to develop a pushing server (HTML5 WebSocket/Polling) and in order to reduce the size of packets (that presents in JSON format) I want to do something like this with packets:
[["id", "username", "password"], [1, "afshin", "123"], [2, "barak", "meme"]]
Instead of clear JSON format like:
[{"id": 1, "username": "afshin", "password": "123"}, {"id": 2, "username": "barak", "password": "meme"}]
Exactly, I want to prevent sending contract properties in each object.
So, I want to know is there any library for doing this (or something like)? I have C# on server and JavaScript on clients.
JSON DB or RJSON should be exactly what you're looking for. You'll most likely have to implement serializers/deserializers yourself (RJSON is already implemented in JS though).
As for compressing pure JSON, I think you could bypass the "keys are needed" rule by wrapping all your data in a single object entry:
{"data" : [["id", "username", "password"], [1, "afshin", "123"], [2, "barak", "meme"]]}
So, besides all the arguments against manual compression, this would be a solution:
var input = [{"id": 1, "username": "afshin", "password": "123"}, {"id": 2, "username": "barak", "password": "meme"}];
var keys = {}
input.map ( function (e) { Object.keys(e).map( function (k) { keys[k] = 1; })});
var output = [ Object.keys(keys) ] .concat( input.map( function (e) {
return Object.keys(keys).map( function (k) { return e[k]; } );
} ) );
console.log(output);
and Node.js produces:
[ [ 'id', 'username', 'password' ],
[ 1, 'afshin', '123' ],
[ 2, 'barak', 'meme' ] ]
I really don't know if this works with every browser etc.
By removing the name of the name-value pair you'd be breaking a JSON syntax rule. Effectively, it wouldn't be JSON. You also might cause problems for JSON client deserialization. Could you consider reducing the length of your names:
[{"id": 1, "u": "afshin", "p": "123"}, {"id": 2, "u": "barak", "p": "meme"}]
This JSON document is the same size as the one you propose above.
My json data being retrieved is as of below, notice that the format is not right in terms of object and key values why is this so, I am retrieving this json data from a dataset using the jayrock rpc service for asp.net 3.5
{
"Table":{
"columns":[
"i_member_id",
"i_group_id",
"u_name",
"u_tel",
"u_email",
"u_password",
"d_timestamp",
"b_activated"
],
"rows":[
[
1,
0,
"kevin",
"1231234",
"kevin#creaworld.com.sg",
"123",
"2011-01-05T09:51:36.8730000+08:00",
true
],
[
2,
0,
"kevin2",
"asdads",
"kevin2#creaworld.com.sg",
"123123",
"2011-01-05T10:01:46.1530000+08:00",
true
]
]
}
}
Here the link is to find the json formater
http://jsonformatter.curiousconcept.com/
Better you can use third party DLL
Json.Net
It will use full for you.