sum up json values [closed] - json

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I'm trying to sum up all the values in the json that contain the string "Example". But I have no idea how to do it.
I'm using the "Newtonsoft.Json" framework.
{
"Example1344": 13,
"Example925": 16,
"Example454Example": 24,
"Nothing": 51,
"Other9235": 45
}
So that the result would be 53

Something along these lines might work for you:
Imports Newtonsoft.Json.Linq
Module Module1
Sub Main()
Dim json As String = $"{{
""Example1344"": 13,
""Example925"": 16,
""Example454Example"": 24,
""Nothing"": 51,
""Other9235"": 45
}}"
Dim model As JObject = JObject.Parse(json)
Dim value As Int16 = 0
value = model.Properties _
.Where(Function(m) m.Name.Contains("Example")) _
.Sum(Function(m) m.Value)
Console.WriteLine($"Sum: {value}")
Console.WriteLine("PRESS ANY KEY TO EXIT")
Console.ReadKey(True)
End Sub
End Module

Related

How to deserialize csv based on line format [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 9 months ago.
Improve this question
I have a csv without headers that can have lines in these three following formats:
char,int,int,string,int
char,int,string
char
The first character defines the format and be one of the values (A,B,C) respectively. Does anyone know a way to deserialize it based on the line format?
Just keep it simple. You can always parse it manually.
use std::io::{self, BufRead, Error, ErrorKind};
pub enum CsvLine {
A(i32, i32, String, i32),
B(i32, String),
C,
}
pub fn read_lines<R: BufRead>(reader: &mut R) -> io::Result<Vec<CsvLine>> {
let mut lines = Vec::new();
for line in reader.lines() {
let line = line?;
let trimmed = line.trim();
if trimmed.is_empty() {
continue
}
// Split line by commas
let items: Vec<&str> = trimmed.split(',').collect();
match items[0] {
"A" => {
lines.push(CsvLine::A (
items[1].parse::<i32>().map_err(|e| Error::new(ErrorKind::Other, e))?,
items[2].parse::<i32>().map_err(|e| Error::new(ErrorKind::Other, e))?,
items[3].to_string(),
items[4].parse::<i32>().map_err(|e| Error::new(ErrorKind::Other, e))?,
));
}
"B" => {
lines.push(CsvLine::B (
items[1].parse::<i32>().map_err(|e| Error::new(ErrorKind::Other, e))?,
items[2].to_string(),
));
}
"C" => lines.push(CsvLine::C),
x => panic!("Unexpected string {:?} in first column!", x),
}
}
Ok(lines)
}
Calling this function would look something like this:
let mut file = File::open("path/to/data.csv").unwrap();
let mut reader = BufReader::new(file);
let lines: Vec<CsvLine> = read_lines(&mut reader).unwrap();
But you may want to keep in mind that I didn't bother to handle a couple edge cases. It may panic if there are not enough items to satisfy the requirements and it makes no attempt to parse more complex strings. For example, "\"quoted strings\"" and "\"string, with, commas\"" would likely cause issues.

Naming convention recommendation for using mysql in nodejs [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
please give me your guidance.
I'm using node with mysql database (without using orm).
i use snake_case naming convention for mysql.
my question is:
in node,
should i use snake_case or use camelCase ?
for e.g. at models/movie.js:
snake_case:
return db.execute(SELECT title, daily_rental_rate FROM movies);
the result sent to the client:
{
title: 'abc',
daily_rental_rate: 20
}
camel_case:
return db.execute(SELECT title, daily_rental_rate as dailyRentalRate FROM movies);
the result sent to the client:
{
title: 'abc',
dailyRentalRate: 20
}
thank you so much /\
There is no fixed convention for JSON casing, however most APIs tend to use camelCase for properties, include Google, see their style guide here.
You can also map object properties within JavaScript, you don't have to do this manually in your queries. This allows you to be relatively flexible with your casing, even changing to kebab-case or snake_case if you wish to later on. This example uses the lodash library to convert object case.
const _ = require("lodash");
function objectToCamelCase(obj) {
return _.mapKeys(obj, (v, k) => _.camelCase(k))
}
let rows = db.execute("SELECT title, daily_rental_rate FROM movies");
console.log("Result (snake_case): ", rows);
rows = rows.map(objectToCamelCase);
console.log("Result (camelCase):", rows);
The results might look like so:
Result (snake_case):
[
{
"title": "The Shawshank Redemption",
"daily_rental_rate": "€2.99"
},
{
"title": "Ferris Bueller's Day Off",
"daily_rental_rate": "€2.99"
}
]
Result (camelCase):
[
{
"title": "The Shawshank Redemption",
"dailyRentalRate": "€2.99"
},
{
"title": "Ferris Bueller's Day Off",
"dailyRentalRate": "€2.99"
}
]

How can I convert a JSON file to a html page in Ruby on Rails? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 5 years ago.
Improve this question
I am writing a website which takes as a URL get parameter a link to a JSON file and converts it to a good looking html page (i.e. not a table). Part of this requires a recursive function to parse the JSON file. What I would like is the keys to look like html headers. Each value in an Array to be on a separate line and strings to have a bold key with the value being on the same line. Most importantly I would like the page to indent the content of the JSON file as you would expect a JSON file to be indented.
I think this recursive function does a good job at converting the JSON file to a pretty html page.
def parse(hash, iteration=0 )
iteration += 1
output = ""
hash.each do |key, value|
if value.is_a?(Hash)
output += "<div class='entry' style='margin-left:#{iteration}em'> <span style='font-size:#{250 - iteration*20}%'>#{key}: </span><br>"
output += parse(value,iteration)
output += "</div>"
elsif value.is_a?(Array)
output += "<div class='entry' style='margin-left:#{iteration}em'> <span style='font-size:#{250 - iteration*20}%'>#{key}: </span><br>"
value.each do |value|
if value.is_a?(String) then
output += "<div style='margin-left:#{iteration}em'>#{value} </div>"
else
output += parse(value,iteration-1)
end
end
output += "</div>"
else
output += "<div class='entry' style='margin-left:#{iteration}em'> <span style='font-weight: bold'>#{key}: </span>#{value}</div>"
end
end
return output
end
Here is how it works. You pass it a hash Which you get by converting the JSON file to a hash:
jsonFile = JSON.load(open(params["url"] ))
#output = parse(jsonFile)
And the function checks each value in the hash and does 1 of 3 things:
If it is an array outputs each array element on a new line
If it is a string outputs the key and the value on the same line
If it is another Hash calls the same function
After this all you have to do it output the #output variable to the screen on the view file:
<%= #output.html_safe %>
What the iteration does is keep track of how many times the function has been called so that the output can be indented correctly
I hope this helps anybody else who is trying to do the same thing.

R - Serialize R models as JSON [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
Is there some good R package that can convert prediction models and other complex object to and from JSON? I have linear regression model from this example:
eruption.lm = lm(eruptions ~ waiting, data=faithful)
newdata = data.frame(waiting=80)
predict(eruption.lm, newdata)
I would like to serialize eruption.lm model as JSON store it somewhere or send it to some external system, and later deserialize it and do prediction.
I have tried with jsonlite R package:
json<-serializeJSON(eruption.lm)
lin.model<-unserializeJSON(json)
predict(lin.model, newdata)
However, jsonlite cannot handle complex objects - deserialized model returns an error in prediction:
Error in eval(expr, envir, enclos) : could not find function "list"
Is there some better package that can serialize/deserialize objects.
You just need to help it remember the environment for terms:
attr(lin.model$terms, ".Environment") <- .GlobalEnv
predict(lin.model, newdata)
## 1
## 4.17622
I'd file this as an enhancement request over at http://github.com/jeroenooms/jsonlite/issues
Alternatively, you can use native R binary serialization:
saveRDS(lin.model, "lin.model.rds")
predict(readRDS("lin.model.rds"), newdata)
## 1
## 4.17622
unless you absolutely need a text serialization method, in which case you can do:
saveRDS(lin.model, file="lin.model.txt", ascii=TRUE)
predict(readRDS("lin.model.txt"), newdata)
## 1
## 4.17622
The ascii=TRUE makes a text hex representation of the object:
1f8b 0800 0000 0000 0003 ed5d c992 1cb9
91bd e797 cc1c 9806 381c db51 36a6 c35c
e61f 4a64 5153 3645 b255 2cb6 749a 6f1f
5fb0 bcc8 ca62 4b1a 33f5 25da 8c6d 8848
04fc f9f6 b004 10f5 870b 5d62 afa9 964b
4cb1 71b8 d456 2f91 2e99 8afc f421 5e5b
e510 73ef 9770 0d35 17aa 3d5f 6290 5fe3
850a c59c 2ef9 f2f5 e1cb e3f7 4bd4 27c6
bd18 2fff f69f 5f5f 1f5f 3e3e fef2 faef
f36e bdfc f5e1 e9f5 e9eb 9f2f 94d9 4554
1112 ae39 84dc 63d7 2287 de7a b2bb a975
... (lots more)
that can be stored in places where binary blobs can't.
If you need a readable text serialization method, filing the above suggested enhancement request is prbly the way to go.

how to parse a multidimensional array? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
I want the key and value of the following multidimensional array with key :followersperdate
{"version":"1.2","username":"LGUS","url":"http://t.co/ospRRVboge","avatar":"http://pbs.twimg.com/profile_images/378800000016800237/3787f02f0e0e10a0a19d9b508abd6ce2_normal.png","followers_current":38775,"date_updated":"2013-11-15","follow_days":"774","started_followers":544,"growth_since":38231,"average_growth":"49","tomorrow":"38824","next_month":"40245","followers_yesterday":37232,"rank":"14934","followersperdate":{"date2013-11-15":38775,"date2013-11-05":37232,"date2013-11-04":37126,"date2013-10-26":36203,"date2013-10-10":34384,"date2013-10-02":33353,"date2013-09-18":30870},"last_update":1384679796}
i cleared it for you, the object you want, has 7 sub objects, that can parse base on your language,
check this web page to learn more, choose your language, at the bottom of the site:
http://www.json.org/
if you are on a javaScript, check this http://www.w3schools.com/json/json_intro.asp
{
"version": "1.2",
"username": "LGUS",
"url": "http://t.co/ospRRVboge",
"avatar": "http://pbs.twimg.com/profile_images/378800000016800237/3787f02f0e0e10a0a19d9b508abd6ce2_normal.png",
"followers_current": 38775,
"date_updated": "2013-11-15",
"follow_days": "774",
"started_followers": 544,
"growth_since": 38231,
"average_growth": "49",
"tomorrow": "38824",
"next_month": "40245",
"followers_yesterday": 37232,
"rank": "14934",
"followersperdate": {
"date2013-11-15": 38775,
"date2013-11-05": 37232,
"date2013-11-04": 37126,
"date2013-10-26": 36203,
"date2013-10-10": 34384,
"date2013-10-02": 33353,
"date2013-09-18": 30870
},
"last_update": 1384679796
}