Manipulate data from angular JSON response - json

I have a standard JSON response in an Angular controller, which returns data.
I am trying to get specific parts of that data, and manipulate it and use the manipulated version within the code.
Currently i have:
$http.get('/json/file.json').success(function(data) {
$scope.results = data;
});
In the JSON, i have data such as this:
"hotels":[
{
"region": "Indian Ocean"
}
]
In my code, i am using ng-repeat to call "hotel in results.hotels" and using "hotel.region".
How do i grab the hotel.region from the data, and remove the space between the words, replace the space with a '_' and make it all lower case so i end up with "indian_ocean". As well as this, how would i then use this within my ng-repeat?
Many thanks..

data.hotels[0].region.replace(" ","_").toLowercase()

Just do...
$scope.results.forEach(function (element) {element.replace(" ","_").toLowercase()});

I figured it out so this can be used in a more general way.
Create a new filter...
app.filter('removeSpaces', function () {
return function (text) {
var str = text.replace(/\s+/g, '_');
return str.toLowerCase();
};
});
Then this can be used site-wide by calling "{{hotel.region | removeSpaces}}".
Thanks to the people who did respond and for their help.

Related

Returning a specific JSON object using Express

So let's say I've got a massive JSON file, and the general structure is roughly like so:
{
"apples": { complex object },
"oranges": { complex object },
"grapes": { complex object }
}
Is there some way to specifically target an object to return while using express? As in, say, if someone made a simple get request to my server, it'd return specifically the given object(s). I know the syntax and concept is completely wrong in this instance but for lack of a better way to say it, something like...
let testData = 'testdata.json';
app.get('/thing', res => {
res.json(testData.oranges);
}
I know you can return the entire file, but that adds a good amount of loading time in this instance, and is impractical in this particular case.
Or, alternatively - would it be better to have node parse the JSON file and split it into an apples.json, oranges.json, etc files to use? Trying to understand A, the best practice for doing something like this, and B, the most effective way to translate this into a practical application for a medium sized project.
Any thoughts or advice along this line - even if it's a library recommendation - would be greatly appreciated.
It should work if you make a POST request caring the payload of the specific 'thing', and then returning an object based on that thing. Example:
let testData = {
"apples": { complex object },
"oranges": { complex object },
"grapes": { complex object }
};
app.post('/route', (req, res) => {
thing = req.body.thing;
res.json(testData[thing]);
}
This is a GET request for some data and essentially since the JSON file can be used as a key/value store to query for the desired response data.
Assuming the query parameter for specifying the desired key for the object to return is part then the following example would work:
const testData = require('./testdata.json');
app.get('/thing', (req, res) => res.json(testdata[req.query.part]);
Querying for /thing?part=apples would return testdata.apples in the response.

Adding Attributes while parsing CSV file in D3

I am trying to parse data in d3 using the csv function. I am attempting to give each datapoint a new attribute (Region) during processing. Outside the CSV function I defined a function that is supposed to take the datapoint, check to see if the state is Alabama, and if so, assign the Region attribute to a string of either "North" or "South".
var parseRegion = (function(d){
if(d.State === "Alabama"){
return "South";
}
else {
return "North";
}
});
However, when I run the code, every datapoint is assigned a "Region" attribute that is assigned to the function itself. In other words, it is assigned the actual code, rather than the return values. What am I doing wrong??
d3.csv("data.csv").get(function(error,data){
if (error) throw error;
data.forEach(function(d){
d.Deaths = +d.Deaths;
d.Population = +d.Population;
d.Year = parseDate(d.Year);
d.Region = parseRegion;
});
Thanks for any help you can provide. Eventually I will add additional states besides Alabama of course.
Your problem is that you're not calling the parseRegion function that you define.
So you need
d.Region = parseRegion(d);
More generally d3.csv provides a way to parse the data without the use of forEach. You can do the following:
d3.csv("data.csv")
.row(function(d) {
//Code to parse data row by row goes here
})
.get(function(error,data){
//Data is now the whole parsed dataset
});

WinJS: Save observable objects without backingData etc. to JSON

what is the best way to save observable objects to JSON without _backingData etc.?
For example instead of:
[{"_backingData":{"name":"Test","hourlyRate":""},"name":"Test","hourlyRate":"","id":-1,"number":"","backingData":{"name":"Test","hourlyRate":""}}]
This should be saved:
[{"name":"Test","hourlyRate":"","id":-1,"number":""}]
This is my code to save the data:
var data = JSON.stringify(value.concat());
Windows.Storage.ApplicationData.current.localFolder.createFileAsync("customer.json", Windows.Storage.CreationCollisionOption.replaceExisting)
.then(function (file) {
return Windows.Storage.FileIO.writeTextAsync(file, data);
});
value is a WinJS.Binding.List().
Is there a simple solution to solve this "problem"?
use WinJS.Binding.unwrap over the individual elements in the array.

how to send the data in Json structure

I have a rest service for which I am sending the Json data as ["1","2","3"](list of strings) which is working fine in firefox rest client plugin, but while sending the data in application the structure is {"0":"1","1":"2","2":"3"} format, and I am not able to pass the data, how to convert the {"0":"1","1":"2","2":"3"} to ["1","2","3"] so that I can send the data through application, any help would be greatly appreciated.
If the format of the json is { "index" : "value" }, is what I'm seeing in {"0":"1","1":"2","2":"3"}, then we can take advantage of that information and you can do this:
var myObj = {"0":"1","1":"2","2":"3"};
var convertToList = function(object) {
var i = 0;
var list = [];
while(object.hasOwnProperty(i)) { // check if value exists for index i
list.push(object[i]); // add value into list
i++; // increment index
}
return list;
};
var result = convertToList(myObj); // result: ["1", "2", "3"]
See fiddle: http://jsfiddle.net/amyamy86/NzudC/
Use a fake index to "iterate" through the list. Keep in mind that this won't work if there is a break in the indices, can't be this: {"0":"1","2":"3"}
You need to parse out the json back into a javascript object. There are parsing tools in the later iterations of dojo as one of the other contributors already pointed out, however most browsers support JSON.parse(), which is defined in ECMA-262 5th Edition (the specification that JS is based on). Its usage is:
var str = your_incoming_json_string,
// here is the line ...
obj = JSON.parse(string);
// DEBUG: pump it out to console to see what it looks like
a.forEach(function(entry) {
console.log(entry);
});
For the browsers that don't support JSON.parse() you can implement it using json2.js, but since you are actually using dojo, then dojo.fromJson() is your way to go. Dojo takes care of browser independence for you.
var str = your_incoming_json_string,
// here is the line ...
obj = dojo.fromJson(str);
// DEBUG: pump it out to console to see what it looks like
a.forEach(function(entry) {
console.log(entry);
});
If you're using an AMD version of Dojo then you will need to go back to the Dojo documentation and look at dojo/_base/json examples on the dojo.fromJson page.

Backbone multiple collections fetch from a single big JSON file

I would like to know if any better way to create multiple collections fetching from a single big JSON file. I got a JSON file looks like this.
{
"Languages": [...],
"ProductTypes": [...],
"Menus": [...],
"Submenus": [...],
"SampleOne": [...],
"SampleTwo": [...],
"SampleMore": [...]
}
I am using the url/fetch to create each collection for each node of the JSON above.
var source = 'data/sample.json';
Languages.url = source;
Languages.fetch();
ProductTypes.url = source;
ProductTypes.fetch();
Menus.url = source;
Menus.fetch();
Submenus.url = source;
Submenus.fetch();
SampleOne.url = source;
SampleOne.fetch();
SampleTwo.url = source;
SampleTwo.fetch();
SampleMore.url = source;
SampleMore.fetch();
Any better solution for this?
Backbone is great for when your application fits the mold it provides. But don't be afraid to go around it when it makes sense for your application. It's a very small library. Making repetitive and duplicate GET requests just to fit backbone's mold is probably prohibitively inefficient. Check out jQuery.getJSON or your favorite basic AJAX library, paired with some basic metaprogramming as following:
//Put your real collection constructors here. Just examples.
var collections = {
Languages: Backbone.Collection.extend(),
ProductTypes: Backbone.Collection.extend(),
Menus: Backbone.Collection.extend()
};
function fetch() {
$.getJSON("/url/to/your/big.json", {
success: function (response) {
for (var name in collections) {
//Grab the list of raw json objects by name out of the response
//pass it to your collection's constructor
//and store a reference to your now-populated collection instance
//in your collection lookup object
collections[name] = new collections[name](response[name]);
}
}
});
}
fetch();
Once you've called fetch() and the asyn callback has completed, you can do things like collections.Menus.at(0) to get at the loaded model instances.
Your current approach, in addition to being pretty long, risks retrieving the large file multiple times (browser caching won't always work here, especially if the first request hasn't completed by the time you make the next one).
I think the easiest option here is to go with straight jQuery, rather than Backbone, then use .reset() on your collections:
$.get('data/sample.json', function(data) {
Languages.reset(data['Languages']);
ProductTypes.reset(data['ProductTypes']);
// etc
});
If you wanted to cut down on the redundant code, you can put your collections into a namespace like app and then do something like this (though it might be a bit too clever to be legible):
app.Languages = new LanguageCollection();
// etc
$.get('data/sample.json', function(data) {
_(['Languages', 'ProductTypes', ... ]).each(function(collection) {
app[collection].reset(data[collection]);
})
});
I think you can solve your need and still stay into the Backbone paradigm, I think an elegant solution that fits to me is create a Model that fetch the big JSON and uses it to fetch all the Collections in its change event:
var App = Backbone.Model.extend({
url: "http://myserver.com/data/sample.json",
initialize: function( opts ){
this.languages = new Languages();
this.productTypes = new ProductTypes();
// ...
this.on( "change", this.fetchCollections, this );
},
fetchCollections: function(){
this.languages.reset( this.get( "Languages" ) );
this.productTypes.reset( this.get( "ProductTypes" ) );
// ...
}
});
var myApp = new App();
myApp.fetch();
You have access to all your collections through:
myApp.languages
myApp.productTypes
...
You can easily do this with a parse method. Set up a model and create an attribute for each collection. There's nothing saying your model attribute has to be a single piece of data and can't be a collection.
When you run your fetch it will return back the entire response to a parse method that you can override by creating a parse function in your model. Something like:
parse: function(response) {
var myResponse = {};
_.each(response.data, function(value, key) {
myResponse[key] = new Backbone.Collection(value);
}
return myResponse;
}
You could also create new collections at a global level or into some other namespace if you'd rather not have them contained in a model, but that's up to you.
To get them from the model later you'd just have to do something like:
model.get('Languages');
backbone-relational provides a solution within backbone (without using jQuery.getJSON) which might make sense if you're already using it. Short answer at https://stackoverflow.com/a/11095675/70987 which I'd be happy to elaborate on if needed.