I have a use case where a Model has a JSON attribute used for arbitrary configurations: configuration: DS.attr().
I have a configurationService (initialized in every component/route/controller/…) with computed properties for easily retrieving this configurations throughout the application
Since the JSON configurations are quite large & have variable depth I can’t have a computed property for every single one.
Unfortunately, being the Service a singleton computed properties don’t detect the changes in the JSON keys (a.k.a. deep watch).
I there a way to force deep watching a JSON attribute?
Example:
services/configuration.js:
// These below can also be computed.alias, same effect
configuration: Ember.computed(‘account.configuration', function() {
// account is set at application’s route:
// set(this, ‘configurationService.account', account);
return this.get(‘account.configuration’);
}),
profile: Ember.computed('configuration.profile', function() {
return this.get('configuration.profile');
}),
any/given/component.js:
configurationService: Ember.inject.service(‘configuration’),
…
// These won’t detect changes
randomConfig: Ember.computed.alias(’configurationService.profile.foo.bar.randomConfig')
Considering the configuration object was: {configuration: {profile: {foo: {bar: {randomConfig: false}}}}}, if I somehow change randomConfig to true, it won’t be detected
Note: I considered https://github.com/lytics/ember-data-model-fragments but discarded it since it’s too verbose, the issue is that our configuration object can become quite large and deep, dynamic & unpredictable
I tried computed.alias as well with no success.
Any hints or alternatives would be appreciated :)
UPDATE:
I tried with an object Transform (as suggested in Slack) and it doesn’t observe it deeply: https://gist.github.com/benoror/272f0ae893f80276ac1553ae048e6b20#file-object-js
You need to transform the plain JS-Object to an Ember-Object, and so for its child objects:
import Ember from 'ember';
import DS from 'ember-data';
function emberify(obj) {
if(obj instanceof Array) {
return Ember.A(obj.map(i => emberify(i)));
} else if (obj instanceof Object) {
return Ember.Object.create(Object.keys(obj)
.reduce((hash, key) => {
hash[key] = emberify(obj[key]);
}, {}));
}
}
export default DS.Transform.extend({
deserialize: function(value) {
return emberify(obj);
}
}
Related
I have a model:
export default Model.extend({
title: attr('string'),
attributes: attr('jsonb')
});
Where attributes is a custom json filed stored as jsonb in Postgres.
let say:
{
"name":"Bob",
"city":""
}
So I can easily manipulate attributes using template
<form.element .. #property="attributes.city"/> or model.set('attributes.city','city name')
Problem: hasDirtyAttributes do not changing because technically we have old object. But when I try to copy object let say
JSON.parse(JSON.stringify(this.get('attributes')) hasDirtyAttributes works as expected
So how to write some Mixin for a Model or other workaround which on the change of any attribute property will mark hasDirtyAttributes as true. I will update whole object so doesn't matter which property actually was changed.
Same problem: https://discuss.emberjs.com/t/hasdirtyattributes-do-not-work-with-nested-attributes-json-api/15592
existing solution doesn't work for me at all:
ember-dirtier
ember-data-relationship-tracker
ember-data-model-fragments (a lot of changes under the hood and broke my app)
Update:
Some not perfect idea that help better describe what I'm want to achieve:
Let say we adding observer to any object fileds:
export default Model.extend({
init: function(){
this._super();
this.set('_attributes', Object.assign({}, this.get('attributes'))); //copy original
Object.keys(this.get('attributes')).forEach((item) => {
this.addObserver('attributes.'+ item, this, this.objectObserver);
});
}
...
})
And observer:
objectObserver: function(model, filed){
let privateFiled = '_' + filed;
if (model.get(privateFiled) != model.get(filed)) { //compare with last state
model.set(privateFiled, this.get(filed));
model.set('attributes', Object.assign({}, this.get('attributes')) );
}
}
It's works, but when I change one filed in object due to copying object objectObserver faired again on every filed. So in this key changing every filed in object I mark observed filed as dirty
The further ember development will reduce using of event listener and two-way binding, actually Glimmer components supports only One-way Data Flow. So to be friendly with future versions of emberusing one-way data flow is good approach in this case to. So In my case as I use ember boostrap solution looks like
<form.element #controlType="textarea" #onChange={{action 'attributeChange'}}
where attributeChange action do all works.
New Glimmer / Octane style based on modifier and looks like:
{{!-- templates/components/child.hbs --}}
<button type="button" {{on "click" (fn #onClick 'Hello, moon!')}}>
Change value
</button>
I have read multiple answers to these kind of issues, and each answer has its own response;
In my case I am not getting any of those as my interfaces simply don't map the json like I want it to. I have tried multiple solutions, since working with Root-object and nested interfaces, but here I am, asking which is the best approach to deal with these kind of JSON objects in the front end, how to map it this particular one (a fork-Join). and I wanted to ask what are the real benefits of using the interfaces/classes/ maps besides the Intellisense? It has to do with data propagation?
The json structure in question:
{
Title: "",
Year: "",
Rated: "",
Released: "",
Runtime: "",
…}
Simple as it is. But back in my service I call it with a forkjoin:
getMovies(name: string, year?: string): Observable<any> {
let shortPlot = this.http.get(
"https://www.omdbapi.com/?t=" +
name +
"&plot=short&y=" +
year +
"&apikey=[my key]"
);
let fullPlot = this.http.get(
"https://www.omdbapi.com/?t=" + name + "&plot=full&apikey=[my key]"
);
return forkJoin([shortPlot, fullPlot]);
}
The subscription in the component:
getMovie() {
this.spinner = true;
this.movieService
.getMovies(this.name.value)
.subscribe((dataList: any) => {
this.movies = Array.of(dataList[0]);
this.spinner = false;
let error: any = this.movies.map(error => error.Error);
if (error[0]) {
this.notfound = error[0];
this.error = true;
} else {
this.error = false;
this.movieRate = this.movies.map(rating => rating.imdbRating.toString());
}
})),
error => console.log(error);
}
And in the HTML I render the data like this:
<div *ngFor="let m of movies">
<h5 class="mt-0">{{m.Title}}, {{m.Year}}</h5>
</div>
So as you can see I am not working with an interface and I should. Anyone can sort me out?
Thank you
EDIT: the log after subscribe:
let's break it down,
what are the real benefits of using the interfaces/classes/ maps besides the Intellisense?
Using interfaces and classes will not just give you intellisense but will also provide static type safety for your code. Why this is important, let's say you have a interface with following structure,
export interface Demo {
field: string;
}
// in some other file 1
demo.field.substring(1, 2);
// in some other file 2
demo.field.lenght;
You are using this interface in many places in your code. Now, for some reason you get to know that the property should be number not string. So here typescript will give you all the errors at compile time only.
export interface Demo {
field: number;
}
// in some other file 1
demo.field.substring(1, 2); // error
// in some other file 2
demo.field.lenght // error
Also, after typescript transpiles it will generate javascript files, now as javascript is interpreted language, your code will not be tested until the javascript run-time actually executes the problematic line, but in typescript you will get errors in compilation stage only.
You can get away with using any everywhere, but with that you will be missing the static typings.
With interfaces and classes, you also get OOP features, such as inheritance etc.
It has to do with data propagation?
Your frond-end is never aware what type of data will be received from api. So it's developers responsibility that the received data should be mapped to some interface.
Again as mentioned above, if somehow back-end changes type of some field in received json, then it will again be caught in compile time.
In case of forkJoin which combines output of two jsons you can have two different types.
Demo
export interface Demo1 {
field1: string;
}
export interface Demo2 {
field2: number;
}
// in service layer
getData(): Observable<[Demo1, Demo2]> {
const res1 = this.http.get(...);
const res2 = this.http.get(...);
return forkJoin([res1, res2]);
}
// in component
this.dataService.getData().subscribe(res => {
// you will get type safety and intellisense for res here
console.log(res[0].field1)
})
I am not working with an interface and I should.
Yes, you should use interfaces, if you are not using using features of typescript then whats the point using it. :)
I have loaded multiple models in the same scene and want to persist a slice viewer state and restore later. It works with a single model but isn't working with multiple models.
Unfortunately at the moment you can only restoreState for 1 model. But I have brought up this to our engineers and they have added it as an enhancement request.
A solution to save and restore viewer state from multiple models is to change the way the seedUrn works on the ViewerState.js.
The issue is that seedUrn, as a simple string, cannot identify models accurately. The solution works by changing it to an object, containing the urn of the model and a unique key (set as a loadOption during model loading). When ViewerState needs to find a model, it searches for both the urn and the loaded key, which, if unique, is able to handle even multiple identical models, as long as the key is unique (maybe urn is not needed, but I won't handle that now).
This is the code that changes the two methods on ViewerState, relating to generating and comparing seedUrn:
NOP_VIEWER.viewerState.getSeedUrn = function (model) {
model = model || viewer.model;
if (model === null) {
return {
urn: "",
uniqueKey: undefined
};
} else {
return {
urn: model.getSeedUrn(),
uniqueKey: model.myData.loadOptions.uniqueKey
};
}
};
NOP_VIEWER.viewerState.getVisibleModel = function (seedUrn) {
const visibleModels = viewer.getVisibleModels();
for (let i = 0; i < visibleModels.length; ++i) {
const modelSeedUrn = this.getSeedUrn(visibleModels[i]);
if (modelSeedUrn.urn === seedUrn.urn && modelSeedUrn.uniqueKey === seedUrn.uniqueKey) {
return visibleModels[i];
}
}
};
When loading a model into the viewer, pass the uniqueKey as a loadOption:
viewer.loadDocumentNode(obj.doc, obj.geometry, {
...
uniqueKey: 'a unique identifier',
...
})
If you need to persist the state in a database, for example, make sure you also save the unique key for the model, since it needs to be loaded with the same key contained in the state JSON.
Note that this solution does not handle cutplanes, which still rely on the viewer.model. Solving that would require overwriting the getState and restoreState functions entirely, which could get obsolete quickly in newer versions of the Viewer.
So I am coming from a background of C# where I can do things in a dynamic and reflective way and I am trying to apply that to a TypeScript class I am working on writing.
Some background, I am converting an application to a web app and the backend developer doesn't want to change the backend at all to accommodate Json very well. So he is going to be sending me back Json that looks like so:
{
Columns: [
{
"ColumnName": "ClientPK",
"Label": "Client",
"DataType": "int",
"Length": 0,
"AllowNull": true,
"Format": "",
"IsReadOnly": true,
"IsDateOnly": null
}
],
Rows:[
0
]
}
I am looking to write an Angular class that extends Response that will have a special method called JsonMinimal which will understand this data and return an object for me.
import { Response } from "#angular/http";
export class ServerSource
{
SourceName: string;
MoreItems: boolean;
Error: string;
ExtendedProperties: ExtendedProperty[];
Columns: Column[];
}
export class ServerSourceResponse extends Response
{
JsonMinimal() : any
{
return null; //Something that will be a blank any type that when returned I can perform `object as FinalObject` syntax
}
}
I know StackOverflow isn't for asking for complete solutions to problems so I am only asking what is one example taking this example data and creating a dynamic response that TypeScript isn't going to yell at me for. I don't know what to do here, this developer has thousands of server-side methods and all of them return strings, in the form of a JSON or XML output. I am basically looking for a way to take his column data and combine it with the proper row data and then have a bigger object that holds a bunch of these combined object.
A usage case here after that data has been mapped to a basic object would be something like this.
Example:
var data = result.JsonMinimal() as LoginResponse; <-- Which will map to this object correctly if all the data is there in a base object.
var pk = data.ClientPK.Value;
I'm not exactly sure I understand, but you may want to try a simple approach first. Angular's http get method returns an observable that can automatically map the response to an object or an array of objects. It is also powerful enough to perform some custom mapping/transformation. You may want to look at that first.
Here is an example:
getProducts(): Observable<IProduct[]> {
return this._http.get(this._productUrl)
.map((response: Response) => <IProduct[]> response.json())
.do(data => console.log('All: ' + JSON.stringify(data)))
.catch(this.handleError);
}
Here I'm mapping a json response to an array of Product objects I've defined with an IProduct interface. Since this is just a "lambda" type function, I could add any amount of code here to transform data.
I would like to know if any better way to create multiple collections fetching from a single big JSON file. I got a JSON file looks like this.
{
"Languages": [...],
"ProductTypes": [...],
"Menus": [...],
"Submenus": [...],
"SampleOne": [...],
"SampleTwo": [...],
"SampleMore": [...]
}
I am using the url/fetch to create each collection for each node of the JSON above.
var source = 'data/sample.json';
Languages.url = source;
Languages.fetch();
ProductTypes.url = source;
ProductTypes.fetch();
Menus.url = source;
Menus.fetch();
Submenus.url = source;
Submenus.fetch();
SampleOne.url = source;
SampleOne.fetch();
SampleTwo.url = source;
SampleTwo.fetch();
SampleMore.url = source;
SampleMore.fetch();
Any better solution for this?
Backbone is great for when your application fits the mold it provides. But don't be afraid to go around it when it makes sense for your application. It's a very small library. Making repetitive and duplicate GET requests just to fit backbone's mold is probably prohibitively inefficient. Check out jQuery.getJSON or your favorite basic AJAX library, paired with some basic metaprogramming as following:
//Put your real collection constructors here. Just examples.
var collections = {
Languages: Backbone.Collection.extend(),
ProductTypes: Backbone.Collection.extend(),
Menus: Backbone.Collection.extend()
};
function fetch() {
$.getJSON("/url/to/your/big.json", {
success: function (response) {
for (var name in collections) {
//Grab the list of raw json objects by name out of the response
//pass it to your collection's constructor
//and store a reference to your now-populated collection instance
//in your collection lookup object
collections[name] = new collections[name](response[name]);
}
}
});
}
fetch();
Once you've called fetch() and the asyn callback has completed, you can do things like collections.Menus.at(0) to get at the loaded model instances.
Your current approach, in addition to being pretty long, risks retrieving the large file multiple times (browser caching won't always work here, especially if the first request hasn't completed by the time you make the next one).
I think the easiest option here is to go with straight jQuery, rather than Backbone, then use .reset() on your collections:
$.get('data/sample.json', function(data) {
Languages.reset(data['Languages']);
ProductTypes.reset(data['ProductTypes']);
// etc
});
If you wanted to cut down on the redundant code, you can put your collections into a namespace like app and then do something like this (though it might be a bit too clever to be legible):
app.Languages = new LanguageCollection();
// etc
$.get('data/sample.json', function(data) {
_(['Languages', 'ProductTypes', ... ]).each(function(collection) {
app[collection].reset(data[collection]);
})
});
I think you can solve your need and still stay into the Backbone paradigm, I think an elegant solution that fits to me is create a Model that fetch the big JSON and uses it to fetch all the Collections in its change event:
var App = Backbone.Model.extend({
url: "http://myserver.com/data/sample.json",
initialize: function( opts ){
this.languages = new Languages();
this.productTypes = new ProductTypes();
// ...
this.on( "change", this.fetchCollections, this );
},
fetchCollections: function(){
this.languages.reset( this.get( "Languages" ) );
this.productTypes.reset( this.get( "ProductTypes" ) );
// ...
}
});
var myApp = new App();
myApp.fetch();
You have access to all your collections through:
myApp.languages
myApp.productTypes
...
You can easily do this with a parse method. Set up a model and create an attribute for each collection. There's nothing saying your model attribute has to be a single piece of data and can't be a collection.
When you run your fetch it will return back the entire response to a parse method that you can override by creating a parse function in your model. Something like:
parse: function(response) {
var myResponse = {};
_.each(response.data, function(value, key) {
myResponse[key] = new Backbone.Collection(value);
}
return myResponse;
}
You could also create new collections at a global level or into some other namespace if you'd rather not have them contained in a model, but that's up to you.
To get them from the model later you'd just have to do something like:
model.get('Languages');
backbone-relational provides a solution within backbone (without using jQuery.getJSON) which might make sense if you're already using it. Short answer at https://stackoverflow.com/a/11095675/70987 which I'd be happy to elaborate on if needed.