So I'm trying to load the data received from a webservice into a sencha touch 2 store.
The data is nested JSON, however it is made to include multiple dataArrays.
I am working with sencha touch 2.3.1, somewhat equal to Ext JS 4.2. I don't have that much experience with sencha yet, but I'm getting there. I decided to go for MVC, so I'd like the answers to be as close to this as possible :).
This is the example JSON I am using:
[
{
"DataCollection": {
"DataArrayOne": [
{
"Name": "John Smith",
"Age": "19"
},
{
"Name": "Bart Smith",
"Age": "16"
}
],
"DataArrayTwo": [
{
"Date": "20110601",
"Product": "Apple",
"Descr": "",
"Remark": ""
},
{
"Date": "20110601",
"Product": "Orange",
"Descr": "",
"Remark": ""
},
{
"Date": "20110601",
"Product": "Pear",
"Descr": "",
"Remark": ""
}
],
"DataArrayThree": [
{
"SomeTotalCost": "400,50",
"IntrestPercentage": "3"
}
]
}
}
]
Through only one call, I get this json. I don't want to cause any unnecessary traffic so I hope to be able to use the data somehow.
I want to be able to use each DataArray on its own.
The data gets sent to the store through its proxy:
Ext.define("MyApp.store.myDataObjects", {
extend: "Ext.data.Store",
config: {
model: "MyApp.model.myDataObject",
proxy: {
reader: {
type: "json",
rootProperty: "DataCollection"
},
type: "ajax",
api: {
read: "https://localhost/Service.svc/json"
},
limitParam: false,
startParam: false,
pageParam: false,
extraParams: {
id: "",
token: "",
filter: ""
},
writer: {
encodeRequest: true,
type: "json"
}
}
}
});
I am a bit stuck with the model here. I tried using mappings which would look like this:
config: {
fields: [ {
name: "IntrestPercentage",
mapping: "Calculation.IntrestPercentage",
type: "string"
}
]}
I tried associations as well but to no avail.
According to google chrome console, it doesn't make any objects containing data. I get only 1 object with all values "null".
My endgoal is to be able to show each dataArray in a separate table. So a table for DataArrayOne, a table for DatarrayTwo... The data itself isn't linked. They are only details that have to be shown on a view.
John Smith isn't related to the apples, as in he didn't buy. The apples are just there as an item to be shown.
The possible solutions I've seen yet not understood due to them being outdated are:
ChildStores: You have a master store that receives the data, and then
you split the data to other stores according to rootProperty. I have
no idea how to do this however and I'm not sure if it will work at
all.
Associations, in case I was doing them wrong. I don't think they
are needed because the data isn't linked to each other but it is part
of "DataCollection" though.
Could someone please post an example on how to deal with this unusual(?) kind of nested json.
Or any other solution which will lead to being able to use the 3 dataArrays at will.
Thanks in advance
The best would be to load the complete data with a separate Ext.Ajax.request and then use store.loadData in the success callback. For example:
var data = Ext.decode(response.responseText);
store1.loadData(data[0].DataCollection.DataArrayOne);
store2.loadData(data[0].DataCollection.DataArrayTwo);
store3.loadData(data[0].DataCollection.DataArrayThree);
Related
I am using a rest API, sending a GET request and getting the following JSON structure as a result:
{
"Id": "Sample Id",
"Attributes": {
"ReadOnly": false
},
"Children": [
{
"Id": "Sample Id1",
"Attributes": {
"ReadOnly": false
},
"Children": [
{
"Id": "Sample Id2",
"Attributes": {
"ReadOnly": false
}
}
]
},
{
"Id": "Sample Name2",
"Attributes": {
"ReadOnly": false
},
"Children": [
{
"Id": "Sample Id2",
"Attributes": {
"ReadOnly": false
}
}
]
}
]
}
It is basically a file system structure. So it is possible to have N objects(Id, Attributes{}, Children[]) in the root as well as in any other level of the structure.
Trying to explain a little bit better, the root node has its attributes and an array of N children that have its attributes and another Array of N children and so on...
How would be the correct way to handle this situation?
I have created a flat interface structure, looking basically like that:
export interface Hana{
Id: string,
Attributes: {
ReadOnly: string
}
}
I have also created a service and a component as follows:
Service
getHanaStructure(): Observable<Hana[]> {
const hanaStructs = this.http.get<Hana[]>(this.apiUrl);
this.messageService.add('HanaService: fetched struct');
return hanaStructs;
Component
hanaStructures$: Observable<Hana[]>;
getHanaStructure() : void {
this.hanaStructures$ = this.hanaService.getHanaStructure().pipe(map(data=> _.toArray(data)));
}
In order to show the data my HTML template looks like that:
<ul *ngIf="hanaStructures$ | async as hanaStructures else noData">
<li *ngFor="let hana of hanaStructures">
{{hana}}
</li>
</ul>
<ng-template #noData>No Data Available</ng-template>
The first problem is that I don't know to access the information by its key, I can just list their values. When I try something like {{ hana.Id }} instead of just {{ hana }} I got: *
"Property 'Id' does not exist on type 'Hana'"
The second issue is that I can only manage to list the first level data. I don´t know how access the Children of the Children of Children...
I am sure that the API is returning everything I need, but unfortunately I don´t know how to solve the problem.
Thanks,
Filipe
At first I would define the "Children" property in your Hana Interface as List of "Hana" Interfaces (This has some similarity with a resursive approach). Afterwards I would create a seperate Component, which gets displayed if you loop through the first layer of your children. This component should contain a loop of its own to loop through deeper nested components recursively.
Hope this was understandable and helps you. =)
We have a heavily nested json document containing server metrcs, the document contains > 1000 fields some of which are completely irrelevant to us for analytic purposes so i would like to remove them before indexing the document in Elastic.
However i am unable to find the correct filter to use as the fields i want to remove have common names in multiple different objects within the document.
The source document looks like this ( reduced in size for brevity)
[
{
"server": {
"is_master": true,
"name": "MYServer",
"id": 2111
},
"metrics": {
"Server": {
"time": {
"boundary": {},
"type": "TEXT",
"display_name": "Time",
"value": "2018-11-01 14:57:52"
}
},
"Mem_OldGen": {
"used": {
"boundary": {},
"display_name": "Used(mb)",
"value": 687
},
"committed": {
"boundary": {},
"display_name": "Committed(mb)",
"value": 7116
}
"cpu_count": {
"boundary": {},
"display_name": "Cores",
"value": 4
}
}
}
}
]
The data is loaded into logstash using the http_poller input plugin and needs to be processed before sending to Elastic for indexing.
I am trying to remove the fields that are not relevant for us to track for analytical purposes, these include the "display_name" and "boundary" fields from each json object in the different metrics.
I have tried using the mutate filter to remove the fields but because they exist in so many different objects it requires to many coded paths to be added to the logstash config.
I have also looked at the ruby filter, which seems promising as it can look the event, but i am unable to get it to crawl the entire json document, or more importantly actually remove the fields.
Here is what i was trying as a test
filter {
split{
field => "message"
}
ruby {
code => '
event.get("[metrics][Mem_OldGen][used]").to_hash.keys.each { |k|
logger.info("field is:", k)
if k.include?("display_name")
event.remove(k)
end
if k.include?("boundary")
event.remove(k)
end
}
'
}
}
It first splits the input at the message level to create one event per server, then tries to remove the fields from a specific metric.
Any help you be greatly appreciated.
If I get the point, you want to keep just the value key.
So, considering the response hash:
response = {
"server": {
"is_master": true,
"name": "MYServer",
"id": 2111
},
"metrics": {
...
You could do:
response[:metrics].transform_values { |hh| hh.transform_values { |h| h.delete_if { |k,v| k != :value } } }
#=> {:server=>{:is_master=>true, :name=>"MYServer", :id=>2111}, :metrics=>{:Server=>{:time=>{:value=>"2018-11-01 14:57:52"}}, :Mem_OldGen=>{:used=>{:value=>687}, :committed=>{:value=>7116}, :cpu_count=>{:value=>4}}}}
I've been working with datatables and I'm able to load the datatable using getJson with strongly typed classes etc and it works just great. Until I hit one snag.
There are times I want to populate a datatable with data that "I don't know about" but I always know that it will be one row of data - it is simply a json string with dynamic content.
Now with datatables you can simply populate the table with aaData and aaCol by assigning a json string to it but my json string contains a column and data IE:
First_name:bob and so on.
A column - on Datatables would be populated with sTitle:Column1 etc and assigned to aaCol.
Does anyone know of a plug in that parses a json string into aaCol and aaData for use with datatables?
I believe you can solve your problem using this approach:
$(document).ready(function() {
$('#example').DataTable( {
"processing": true,
"serverSide": true,
"ajax": "scripts/objects.php",
"columns": [
{ "data": "first_name" },
{ "data": "last_name" },
{ "data": "position" },
{ "data": "office" },
{ "data": "start_date" },
{ "data": "salary" }
]
} );
} );
In the example above the dataTables uses a serverSide processing, the ajax return a object like this:
{
"draw": 1,
"recordsTotal": 57,
"recordsFiltered": 57,
"data": [
{
"first_name": "Airi",
"last_name": "Satou",
"position": "Accountant",
"office": "Tokyo",
"start_date": "28th Nov 08",
"salary": "$162,700"
}, ...
You can also set the column name using the "name" property inside the specification of each column.
You can the full example in the following link. If you need more assistance I can make a code on my own later today =)
You can check this JsFiddle to understand how to set the columns names
In the app we're developing, we create all the JSON at the server side using dinamically generated configs (JSON objects). We use that for stores (and other stuff, like GUIs), with a dinamically generated list of its data fields.
With a JSON like this:
{
"proxy": {
"type": "rest",
"url": "/feature/163",
"timeout": 600000
},
"baseParams": {
"node": "163"
},
"fields": [{"name": "id", "type": "int" },
{"name": "iconCls", "type": "auto"},
{"name": "text","type": "string"
},{ "name": "name", "type": "auto"}
],
"xtype": "jsonstore",
"autoLoad": true,
"autoDestroy": true
}, ...
Ext will gently create an "implicit model" with which I'll be able to work with, load it on forms, save it, delete it, etc.
What I want is to specify through a JSON config not the fields, but the model itself. Is this possible?
Something like:
{
model: {
name: 'MiClass',
extends: 'Ext.data.Model',
"proxy": {
"type": "rest",
"url": "/feature/163",
"timeout": 600000},
etc... }
"autoLoad": true,
"autoDestroy": true
}, ...
That way I would be able to create a whole JSON from the server without having to glue stuff using JS statements on the client side.
Best regards,
I don't see why not. The syntax to create a model class is similar to that of store and components:
Ext.define('MyApp.model.MyClass', {
extend:'Ext.data.Model',
fields:[..]
});
So if you take this apart you could call Ext.define(className,config);
where className is a string and config is a JSON object and both are generated on the server.
There's no way to achieve what I want.
The only way you can do it is by means of defining the fields of the Ext.data.Store and have it to generate the implicit model by using the fields configuration.
Is there a way to fetch data from Oracle UCM's JSON option and use this in Full Calendar. I createed a service to return data from calendar data in UCM, and want to display the events using fullcalendar.
Here is an example feed that I get back:
"ResultSets": {
{
"SQLLMCal": {
"fields": [
{ "name": "SINGLE_ELEMENT" },
{ "name": "Start" },
{ "name": "SCHOOL_TYPE" },
{ "name": "SCHOOL_TYPE_ID" },
{ "name": "Title" },
{ "name": "ENTRY_SIDE_GROUP" }
],
"rows": [
[
"141",
"3/17/11 12:00 AM",
"Elementary",
"3",
"Big Burger",
"1"
]
]
}
}
}
Thanks, Ken
need to parse your own data using a custom events function:
http://arshaw.com/fullcalendar/docs/event_data/events_function/
Okay, great example of "problem exists between keyboard and chair"! I just had to stare at the documentation and the answer was right there.
In my code, I have the following:
events: function(callback) {
By definition, this MUST be
events: function(start, end, callback) {
Once I added the start and end parameters, everything worked perfectly.
Thank you Adam for pointing me the the right direction!
Ken