How to display CZML polygons in real time - cesiumjs

My code changes the value (time, polygon height) of sql to express czml at regular intervals. When czml with interval time between 1:00 and 2:00 moves to a czml file with interval time between 2:00 and 3:00, the error "DeveloperError: Normalized result is not a number" is generated.
How do I connect the 1:00-2:00 czml file to the 2:00-3:00 czml file without errors?
This is my czml file.
const exresult = [
{
id: "document",
name: "CZML Custom Properties",
version: "1.0",
clock: {
interval: TotalDate2,
currentTime: date_re1,
multiplier: 300,
},
},
{
id: "water_extrudedheight",
name: "An object with custom properties",
properties: {
constant_property: true,
population_sampled: {
number: [date_re1, result6[0], date_re2, result6[1]],
},
},
},
{
id: "colorado",
name: "Colorado",
polygon: {
positions: {
cartographicDegrees: [
128.818153,
37.699167,
0,
128.189289,
37.698488,
0,
128.818186,
37.696908,
0,
128.814783,
37.697982,
0,
],
},
material: {
solidColor: {
color: {
rgba: [0, 255, 0, 150],
},
},
},
height: 208.652,
extrudedHeight: 208.652,
},
},
];
The first czml is expressed without errors even if it is repeated.
However, if the interval time and the number time at the bottom of population_sampled change, the error "DeveloperError: Normalized result is not a number" will occur.
I thought the interval time was a problem, so I set the interval time to 24 hours and set only the value of number to be different, but if the interval time and the time size of number are not the same, I confirmed that the same error appears.

Related

How do you destructure objects nested in an array

I understand how destructuring work, {item}=data logs the item from data and [item]=data does the same. If its an object nested in an array, we do [{item}]=data. I have tried to destructure my data with no luck, so I thought to come on here and seek for help.
From the below data, how do you destructure the keys nested in payload.
const events = {
data: [
{
payload: {
care_recipient_id: "df50cac5-293c-490d-a06c-ee26796f850d",
caregiver_id: "220d9432-b5ed-4c81-8709-434899d2cd1b",
consumed_volume_ml: 230,
event_type: "fluid_intake_observation",
fluid: "caffeinated",
id: "00114a9f-00dc-4f39-a6ac-af1b7e0543e7",
observed: false,
timestamp: "2019-04-26T07:08:21.758Z",
visit_id: "5cc23bf0-8b66-f8a8-4339-688e1d43e11a",
},
payload: {
care_recipient_id: "df50cac5-293c-490d-a06c-ee26796f850d",
caregiver_id: "5c9090ab-7d5e-4a72-8bf7-197190ad4c98",
event_type: "task_completed",
id: "006139b8-a387-4529-9280-2d798c500aeb",
task_definition_description: "Assist with oral hygiene",
task_definition_id: "1bf3b81d-40b0-4539-ba96-9ea12ad6110b",
task_instance_id:
"dHxmMjU2YmFlYS1jODEyLTRjZWMtOTUxNC0wYzc5YjNjZmQwMzN8MjAxOS0wNS0xMlQwNzowMDowMC4wMDBafE1PUk5JTkc=",
task_schedule_id: "f256baea-c812-4cec-9514-0c79b3cfd033",
task_schedule_note: "Please assist me to brush my teeth",
timestamp: "2019-05-12T07:23:12.789Z",
visit_id: "5cd753f0-8b66-f8a8-4591-3f78ca3f9c45",
},
payload: {
care_recipient_id: "df50cac5-293c-490d-a06c-ee26796f850d",
caregiver_id: "5c9090ab-7d5e-4a72-8bf7-197190ad4c98",
event_type: "task_completed",
id: "0099ecb2-07bb-4b93-bd56-be485d62f22c",
task_definition_description: "Ensure home is clean and tidy",
task_definition_id: "9ac88364-79c5-4f1d-9767-5e65f16a0711",
task_instance_id:
"dHw2ZGRhZGVkMC1lZjk0LTQ1N2ItYjViMi01NDVhM2JkM2Q0YzF8MjAxOS0wNS0wM1QwNzowMDowMC4wMDBafE1PUk5JTkc=",
task_schedule_id: "6ddaded0-ef94-457b-b5b2-545a3bd3d4c1",
task_schedule_note: "Empty the bins if required.",
timestamp: "2019-05-03T07:24:10.276Z",
visit_id: "5ccb7670-8b66-f8a8-48ca-1c06125a9c4c",
},
},
],
};
I tried to do this
const {
data: [
{
payload: { event_type, task_definition_description, timestamp },
},
],
} = events;
console.log(event_type);
I got an error, I'm probably using the wrong syntax. Does anyone have a better solution please.

DataTables pagination issues: working with Laravel's pagination model and JSON response

I am trying to build a datatable for a webpage that will eventually hold hundreds of thousands of entries. Because pagination is a must-have, I was given certain local API-calls that implemented Laravel's paginate method. As a result, the server's JSON response has the following format:
{
"current_page": 1,
"data": [{...,
}],
"first_page_url": "http:\/\/127.0.0.1:8000\/api\/test_history\/fullObj?page=1",
"from": 1,
"last_page": 134,
"last_page_url": "http:\/\/127.0.0.1:8000\/api\/test_history\/fullObj?page=134",
"links": [{...,
}],
"next_page_url": "http:\/\/127.0.0.1:8000\/api\/test_history\/fullObj?page=2",
"path": "http:\/\/127.0.0.1:8000\/api\/test_history\/fullObj",
"per_page": 15,
"prev_page_url": null,
"to": 15,
"total": 2000
}
QUESTION
How could DataTables' paging feature implement Laravel's or any pagination model?
What changes in the table initialization, or in general, can be made in order for the page buttons to properly call the appropriate batch of entries?
There should be a way to utilize the above model's next_page_url, first_page_url, etc. properties.
Due to a predicted vast amount of entries, calling the entire object is out of the question, so, unless I am mistaken, DataTables' default paging may not do, because it is not a matter of what is rendered in the front end.
ISSUES
Even though the above JSON response contains all the necessary information for a table pagination, it seems to be conflicting with whatever data the datatable is expecting. So, the following issues surfaced within the webpage:
The total number of pages within the pagination section was not showing
Clicking on any page number, or the next/previous buttons within the pagination section loaded the same entries
The searching and sorting features have stopped filtering the table results
After consulting with documentation, I found that the server's JSON response did not follow DataTables standards, and I had to manipulate some of the data. Here follows the jQuery used to initialize the datatable:
$(document).ready(function() {
$('#test_table').DataTable( {
"order": [[ 8, "desc" ]],
"scrollX": true,
"paging": true,
"lengthMenu": [[ 5, 15, 25, 100, -1 ], [ 5, 15, 25, 100, "All" ]],
"pageLength": 15,
"processing": true,
"serverSide": true,
"ajax": function(data, callback, settings) {
$.get("http://127.0.0.1:8000/api/test_history/fullObj?page=1", {
limit: data.length,
offset: data.start
},
function(json) {
callback({
recordsTotal: json.total,
recordsFiltered: json.total,
data: json.data
});
});
},
"columns": [
{ "data": "id" },
{ "data": "uid" },
{ "data": "dev_type.type" },
{ "data": "registers.id" },
{ "data": "measurements.id" },
{ "data": "created_at" },
{ "data": "updated_at" }
]
});
});
Alternatively:
$(document).ready(function() {
$('#test_table').DataTable( {
"order": [[ 8, "desc" ]],
"scrollX": true,
"paging": true,
"lengthMenu": [[ 5, 15, 25, 100, -1 ], [ 5, 15, 25, 100, "All" ]],
"pageLength": 15,
"processing": true,
"serverSide": true,
"ajax": {
"url": "http://127.0.0.1:8000/api/test_history/fullObj?page=1",
"dataSrc": function(json) {
json.recordsTotal = json.total;
json.recordsFiltered = json.total;
return json.data;
}
},
"columns": [
{ "data": "id" },
{ "data": "uid" },
{ "data": "dev_type.type" },
{ "data": "registers.id" },
{ "data": "measurements.id" },
{ "data": "created_at" },
{ "data": "updated_at" }
]
});
});
These response manipulations fixed the issues of the number of pages not showing, and the incorrect number of page buttons. The rest, however, do persist.
METHODS
Apart from scouring the DataTables' documentation and their website's forum, and combing through the internet for possible solutions, I have tried the following:
Attempting to access an HTML class or ID to create a click event, either by adding the attribute or by creating a handler for the event, possibly not getting it right
Iterating across all numbered buttons, through their DataTables-specific attributes, to dynamically load new entries on click
I have yet to stumble upon a groundbreaking answer, or think of one. Despite these shortcomings, I do believe that the answer is usually simpler than any complex thinking, and in my case it could be related to the data the table is expecting from the server; the JSON's format, or its properties.
Any suggestion would be greatly appreciated.

Structuring Normalized JSON response in Redux store and mapping to React component props

Just recently, our team began structuring our JSON payload in a normalized fashion. I am most used to working with nested data in React components and even in the reducer, but I see the benefits here (less connected components re-rendering, simplified reducer code, and easier tests) and I am excited to start using this approach. I do however, have some confusion with the state shape after my first try.
Let's start with the shape of the payload -
{
"data": {
"advisors": {
"allIds": [
2
],
"byId": {
"2": {
"active": true,
"avatar_url": null,
"country": "US",
"email": "demo#gmail.com",
"first_name": "George Michael",
"full_name": "George Michael Bluth",
"id": 2,
"last_name": "Bluth",
"time_zone": "US/Central"
}
}
},
"opportunities": {
"allIds": [
"100-3",
],
"byId": {
"100-3": {
"created": "Fri, 29 Sep 2017 20:00:40 GMT",
"program_id": 3,
"prospect_id": 100
}
}
},
"programs": {
"allIds": [
3
],
"byId": {
"3": {
"abbr": "CAP",
"end_date": null,
"funnel_id": 2,
"id": 3,
"launch_date": "Sat, 11 Mar 2017 00:00:00 GMT",
"name": "Certificate in Astral Projection",
"period_end": null,
"period_start": null,
"program_level_abbr": "NCC",
"school_id": 2,
"virtual": false
}
}
},
"prospects": {
"allIds": [
2,
],
"byId": {
"2": {
"advisor_id": 3,
"contact_attempt_count": 0,
"contact_success_count": 0,
"do_not_call": false,
"do_not_email": false,
"do_not_mail": false,
"email": "adavis.est#hotmail.com",
"first_name": "Antonio",
"id": 2,
"inactive": false,
"last_name": "Davis",
"phone": {
"area_code": "800",
"extension": "70444",
"number": "3575792"
},
"priority": 10.0,
"referred_by_prospect_id": null,
"third_party": false
},
}
}
},
"pagination": {
"page_number": 1,
"total": 251
}
}
The normalized payload is structured so that advisors, opportunities, programs, and prospects are siblings and not ancestors. They're all nested one level inside of "data".
Then, in my "prospects" reducer I initialize the prospects state as an object with the following keys: fetching, failure, and entities. The first two are UI data and entities will house the response (advisors, opportunities, programs, and prospects).
const initialState = {
fetching: false,
failure: false,
entities: null,
};
function prospects(state = initialState, action) {
switch (action.type) {
case constants.prospects.REQUEST_PROSPECTS:
return { ...state, fetching: true };
case constants.prospects.RECEIVE_PROSPECTS:
return Object.assign({}, state, {
fetching: false,
entities: action.data,
});
case constants.prospects.REQUEST_PROSPECTS_FAILURE:
return { ...state, fetching: false, failure: true };
default:
return state;
}
}
And now for the red flag that brought me here - my props and internal component state seem oddly structured. I mapStateToProps like so:
const mapStateToProps = state => ({
prospects: state.prospects,
});
This has resulted in me accessing advisors, opportunities, programs, and prospects like this:
this.props.fetching
this.props.failure
this.props.prospects.entities.advisors.allIds.length
this.props.prospects.entities.opportunities.allIds.length
this.props.prospects.entities.programs.allIds.length
this.props.prospects.entities.prospects.allIds.length
My understanding is that with a normalized approach things are typically housed under this.props.entities and ui data in this.props.ui. Is the problem that I am getting all this data back from my prospects action and reducer and not separate actions and reducers? I want to reducer the accessor chain in my components, because it's become very error prone and hard to read. Would it be better to query for each entity with separate XHRs and reducers?
I know there a lot of good resources on this approach including videos from DA. But I haven't found an answer to all of these questions in combination. Thanks!
Summary
I'm suggesting that you refactor your state to look like:
{
network: {
loading: false,
failure: false
},
advisors: { allIds, byId },
opportunities: { allIds, byId },
programs: { allIds, byId },
prospects: { allIds, byId },
}
To do this, you'll want a reducer for each key in the state. Each reducer will handle its portion of the normalized payload and otherwise ignore actions.
Reducers
Network.js:
function network(state = { loading: false, failure: false }, action) {
switch (action.type) {
case constants.REQUEST_PAYLOAD:
return { ...state, fetching: true };
case constants.RECEIVE_PAYLOAD:
return { ...state, fetching: false, failure: false };
case constants.prospects.REQUEST_PROSPECTS_FAILURE:
return { ...state, fetching: false, failure: true };
default:
return state;
}
}
Prospects.js:
function prospects(state = { allIds: [], byId: {} }, action) {
switch (action.type) {
case constants.RECEIVE_PAYLOAD:
// depending on your use case, you may need to merge the existing
// allIds and byId with the action's. This would allow you to
// issue the request multiple times and add to the store instead
// of overwriting it each time.
return { ...state, ...action.data.prospects };
default:
return state;
}
}
Repeat the prospects reducer for each other section of the state.
Note
I'm assuming your payload comes back in that fashion from a single API call, and that you're not stitching that together from separate calls for each sibling (advisors, opportunities, programs, and prospects).
Details
In order to store your payload in the store, I would recommend writing separate reducers that each handle a different part of the state returned by your API call.
For prospects, you should only store the prospects portion of the payload and throw out the rest.
So instead of...
case constants.prospects.RECEIVE_PROSPECTS:
return Object.assign({}, state, {
fetching: false,
entities: action.data,
});
You should do...
case constants.prospects.RECEIVE_PROSPECTS:
return {
...state,
fetching: false,
entities: action.data.prospects,
};
Then have a similar reducer for each of the other types of data returned by your API call. Each of these reducers will process the exact same actions. They'll only handle the portion of the payload that they care about, though.
Finally, in your mapStateToProps, state.prospects will only contain the prospect data.
As a side note--assuming I'm correct about the payload being delivered by a single API--I would rename your action constants to REQUEST_PAYLOAD, RECEIVE_PAYLOAD and REQUEST_PAYLOAD_FAILURE, or something equally generic.
One more suggestion: you can move your fetching and failure logic into a NetworkReducer that only has the job of managing success/failure/loading for the API request. That way, each of your other reducers only has to handle the RECEIVE case and can just ignore other actions.

Google Slides API deleteObject() when object not exists

I want to update an existing object/image in a Google Slide. This works as long as the object exists:
var requests = [
{
"deleteObject": {
"objectId": 'image01'
}
},
{
"createImage": {
"url": imageUrl,
"objectId": 'image01',
"elementProperties": {
"pageObjectId": pageId,
"size": {
"width": {
"magnitude": 250,
"unit": "PT"
},
"height": {
"magnitude": 250,
"unit": "PT"
}
},
"transform": {
"scaleX": 1,
"scaleY": 1,
"translateX": 200,
"translateY": 100,
"unit": "PT"
}
}
}
}
];
var response = Slides.Presentations.batchUpdate({'requests': requests}, presentationId);
However, if a user previously deleted the object in the presentation, it is not re-created.
The following error message appear:
Invalid requests[0].deleteObject: The object (image01) could not be
found.
How can I query whether an object exists in presentation?
How about retrieving a object list using slides.presentations.get? In order to confirm whether objects exist, it uses slides/pageElements/objectId for fields of slides.presentations.get. You can know the exist of objects using the object list.
Sample script :
var response = Slides.Presentations.get(presentationId);
response.slides.forEach(function(e1, i1){
e1.pageElements.forEach(function(e2){
Logger.log("Page %s, objectId %s", i1 + 1, e2.objectId);
});
});
Result :
Page 1.0, objectId ###
Page 2.0, objectId ###
Page 3.0, objectId ###
If this was not useful for you, I'm sorry.
Edit :
If you want to search a value from whole JSON, you can use following simple script. When value2 is included in sampledata, ~JSON.stringify(sampledata).indexOf('value2') becomes true. In this sample, ok is shown, because value2 is included in sampledata.
But it's a bit of a stretch. If you can know the complete structure of JSON, I think that the compare of value using key is better.
var sampledata = {key1: "value1", key2: "value2"};
if (~JSON.stringify(sampledata).indexOf('value2')) {
Logger.log("ok")
}

c3.js generate a stacked bar from JSON payload

I am attempting to generate a stacked bar chart with c3 when using a JSON payload (code below). However, when I group the data, instead of having a stacking behavior, they overlay instead. If I use the column structure, I get the intended behavior, but this means that I'd have different code generate for a stacked bar chart versus my other visuals (ie timeseries chart).
var chart = c3.generate({
data: {
x: "x-axis",
json:[
{ "x-axis": "0",
"data1": 30
},
{ "x-axis": "0",
"data2": 40
}],
keys: {
x: "x-axis",
value: ["data1", "data2"]
},
groups: [
['data1', 'data2']
],
type: 'bar'
}
});
Here is a fiddle: http://jsfiddle.net/cjrobinson/ozf4fzcb/
It's weird they overplot each other in your example, I'd report that as a bug to c3
If you don't want to use the columns[] format, you could do it like below, would still need some data wrangling though:
var chart = c3.generate({
data: {
x: "x-axis",
json:[
{ "x-axis": "0",
"data1": 30,
"data2": 40
},
{ "x-axis": "1",
"data1" :20,
"data2": 60
}],
// etc etc
keys: {
x: "x-axis",
value: ["data1", "data2"]
},
groups: [
['data1', 'data2']
],
type: 'bar'
}
});
http://jsfiddle.net/dhgujwy7/1/