How I can access to the models array when I have only the information "Bridgestone" or "Continental". I think that should works with Object.keys() and find() but all my tries didn't worked. I think the trick is to get the key and with this key you can iterate with forEach() the models.
json_structure = {
"tyres":[
{
"manufacture":"Bridgestone",
"models":[
"Potenza",
"Turanza"
]
},
{
"manufacture":"Continental",
"models":[
"Allseasonconta",
"Winter Contact"
]
}
]
}
Just use find method to find a manufacture by its name, and then get the models array out of it:
const jsonStructure = {
"tyres":[
{
"manufacture":"Bridgestone",
"models":[
"Potenza",
"Turanza"
]
},
{
"manufacture":"Continental",
"models":[
"Allseasonconta",
"Winter Contact"
]
}
]
}
const getModelsByManufactureName = data => name => {
const manufacture = data.tyres.find(val => val.manufacture === name)
if (!manufacture) return manufacture
return manufacture.models
}
console.log(getModelsByManufactureName(jsonStructure)('Bridgestone'))
Hope it helps :)
There is a way to do this without an explicit loop, but knowledge of the overall structure is still necessary:
json_structure.tyres.filter(o => o.manufacture == "Continental")[0].models
So the first step is to get to the "tyres" portion which is an array and then filter by "manufacture".
Then you can look at the first record (assuming "Continental" is unique and there is the "models" object.
Another more explicit way to do this would be similar to what you were proposing:
model = {};
json_structure.tyres.forEach(function(o) {
if(o.manufacture == "Continental") {
model = o.models; return;
}
})
model should contain the model information.
Related
I use a WebSocket to communicate to a server in my Flutter app. Let's say I receive a JSON object trough the WebSocket :
{
"action": "getProduct",
"cbackid": 1521474231306,
"datas": {
"product": {
"Actif": 1,
"AfficheQte": 0,
"Article": "6"
},
"result": "success"
},
"deviceID": "4340a8fdc126bb59"
}
I have no idea what the content of datas will be until I read the action, and even then, it's not guaranteed to be the same every time. One example of a changing action/datas is when the product doesn't exist.
I can parse it in a Map<String, Object>, but then, how do I access what's inside the Object?
What's the correct way to read this data?
Not sure what the question is about, but you can check the type of the values and then continue accordingly
if(json['action'] == 'getProduct') {
var datas = json['datas'];
if(datas is List) {
var items = datas as List;
for(var item in items) {
print('list item: $item');
}
} else if (datas is Map) {
var items = datas as Map;
for(var key in items.keys) {
print('map item: $key, ${items[key]}');
}
} else if(datas is String) {
print('datas: $datas');
} // ... similar for all other possible types like `int`, `double`, `bool`, ...
}
You also can make that recursive to check list or map values if they are String, ...
What should be the proper way of storing / handling repeating events in the redux store ?
Problem: Let's say that we have a backend API that generates repeating events trough a complicated business logic.Some of the events might have the same ID. Lets say that generated output looks this way :
[
{
"id": 1,
"title": "Weekly meeting",
"all_day": true,
"starts_at": "2017-09-12",
"ends_at": "2017-09-12"
},
{
"id": 3,
"title": "Daily meeting1",
"all_day": false,
"starts_at": "2017-09-12",
"ends_at": "2017-09-12",
},
{
"id": 3,
"title": "Daily meeting1",
"all_day": false,
"starts_at": "2017-09-13",
"ends_at": "2017-09-13",
},
{
"id": 3,
"title": "Daily meeting1",
"all_day": false,
"starts_at": "2017-09-14",
"ends_at": "2017-09-14",
}
]
Possible solution would be: generate unique ID by having additional property uid composed like this: id + # + starts_at. This way we could identify each occurrence uniquely. (I'm using this right now)
Example:
[
{
"id": 1,
"uid": "1#2017-09-12",
"title": "Weekly meeting",
"all_day": true,
"starts_at": "2017-09-12",
"ends_at": "2017-09-12"
}
]
I'm wondering is there some other way, maybe more elegant than having composed unique id ?
There is a possible pitfall with your current solution. What will happen if id and start_id of two events will be the same? Is it possible scenario in your domain?
Because of that I usually use this nice lib in such cases. It produces really short unique ids, which have some nice properties, like guaranties not to intersect, to be unpredictable and so on.
Also ask yourself if you actually need unique ids in your case. Looks like your back-end have no chance to distinguish events anyways, so why bother? Redux store will happily keep your events event without uid.
Maybe not much of an improvement (if at all) but just using JSON.stringify to check for duplicates could make unique id's obsolete.
const existingEvents = [
{
"id": 3,
"title": "Daily meeting1",
"all_day": false,
"starts_at": "2017-09-14",
"ends_at": "2017-09-14",
}
];
const duplicate = {
"id": 3,
"title": "Daily meeting1",
"all_day": false,
"starts_at": "2017-09-14",
"ends_at": "2017-09-14",
};
const eventIsDuplicate = (existingEvents, newEvent) => {
const duplicate =
existingEvents.find(event => JSON.stringify(event) == JSON.stringify(newEvent));
return typeof duplicate != 'undefined';
};
console.log(eventIsDuplicate(existingEvents, duplicate)); // true
I guess this would only be preferable to your existing solution if, for some reason, you'd want to keep all the uniqueness logic on the client side.
As far as I understand the examples you've given, it seems like the server is sending a particular event whenever the details of the event change.
If that is so, and you want to track the changes to events, your might shape might be an array of objects with all the fields of the event that hold the current data, and a history property which is an array of all previous (or n most recent) event objects and the timestamps at which they were received. This is how your reducers would look, storing only the five most recent event changes for each event. I'm expecting the action to have a payload property which has your standard event property and a timestamp property, which can be easily accomplished in the action creator.
const event = (state = { history: [] }, action) => {
switch (action.type) {
case 'EVENT_FETCHED':
return ({
...action.payload.event,
history: [...state.history, action.payload].slice(-5),
});
default:
return state;
}
};
const events = (state = { byID: {}, IDs: [] }, action) => {
const id = action.payload.event.ID;
switch (action.type) {
case 'EVENT_FETCHED':
return id in state.byID
? {
...state,
byID: { ...state.byID, [id]: event(state.byID[id], action) },
}
: {
byID: { ...state.byID, [id]: event(undefined, action) },
IDs: [id],
};
default:
return state;
}
};
Doing this, you don't need any unique ID. Please let me know if I have misunderstood your problem.
Edit: This is a slight extension of the pattern in the Redux documentation, to store previous events.
At the end this is what I've implemented (for demonstration purpose only - unrelated code is omitted):
eventRoot.js:
import { combineReducers } from 'redux'
import ranges from './events'
import ids from './ids'
import params from './params'
import total from './total'
export default resource =>
combineReducers({
ids: ids(resource),
ranges: ranges(resource),
params: params(resource)
})
events.js:
import { GET_EVENTS_SUCCESS } from '#/state/types/data'
export default resource => (previousState = {}, { type, payload, requestPayload, meta }) => {
if (!meta || meta.resource !== resource) {
return previousState
}
switch (type) {
case GET_EVENTS_SUCCESS:
const newState = Object.assign({}, previousState)
payload.data[resource].forEach(record => {
// ISO 8601 time interval string -
// http://en.wikipedia.org/wiki/ISO_8601#Time_intervals
const range = record.start + '/' + record.end
if (newState[record.id]) {
if (!newState[record.id].includes(range)) {
// Don't mutate previous state, object assign is only a shallow copy
// Create new array with added id
newState[record.id] = [...newState[record.id], range]
}
} else {
newState[record.id] = [range]
}
})
return newState
default:
return previousState
}
}
There is also a data reducer but it's linked in parent reducer due to generic implementation that is re-used for common list responses. Events data are updated and start/end property is removed as it's composed by range (ISO 8601 time interval string). This can be later used by moment.range or split by '/' to get start/end data. I've opted for array of range strings to simplify checking of existing ranges, as they might grow large. I think that primitive string comparison (indexOf or es6 includes) would be faster than looping over complex structure in such cases.
data.js (stripped down version):
import { END } from '#/state/types/fetch'
import { GET_EVENTS } from '#/state/types/data'
const cacheDuration = 10 * 60 * 1000 // ten minutes
const addRecords = (newRecords = [], oldRecords, isEvent) => {
// prepare new records and timestamp them
const newRecordsById = newRecords.reduce((prev, record) => {
if (isEvent) {
const { start, end, ...rest } = record
prev[record.id] = rest
} else {
prev[record.id] = record
}
return prev
}, {})
const now = new Date()
const newRecordsFetchedAt = newRecords.reduce((prev, record) => {
prev[record.id] = now
return prev
}, {})
// remove outdated old records
const latestValidDate = new Date()
latestValidDate.setTime(latestValidDate.getTime() - cacheDuration)
const oldValidRecordIds = oldRecords.fetchedAt
? Object.keys(oldRecords.fetchedAt).filter(id => oldRecords.fetchedAt[id] > latestValidDate)
: []
const oldValidRecords = oldValidRecordIds.reduce((prev, id) => {
prev[id] = oldRecords[id]
return prev
}, {})
const oldValidRecordsFetchedAt = oldValidRecordIds.reduce((prev, id) => {
prev[id] = oldRecords.fetchedAt[id]
return prev
}, {})
// combine old records and new records
const records = {
...oldValidRecords,
...newRecordsById
}
Object.defineProperty(records, 'fetchedAt', {
value: {
...oldValidRecordsFetchedAt,
...newRecordsFetchedAt
}
}) // non enumerable by default
return records
}
const initialState = {}
Object.defineProperty(initialState, 'fetchedAt', { value: {} }) // non enumerable by default
export default resource => (previousState = initialState, { payload, meta }) => {
if (!meta || meta.resource !== resource) {
return previousState
}
if (!meta.fetchResponse || meta.fetchStatus !== END) {
return previousState
}
switch (meta.fetchResponse) {
case GET_EVENTS:
return addRecords(payload.data[resource], previousState, true)
default:
return previousState
}
}
This can be then used by an calendar component with event selector:
const convertDateTimeToDate = (datetime, timeZoneName) => {
const m = moment.tz(datetime, timeZoneName)
return new Date(m.year(), m.month(), m.date(), m.hour(), m.minute(), 0)
}
const compileEvents = (state, filter) => {
const eventsRanges = state.events.list.ranges
const events = []
state.events.list.ids.forEach(id => {
if (eventsRanges[id]) {
eventsRanges[id].forEach(range => {
const [start, end] = range.split('/').map(d => convertDateTimeToDate(d))
// You can add an conditional push, filtered by start/end limits
events.push(
Object.assign({}, state.events.data[id], {
start: start,
end: end
})
)
})
}
})
return events
}
And here is how the data structure looks in the redux dev tools:
Each time the events are fetched, their data is updated (if there is a change) and references are added. Here is an screenshot of redux diff after fetching new events range:
Hope this helps somebody, I'll just add that this still isn't battle tested but more a proof of a concept that's working.
[EDIT] Btw. I'll probably move some of this logic to the backend as then there will be no need to split / join / delete properties.
I am working with a dataset that cannot be modified on the server side. So I am trying to setup the local data model on the client in a way that I can easily traverse through the model when updating parts of the data.
Therefore I am trying to create a multi-leveled Map from multi-leveled Maps including Lists, that themselves include Maps, etc. (see schematics at the end of this post).
What I am trying to get is a Map containing other Maps, with the key of the included Map being the value of the object (again please see schematics at the end of this post).
I got it to work on the first level:
const firstLevel = data.toMap().mapKeys((key, value) => value.get('value'));
See it in action here: https://jsfiddle.net/9f0djcb0/4/
But there is a maximum of 3 levels of nested data and I can't get my head around how to get the transformation done. Any help appreciated!
The schematic datasets:
// This is what I got
const dataset = [
{
field: 'lorem',
value: 'ipsum',
more: [
{
field: 'lorem_lvl1',
value: 'ispum_lvl1',
more: [
{
field: 'lorem_lvl2',
value: 'ispum_lvl2',
more: [
{
field: 'lorem_lvl3',
value: 'ispum_lvl3',
}
]
}
]
}
]
},
{
field: 'glorem',
value: 'blipsum'
},
{
field: 'halorem',
value: 'halipsum'
}
];
This is where I want to go:
// This is what I want
const dataset_wanted = {
ipsum: {
field: 'lorem',
value: 'ipsum',
more: {
lorem_lvl1: {
field: 'lorem_lvl1',
value: 'ispum_lvl1',
more: {
lorem_lvl2: {
field: 'lorem_lvl2',
value: 'ispum_lvl2',
more: {
lorem_lvl3: {
field: 'lorem_lvl3',
value: 'ispum_lvl3',
}
}
}
}
}
}
},
glorem: {
field: 'glorem',
value: 'blipsum'
},
halorem: {
field: 'halorem',
value: 'halipsum'
}
};
Retrieve nested structures using "getIn" is beter.
const data = Immutable.fromJS(dataset[0]);
const firstLevel = data.getIn(['more']);
const twoLevel = firstLevel.getIn([0,'more']);
const threeLevel = twoLevel.getIn([0,'more']);
console.log(firstLevel.toJS(),twoLevel.toJS(),threeLevel.toJS());
As for a more generative solution, I re-wrote the answer before to a recursive approach:
function mapDeep(firstLevel) {
return firstLevel.map((obj) => {
if (obj.has('more')) {
const sec = obj.get('more').toMap().mapKeys((key, value) => value.get('value'));
const objNext = mapDeep(sec);
obj = obj.set('more', objNext);
}
return obj;
});
}
The first level still needs to be mapped manually before.
const firstLevel = data.toMap().mapKeys((key, value) => value.get('value'));
const secondLevel = mapDeep(firstLevel);
Again, see it in action: https://jsfiddle.net/9f0djcb0/12/
This is good enough for me for now. Still feels like this can be solved smarter (and more performant).. Cheers :)
So after some time passed I came up with a solution that works for me:
let sec, third, objThird;
// 1st level: simple mapping
const firstLevel = data.toMap().mapKeys((key, value) => value.get('value'));
// 2nd level: walk through updated firstLevel's subobjects and do the mapping again:
const secondLevel = firstLevel.map((obj) => {
if (obj.has('more')) {
sec = obj.get('more').toMap().mapKeys((key, value) => value.get('value'));
// 3nd level: walk through updated secondLevel's subobjects and do the mapping again:
objThird = sec.map((o) => {
if (o.has('more')) {
third = o.get('more').toMap().mapKeys((key, value) => value.get('value'));
o = o.set('more', third);
}
return o;
});
obj = obj.set('more', objThird);
}
return obj;
});
See it in action here: https://jsfiddle.net/9f0djcb0/7/
This has been working nicely so far, thur pretty hard-coded. If anyone has a more elegant solution to this, I am happy to learn about it!
My react-redux app is getting a single record in JSON but the record is an array and therefore it looks like this (notice [ ] brackets):
{"person":[{"PersonID":1,"Name":"John Smith","Gender":0}]}
So, the redux store shows it as person->0->{"PersonID":1,"Name":"John Smith","Gender":0}. As such, the state shows that the person object is empty:
Name: this.props.person?this.props.person.Name:'object is empty',
My PersonPage.js includes the details page like this:
<PersonDetail person={this.props.person} />
The details page has this:
import React from 'react';
import classnames from 'classnames';
class PersonDetail extends React.Component {
state = {
Name: this.props.person?this.props.person.Name:'',
PersonID: this.props.person?this.props.person.PersonID:null,
loading: false,
done: false
}
componentWillReceiveProps = (nextProps) => {
this.setState({
PersonID: nextProps.person.PersonID,
Name: nextProps.person.Name
});
}
This is my raw Redux state:
people: [
[
{
PersonID: 51,
Name: 'John Smith',
Gender: 0
}
]
]
Person is an array, that contains the object in which Name key is present, so you need to use index also, write it like this:
this.props.person && this.props.person.length ? this.props.person[0].Name : '';
Check this example:
var data = {
"person":[
{
"PersonID":1,
"Name":"John Smith",
"Gender":0
}
]
};
console.log('Name: ', data.person[0].Name);
I think that you are supposed to map the person detail foreach person's data.
on the PersonPage.js ,
map it as follows:
{
this.props.person.map((p)=>{
return (<PersonDetail person={p} />)
})
}
If I was you I would make an util function like this:
const parsePeople = people => {
if (people instanceof Array) return parsePeople(people.pop())
return people
}
const people = [
[{
PersonID: 51,
Name: 'John Smith',
Gender: 0
}]
]
const person = parsePeople(people)
console.log(person) -> Object
Using recursion we check if people is an instance of Array, we call the function again using people.pop() which return the last element of the array.
you have an array on your person data... you can only access that without the 0 using map...
example:
componentWillReceiveProps = (nextProps) => {
var PersonID = nextProps.person ? nextProps.person.map(item => { return item.PersonID}) : '';
var Name = nextProps.person ? nextProps.person.map(item => { return item.Name}) : '';
this.setState({
PersonID,
Name
});
}
this is considering you only have 1 array on person.
I fixed it! It was a combination of two of the answers given:
In the PersonPage.js, I had to call the PersonDetails object like this:
<PersonDetail
person={this.props.person[0]}
/>
And this is the new MapStatetoProps:
function mapStateToProps(state, props) {
const { match } = props;
if (match.params.PersonID) {
return {
person: state.people
}
}
Thanks to those who answered. This drove me nuts.
I'm retrieving the following structure from Firebase:
"bills" : {
"1" : { // the customer id
"orders" : {
"-KVMs10xKfNdh_vLLj_k" : [ { // auto generated
"products" : [ {
"amount" : 3,
"name" : "Cappuccino",
"price" : 2.6
} ],
"time" : "00:15:14"
} ]
}
}
}
I'm looking for a way to process this with Aurelia. I've written a value converter that allows my repeat.for to loop the object keys of orders, sending each order to an order-details component. The problem is, this doesn't pass the key, which I need for deleting a certain order ("-KVMs10xKfNdh_vLLj_k")
Should I loop over each order and add the key as an attribute myself?
Is there a better/faster way?
This answer might be a little late (sorry OP), but for anyone else looking for a solution you can convert the snapshot to an array that you can iterate in your Aurelia views using a repeat.for, for example.
This is a function that I use in all of my Aurelia + Firebase applications:
export const snapshotToArray = (snapshot) => {
const returnArr = [];
snapshot.forEach((childSnapshot) => {
const item = childSnapshot.val();
item.uid = childSnapshot.key;
returnArr.push(item);
});
return returnArr;
};
You would use it like this:
firebase.database().ref(`/bills`)
.once('value')
.then((snapshot) => {
const arr = snapshotToArray(snapshot);
});