Angular 10 JSON Data Not Working When Adding to MongoDB - json

Hello I have completed coding for my fake back-end using json-server. Things are working the way I expect for the e-commerce site on the front end. Now preparing to use MongoDB for production on back end using the db.json file and trying to import with compass does not work. According to json-server documentation my data is correct.
The error I get when trying to import is "Operations passed in cannot be an array." I also used a tool to un-format an format with validation in which I copied my JSON data, I get an error "That doesn't appear to be valid JSON.
Validation failed with the following error:"
What I don't understand why this works on Angular 10. All the code I written so far is based on this db.json file. I also manually tried to add data into MongoDB, see below:
db.collection.insertMany([{
"products":[
{
}
]
}
{)
It takes it but it never adds my object (products) when I view it using .find().pretty. Viewing it in compass it has products as an array
Can someone please explain to my what is going on and what I need to do or point me in the right direction. I'm new to MongoDB, its not getting my objects. I have included my db.json file. As I stated my entire front end is based on this structure of JSON.
db.json
{
"products": [
{
"_id": 1,
"name": "Overalls",
"size": " ",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/overalls.png",
"price": 250
},
{
"_id": 2,
"name": "Red Onepiece",
"size": " ",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/redonepiece-r.png",
"price": 160
},
{
"_id": 3,
"name": "Sport",
"size": " ",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/sportoutfit.png",
"price": 50
},
{
"_id": 4,
"name": "Purple Outfit",
"size": "",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/purpleoutfit4.png",
"price": 100
},
{
"_id": 5,
"name": "Pink To Slick",
"size": "",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/PinkToSlick.png",
"price": 299
},
{
"_id": 8,
"name": "Grey Jumper",
"size": "",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/greyofftheshoulder.png",
"price": 99
},
{
"_id": 9,
"name": "Crop Jacket",
"size": " ",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/slickjacket.png",
"price": 299
},
{
"_id": 10,
"name": "Shear Top",
"size": "",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/sexy.png",
"price": 299
},
{
"_id": 11,
"name": "Blueprint One Piece",
"size": " ",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/Extra-n-Blue.png",
"price": 199
},
{
"_id": 12,
"name": "BabyGirl Jogger",
"size": " ",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/BabyGirl.png",
"price": 99
},
{
"_id": 13,
"name": "Black One Piece",
"size": " ",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/BlacknwhiteOne.png",
"price": 199
}
],
"wishlist": [
{
"id": 25
},
{
"id": 65
},
{
"id": 67
},
{
"id": 70
},
{
"id": 72
},
{
"id": 29
},
{
"id": 35
},
{
"id": 71
},
{
"id": 30
},
{
"id": 27
},
{
"id": 21
},
{
"id": 1
},
{
"id": 2
},
{
"id": 3
},
{
"id": 4
},
{
"id": 6
},
{
"id": 8
}
],
"wishlistitem": [
{
"product": {
"id": 8,
"name": "Grey Jumper",
"size": "",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/greyofftheshoulder.png",
"price": 99
},
"id": 1
}
],
"cart": [
{
"product": {
"id": 2,
"name": "Red Onepiece",
"size": "S",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/redonepiece-r.png",
"price": 160
},
"id": 1
},
{
"product": {
"_id": 8,
"name": "Grey Jumper",
"size": "L",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/greyofftheshoulder.png",
"price": 99
},
"id": 3
}
],
"purses": [
{
"id": 150,
"name": "Black-Louis Vuitton",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/IMG-20211019-WA0003.png",
"price": 250
},
{
"id": 151,
"name": "BlackBlue",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/IMG-20211017-WA0008.png",
"price": 300
},
{
"id": 152,
"name": "White",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/IMG-20211017-WA0013.png",
"price": 300
},
{
"id": 153,
"name": "Brown-Louis Vuitton",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/IMG-20211019-WA0002.png",
"price": 300
},
{
"id": 154,
"name": "Green Small Louis Vuitton",
"description": "Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book.",
"imageUrl": "http://localhost:4200/assets/IMG-20211019-WA0010.png",
"price": 350
}
],
"shoes": [],
"posts": [
{
"id": 1,
"title": "json-server",
"author": "typicode"
}
],
"comments": [
{
"id": 1,
"body": "some comment",
"postId": 1
}
],
"profile": {
"name": "typicode"
}
}

Related

how to load nested data from json file in react

i have simple json file like this:
[{
"id": "1",
"title": "How to become a best sale marketer in a month!",
"Summary": "Lorm voluptatem reecto, quos amet hic aliquid!",
"dateAdd": "May 9, 2021",
"body": "Lorem ipsum dolor sit amet",
"child": [{
"base": "Lorem ipsum dolor sit ametLorem ipsum dolor sit amet"
}]
},
{
"id": "2",
"title": "SEO trend to look for the best in 2020",
"Summary": "Lorem ipsum",
"dateAdd": "May 9, 2020",
"body": "Lorem ipsum dolor sit amet",
"child": [{
"base": "Lorem ipsum dolor sit ametLorem ipsum dolor sit amet"
}]
}]
this is my blog component that uses the json data:
import React from "react";
import { Link } from "react-router-dom";
import data from "../../layout/data.json";
const _Blog = (props: any) => {
return (
<>
<div className="col-md-8">
{data.map((postDetail, index) => {
return (
<article className="blog-post-item">
<div className="post-thumb">
<img
src="assets/images/blog/news-1.jpg"
alt=""
className="img-fluid"
/>
</div>
<div className="post-item mt-4">
<div className="post-meta">
<span className="post-date">
<i className="fa fa-calendar-alt mr-2"></i>{postDetail.dateAdd}
</span>
</div>
<h2 className="post-title">
<a href="blog-single.html">
{postDetail.title}
</a>
</h2>
<div className="post-content">
<p>
{postDetail.body}
</p>
<h5 key={index}>
<Link to={`/blog-detail/${index + 1}`}>More</Link>
</h5>
</div>
</div>
</article>
);
})}
</div>
</>
);
};
export const Blog = _Blog;
it works correctly but i want to use the data from the child object. so i thought i should use it like this:
{postDetail.child.base} but it wont load it. Im totally lost at this point. How does {postDetail.title} or {postDetail.body} work but not the child part?
you can't access postDetail.child.base probably because postDetail.child is an array.
Try accessing it like postDetail.child[0].base.
Hope that helps.. Leave a comment if your issue is different. I'll edit my response
It should've been a comment but I don't have enough reputation.

Parsing a set of JSON objects with jq

A have a big block of JSON l'm trying to parse, that looks basically like
{
"order": [
"hash1",
"hash2"
],
"posts": {
"hash4": {
"id": "hash4",
"message": "lorem ipsem"
},
"hash5": {
"id": "hash5",
"message": "dolor sit amet"
},
"hash6": {
"id": "hash6",
"message": "consectetur adipiscing elit"
}
}
}
The way I've been handling this so far is to just grep for messages
$ grep 'message' jq_dat.json
"message": "lorem ipsem"
"message": "dolor sit amet"
"message": "consectetur adipiscing elit"
This works for my current purposes, but l'd like to know how to get the same effect with jq. I.e.
$ jq .posts.<something>.message < jq_dat.json
"lorem ipsem"
"dolor sit amet"
"consectetur adipiscing elit"
I've tried using [] and {} in place of something, but those both spit back compile errors.
You just have one too many dot
jq .posts[].message < jq_dat.json

How can I add a child node to an existing json file through node js?

I am using ExpressJS to write my application and jsonfile (https://www.npmjs.com/package/jsonfile) to handle json files. I have this following json file:
{
"news": [
{
"id": "1",
"title": "News 1 heading",
"description": "Lorem ipsum dolor sit amet upidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.",
"dateposted": "00188292929"
},
{
"id": "2",
"title": "News 2 heading",
"description": "Lorem ipsum dolor sit amet",
"dateposted": "00188292929"
}
]
}
Now, I want to add another set of news under the "news" node, so that my final json looks like this:
{
"news": [
{
"id": "1",
"title": "News 1 heading",
"description": "Lorem ipsum dolor sit amet upidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.",
"dateposted": "00188292929"
},
{
"id": "2",
"title": "News 2 heading",
"description": "Lorem ipsum dolor sit amet",
"dateposted": "00188292929"
},
{
"id": "3",
"title": "News 3 heading",
"description": "Lorem ipsum dolor sit amet",
"dateposted": "00188292929"
}
]
}
There is an append flag with jsonfile but it appends at the end of the file rather than under a given node. How can I append the data under and existing node? Do, I need to stringify the json, add data and JSONfy it? or there is a more direct way?
Thanks.
You can use Json PUSH to append a json object to a current node. The code would look like this:
var json={
"news": [
{
"id": "1",
"title": "News 1 heading",
"description": "Lorem ipsum dolor sit amet upidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.",
"dateposted": "00188292929"
},
{
"id": "2",
"title": "News 2 heading",
"description": "Lorem ipsum dolor sit amet",
"dateposted": "00188292929"
},
{
"id": "3",
"title": "News 3 heading",
"description": "Lorem ipsum dolor sit amet",
"dateposted": "00188292929"
}
]
};
json.news.push({
"id": "3",
"title": "News 3 heading",
"description": "Lorem ipsum dolor sit amet",
"dateposted": "00188292929"
});
console.log(json);
Jsonfile's append option is referring to opening a file in append mode, in this mode you can only add to the end of the file.
You will need to re-write the entire file using the normal writeFile options. Effectively overwriting the original file.
You can see in the jsonfile code on line 91 (it is a short single file node module) that it simply passes the append flag through to fs.writeFile. I'm not entirely sure when you would use this in all honesty, but I'm assuming it's if you want to maybe output a bunch of documents and then append on some json at the bottom of each.

Facebook Messenger bot, returning multiple messages or payloads

I'm looking to return multiple responses to a user. For instance this might be an image and a text block, or a text block and a list.
So far I've not been able to find a way of doing this, everything I try either results in one of the payloads not displaying or it failing completely.
Here's an example of an attempt at displaying a text block and a list:
{
speech:"myMessage",
displayText:"myMessage",
data:{
facebook:{
"attachment": {
"type": "template",
"payload": {
"template_type": "list",
"top_element_style": "compact",
"elements": [
{
"title": "£10",
"image_url": "http://example.com/example.jpg",
"subtitle": "An amazing t-shirt"
},
{
"title": "£30",
"image_url": "http://example.com/example.jpg",
"subtitle": "Another amazing t-shirt"
},
{
"title": "£40",
"image_url": "http://example.com/example.jpg",
"subtitle": "An amazing t-shirt"
}
]
}
}
}
},
contextOut:[],
source:"webhook"
}
Any ideas on where I'm going wrong?
Each message is separate, but you can send a batch request to the graph API to dispatch all the messages with a single API call:
https://developers.facebook.com/docs/graph-api/making-multiple-requests/

How to do complex document from three tables in json to mongodb using Pentaho

I have three tables in one database.
These tables have a foreign key between they.
The Table1 is master from Table2 and Table 2 is master from Table3.
I want get data values and transform in MongoDB document like this:
{
"_id" : ObjectId("cf3977abf592d19962ff7982"),
"T1Column1" : "Lorem Ipsum",
"T1Column2" : ISODate("2015-11-27T16:04:24.000Z"),
"Table2" : [
{
"T2Column1" : NumberLong(1),
"T2Column2" : "Lorem Ipsum",
"Table3" : [
{
"T3Column1" : "Lorem Ipsum",
"T3Column2" : "Lorem Ipsum"
},
{
"T3Column1" : "Lorem Ipsum",
"T3Column2" : "Lorem Ipsum"
}
]
},
{
"T2Column1" : NumberLong(2),
"T2Column2" : "Lorem Ipsum",
"Table3" : [
{
"T3Column1" : "Lorem Ipsum1",
"T3Column2" : "Lorem Ipsum"
},
{
"T3Column1" : "Lorem Ipsum2",
"T3Column2" : "Lorem Ipsum"
}
]
}
]
}
I already try to use "Mongo document path" in MongoDB Output Step, but don't is possible use "upsert" for subdocuments, as we can see in MongoDB Ouput Document:
How I can to do this using Pentaho Data Integration (PDI)?
try to add modifier options "$addToSet"
To insert into a Mongodb using PDI step MONGODB OUTPUT, the trick is in the 'Mongo Document Path' column. Put a bracket [] in the end of the field path where you want the array (See the data.labels[].id in below screenshot), and use $set as modifier operation.
You can also use a pointer if you want the data into a specific array. Use [n] at the end of field path (see the tags[0], tags[1], tags[2] in below screenshot). Note that the first array begins as 0.