How to updateItem for Map dataType in dynamoDB - json

I am trying to use Map datatype in dynamodb to insert my JSON object. The JSON I am getting from external API is bit long and got nested Array of objects in it. (I am using nodejs.)
{
"a": "foo",
"b": "foo1",
"c": [{
"boo": 10,
"boo1": 15
}, {
"boo": 19,
"boo1": 45
}, {
"boo": 11,
"boo1": 25
}]
}
From the research i made so far it looks like i have to specify types for every single element in the json i am trying to insert/update. It make it harder since in my case the json could have anything.
If anyone experienced the same issue and know any solution for it please let me know.

You need to specify the exact types for every value only if you use the low-level AmazonDB API.
But you can use an AWS SDK which makes things much easier and you can use the json notation directly.
I didn't use node.js SDK (have experience with python SDK), but looking at the examples, this is true for node.js too.
Check this one:
var AWS = require("aws-sdk");
AWS.config.update({
region: "us-west-2",
endpoint: "http://localhost:8000"
});
var docClient = new AWS.DynamoDB.DocumentClient();
var table = "Movies";
var year = 2015;
var title = "The Big New Movie";
var params = {
TableName:table,
Item:{
"year": year,
"title": title,
"info":{
"plot":"Something happens."
}
}
};
console.log("Adding a new item...");
docClient.put(params, function(err, data) {
if (err) {
console.error("Unable to add item. Error JSON:", JSON.stringify(err, null, 2));
} else {
console.log("Added item:", JSON.stringify(data, null, 2));
}
});
Here the Item inside the params passed into the docClient.put call is just a plain javascript object.

It's necessary for dynamodb to know what type of data your are inserting/updating. So dynamodb or any db allows only
datatype that is specific to that column. If you looking to parse json(with Jackson api) to specific type and insert. Below is the code
JsonParser parser = new JsonFactory().createParser(new File(
"target/resources/json_file.json"));
JsonNode rootNode = new ObjectMapper().readTree(parser);
Iterator<JsonNode> iter = rootNode.iterator();
ObjectNode currentNode;
while (iter.hasNext()) {
currentNode = (ObjectNode) iter.next();
table.putItem(new Item().withPrimaryKey("a", currentNode.path("a").asText(), "b",
currentNode.path("b").asText()).withJSON("c",currentNode.path("c").toString()));
}

Related

How to find the length of the JSON in flutter

I'm using API as a backed which returns a JSON file. I need to use this JSON data for a dropdown button in flutter the JSON data That I recieved is here.
[
{
"store": "AMAZON"
},
{
"store": "FLIPKART"
},
{
"store": "WALMART"
},
{
"store": "ALIBABA"
},
]
The length of the JSON may vary time to time.
In Flutter I decode this JSON with this line.
stores = json.decode(response.body);
print(stores[0]['store']);
I need to know the length of the json to use those data in dropdown button. If there is someother way to use json directly to dropdown also suggest me.
You can get the length of json like this:
stores = json.decode(response.body);
final length = stores.length;
And get the list of items to use in a drop down widget:
List<String> items = [];
stores.forEach((s)=> items.add(s["store"]));
new DropdownButton<String>(
items: items.map((String value) {
return new DropdownMenuItem<String>(
value: value,
child: new Text(value),
);
}).toList(),
onChanged: (_) {},
)
You can always find the length of a JSON object by passing it through the map,
stores = json.decode(response.body);
(stores as Map<String, dynamic>).length
let me know this works or not.
After decoding your json into stores, your stores is a list of json objects, so now you can simply get the length of the stores like this.
stores = json.decode(response.body);
int len = stores.length;
print('length = $len');

How to insert new object to the first index on array json api using Flutter

I'm developing an app using Flutter so I have data from internet and I have json list array It depends on the news showing first Powell I need when has new object instar to the index 0 How can I do this ???
this my function Call API
Future<List<Hall>> fetchHall() async{
String token =await read();
final String url ='listhall';
String Fullurl=Serveurl+url;
var response =await http.get(Fullurl,
headers: {HttpHeaders.connectionHeader:"application/josn ",HttpHeaders.authorizationHeader:"Bearer $token"}
);
print('Token :${token}');
if (response.statusCode==200){
List<Hall> list =[json.decode(response.body)];
list.insert(0, Hall());
return list.map((m) => new Hall.fromjson(json.decode(response.body))).toList();
}else{
print(response.statusCode);
throw Exception('Failed to load data from Server.');
}
my json
[
{
"id": 42,
"image_path": "https://fathomless-brushlands-95996.herokuapp.com/Imaga_halls/1583971227.jpg",
"hall_details": "*********",
},
{
"id": 52,
"image_path": "https://fathomless-brushlands-95996.herokuapp.com/Imaga_halls/1584390666.jpg",
"hall_details": " Could anyone please help me with this. I am stuck here. Have been trying different methods but none working. Thank you",
},
{
"id": 62,
"image_path": "https://fathomless-brushlands-95996.herokuapp.com/Imaga_halls/1584453580.jpg",
"hall_details": "Could anyone please help me with this. I am stuck here. Have been trying different methods but none working. Thank you.",
},
]
The operation is ok, but it's necessary to convert the json data into a dart object first, try the next:
final data = json.decode(response.body);
List<Hall> list = List<Hall>.from(data.map((rawHall) => Hall.fromjson(rawHall)));
list.insert(0, Hall());
print(list);

How to post into the MongoDb collection using postman

I am trying to insert data into the MongoDb collection using postman. How would I approach this; hardcoding the data in JSON format works, but I want to be able to insert using postman in JSON format.
This is the code that allowed me to enter directly, using the post function in postman, with no input:
public async void Insert([FromBody]string value)
{
var client = new MongoClient();
var dbs = client.GetDatabase("test");
var collection = dbs.GetCollection<BsonDocument> ("restaurants");
BsonArray dataFields = new BsonArray { new BsonDocument {
{ "ID" , ObjectId.GenerateNewId()}, { "_id", ""}, } };
var document = new BsonDocument
{
{"borough",value },
{"cuisine","Korean"},
{"name","Bellaa"},
{"restaurant_id","143155"}
};
await collection.InsertOneAsync(document);
}
You can send it as raw data. You will set the post type to application/json.
This comes From the docs.

Reformat API data to Ember friendly array of objects | Ember data with unconventional endpoint

This is a question about molding some API data to fit some needs. I've heard it called "munging." I guess the heart of if is really re-formatting some JSON, but It would be ideal to do it the Ember data way...
I'm getting this data in an Emberjs setting - but it shouldn't really matter - ajax, ic-ajax, fetch, etc... I'm getting some data:
...
model: function() {
var libraryData = ajax({
url: endPoint,
type: 'GET',
dataType: 'jsonp'
});
// or most likely the ember-data way
// this.store.findAll(...
console.log(libraryData);
return libraryData;
}
...
The URL is getting me something like this:
var widgetResults = {
"settings": {
"amazonchoice":null,
"show":{
"showCovers":null,
"showAuthors":null
},
"style":null,
"domain":"www.librarything.com",
"textsnippets":{
"by":"by",
"Tagged":"Tagged","readreview":"read review","stars":"stars"
}
},
"books":{
"116429012":{
"book_id":"116429012",
"title":"The Book of Three (The Chronicles of Prydain Book 1)",
"author_lf":"Alexander, Lloyd",
"author_fl":"Lloyd Alexander",
// ...
The promise that is actually returned is slightly different.
My goal is to get to those books and iterate over them - but in my case it wants an array. that #each loops over must be an Array. You passed {settings: [object Object], books: [object Object]} - which makes sense.
In and ideal API the endpoint would be / http:/site.com/api/v2/books
and retrieve the data in this format:
{
"book_id":"116428944",
"title":"The Phantom Tollbooth",
"author_lf":"Juster, Norton",
"author_fl":"Norton Juster",
...
},
{
"book_id":"116428944",
"title":"The Phantom Tollbooth",
"author_lf":"Juster, Norton",
"author_fl":"Norton Juster",
...
},
{
... etc.
I would expect to just drill down with dot notation, or to use some findAll() but I'm just shooting in the dark. Librarything in specific is almost done with their new API - but suggest that I should be able to loop through this data and reformat it in an ember friendly way. I have just looped through and returned an array in this codepen - but haven't had luck porting it... something about the returned promise is mysterious to me.
How should I go about this? am I pointed in the wrong direction?
I've tried using the RESTAdapter - but didn't have much luck dealing with more unconventional endpoints.
Custom Adapters / Serializers ?
this article just appeared: "Fit any backend into ember with custom adapters and serializers
Full url with endpoint in question
model (just title to test)
import DS from 'ember-data';
export default DS.Model.extend({
title: DS.attr('string')
});
route ( per #Artych )
export default Ember.Route.extend({
model() {
$.ajax({
url: endPoint,
type: 'GET',
dataType: 'jsonp'
}).then((widgetResults) => {
// modify payload to RESTAdapter
var booksObj = widgetResults.books;
var booksArray = Object.keys(booksObj).map((element) => {
var book = booksObj[element];
book.id = book.book_id;
delete book.book_id;
return book;
});
console.log(booksArray);
this.store.pushPayload({books: booksArray});
});
return this.store.peekAll('book');
}
});
template
{{#each model as |book|}}
<article>
<h1>{{book.title}}</h1>
</article>
{{/each}}
There is straightforward solution to process your payload in model():
Define book model.
Process your payload in model() hook:
model() {
$.ajax({
url: endPoint,
type: 'GET',
dataType: 'jsonp'
}).then((widgetResults) => {
// modify payload to RESTAdapter
var booksObj = widgetResults.books;
var booksArray = Object.keys(booksObj).map((element) => {
var book = booksObj[element];
book.id = book.book_id;
delete book.book_id;
return book;
});
this.store.pushPayload({books: booksArray});
});
return this.store.peekAll('book');
}
Iterate model in controller or template as usual.
Working jsbin:
ember 1.13
ember 2.0
You want a custom serializer to translate the data from that format into JSON-API. JSON-API is an extremely well thought-out structure, so well in fact that ember-data has adopted it as the default format used internally. Some of the benefits are that it defines a structure for objects themselves, separating attributes from relationships; a means for embedding or including associated resources; defines a place for errors and other metadata.
In short, for whatever you're trying to do, JSON-API probably has already done a lot of the decision-making for you. And, by subclassing from DS.JSONSerializer, you'll be mapping right into the format that ember-data needs.
To do this, you create a custom adapter using ember generate serializer books:
// app/serializers/book.js
import DS from 'ember-data';
export default DS.JSONSerializer.extend({
normalizeResponse(store, primaryModelClass, payload, id, requestType) {
// payload will contain your example object
// You should return a JSON-API document
const doc = {};
// ...
return doc;
}
});
For your example data, the output of the normalization should look something like this:
{
"data": [
{
"type": "books",
"id": 116429012,
"attributes": {
"title": "The Book of Three (The Chronicles of Prydain Book 1)",
"author_lf": "Alexander, Lloyd",
"author_fl": "Lloyd Alexander"
}
},
{
"type": "books",
"id": 1234,
"attributes": {
}
}
],
"meta": {
"settings": {
"amazonchoice":null,
"show":{
"showCovers":null,
"showAuthors":null
},
"style":null,
"domain":"www.librarything.com",
"textsnippets":{
"by":"by",
"Tagged":"Tagged","readreview":"read review","stars":"stars"
}
}
}
};
Then do
this.get('store').findAll('books').then((books) => {
const meta = books.get('meta');
console.log(meta.settings.domain);
books.forEach((book) => {
console.log(book.get('title'));
});
});
Code is not tested, but hopefully it gets you started.
Define settings and book models. Arrange for the API to respond to the endpoint /books returning data in the format:
{
settings: { ... },
books: [
{
id: xxx,
...
}
]
}
Retrieve the data in the model hook with this.store.findAll('book').
Iterate over the books in your template with {{#each model as |book|}}.

Transform file full-content nodejs

I am building a website with NodeJS which asks for a data-file to be uploaded, then I have to check and (if needed) transform the content of this file.
The source file is a JSON or XML configuration file, I just need to ensure its content is well-formatted for the rest of the application.
I am wondering what would be the best way to check the global file's content.
I usually manipulate files with Streams, but I am not sure if it allows me to do what I want...
The source file has a similar format :
{
"parameters": [{
"name": "name",
"settings": {
"key": "value"
}
}],
"data": [{
"id": "1",
"label": "label 1",
}, {
"id": "2",
"label": "label 2"
}]
}
What I need to do is to parse the file's content, and check if the file-format is good ;
Otherwise transform the file to a well-formatted one :
// Read the file content
var parameters = [],
data = [],
p = parameters.length,
d = data.length;
// Loop on the parameters, and check the format
while (p--) {
var parameter = parameters[p];
if (name in parameter && typeof parameter.name == "string") {
// Add several rules
parameters.push(parameter);
}
}
// Do a similar control for "data".
// Then save the well-formatted parameters and data into a file
The thing is that the uploaded file might be very large...
Can I perform it with transform Streams ? Because I need to check the full-content of the file as a object !
How can I be sure a Stream transformer won't give a chunk with just a part of data, for instance ?
I'd first try something like this:
var fs = require('fs');
try {
var inputFile = require('./config.json');
} catch (e) {
console.log(e.message); // Do proper error handling.
}
// Loop on the parameters, and check the format
if (!'parameters' in inputFile) {
console.log("Got a problem here!");
}
var parameters = [];
var p = inputFile['parameters'].length;
while(p--) {
var parameter = inputFile['parameters'][p];
if ('name' in parameter && typeof parameter.name == 'string') {
// Add several rules
parameters.push(parameter);
}
}
// Do a similar control for "data".
var data = inputFile['data'];
// More code needed here...
// Then save the well-formatted parameters and data into a file
fs.writeFileSync('./data.json', JSON.stringify({parameters: parameters, data: data}, null, 4) , 'utf-8');
If you are dealing with mammoth files that cannot fit into memory, you've got a HUGELY more difficult task ahead of you. In general, you cannot guarantee that a partial read will give you enough of the JSON to parse anything out of (e.g. {"data": ["<FOUR PETABYTE STRING>"]}).