I am getting invalid type ( struct ) error while interacting with the solidity smart contract - ethereum

Here is the Error Message:
Uncaught (in promise) Error: invalid type (argument="type", value="Proof", code=INVALID_ARGUMENT, version=abi/5.7.0)
Here is the solidity function
function createOffer(
address _requestorAddress,
address[] memory _requestedNftAddress,
uint[] memory _requestedNftId,
address[] memory _offeredNftAddress,
uint[] memory _offeredNftId,
Proof[] memory _proofsRequested,
Proof[] memory _proofsOffered
)
Here is the struct
struct Proof {
address nftAddress;
bytes32[] proof;
}
here is the typescript code
const tx = await contractWithSigner.createOffer(
address,
[counter_party_nfts_addresses[0]],
[counter_party_nft_ids[0]],
[my_nft_addresses[0]],
[my_nft_ids[0]],
[
[
counter_party_nfts_addresses[0],
getMerkelProof(counter_party_nfts_addresses).proof,
],
],
[[my_nft_addresses[0], getMerkelProof(my_nft_addresses).proof]]
);
and here is the getMerkleProof function:
const getMerkelProof = (addresses: string[]) => {
const leaf = keccak256(addresses[0]);
const leaves = addresses.map((x: any) => keccak256(x));
const tree = new MerkleTree(leaves, keccak256, { sortPairs: true });
const proof = tree.getProof(leaf, 0);
const root = tree.getRoot().toString("hex");
return {
proof,
root,
};
};
As the struct accepts two arguments 1 nftAddress and 2nd is merkleProof,
i.e
[[address,proof]]
but unable to find the correct way to do it.

Proof[] memory _proofsRequested,
Proof[] memory _proofsOffered
argument types should be array of Proof structs but you are passing array of arrays
[
[
counter_party_nfts_addresses[0],
getMerkelProof(counter_party_nfts_addresses).proof,
],
],
[[my_nft_addresses[0], getMerkelProof(my_nft_addresses).proof]]
instead
[// convert the inner array to obj
{
nftAddress: counter_party_nfts_addresses[0],
proof :getMerkelProof(counter_party_nfts_addresses).proof,
{,
],
[
{
address: my_nft_addresses[0],
proof: getMerkelProof(my_nft_addresses).proof
}
]

Related

How to pass in arguments to a contract's constructor for initializing characters images and attributes

I have a contract like this that has a constructor like this:
constructor(
string[] memory characterNames,
string[] memory characterImageURIs,
uint256[] memory characterHp,
uint256[] memory characterAttackDmg,
string memory bossName,
string memory bossImageURI,
uint256 bossHp,
uint256 bossAttackDamage
) ERC721("Heroes", "HERO") {
for (uint256 i = 0; i < characterNames.length; i += 1) {
defaultCharacters.push(
CharacterAttributes({
characterIndex: i,
name: characterNames[i],
imageURI: characterImageURIs[i],
hp: characterHp[i],
maxHp: characterHp[i],
attackDamage: characterAttackDmg[i]
})
);
CharacterAttributes memory c = defaultCharacters[i];
console.log(
"Done initializing %s w/ HP %s, img %s",
c.name,
c.hp,
c.imageURI
);
}
bigBoss = BigBoss({
name: bossName,
imageURI: bossImageURI,
hp: bossHp,
maxHp: bossHp,
attackDamage: bossAttackDamage
});
console.log(
"Done initializing boss %s w/ HP %s, img %s",
bigBoss.name,
bigBoss.hp,
bigBoss.imageURI
);
_tokenIds.increment();
}
Question
for the characters names and HP and Attack... how should I type them is it in a JSON or in arrays or what?
for the imageURI how can I parse them to the constructor and where as I'm using Pinata IPFS to host my images?
Please if you can help me with this.
1- type for constructor arguments are already defined:
string[] = array of strings
string =string
uint (uint256) = 256 bits unsigned number ranging from 0 to 2²⁵⁶
2- constructor arguments mean, when you create the contract, you have to pass those parameters inorder to initialize the contracts. So you should already have them in hand.
you store the metadata in ipfs: an example would be like this:
{
"name": "name",
"description": "descruption",
"image": "ipfs://cid/name.jpeg"
}
So when you work in front-end, you make a request to ipfs Uri, to get the metadata:
// make a request with axios library
// this will store the above json data inside `metaData`.data
const metaData = await axios.get(tokenUri)
const data=metaData.data
From metadata, you get the imageUri as data.image

Casting fails when deserializing JSON

Please take into account that this question is about Typescript and not vanilla Javascript.
I am trying to deserialize a very simple JSON string into a Typescript object and then casting into the correct type.
After casting at const obj = <FooClass>JSON.parse(s) I would expect the typeof operator to return FooClass. Why the operator still returns object ?
Why does casting here fails? How can I deserialize and still have access to somefunc ?
Example code:
class FooClass {
public baz = 0
public somefunc() {
return this.baz * 2
}
}
const jsonData = {
baz: 1234,
}
test('deserialize example', () => {
const s = JSON.stringify(jsonData)
const obj = <FooClass>JSON.parse(s) // Cast here
console.log(typeof obj) // typeof still returns object
console.log(obj)
console.log(obj.somefunc())
})
Output:
console.log
object
at Object.<anonymous> (tests/deserialize.test.ts:15:11)
console.log
{ baz: 1234 }
at Object.<anonymous> (tests/deserialize.test.ts:17:11)
TypeError: obj.somefunc is not a function
In typescript you can cast any (return type of JSON.parse) to anything. The responsibility of ensuring if the casting is "correct", and the casted value indeed matches the type it's being casted to is yours.
Casting is only telling the type checker how to treat that value from the point of casting.
Turning that object to an instance of your class is also your responsibility. You could however do something like this:
type Foo = {
baz: number
}
class FooClass {
public baz: number = 0
constructor(input: Foo) {
this.baz = input.baz
}
public somefunc() {
return this.baz * 2
}
}
const rawFoo = JSON.parse(s) as Foo
const fooClassInstance = new FooClass(rawFoo)
// ready to be used as an instance of FooClass
Playground
For completion, I copy here a solution that I find both efficient and clean:
test('casting fails example', () => {
const s = JSON.stringify(jsonData)
const obj = Object.assign(new FooClass(), JSON.parse(s))
console.log(typeof obj)
console.log(obj)
console.log(obj.somefunc())
})
This could later be improved using generics
function deserializeJSON<T>(c: { new (): T }, s: string): T {
return Object.assign(new c(), JSON.parse(s))
}
const obj = deserializeJSON(FooClass, s)

TypeScript: parse raw data into interface

In Typescript (specifically React with hooks), I'm trying to parse some URL hash data from an OAuth callback and utilize it in my components.
I'm able to parse my data by calling window.location.hash
const hash = window.location.hash.substr(1);
const oauthData = hash.split('&')
.map(v => v.split('='))
.reduce((pre, [key, value]) => (
key == 'scope' ? {...pre, [key]: value.split('+')} : {...pre, [key]: value}
), {});
{
"access_token": "eyJhbGciOiJIUzI1NiJ9.eyJhdWQiOiIyMkJCWVkiLCJzdWIiOiI1TkZCTFgiLCJpc3MiOiJGaXRiaXQiLCJ0eXAiOiJhY2Nlc3NfdG9rZW4iLCJzY29wZXMiOiJyc29jIHJhY3QgcnNldCBybG9jIHJ3ZWkgcmhyIHJudXQgcnBybyByc2xlIiwiZXhwIjoxNTc4NTQ3NzkxLCJpYXQiOjE1NzgyMDQzOTF9.qLl0L5DthFu3NxeLodotPsPljYMWgw1AvKj2_i6zilU",
"user_id": "5NFBLX",
"scope": [
"heartrate",
"nutrition",
"location",
"sleep",
"activity",
"weight",
"social",
"profile",
"settings"
],
"token_type": "Bearer",
"expires_in": "343400"
}
Awesome! Now I want to pass all this information into my component and this is where things get a little haywire and I can't figure out the way to get this data into my component because I break type-safety.
My component is built like this
export interface IOAuthProps {
accessToken: string
userID: string
scope: string[]
expiresIn: number
}
const OAuthFun: React.FC<IOAuthProps> = (props) => {
const [ac] = useState(props.accessToken)
return (
<div>
access token = {ac}
</div>
)
}
export default OAuthFun;
I've tried these permutations of what seem like the same thing (I'll omit the additional properties for brevity):
Nonworking example: can't even index oauthData because it is of type {}
<OAuthFun accessToken={oauthData['access_token'] as string}/>
Since I couldn't even index the raw json object as a dictionary, I figured I needed to create some type safety on the object getting constructed:
const oauthData = hash.split('&')
.map(v => v.split('='))
.reduce((pre, [key, value]) => (
key == 'scope' ? {...pre, [key]: value.split('+')} : {...pre, [key]: value}
), {access_token: String, user_id: String, scope: [], expires_in: Number});
However, this breaks the expression inside my reduce call: No overload matches this call. Which leads me to believe that I need to have some more concise manor of parsing the raw data, but I'm really unsure of how to do that.
I imagine I could cast it directly from raw data, to the interface but the raw data has underscore_casing instead of camelCasing for its naming conventions. Plus it just side-steps the problem without addressing it if I change the casing instead of learning how to normalize the data.
What is the correct approach to get raw data into the interface directly?
Based on the comments, I was able to piece together this solution.
import React from 'react';
export interface IOAuthProps {
accessToken: string
userID: string
scope: string[]
expiresIn: number
}
export function ParseOAuthProperties(rawHashProperties: string): IOAuthProps {
const rawData = rawHashProperties.substr(1)
.split('&')
.map(v => v.split('='))
.reduce((pre, [key, value]) => (
{...pre, [key]: value}
), {access_token: "", user_id: "", scope: "", expires_in: ""});
const normalizedData: IOAuthProps = {
accessToken: rawData.access_token,
userID: rawData.user_id,
scope: rawData.scope.split('+'),
expiresIn: Number(rawData.expires_in),
}
return normalizedData;
}
const OAuthFun: React.FC<IOAuthProps> = (props) => {
return (
<div>
<div>access token = {props.accessToken}</div>
<div>user id = {props.userID}</div>
<div>scope = {props.scope}</div>
<div>expires in = {props.expiresIn}</div>
</div>
)
}
export default OAuthFun;
Now I can take my method, which encapsulates the normalization and returns the interface, and use it from my parent component:
import React from 'react';
import OAuthFun, {ParseOAuthProperties, IOAuthProps} from './OAuthFun'
const App: React.FC = () => {
const props: IOAuthProps = ParseOAuthProperties(window.location.hash)
return (
<div className="App">
{/* Note, you can pass the interface wholesale with the spread operator */}
<OAuthFun {...props} />
</div>
);
}
export default App;

Why does the value of a mapping(bytes32 => uint8) seem to change between transaction calls?

The contract code:
pragma solidity ^0.4.10;
contract Test {
mapping (bytes32 => uint8) private dict;
function Test() {}
function Set(bytes32 key, uint8 val) returns (uint8) {
dict[key] = val;
return dict[key];
}
function Get(bytes32 key) returns (uint8) {
return dict[key];
}
}
and I run on testrpc:
contract_file = 'test/test.sol'
contract_name = ':Test'
Solc = require('solc')
Web3 = require('web3')
web3 = new Web3(new Web3.providers.HttpProvider("http://localhost:8545"));
source_code = fs.readFileSync(contract_file).toString()
compiledContract = Solc.compile(source_code)
abi = compiledContract.contracts[contract_name].interface
bytecode = compiledContract.contracts[contract_name].bytecode;
ContractClass = web3.eth.contract(JSON.parse(abi))
contract_init_data = {
data: bytecode,
from: web3.eth.accounts[0],
gas: 1000000,
}
deployed_contract = ContractClass.new(contract_init_data)
deployed_contract.Set.call("akey", 5)
deployed_contract.Get.call("akey")
bizarrely, this is the output I get in the node terminal:
> deployed_contract.Set.call("akey", 5)
{ [String: '5'] s: 1, e: 0, c: [ 5 ] }
> deployed_contract.Get.call("akey")
{ [String: '0'] s: 1, e: 0, c: [ 0 ] }
This result is the outcome of a long debugging session... what is going on? It seems distinctly like something is broken here, but I followed a tutorial which did something very similar which seems to work...
also:
> Solc.version()
'0.4.11+commit.68ef5810.Emscripten.clang'
Try this deployed_contract.Set("akey", 5) without .call
Because .call on your setter method
executes a message call transaction, which is directly executed in the
VM of the node, but never mined into the blockchain.
doc
The value of the map does not change. I bet 0 is the default value when nothing is set
By the way try using the online compiler you will quickly see if the problem is on the contract or the way you interact with it with web3.

Node JS: Make a flat json from a tree json

I was writing a node.js script to combine all the json files in a directory and store the result as a new json file. I tried do the job to a great extent but it has few flaws.
A.json
[
{
"id": "addEmoticon1",
"description": "Message to greet the user.",
"defaultMessage": "Hello, {name}!"
},
{
"id": "addPhoto1",
"description": "How are youu.",
"defaultMessage": "How are you??"
}
]
B.json
[
{
"id": "close1",
"description": "Close it.",
"defaultMessage": "Close!"
}
]
What I finally need is:
result.json
{
"addEmoticon1": "Hello, {name}!",
"addPhoto1": "How are you??",
"close1": "Close!"
}
I wrote a node.js script:
var fs = require('fs');
function readFiles(dirname, onFileContent, onError) {
fs.readdir(dirname, function(err, filenames) {
if (err) {
onError(err);
return;
}
filenames.forEach(function(filename) {
fs.readFile(dirname + filename, 'utf-8', function(err, content) {
if (err) {
onError(err);
return;
}
onFileContent(filename, content);
});
});
});
}
var data = {};
readFiles('C:/node/test/', function(filename, content) {
data[filename] = content;
var lines = content.split('\n');
lines.forEach(function(line) {
var parts = line.split('"');
if (parts[1] == 'id') {
fs.appendFile('result.json', parts[3]+': ', function (err) {});
}
if (parts[1] == 'defaultMessage') {
fs.appendFile('result.json', parts[3]+',\n', function (err) {});
}
});
}, function(err) {
throw err;
});
It extracts the 'id' and 'defaultMessage' but is not able to append correctly.
What I get:
result.json
addEmoticon1: addPhoto1: Hello, {name}!,
close1: How are you??,
Close!,
This output is different every time I run my script.
Aim 1: Surround items in double quotes,
Aim 2: Add curly braces at the top and at the end
Aim 3: No comma at the end of last element
Aim 4: Same output every time I run my script
I'll start with the finished solution...
There's a big explanation at the end of this answer. Let's try to think big-picture for a little bit first tho.
readdirp('.')
.fmap(filter(match(/\.json$/)))
.fmap(map(readfilep))
.fmap(map(fmap(JSON.parse)))
.fmap(concatp)
.fmap(flatten)
.fmap(reduce(createMap)({}))
.fmap(data=> JSON.stringify(data, null, '\t'))
.fmap(writefilep(resolve(__dirname, 'result.json')))
.then(filename=> console.log('wrote results to %s', filename), err=>console.error(err));
Console output
wrote results to /path/to/result.json
result.json (I added a c.json with some data to show that this works with more than 2 files)
{
"addEmoticon1": "Hello, {name}!",
"addPhoto1": "How are you??",
"close1": "Close!",
"somethingelse": "Something!"
}
Implementation
I made Promise-based interfaces for readdir and readFile and writeFile
import {readdir, readFile, writeFile} from 'fs';
const readdirp = dir=>
new Promise((pass,fail)=>
readdir(dir, (err, filenames) =>
err ? fail(err) : pass(mapResolve (dir) (filenames))));
const readfilep = path=>
new Promise((pass,fail)=>
readFile(path, 'utf8', (err,data)=>
err ? fail(err) : pass(data)));
const writefilep = path=> data=>
new Promise((pass,fail)=>
writeFile(path, data, err=>
err ? fail(err) : pass(path)));
In order to map functions to our Promises, we needed an fmap utility. Notice how we take care to bubble errors up.
Promise.prototype.fmap = function fmap(f) {
return new Promise((pass,fail) =>
this.then(x=> pass(f(x)), fail));
};
And here's the rest of the utilities
const fmap = f=> x=> x.fmap(f);
const mapResolve = dir=> map(x=>resolve(dir,x));
const map = f=> xs=> xs.map(x=> f(x));
const filter = f=> xs=> xs.filter(x=> f(x));
const match = re=> s=> re.test(s);
const concatp = xs=> Promise.all(xs);
const reduce = f=> y=> xs=> xs.reduce((y,x)=> f(y)(x), y);
const flatten = reduce(y=> x=> y.concat(Array.isArray(x) ? flatten (x) : x)) ([]);
Lastly, the one custom function that does your work
const createMap = map=> ({id, defaultMessage})=>
Object.assign(map, {[id]: defaultMessage});
And here's c.json
[
{
"id": "somethingelse",
"description": "something",
"defaultMessage": "Something!"
}
]
"Why so many little functions ?"
Well despite what you may think, you have a pretty big problem. And big problems are solved by combining several small solutions. The most prominent advantage of this code is that each function has a very distinct purpose and it will always produce the same results for the same inputs. This means each function can be used other places in your program. Another advantage is that smaller functions are easier to read, reason with, and debug.
Compare all of this to the other answers given here; #BlazeSahlen's in particular. That's over 60 lines of code that's basically only usable to solve this one particular problem. And it doesn't even filter out non-JSON files. So the next time you need to create a sequence of actions on reading/writing files, you'll have to rewrite most of those 60 lines each time. It creates lots of duplicated code and hard-to-find bugs because of exhausting boilerplate. And all that manual error-handling... wow, just kill me now. And he/she thought callback hell was bad ? haha, he/she just created yet another circle of hell all on his/her own.
All the code together...
Functions appear (roughly) in the order they are used
import {readdir, readFile, writeFile} from 'fs';
import {resolve} from 'path';
// logp: Promise<Value> -> Void
const logp = p=> p.then(x=> console.log(x), x=> console.err(x));
// fmap : Promise<a> -> (a->b) -> Promise<b>
Promise.prototype.fmap = function fmap(f) {
return new Promise((pass,fail) =>
this.then(x=> pass(f(x)), fail));
};
// fmap : (a->b) -> F<a> -> F<b>
const fmap = f=> x=> x.fmap(f);
// readdirp : String -> Promise<Array<String>>
const readdirp = dir=>
new Promise((pass,fail)=>
readdir(dir, (err, filenames) =>
err ? fail(err) : pass(mapResolve (dir) (filenames))));
// mapResolve : String -> Array<String> -> Array<String>
const mapResolve = dir=> map(x=>resolve(dir,x));
// map : (a->b) -> Array<a> -> Array<b>
const map = f=> xs=> xs.map(x=> f(x));
// filter : (Value -> Boolean) -> Array<Value> -> Array<Value>
const filter = f=> xs=> xs.filter(x=> f(x));
// match : RegExp -> String -> Boolean
const match = re=> s=> re.test(s);
// readfilep : String -> Promise<String>
const readfilep = path=>
new Promise((pass,fail)=>
readFile(path, 'utf8', (err,data)=>
err ? fail(err) : pass(data)));
// concatp : Array<Promise<Value>> -> Array<Value>
const concatp = xs=> Promise.all(xs);
// reduce : (b->a->b) -> b -> Array<a> -> b
const reduce = f=> y=> xs=> xs.reduce((y,x)=> f(y)(x), y);
// flatten : Array<Array<Value>> -> Array<Value>
const flatten = reduce(y=> x=> y.concat(Array.isArray(x) ? flatten (x) : x)) ([]);
// writefilep : String -> Value -> Promise<String>
const writefilep = path=> data=>
new Promise((pass,fail)=>
writeFile(path, data, err=>
err ? fail(err) : pass(path)));
// -----------------------------------------------------------------------------
// createMap : Object -> Object -> Object
const createMap = map=> ({id, defaultMessage})=>
Object.assign(map, {[id]: defaultMessage});
// do it !
readdirp('.')
.fmap(filter(match(/\.json$/)))
.fmap(map(readfilep))
.fmap(map(fmap(JSON.parse)))
.fmap(concatp)
.fmap(flatten)
.fmap(reduce(createMap)({}))
.fmap(data=> JSON.stringify(data, null, '\t'))
.fmap(writefilep(resolve(__dirname, 'result.json')))
.then(filename=> console.log('wrote results to %s', filename), err=>console.error(err));
Still having trouble following along?
It's not easy to see how these things work at first. This is a particularly squirrely problem because the data gets nested very quickly. Thankfully that doesn't mean our code has to be a big nested mess just to solve the problem ! Notice the code stays nice and flat even when we're dealing with things like a Promise of an Array of Promises of JSON...
// Here we are reading directory '.'
// We will get a Promise<Array<String>>
// Let's say the files are 'a.json', 'b.json', 'c.json', and 'run.js'
// Promise will look like this:
// Promise<['a.json', 'b.json', 'c.json', 'run.js']>
readdirp('.')
// Now we're going to strip out any non-JSON files
// Promise<['a.json', 'b.json', 'c.json']>
.fmap(filter(match(/\.json$/)))
// call `readfilep` on each of the files
// We will get <Promise<Array<Promise<JSON>>>>
// Don't freak out, it's not that bad!
// Promise<[Promise<JSON>, Promise<JSON>. Promise<JSON>]>
.fmap(map(readfilep))
// for each file's Promise, we want to parse the data as JSON
// JSON.parse returns an object, so the structure will be the same
// except JSON will be an object!
// Promise<[Promise<Object>, Promise<Object>, Promise<Object>]>
.fmap(map(fmap(JSON.parse)))
// Now we can start collapsing some of the structure
// `concatp` will convert Array<Promise<Value>> to Array<Value>
// We will get
// Promise<[Object, Object, Object]>
// Remember, we have 3 Objects; one for each parsed JSON file
.fmap(concatp)
// Your particular JSON structures are Arrays, which are also Objects
// so that means `concatp` will actually return Promise<[Array, Array, Array]
// but we'd like to flatten that
// that way each parsed JSON file gets mushed into a single data set
// after flatten, we will have
// Promise<Array<Object>>
.fmap(flatten)
// Here's where it all comes together
// now that we have a single Promise of an Array containing all of your objects ...
// We can simply reduce the array and create the mapping of key:values that you wish
// `createMap` is custom tailored for the mapping you need
// we initialize the `reduce` with an empty object, {}
// after it runs, we will have Promise<Object>
// where Object is your result
.fmap(reduce(createMap)({}))
// It's all downhill from here
// We currently have Promise<Object>
// but before we write that to a file, we need to convert it to JSON
// JSON.stringify(data, null, '\t') will pretty print the JSON using tab to indent
// After this, we will have Promise<JSON>
.fmap(data=> JSON.stringify(data, null, '\t'))
// Now that we have a JSON, we can easily write this to a file
// We'll use `writefilep` to write the result to `result.json` in the current working directory
// I wrote `writefilep` to pass the filename on success
// so when this finishes, we will have
// Promise<Path>
// You could have it return Promise<Void> like writeFile sends void to the callback. up to you.
.fmap(writefilep(resolve(__dirname, 'result.json')))
// the grand finale
// alert the user that everything is done (or if an error occurred)
// Remember `.then` is like a fork in the road:
// the code will go to the left function on success, and the right on failure
// Here, we're using a generic function to say we wrote the file out
// If a failure happens, we write that to console.error
.then(filename=> console.log('wrote results to %s', filename), err=>console.error(err));
All done !
Assumed files is list of arrays; [a, b, ...];
var res = {};
files.reduce((a, b) => a.concat(b), []).forEach(o => res[o.id] = o.defaultMessage);
But you need not to get all files at once.
Just add this code to onFileContent callback.
JSON.parse(fileContent).forEach(o => res[o.id] = o.defaultMessage);
Also, you should to add any final callback to your readFiles.
And in this callback:
fs.writeFile('result.json', JSON.stringify(res));
So, final solution for you:
var fs = require('fs');
function task(dir, it, cb) {
fs.readdir(dir, (err, names) => {
if (err) return cb([err]);
var errors = [], c = names.length;
names.forEach(name => {
fs.readFile(dir + name, 'utf-8', (err, data) => {
if (err) return errors.push(err);
try {
it(JSON.parse(data)); // We get a file data!
} catch(e) {
errors.push('Invalid json in ' + name + ': '+e.message);
}
if (!--c) cb(errors); // We are finish
});
});
});
}
var res = {};
task('C:/node/test/', (data) => data.forEach(o => res[o.id] = o.defaultMessage), (errors) => {
// Some files can be wrong
errors.forEach(err => console.error(err));
// But we anyway write received data
fs.writeFile('C:/node/test/result.json', JSON.stringify(res), (err) => {
if (err) console.error(err);
else console.log('Task finished. see results.json');
})
});
this should do it once you have your json in variables a and b:
var a = [
{
"id": "addEmoticon1",
"description": "Message to greet the user.",
"defaultMessage": "Hello, {name}!"
},
{
"id": "addPhoto1",
"description": "How are youu.",
"defaultMessage": "How are you??"
}
];
var b = [
{
"id": "close1",
"description": "Close it.",
"defaultMessage": "Close!"
}
];
var c = a.concat(b);
var res = []
for (var i = 0; i < c.length; i++){
res[ c[i].id ] = c[i].defaultMessage;
}
console.log(res);
Here's my solution:
function readFiles(dirname, onFileContent, onError) {
fs.readdir(dirname, function(err, filenames) {
/**
* We'll store the parsed JSON data in this array
* #type {Array}
*/
var fileContent = [];
if (err) {
onError(err);
} else {
filenames.forEach(function(filename) {
// Reading the file (synchronously) and storing the parsed JSON output (parsing from string to JSON object)
var jsonObject = JSON.parse(fs.readFileSync(dirname + filename, 'utf-8'));
// Pushing the parsed JSON output into array
fileContent.push(jsonObject);
});
// Calling the callback
onFileContent(fileContent);
}
});
}
readFiles('./files/',function(fileContent) {
/**
* We'll store the final output object here
* #type {Object}
*/
var output = {};
// Loop over the JSON objects
fileContent.forEach(function(each) {
// Looping within each object
for (var index in each) {
// Copying the `id` as key and the `defaultMessage` as value and storing in output object
output[each[index].id] = each[index].defaultMessage;
}
});
// Writing the file (synchronously) after converting the JSON object back to string
fs.writeFileSync('result.json', JSON.stringify(output));
}, function(err) {
throw err;
});
Notable difference is that I've not used the asynchronous readFile and writeFile functions as they'd needlessly complicate the example. This example is meant to showcase the use of JSON.parse and JSON.stringify to do what OP wants.
UPDATE:
var fs = require('fs');
function readFiles(dirname, onEachFilename, onComplete) {
fs.readdir(dirname, function(err, filenames) {
if (err) {
throw err;
} else {
// Prepending the dirname to each filename
filenames.forEach(function(each, index, array) {
array[index] = dirname + each;
});
// Calling aync.map which accepts these parameters:
// filenames <-------- array of filenames
// onEachFilename <--- function which will be applied on each filename
// onComplete <------- function to call when the all elements of filenames array have been processed
require('async').map(filenames, onEachFilename, onComplete);
}
});
}
readFiles('./files/', function(item, callback) {
// Read the file asynchronously
fs.readFile(item, function(err, data) {
if (err) {
callback(err);
} else {
callback(null, JSON.parse(data));
}
});
}, function(err, results) {
/**
* We'll store the final output object here
* #type {Object}
*/
var output = {};
if (err) {
throw err;
} else {
// Loop over the JSON objects
results.forEach(function(each) {
// Looping within each object
for (var index in each) {
// Copying the `id` as key and the `defaultMessage` as value and storing in output object
output[each[index].id] = each[index].defaultMessage;
}
});
// Writing the file (synchronously) after converting the JSON object back to string
fs.writeFileSync('result.json', JSON.stringify(output));
}
});
This is a simple asynchronous implementation of the same, using readFile. For more information, async.map.