I am running a private Ethereum network. I do use https://aws.amazon.com/blockchain/templates/
The entire setup has been done. Things look setup properly on AWS. Now, I am trying to create the account and retrieve all those accounts. For that, I am using methods as below.
Web3Service.js
var Web3 = require('web3');
var web3 = new Web3(new Web3.providers.HttpProvider(process.env.NETWORK_URL));
exports.getAccounts = function () {
return web3.eth.getAccounts();
};
exports.createAccount = function () {
return web3.eth.accounts.create();
};
app.js
var newAccount = await web3Service.createAccount();
console.log('newAccount ', newAccount);
var accounts = await web3Service.getAccounts();
console.log('accounts ', accounts);
I am not facing any errors at all. But in the response of the web3Service.getAccounts(); it's always empty [] array.
I have verified the Etherium setup. All nodes working perfectly.
You can find the entire codebase here : blockchain-node Sample entire codebase
web3.eth.accounts.create will provide you with the Ethereum address and the private key. In order to make new accounts available to a node, you have to store the new account information in the node's keystore.
When you call create, you will get an object like this (from the docs):
web3.eth.accounts.create();
> {
address: "0xb8CE9ab6943e0eCED004cDe8e3bBed6568B2Fa01",
privateKey: "0x348ce564d427a3311b6536bbcff9390d69395b06ed6c486954e971d960fe8709",
signTransaction: function(tx){...},
sign: function(data){...},
encrypt: function(password){...}
}
Use the encrypt function to generate the encrypted keystore. This is what needs to be stored with the node in order to be retrievable through web3.eth.getAccounts. The location is going to vary depending on node client, OS, and if you override the keystore location when starting the node (for example, the default Geth location on Linux is ~/.ethereum/keystore).
After struggling found the solution :
Web3Service.js
/**
*
* Accounts Functions
*/
exports.createAccount = function () {
/* *
* Create Account Local Machine Only.
* It will not return in web3.eth.getAccounts(); call
*/
return web3.eth.accounts.create();
};
exports.createPersonalAccount = function (password) {
/* *
* Create Account in Node.
* web3.eth.getAccounts();
*/
return web3.eth.personal.newAccount(password);
};
app.js
var personalAccount = await web3Service.createPersonalAccount('123456789');
console.log('personalAccount ', personalAccount);
var accounts = await web3Service.getAccounts();
console.log('accounts ', accounts);
Updated source : Working Source Code
Thier is no explicitly do anything with keystore.
Start your Geth using this --rpcapi db,eth,net,web3,personal flag. It is necessary. Otherwise, you will face the error.
Related
Looking into a way of sharing data via Google App Scripts's Cache Services from one web app to another.
Users load up the first webpage and filled out their information. Once submitted a function is run on this data and stored via the cache.
CacheService.getUserCache().put('FirstName','David')
CacheService.getUserCache().put('Surname','Armstrong')
Console log shows reports back that these two elements have been saved to cache.
However in the second web app when cache is called upon the console log returns null
var cache = CacheService.getUserCache().get('Firstname');
var cache2 = CacheService.getUserCache().get('Surname');
console.log(cache)
console.log(cache2)
Any ideas?
A possible solution would be to implement a service to synchronize the cache between web apps.
This can be achieved by creating a WebApp that via POST allows to add to the ScriptCache of the "Cache Synchronizer" the UserCache of the individual Web Apps.
The operation would be very simple:
From the web app that we want to synchronize, we check if we have cache of the user.
If it exists, we send it to the server so that it stores it.
If it does not exist, we check if the server has stored the user's cache.
Here is a sketch of how it could work.
CacheSync.gs
const cacheService = CacheService.getScriptCache()
const CACHE_SAVED_RES = ContentService
.createTextOutput(JSON.stringify({ "msg": "Cache saved" }))
.setMimeType(ContentService.MimeType.JSON)
const doPost = (e) => {
const { user, cache } = JSON.parse(e.postData.contents)
const localCache = cacheService.get(user)
if (!localCache) {
/* If no local data, we save it */
cacheService.put(user, JSON.stringify(cache))
return CACHE_SAVED_RES
} else {
/* If data we send it */
return ContentService
.createTextOutput(JSON.stringify(localCache))
.setMimeType(ContentService.MimeType.JSON)
}
}
ExampleWebApp.gs
const SYNC_SERVICE = "<SYNC_SERVICE_URL>"
const CACHE_TO_SYNC = ["firstName", "lastName"]
const cacheService = CacheService.getUserCache()
const syncCache = () => {
const cache = cacheService.getAll(CACHE_TO_SYNC)
const options = {
method: "POST",
payload: JSON.stringify({
user: Session.getUser().getEmail(),
cache
})
}
if (Object.keys(cache).length === 0) {
/* If no cache try to fetch it from the cache service */
const res = UrlFetchApp.fetch(SYNC_SERVICE, options)
const parsedResponse = JSON.parse(JSON.parse(res.toString()))
Object.keys(parsedResponse).forEach((k)=>{
console.log(k, parsedResponse[k])
cacheService.put(k, parsedResponse[k])
})
} else {
/* If cache send it to the sync service */
const res = UrlFetchApp.fetch(SYNC_SERVICE, options)
console.log(res.toString())
}
}
const createCache = () => {
cacheService.put('firstName', "Super")
cacheService.put('lastName', "Seagull")
}
const clearCache = () => {
cacheService.removeAll(CACHE_TO_SYNC)
}
Additional information
The synchronization service must be deployed with ANYONE access. You can control the access via an API_KEY.
This is just an example, and is not fully functional, you should adapt it to your needs.
The syncCache function of the web App is reusable, and would be the function you should use in all Web Apps.
There is a disadvantage when retrieving the cache, since you must provide the necessary keys, which forces you to write them manually (ex CACHE_TO_SYNC).
It could be considered to replace ScriptCache with ScriptProperties.
Documentation
Cache
Properties
Session
The doc says:
Gets the cache instance scoped to the current user and script.
As it is scoped to the script, accessing from another script is not possible. This is also the case with PropertiesService:
Properties cannot be shared between scripts.
To share, you can use a common file shared between them, like a drive text file or a spreadsheet.
How can I interact with smart contracts and send transactions with Web3.js by having a local private key? The private key is either hardcoded or comes from an environment (.env) file?
This is needed for Node.js and server-side interaction or batch jobs with Ethereum/Polygon/Binance Smart Chain smart contracts.
You may encounter e.g. the error
Error: The method eth_sendTransaction does not exist/is not available
Ethereum node providers like Infura, QuikNode and others require you to sign outgoing transactions locally before you broadcast them through their node.
Web3.js does not have this function built-in. You need to use #truffle/hdwallet-provider package as a middleware for your Ethereum provider.
Example in TypeScript:
const Web3 = require('web3');
const HDWalletProvider = require("#truffle/hdwallet-provider");
import { abi } from "../../build/contracts/AnythingTruffleCompiled.json";
//
// Project secrets are hardcoded here
// - do not do this in real life
//
// No 0x prefix
const myPrivateKeyHex = "123123123";
const infuraProjectId = "123123123";
const provider = new Web3.providers.HttpProvider(`https://mainnet.infura.io/v3/${infuraProjectId}`);
// Create web3.js middleware that signs transactions locally
const localKeyProvider = new HDWalletProvider({
privateKeys: [myPrivateKeyHex],
providerOrUrl: provider,
});
const web3 = new Web3(localKeyProvider);
const myAccount = web3.eth.accounts.privateKeyToAccount(myPrivateKeyHex);
// Interact with existing, already deployed, smart contract on Ethereum mainnet
const address = '0x123123123123123123';
const myContract = new web3.eth.Contract(abi as any, address);
// Some example calls how to read data from the smart contract
const currentDuration = await myContract.methods.stakingTime().call();
const currentAmount = await myContract.methods.stakingAmount().call();
console.log('Transaction signer account is', myAccount.address, ', smart contract is', address);
console.log('Starting transaction now');
// Approve this balance to be used for the token swap
const receipt = await myContract.methods.myMethod(1, 2).send({ from: myAccount.address });
console.log('TX receipt', receipt);
You need to also avoid to commit your private key to any Github repository. A dotenv package is a low entry solution for secrets management.
You can achieve what you want by using ethers.js instead of web3 with no other package needed.
First import the library:
Node.js:
npm install --save ethers
const { ethers } = require("ethers");
Web browser:
<script src="https://cdn.ethers.io/lib/ethers-5.2.umd.min.js"
type="application/javascript"></script>
Define the provider. One way is using the provider URL like this:
const provider = new ethers.providers.JsonRpcProvider(rpcProvider);
Then, in order to interact with the contract without asking for authorization, we will create a wallet using the private key and the provider like this:
const signer = new ethers.Wallet(privateKey,provider)
Now, you can create the contract with the address, ABI, and the signer we created in the previous step:
const contract = new ethers.Contract(contractAddress,ABI, signer);
Now, you can interact with the contract directly. For example, getting the balance of a token:
const tokenBalance = await nftContractReadonly.balanceOf(signer.getAddress(),tokenId);
Don't forget to store the private key in a safe place and never hardcode it in a web page.
Further reading: Provider Signer
There is a better and simple way to sign and execute the smart contract function. Here your function is addBonus.
First of all we'll create the smart contract instance:
const createInstance = () => {
const bscProvider = new Web3(
new Web3.providers.HttpProvider(config.get('bscRpcURL')),
);
const web3BSC = new Web3(bscProvider);
const transactionContractInstance = new web3BSC.eth.Contract(
transactionSmartContractABI,
transactionSmartContractAddress,
);
return { web3BSC, transactionContractInstance };
};
Now we'll create a new function to sign and execute out addBonus Function
const updateSmartContract = async (//parameters you need) => {
try {
const contractInstance = createInstance();
// need to calculate gas fees for the addBonus
const gasFees =
await contractInstance.transactionContractInstance.methods
.addBonus(
// all the parameters
)
.estimateGas({ from: publicAddress_of_your_desired_wallet });
const tx = {
// this is the address responsible for this transaction
from: chainpalsPlatformAddress,
// target address, this could be a smart contract address
to: transactionSmartContractAddress,
// gas fees for the transaction
gas: gasFees,
// this encodes the ABI of the method and the arguments
data: await contractInstance.transactionContractInstance.methods
.addBonus(
// all the parameters
)
.encodeABI(),
};
// sign the transaction with a private key. It'll return messageHash, v, r, s, rawTransaction, transactionHash
const signPromise =
await contractInstance.web3BSC.eth.accounts.signTransaction(
tx,
config.get('WALLET_PRIVATE_KEY'),
);
// the rawTransaction here is already serialized so you don't need to serialize it again
// Send the signed txn
const sendTxn =
await contractInstance.web3BSC.eth.sendSignedTransaction(
signPromise.rawTransaction,
);
return Promise.resolve(sendTxn);
} catch(error) {
throw error;
}
}
I would like to create user data (name, email, phone number) in Firestore. This should be triggered with on create an authenticated user.
at functions-> src-> index.ts
// Sends email to user after signup
export { welcomeEmail } from './send_email';
// Saves user after signup
export { createUserDoc } from './save_user';
at functions-> src-> save_user.ts
// Firebase Config
import * as functions from "firebase-functions";
import * as firebase from "firebase-admin";
import {MD5} from "crypto-js";
export const createUserDoc = functions.auth.user().onCreate(event => {
const firebaseUser = event.data;
// Use gravatar as default if photoUrl isn't specified in user data
let fileEnding = "jpg";
let photoURL = `https://www.gravatar.com/avatar/${MD5(firebaseUser.email).toString().toLowerCase()}.jpg?s=1024&d=robohash`;
if (firebaseUser.photoURL) {
fileEnding = firebaseUser.photoURL.substr(firebaseUser.photoURL.lastIndexOf(".") + 1);
photoURL = firebaseUser.photoURL;
}
const fileName = `users/${firebaseUser.uid}/profile.${fileEnding}`;
const profilePhotoStorageOpts = {
destination: fileName,
metadata: {
contentType: `image/${fileEnding}`
}
};
const user = {
name: firebaseUser.displayName || "No Name",
email: firebaseUser.email,
photoUrl: `gs://${firebase.storage().bucket().name}/${fileName}`
};
return Promise.all([
firebase.storage().bucket().upload(photoURL, profilePhotoStorageOpts),
firebase.firestore().collection("users").doc(firebaseUser.uid).set(user)
]);
});
The goal was, for each created account I would now find a corresponding user document in Firestore and a profile image in the cloud storage.
instead I'm getting:
Property 'data' does not exist on type 'UserRecord'.ts(2339)
'Promise' only refers to a type, but is being used as a value here. Do you need to change your target library? Try changing the lib compiler option to es2015 or later.ts(2585)
Help would be appreciated. Thanks
As you will see in the documentation for the onCreate method, the first parameter of the handler function is a UserRecord which does not have a data property.
So the first error you get is normal.
In your case, if you want, for example, to get the user's photoURL, you should do event.photoURL (Since event is of type UserRecord). Similarly, you will do event.uid to get the user's uid.
For the second error, you may have a look at https://stackoverflow.com/a/43122423/3371862 or How to resolve 'Build:'Promise' only refers to a type, but is being used as a value here.'
In Google compute engine I can use an instance template to create a new VM from the template. This works fine using the GCE-console, and works fine, using the API, too (URL parameter "sourceInstanceTemplate").
How can I create a new GCE-VM from an instance template using googleapis/nodejs-compute (the Node.js GCE SDK)?
google-auth-library-nodejs can be used for accessing the GCE instances.insert API directly.
The following example is adapted from https://github.com/google/google-auth-library-nodejs and works fine, if executed within GCE (in special, in a Google Cloud Function).
const zone = 'some-zone';
const name = 'a-name';
const sourceInstanceTemplate = `some-template-name`;
createVM(zone, name, sourceInstanceTemplate)
.then(console.log)
.catch(console.error);
async function createVM(zone, vmName, templateName) {
const {auth} = require('google-auth-library');
const client = await auth.getClient({
scopes: 'https://www.googleapis.com/auth/cloud-platform'
});
const projectId = await auth.getDefaultProjectId();
const sourceInstanceTemplate = `projects/${projectId}/global/instanceTemplates/${templateName}`;
const url = `https://www.googleapis.com/compute/v1/projects/${projectId}/zones/${zone}/instances?sourceInstanceTemplate=${sourceInstanceTemplate}`;
return await client.request({
url: url,
method: 'post',
data: {name: vmName}
});
}
I can't find the solution in the documentation for the node client. Hopefully my alternate solution helps someone.
const exec = require('child-process-promise').exec;
var create_vm = (zone, vmname, templatename) => {
const cmd = `gcloud compute instances create ${vmname} ` +
`--zone=${zone} ` +
`--source-instance-template=${templatename} `;
return exec(cmd);
};
create_vm('us-central1-c', 'my-instance', 'whatever')
.then(console.log)
.catch(console.error);
You can customize this as far as gcloud lets you. The docs/options for creating an instance are here.
Hi I am trying to run mocha and chai test to test my node js router, which saves a user in mysql database and then returns the same array back.
The problem I am facing at the moment is that I would not like to save the information in the database when I run it from local and when I use a continious integration software like travis/Ci the test fails since there is no database connection. I would like to know how can I test the database saving with the current without actually saving to the database.
Basically meaning having a fake virtual database to save or returning save.
I read that sinon.js can help but I am quite not sure on how to use it.
Here is my code
var expect = require('chai').expect;
var faker = require('faker');
const request = require('supertest');
const should = require('should');
const sinon = require('sinon');
const helper = require('../../models/user');
describe('POST /saveUser',()=>{
it('should save a new user',(done)=>{
var fake =
request(app)
.post('/saveUser')
.send({
Owner : faker.random.number(),
firstname : faker.name.firstName(),
lastname : faker.name.lastName(),
email:faker.internet.email(),
password : faker.random.number(),
token : faker.random.uuid()
})
.expect(200)
.expect((res)=>{
expect(res.body.firstname).to.be.a("string");
expect(res.body.lastname).to.be.a("string");
expect(res.body.Owner).to.be.a("number");
})
.end(done);
});
});
This is the router
router.post('/saveUser',(req,res,next)=>{
saveUser(req.body).then((result)=>{
return res.send(req.body);
}).catch((e)=>{
return res.send('All info not saved');
});
});
And here is the model
saveUser = (userinfo) => new Promise((resolve,reject)=>{
db.query('INSERT INTO user SET ?',userinfo,function(error,results,fields){
if(error){
reject();
}else{
resolve(userinfo);
}
})
});
What you are describing is a stub. With sinon you can stub methods and call fake methods instead like this:
sinon.stub(/* module that implements saveUser */, 'saveUser').callsFake(() => Promise.resolve());