Geth "invalid argument 0: json: cannot unmarshal non-string into Go value of type common.Address" error - ethereum

I run following request to Geth:
POST http://localhost:8545
{
"jsonrpc":"2.0",
"id":1,
"method": "eth_sign",
"params": [
{
"from": "0xDc89096c0E279933a7C99e13f1474A0a84320207",
"gas": "0x55555",
"maxFeePerGas": "0x1234",
"maxPriorityFeePerGas": "0x1234",
"input": "",
"nonce": "0x0",
"to": "0xCdde93eEDC6c911D33E7fe41DC05208E0630a4c0",
"value": "0x1234"
}
]
}
but getting following error:
{
"jsonrpc": "2.0",
"id": 1,
"error": {
"code": -32602,
"message": "invalid argument 0: json: cannot unmarshal non-string into Go value of type common.Address"
}
}
Geth Version: Geth/v1.10.26-stable-e5eb32ac/windows-amd64/go1.18.5
What can be reason?

eth_sign is meant for signing arbitrary messages - not transactions - and takes 2 params:
Address of the signer. It assumes that the signer has an unlocked account on the node (i.e. the node knows the private key that derives into this address).
The message to sign
Based on the params that you're passing, it seems that you want to sign a transaction instead. For that, there's the eth_signTransaction method.
Again, it succeeds only if the node knows the private key to the from address, and the account holding the private key is unlocked on the node.

Related

Step Functions: Passing in json to RequestBody to API Gateway via POST method

so I have a json that is being received by another API Gateway that is invoked.
I want to pass this json in another Task that invokes another API Gateway. I tried to include it in the RequestBody via the $ identifier but it literally sends this without the JSON. Attempting to add ResultPath or InputPath on this Task throws error.
{
"ApiEndpoint": "fasdffasd.execute-api.us-east-1.amazonaws.com",
"Method": "POST",
"Headers": {
"Content-Type": [
"application/json"
]
},
"Stage": "uat",
"Path": "/v1/order/create",
"RequestBody": {
"Payload": "$" <---the JSON received by this Task from the previous Task
},
"AuthType": "IAM_ROLE"
}
The issue is checking CloudWatch Logs I can see that literally the dollar sign as a string is returned. I expected a JSON object.

FLuxMonitor locally: FROM address in transaction is wrong

I'm trying to run decentralized-model locally. I've managed to deploy:
Link contract
AggregatorProxy
FluxAggregator
Consumer contract
Oracle node (offchain)
External adapters (coingecko + coinapi)
I'm mainly struggling for the last piece which is creating a Job which uses the FluxMonitor initiator.
I've created the following job where "0x5379A65A620aEb405C5C5338bA1767AcB48d6750" is the address of FluxAggregator contract
{
"initiators": [
{
"type": "fluxmonitor",
"params": {
"address": "0x5379A65A620aEb405C5C5338bA1767AcB48d6750",
"requestData": {
"data": {
"from": "ETH",
"to": "USD"
}
},
"feeds": [
{
"bridge": "coinapi_cl_ea"
},
{
"bridge": "coingecko_cl_ea"
}
],
"threshold": 1,
"absoluteThreshold": 1,
"precision": 8,
"pollTimer": {
"period": "15m0s"
},
"idleTimer": {
"duration": "1h0m0s"
}
}
}
],
"tasks": [
{
"type": "NoOp"
}
]
}
Unfortunately, it doesn't work, it makes my local ganache fail with this error "Error: The nonce generation function failed, or the private key was invalid"
I've put my Ganache in debug mode in order to log requests to the blockchain. Noticed the following call
eth_call
{
"jsonrpc": "2.0",
"id": 28,
"method": "eth_call",
"params": [
{
"data": "0xfeaf968c",
"from": "0x0000000000000000000000000000000000000000",
"to": "0x5379a65a620aeb405c5c5338ba1767acb48d6750"
},
"latest"
]
}
the signature of the function is correct
"latestRoundData()": "feaf968c"
However , what seems weird is that the from address is "0x0" . Any idea why my Oracle node doesn't use its key to sign the transaction?
thanks a lot
Problem from Ganache. In fact , I wrote a truffle script which:
calls "latestRoundData()" populating the "FROM" with a valid address
calls "latestRoundData()" populating the "FROM" with a 0x0 address
Then I ran the script 2 times:
Connecting to Ganache-cli --> 1st call is successful while the 2nd call fails
Connecting to Kovan testnet --> both calls are successful
I've just opened an issue for ganache-cli team: https://github.com/trufflesuite/ganache-cli/issues/840

Jenkins Pipeline Error while trying to send JSON Payload after post success

I'm trying to send a custom JSON payload from my pipeline on Jenkins after the last successful stage, like this:
post {
success {
script {
def payload = """
{
"type": "AdaptiveCard",
"body": [
{
"type": "TextBlock",
"size": "Medium",
"weight": "Bolder",
"text": "SonarQube report from Jenkins Pipeline"
},
{
"type": "TextBlock",
"text": "Code was analyzed was successfully.",
"wrap": true,
"color": "Good",
"weight": "Bolder"
}
],
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"version": "1.3"
}"""
httpRequest httpMode: 'POST',
acceptType: 'APPLICATION_JSON',
contentType: 'APPLICATION_JSON',
url: "URL",
requestBody: payload
}
}
}
}
But I get an error
Error when executing success post condition:
groovy.lang.MissingPropertyException: No such property: schema for class: groovy.lang.Binding
at groovy.lang.Binding.getVariable(Binding.java:63)
I'm using the HTTP Request plugin available for Jenkins and the format of the JSON payload is correct for MS Teams.
The issue is actually a groovy syntax error. You can easily check this in something like https://groovy-playground.appspot.com/ by adding your def payload = ... statement.
There are multiple ways to get multiline strings in groovy:
triple single quoted string
triple double quoted string
slashy string
dollary slashy string
Apart from the single quoted string, they also have a secondary property which is interpolation
Notice how in the initial JSON payload, there's a "$schema" key? Using triple double quoted strings makes groovy want to find a schema variable and use it's value to construct that payload variable.
You have two separate solutions:
Use triple single quoted string - just update """ to '''
Escape the variable - just update "$schema" to "\$schema" (making $ a literal $ instead of it being used as an interpolation prefix)

Replace and add json within another json

I have a Main json file.
{
"swagger": "2.0",
"paths": {
"/agents/delta": {
"get": {
"description": "lorem ipsum doram",
"operationId": "getagentdelta",
"summary": "GetAgentDelta",
"tags": [
"Agents"
],
"parameters": [
{
"name": "since",
"in": "query",
"description": "Format - date-time (as date-time in RFC3339). The time from which you need changes from. You should use the format emitted by Date's toJSON method (for example, 2017-04-23T18:25:43.511Z). If a timestamp older than a week is passed, a business rule violation will be thrown which will require the client to change the from date. As a best-practice, for a subsequent call to this method, send the timestamp when you <b>started</b> the previous delta call (instead of when you completed processing the response or the max of the lastUpdateOn timestamps of the returned records). This will ensure that you do not miss any changes that occurred while you are processing the response from this method",
"required": true,
"type": "string"
}
]
}
}
}
}
And I have a smaller json file.
{
"name": "Authorization",
"description": "This parameter represents the Authorization token obtained from the OKTA Authorization server. It is the Bearer token provided to authorize the consumer. Usage Authorization : Bearer token",
"in": "header",
"required": true,
"type": "string"
}
Now I need to add the contents of the smaller json file into the Main.Json file in the parameters array.
I tried the below command
cat test.json | jq --argfile sub Sub.json '.paths./agents/delta.get.parameters[ ] += $sub.{}' > test1.json
But I get the below error:
jq: error: syntax error, unexpected '{', expecting FORMAT or QQSTRING_START (Unix shell quoting issues?) at <top-level>, line 1:
.paths += $sub.{}
jq: 1 compile error
cat: write error: Broken pipe
I tried this command.
cat test.json | jq '.paths./agents/delta.get.parameters[ ] | = (.+ [{ "name": "Authorization", "description": "This parameter represents the Authorization token obtained from the OKTA Authorization server. It is the Bearer token provided to authorize the consumer. Usage Authorization : Bearer token", "in": "header", "required": true, "type": "string" }] )' > test1.json
And I get no error and no output either. How do I get around this?
I would have to add the contents of the smaller json file directly first. And then at a later stage, search if it already had name: Authorization and it's other parameters, and then remove and replace the whole name: Authorization piece with the actual contents of the smaller.json, under each path that starts with '/xx/yyy'.
Edited to add:
For the last part of the question, I could not use the walk function, since I have jq 1.5 and since am using the bash task within Azure DevOps, I can't update the jq installation file with the walk function.
Meanwhile I found the use of something similar to wildcard in jq, and was wondering why I can't use it in this way.
jq --slurpfile newval auth.json '.paths | .. | objects | .get.parameters += $newval' test.json > test1.json
Can anyone please point out the issue in the above command? It did not work, and am not sure why..
You want --slurpfile, and you need to escape /agents/delta part of the path with quotes:
$ jq --slurpfile newval insert.json '.paths."/agents/delta".get.parameters += $newval' main.json
{
"swagger": "2.0",
"paths": {
"/agents/delta": {
"get": {
"description": "lorem ipsum doram",
"operationId": "getagentdelta",
"summary": "GetAgentDelta",
"tags": [
"Agents"
],
"parameters": [
{
"name": "since",
"in": "query",
"description": "Format - date-time (as date-time in RFC3339). The time from which you need changes from. You should use the format emitted by Date's toJSON method (for example, 2017-04-23T18:25:43.511Z). If a timestamp older than a week is passed, a business rule violation will be thrown which will require the client to change the from date. As a best-practice, for a subsequent call to this method, send the timestamp when you <b>started</b> the previous delta call (instead of when you completed processing the response or the max of the lastUpdateOn timestamps of the returned records). This will ensure that you do not miss any changes that occurred while you are processing the response from this method",
"required": true,
"type": "string"
},
{
"name": "Authorization",
"description": "This parameter represents the Authorization token obtained from the OKTA Authorization server. It is the Bearer token provided to authorize the consumer. Usage Authorization : Bearer token",
"in": "header",
"required": true,
"type": "string"
}
]
}
}
}
}
And here's one that first removes any existing Authorization objects from the parameters before inserting the new one into every parameters array, and doesn't depend on an the exact path:
jq --slurpfile newval add.json '.paths |= walk(
if type == "object" and has("parameters") then
.parameters |= map(select(.name != "Authorization")) + $newval
else
.
end)' main.json

RabbitMq REST HTTP : Send integer values in JSON payload

data types of headersI am trying to send a message directly to Exchange using the REST endpoint:
/api/exchanges/vhost/name/publish
The sample payload I am using is:
{
"properties": {
"timestamp": 1536959503,
"message_id": "100",
"correlation_id": " ",
"priority": 0,
"delivery_mode": 2,
"headers": {
"counter": 0,
"content-type": "application/xml",
"correlation-id": " ",
"message-id": 100,
"message-type": "message1",
"status": "P"
},
"content_type": "application/xml"
},
"routing_key": "p.ee.pp.rr",
"payload": "sample",
"payload_encoding": "string"
}
In this, the only numerical values in the headers are message-id and counter.
When I receive the message from the queue into a headers map of type <String, Object>, the data type for the numerical values is Long.
My code casts message-id to long and counter to Integer. I get a class cast exception when doing the latter.
I am interested in knowing:
Who is assigning the data type to the numerical values as Long? At which stage does it get assigned (in the flow of REST endpoint to the Queue)
is Long the default type when reading data from JSON?