Can't find minting transaction performed in Truffle - ethereum

When running an ERC721 mint transaction in Matic's Mumbai testnet, I get a transaction address.
However, I'm not able to view this transaction. Did it happen?
truffle(develop)> instance.mint(accounts[0], 'https://ipfs.io/ipfs/Qmd9MCGtdVz2miNumBHDbvj8bigSgTwnr4SbyH6DNnpWdt?filename=1-PUG.json')
{ tx: '0x5eb16f6627ce7c827e33b58d8171c0be33992155abdf0db43f688677c39d5616',
receipt: {
transactionHash: '0x5eb16f6627ce7c827e33b58d8171c0be33992155abdf0db43f688677c39d5616',
transactionIndex: 0,
blockHash: '0x33d49e2bebc505241d03d3818ab0bb78446e915d317724b0636399c2b629a903',
blockNumber: 3,
from: '0x53496208528877b81d48e58fc3c669bb905ecc77',
to: '0x242cf7ed99e0aef8980678d0bd034808d22c77fd',
gasUsed: 23140,
cumulativeGasUsed: 23140,
contractAddress: null,
logs: [],
status: true,
logsBloom: '0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000',
rawLogs: [] }, logs: [] }
Page not found: https://explorer-mumbai.maticvigil.com/search?q=0x5eb16f6627ce7c827e33b58d8171c0be33992155abdf0db43f688677c39d5616

Related

Why is Klarna returning a 204 no content response when authorizing a purchase?

I'm following Klarna's direct integration documentation with the Klarna javascript SDK. I successfully create a payment session, the Klarna widget loads without issues as well.
However I get a 204 no content response when authorizing the purchase, instead of an authorization token or error code. Without the token I can't complete the purchase. This occurs with Klarnas example request which i've been using below. The Authorize call also doesn't appear in the merchant portal logs; while the create session call does.
Klarna.Payments.authorize({
payment_method_category: 'pay_over_time'
}, {
purchase_country: "GB",
purchase_currency: "GBP",
locale: "en-GB",
billing_address: {
given_name: "John",
family_name: "Doe",
email: "john#doe.com",
title: "Mr",
street_address: "13 New Burlington St",
street_address2: "Apt 214",
postal_code: "W13 3BG",
city: "London",
region: "",
phone: "01895808221",
country: "GB"
}, function(res) {
console.debug(res);
}
})

FLuxMonitor locally: FROM address in transaction is wrong

I'm trying to run decentralized-model locally. I've managed to deploy:
Link contract
AggregatorProxy
FluxAggregator
Consumer contract
Oracle node (offchain)
External adapters (coingecko + coinapi)
I'm mainly struggling for the last piece which is creating a Job which uses the FluxMonitor initiator.
I've created the following job where "0x5379A65A620aEb405C5C5338bA1767AcB48d6750" is the address of FluxAggregator contract
{
"initiators": [
{
"type": "fluxmonitor",
"params": {
"address": "0x5379A65A620aEb405C5C5338bA1767AcB48d6750",
"requestData": {
"data": {
"from": "ETH",
"to": "USD"
}
},
"feeds": [
{
"bridge": "coinapi_cl_ea"
},
{
"bridge": "coingecko_cl_ea"
}
],
"threshold": 1,
"absoluteThreshold": 1,
"precision": 8,
"pollTimer": {
"period": "15m0s"
},
"idleTimer": {
"duration": "1h0m0s"
}
}
}
],
"tasks": [
{
"type": "NoOp"
}
]
}
Unfortunately, it doesn't work, it makes my local ganache fail with this error "Error: The nonce generation function failed, or the private key was invalid"
I've put my Ganache in debug mode in order to log requests to the blockchain. Noticed the following call
eth_call
{
"jsonrpc": "2.0",
"id": 28,
"method": "eth_call",
"params": [
{
"data": "0xfeaf968c",
"from": "0x0000000000000000000000000000000000000000",
"to": "0x5379a65a620aeb405c5c5338ba1767acb48d6750"
},
"latest"
]
}
the signature of the function is correct
"latestRoundData()": "feaf968c"
However , what seems weird is that the from address is "0x0" . Any idea why my Oracle node doesn't use its key to sign the transaction?
thanks a lot
Problem from Ganache. In fact , I wrote a truffle script which:
calls "latestRoundData()" populating the "FROM" with a valid address
calls "latestRoundData()" populating the "FROM" with a 0x0 address
Then I ran the script 2 times:
Connecting to Ganache-cli --> 1st call is successful while the 2nd call fails
Connecting to Kovan testnet --> both calls are successful
I've just opened an issue for ganache-cli team: https://github.com/trufflesuite/ganache-cli/issues/840

AWS step function: how to pass InputPath to OutputPath unchanged in Fargate task

I have an AWS steps function defined using this Serverless plugin with 3 steps (FirstStep -> Worker -> EndStep -> Done):
stepFunctions:
stateMachines:
MyStateMachine:
name: "MyStateMachine"
definition:
StartAt: FirstStep
States:
FirstStep:
Type: Task
Resource:
Fn::GetAtt: [ FirstStep, Arn ]
InputPath: $
OutputPath: $
Next: Worker
Worker:
Type: Task
Resource: arn:aws:states:::ecs:runTask.sync
InputPath: $
OutputPath: $
Parameters:
Cluster: "#{EcsCluster}"
TaskDefinition: "#{EcsTaskDefinition}"
LaunchType: FARGATE
Overrides:
ContainerOverrides:
- Name: container-worker
Environment:
- Name: ENV_VAR_1
'Value.$': $.ENV_VAR_1
- Name: ENV_VAR_2
'Value.$': $.ENV_VAR_2
Next: EndStep
EndStep:
Type: Task
Resource:
Fn::GetAtt: [ EndStep, Arn ]
InputPath: $
OutputPath: $
Next: Done
Done:
Type: Succeed
I would like to propagate the InputPath unchanged from Worker step (Fargate) to EndStep, but when I inspect step input of EndStep from AWS management console I see that data associated with Fargate task is passed instead:
{
"Attachments": [...],
"Attributes": [],
"AvailabilityZone": "...",
"ClusterArn": "...",
"Connectivity": "CONNECTED",
"ConnectivityAt": 1619602512349,
"Containers": [...],
"Cpu": "1024",
"CreatedAt": 1619602508374,
"DesiredStatus": "STOPPED",
"ExecutionStoppedAt": 1619602543623,
"Group": "...",
"InferenceAccelerators": [],
"LastStatus": "STOPPED",
"LaunchType": "FARGATE",
"Memory": "3072",
"Overrides": {
"ContainerOverrides": [
{
"Command": [],
"Environment": [
{
"Name": "ENV_VAR_1",
"Value": "..."
},
{
"Name": "ENV_VAR_2",
"Value": "..."
}
],
"EnvironmentFiles": [],
"Name": "container-worker",
"ResourceRequirements": []
}
],
"InferenceAcceleratorOverrides": []
},
"PlatformVersion": "1.4.0",
"PullStartedAt": 1619602522806,
"PullStoppedAt": 1619602527294,
"StartedAt": 1619602527802,
"StartedBy": "AWS Step Functions",
"StopCode": "EssentialContainerExited",
"StoppedAt": 1619602567040,
"StoppedReason": "Essential container in task exited",
"StoppingAt": 1619602553655,
"Tags": [],
"TaskArn": "...",
"TaskDefinitionArn": "...",
"Version": 5
}
Basically, if the initial input is
{
"ENV_VAR_1": "env1",
"ENV_VAR_2": "env2",
"otherStuff": {
"k1": "v1",
"k2": "v2"
}
}
I want it to be passed as is to FirstStep, Worker and EndStep inputs without changes.
Is this possible?
Given that you invoke the step function with an object (let's call that A), then a task's...
...InputPath specifies what part of A is handed to your task
...ResultPath specifies where in A to put the result of the task
...OutputPath specifies what part of A to hand over to the next state
Source: https://docs.aws.amazon.com/step-functions/latest/dg/input-output-example.html
So you are currently overwriting all content in A with the result of your Worker state (implicitly). If you want to discard the result of your Worker state, you have to specify:
ResultPath: null
Source: https://docs.aws.amazon.com/step-functions/latest/dg/input-output-resultpath.html#input-output-resultpath-null

Google Cloud Create Function failed with cache miss on docker_layer_cache

I am trying to create my first Cloud Function, pointing to a branch on my Github repo mirrored on a Google Cloud Repository. However, the function fails to deploy and yields the following error:
Deployment failure:
Build failed: {"cacheStats": [{"status": "MISS", "hash": "c6a7ea692f4ddc9f7088c33d4c6e337420dcdc1452daa09efb5054117e10be57", "type": "docker_layer_cache", "level": "global"}, {"status": "MISS", "hash": "c6a7ea692f4ddc9f7088c33d4c6e337420dcdc1452daa09efb5054117e10be57", "type": "docker_layer_cache", "level": "project"}]}
Here is more from the error log:
protoPayload: {
#type: "type.googleapis.com/google.cloud.audit.AuditLog"
authenticationInfo: {
principalEmail: "paul.devito#output.com"
}
methodName: "google.cloud.functions.v1.CloudFunctionsService.CreateFunction"
resourceName: "projects/second-casing-278016/locations/us-west2/functions/RetentionPredictionFunction-TESTBRANCH"
serviceName: "cloudfunctions.googleapis.com"
status: {
code: 3
message: "Build failed: {"cacheStats": [{"status": "MISS", "hash": "c6a7ea692f4ddc9f7088c33d4c6e337420dcdc1452daa09efb5054117e10be57", "type": "docker_layer_cache", "level": "global"}, {"status": "MISS", "hash": "c6a7ea692f4ddc9f7088c33d4c6e337420dcdc1452daa09efb5054117e10be57", "type": "docker_layer_cache", "level": "project"}]}"
}
}
receiveTimestamp: "2020-07-28T20:37:01.643381187Z"
And from the create function request log:
{
insertId: "fjgtmse38h4w"
logName:
operation: {
first: true
id: "operations/c2Vjb25kLWNhc2luZy0yNzgwMTYvdXMtd2VzdDIvUmV0ZW50aW9uUHJlZGljdGlvbkZ1bmN0aW9uLVRFU1RCUkFOQ0gvX3ZfNWZtU2RITHM"
producer: "cloudfunctions.googleapis.com"
}
protoPayload: {
#type: "type.googleapis.com/google.cloud.audit.AuditLog"
authenticationInfo: {…}
authorizationInfo: [1]
methodName: "google.cloud.functions.v1.CloudFunctionsService.CreateFunction"
request: {
#type: "type.googleapis.com/google.cloud.functions.v1.CreateFunctionRequest"
function: {
availableMemoryMb: 1024
entryPoint: "daily_retention_prediction_procedure"
eventTrigger: {…}
ingressSettings: "ALLOW_ALL"
labels: {
deployment-tool: "console-cloud"
}
maxInstances: 1
name:
runtime: "python37"
serviceAccountEmail:
sourceRepository: {
url:
timeout: "540s"
}
location:
}
requestMetadata: {…}
resourceLocation: {…}
resourceName:
serviceName: "cloudfunctions.googleapis.com"
}
receiveTimestamp: "2020-07-28T20:34:08.301085522Z"
resource: {
labels: {…}
type: "cloud_function"
}
severity: "NOTICE"
timestamp: "2020-07-28T20:34:07.489Z"
}
The only other instance of this error I can find online is a Google outage, but the website shows all systems are go. What can I do to debug this?
Solved!
After checking the python code example for the Pub/Sub trigger I realized I was missing default arguments to the entry function:
def hello_pubsub(event, context):
...
https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/functions/helloworld/main.py

Creating google cloud build triggers using gcloud alpha commands

I am attempting to create a Google Cloud Build trigger for a github repository using gcloud alpha builds triggers create github. I have created a BuildTrigger json containing my trigger config. My end goal is to create a trigger for pull requests in github repositories, which is a small change to the json file.
My json file:
committrigger.json
{
"description": "Commit to master branch",
"name": "Master-Commit",
"tags": ["test-flask-server"],
"github": {
"owner": "zamerman",
"name": "github_zamerman_test-flask-server",
"push": {
"branch": "master"
}
},
"disabled": false,
"filename": "/cloudbuild.yaml"
}
My command to create the trigger: gcloud alpha builds triggers create github --trigger-config committrigger.json
The error I am getting:
ERROR: (gcloud.alpha.builds.triggers.create.github) FAILED_PRECONDITION: Repository mapping does not
exist. Please visit https://console.cloud.google.com/cloud-build/triggers/connect?project=429437829619
to connect a repository to your project
Help or advice of any kind would be welcome. Thank you.
gcloud alpha builds triggers create github --trigger-config committrigger.json --verbosity=debug:
DEBUG: Running [gcloud.alpha.builds.triggers.create.github] with arguments: [--trigger-config: "committrigger.json", --verbosity: "debug"]
DEBUG: (gcloud.alpha.builds.triggers.create.github) FAILED_PRECONDITION: Repository mapping does not exist. Please visit https://console.cloud.google.com/cloud-build/triggers/connect?project=429437829619 to connect a repository to your project
Traceback (most recent call last):
File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/calliope/cli.py", line 983, in Execute
resources = calliope_command.Run(cli=self, args=args)
File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/calliope/backend.py", line 784, in Run
resources = command_instance.Run(args)
File "/usr/lib/google-cloud-sdk/lib/surface/builds/triggers/create/github.py", line 169, in Run
buildTrigger=trigger, projectId=project))
File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/third_party/apis/cloudbuild/v1/cloudbuild_v1_client.py", line 353, in Create
config, request, global_params=global_params)
File "/usr/bin/../lib/google-cloud-sdk/lib/third_party/apitools/base/py/base_api.py", line 731, in _RunMethod
return self.ProcessHttpResponse(method_config, http_response, request)
File "/usr/bin/../lib/google-cloud-sdk/lib/third_party/apitools/base/py/base_api.py", line 737, in ProcessHttpResponse
self.__ProcessHttpResponse(method_config, http_response, request))
File "/usr/bin/../lib/google-cloud-sdk/lib/third_party/apitools/base/py/base_api.py", line 604, in __ProcessHttpResponse
http_response, method_config=method_config, request=request)
HttpBadRequestError: HttpError accessing <https://cloudbuild.googleapis.com/v1/projects/gcplabzamerman/triggers?alt=json>: response: <{'status': '400', 'content-length': '263', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 25 Sep 2019 22:45:52 GMT', 'x-frame-options': 'SAMEORIGIN', 'alt-svc': 'quic=":443"; ma=2592000; v="46,43",h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000', 'content-type': 'application/json; charset=UTF-8'}>, content <{
"error": {
"code": 400,
"message": "Repository mapping does not exist. Please visit https://console.cloud.google.com/cloud-build/triggers/connect?project=429437829619 to connect a repository to your project",
"status": "FAILED_PRECONDITION"
}
}
>
ERROR: (gcloud.alpha.builds.triggers.create.github) FAILED_PRECONDITION: Repository mapping does not exist. Please visit https://console.cloud.google.com/cloud-build/triggers/connect?project=429437829619 to connect a repository to your project
Their github integration is not working at the moment. You can instead use triggerTemplate key to specify the cloud source repository that is connected to your github repo.
Update your configtrigger.json to something along the lines of:
{
"description": "Commit to master branch",
"name": "Master-Commit",
"tags": ["test-flask-server"],
"triggerTemplate": {
"projectId": "your project id",
"repoName": "github_zamerman_github_zamerman_test-flask-server",
"dir": "./",
"branchName": "master"
},
"disabled": false,
"filename": "/cloudbuild.yaml"
}