I am trying to create my first Cloud Function, pointing to a branch on my Github repo mirrored on a Google Cloud Repository. However, the function fails to deploy and yields the following error:
Deployment failure:
Build failed: {"cacheStats": [{"status": "MISS", "hash": "c6a7ea692f4ddc9f7088c33d4c6e337420dcdc1452daa09efb5054117e10be57", "type": "docker_layer_cache", "level": "global"}, {"status": "MISS", "hash": "c6a7ea692f4ddc9f7088c33d4c6e337420dcdc1452daa09efb5054117e10be57", "type": "docker_layer_cache", "level": "project"}]}
Here is more from the error log:
protoPayload: {
#type: "type.googleapis.com/google.cloud.audit.AuditLog"
authenticationInfo: {
principalEmail: "paul.devito#output.com"
}
methodName: "google.cloud.functions.v1.CloudFunctionsService.CreateFunction"
resourceName: "projects/second-casing-278016/locations/us-west2/functions/RetentionPredictionFunction-TESTBRANCH"
serviceName: "cloudfunctions.googleapis.com"
status: {
code: 3
message: "Build failed: {"cacheStats": [{"status": "MISS", "hash": "c6a7ea692f4ddc9f7088c33d4c6e337420dcdc1452daa09efb5054117e10be57", "type": "docker_layer_cache", "level": "global"}, {"status": "MISS", "hash": "c6a7ea692f4ddc9f7088c33d4c6e337420dcdc1452daa09efb5054117e10be57", "type": "docker_layer_cache", "level": "project"}]}"
}
}
receiveTimestamp: "2020-07-28T20:37:01.643381187Z"
And from the create function request log:
{
insertId: "fjgtmse38h4w"
logName:
operation: {
first: true
id: "operations/c2Vjb25kLWNhc2luZy0yNzgwMTYvdXMtd2VzdDIvUmV0ZW50aW9uUHJlZGljdGlvbkZ1bmN0aW9uLVRFU1RCUkFOQ0gvX3ZfNWZtU2RITHM"
producer: "cloudfunctions.googleapis.com"
}
protoPayload: {
#type: "type.googleapis.com/google.cloud.audit.AuditLog"
authenticationInfo: {…}
authorizationInfo: [1]
methodName: "google.cloud.functions.v1.CloudFunctionsService.CreateFunction"
request: {
#type: "type.googleapis.com/google.cloud.functions.v1.CreateFunctionRequest"
function: {
availableMemoryMb: 1024
entryPoint: "daily_retention_prediction_procedure"
eventTrigger: {…}
ingressSettings: "ALLOW_ALL"
labels: {
deployment-tool: "console-cloud"
}
maxInstances: 1
name:
runtime: "python37"
serviceAccountEmail:
sourceRepository: {
url:
timeout: "540s"
}
location:
}
requestMetadata: {…}
resourceLocation: {…}
resourceName:
serviceName: "cloudfunctions.googleapis.com"
}
receiveTimestamp: "2020-07-28T20:34:08.301085522Z"
resource: {
labels: {…}
type: "cloud_function"
}
severity: "NOTICE"
timestamp: "2020-07-28T20:34:07.489Z"
}
The only other instance of this error I can find online is a Google outage, but the website shows all systems are go. What can I do to debug this?
Solved!
After checking the python code example for the Pub/Sub trigger I realized I was missing default arguments to the entry function:
def hello_pubsub(event, context):
...
https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/functions/helloworld/main.py
Related
I used the code to convert json data from Yarn REST API to Prometheus data type:
https://github.com/prometheus-community/json_exporter.
However, it printed errors:
level=error ts=2021-07-08T06:31:03.712Z caller=collector.go:83 msg="Failed to extract value for metric" path={.capacity} err="capacity is not found" metric="Desc{fqName: "queues_capacity", help: "information on queues", constLabels: {}, variableLabels: [type]}"
I was wondering if there is some wrong in my configuration of YAML file (such as in terms of nested json) or just the reason about the code.
my yaml config is:
metrics:
- name: queues
path: "{ .scheduler.schedulerInfo.queues.queue }"
help: information on queues
type: object
labels:
type: '{.type}'
values:
capacity: '{.capacity}'
and part of the json file is:
{
"scheduler": {
"schedulerInfo": {
"type": "capacityScheduler",
"capacity": 100,
"usedCapacity": 1.0526316,
"maxCapacity": 100,
"queueName": "root",
"queues": {
"queue": [
{
"type": "capacitySchedulerLeafQueueInfo",
"capacity": 10,
"usedCapacity": 10.526316,
"maxCapacity": 100,
use [*] to get all object first, so it should be:
path: "{ .scheduler.schedulerInfo.queues.queue[*] }"
according to this: https://kubernetes.io/docs/reference/kubectl/jsonpath/
I have an AWS steps function defined using this Serverless plugin with 3 steps (FirstStep -> Worker -> EndStep -> Done):
stepFunctions:
stateMachines:
MyStateMachine:
name: "MyStateMachine"
definition:
StartAt: FirstStep
States:
FirstStep:
Type: Task
Resource:
Fn::GetAtt: [ FirstStep, Arn ]
InputPath: $
OutputPath: $
Next: Worker
Worker:
Type: Task
Resource: arn:aws:states:::ecs:runTask.sync
InputPath: $
OutputPath: $
Parameters:
Cluster: "#{EcsCluster}"
TaskDefinition: "#{EcsTaskDefinition}"
LaunchType: FARGATE
Overrides:
ContainerOverrides:
- Name: container-worker
Environment:
- Name: ENV_VAR_1
'Value.$': $.ENV_VAR_1
- Name: ENV_VAR_2
'Value.$': $.ENV_VAR_2
Next: EndStep
EndStep:
Type: Task
Resource:
Fn::GetAtt: [ EndStep, Arn ]
InputPath: $
OutputPath: $
Next: Done
Done:
Type: Succeed
I would like to propagate the InputPath unchanged from Worker step (Fargate) to EndStep, but when I inspect step input of EndStep from AWS management console I see that data associated with Fargate task is passed instead:
{
"Attachments": [...],
"Attributes": [],
"AvailabilityZone": "...",
"ClusterArn": "...",
"Connectivity": "CONNECTED",
"ConnectivityAt": 1619602512349,
"Containers": [...],
"Cpu": "1024",
"CreatedAt": 1619602508374,
"DesiredStatus": "STOPPED",
"ExecutionStoppedAt": 1619602543623,
"Group": "...",
"InferenceAccelerators": [],
"LastStatus": "STOPPED",
"LaunchType": "FARGATE",
"Memory": "3072",
"Overrides": {
"ContainerOverrides": [
{
"Command": [],
"Environment": [
{
"Name": "ENV_VAR_1",
"Value": "..."
},
{
"Name": "ENV_VAR_2",
"Value": "..."
}
],
"EnvironmentFiles": [],
"Name": "container-worker",
"ResourceRequirements": []
}
],
"InferenceAcceleratorOverrides": []
},
"PlatformVersion": "1.4.0",
"PullStartedAt": 1619602522806,
"PullStoppedAt": 1619602527294,
"StartedAt": 1619602527802,
"StartedBy": "AWS Step Functions",
"StopCode": "EssentialContainerExited",
"StoppedAt": 1619602567040,
"StoppedReason": "Essential container in task exited",
"StoppingAt": 1619602553655,
"Tags": [],
"TaskArn": "...",
"TaskDefinitionArn": "...",
"Version": 5
}
Basically, if the initial input is
{
"ENV_VAR_1": "env1",
"ENV_VAR_2": "env2",
"otherStuff": {
"k1": "v1",
"k2": "v2"
}
}
I want it to be passed as is to FirstStep, Worker and EndStep inputs without changes.
Is this possible?
Given that you invoke the step function with an object (let's call that A), then a task's...
...InputPath specifies what part of A is handed to your task
...ResultPath specifies where in A to put the result of the task
...OutputPath specifies what part of A to hand over to the next state
Source: https://docs.aws.amazon.com/step-functions/latest/dg/input-output-example.html
So you are currently overwriting all content in A with the result of your Worker state (implicitly). If you want to discard the result of your Worker state, you have to specify:
ResultPath: null
Source: https://docs.aws.amazon.com/step-functions/latest/dg/input-output-resultpath.html#input-output-resultpath-null
This is my log4j2.json file placed inside the resources folder in spring. I dont understand how i can print in both the console and the file at the same time. The console Output is working, the file .html is created but not used
{
"configuration": {
"name": "Default",
"appenders": {
"Console": {
"name": "STDOUT",
"PatternLayout": {
"pattern": "%d [%t] %-5p %c - %m%n"
}
},
"RollingFile": {
"name": "File",
"fileName": "C:/Users/Tiziano/Desktop/LogFile/info-app.html",
"filePattern": "C:/myppInfo-%d{MM-dd-yy-HH-mm-ss}-%i.html.gz",
"HTMLLayout": {
"charset": "UTF-8",
"title": "Info Logs",
"locationInfo": "true"
},
"Policies": {
"SizeBasedTriggeringPolicy": {
"size": "10 MB"
}
},
"DefaultRolloverStrategy": {
"max": "10"
}
}
},
"loggers": {
"logger": {
"level": "info",
"appender-ref": {
"ref": "File"
}
},
"root": {
"level": "info",
"AppenderRef": {
"ref": "STDOUT"
}
}
}
}
}
The error i'm getting in the spring console:
2021-03-02 11:25:51,447 main ERROR Loggers cannot be configured without a name: arg[2](null)
2021-03-02 11:25:51,451 main ERROR Unable to invoke factory method in class org.apache.logging.log4j.core.config.LoggerConfig for element logger: org.apache.logging.log4j.core.config.ConfigurationException: Arguments given for element logger are invalid org.apache.logging.log4j.core.config.ConfigurationException: Arguments given for element logger are invalid
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.generateParameters(PluginBuilder.java:280)
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:135)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:1002)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:942)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:934)
at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:552)
at org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:241)
at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:288)
at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:618)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:691)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:708)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:263)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:153)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:45)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:194)
at org.apache.commons.logging.LogAdapter$Log4jLog.<clinit>(LogAdapter.java:155)
at org.apache.commons.logging.LogAdapter$Log4jAdapter.createLog(LogAdapter.java:122)
at org.apache.commons.logging.LogAdapter.createLog(LogAdapter.java:89)
at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:67)
at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:59)
at org.springframework.boot.SpringApplication.<clinit>(SpringApplication.java:203)
at com.example.myapp.MyappApplication.main(MyappApplication.java:10)
2021-03-02 11:25:51,458 main ERROR Null object returned for logger in loggers.
2021-03-02 11:25:52,019 main ERROR Loggers cannot be configured without a name: arg[2](null)
2021-03-02 11:25:52,020 main ERROR Unable to invoke factory method in class org.apache.logging.log4j.core.config.LoggerConfig for element logger: org.apache.logging.log4j.core.config.ConfigurationException: Arguments given for element logger are invalid org.apache.logging.log4j.core.config.ConfigurationException: Arguments given for element logger are invalid
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.generateParameters(PluginBuilder.java:280)
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:135)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:1002)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:942)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:934)
at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:552)
at org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:241)
at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:288)
at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:618)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:691)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:708)
at org.springframework.boot.logging.log4j2.Log4J2LoggingSystem.reinitialize(Log4J2LoggingSystem.java:207)
at org.springframework.boot.logging.AbstractLoggingSystem.initializeWithConventions(AbstractLoggingSystem.java:73)
at org.springframework.boot.logging.AbstractLoggingSystem.initialize(AbstractLoggingSystem.java:60)
at org.springframework.boot.logging.log4j2.Log4J2LoggingSystem.initialize(Log4J2LoggingSystem.java:163)
at org.springframework.boot.context.logging.LoggingApplicationListener.initializeSystem(LoggingApplicationListener.java:312)
at org.springframework.boot.context.logging.LoggingApplicationListener.initialize(LoggingApplicationListener.java:281)
at org.springframework.boot.context.logging.LoggingApplicationListener.onApplicationEnvironmentPreparedEvent(LoggingApplicationListener.java:239)
at org.springframework.boot.context.logging.LoggingApplicationListener.onApplicationEvent(LoggingApplicationListener.java:216)
at org.springframework.context.event.SimpleApplicationEventMulticaster.doInvokeListener(SimpleApplicationEventMulticaster.java:176)
at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:169)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:143)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:131)
at org.springframework.boot.context.event.EventPublishingRunListener.environmentPrepared(EventPublishingRunListener.java:82)
at org.springframework.boot.SpringApplicationRunListeners.lambda$environmentPrepared$2(SpringApplicationRunListeners.java:63)
at java.util.ArrayList.forEach(ArrayList.java:1257)
at org.springframework.boot.SpringApplicationRunListeners.doWithListeners(SpringApplicationRunListeners.java:117)
at org.springframework.boot.SpringApplicationRunListeners.doWithListeners(SpringApplicationRunListeners.java:111)
at org.springframework.boot.SpringApplicationRunListeners.environmentPrepared(SpringApplicationRunListeners.java:62)
at org.springframework.boot.SpringApplication.prepareEnvironment(SpringApplication.java:362)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:320)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1311)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1300)
at com.example.myapp.MyappApplication.main(MyappApplication.java:10)
2021-03-02 11:25:52,021 main ERROR Null object returned for logger in loggers.
The console Output is fine as you can see, what doesnt work is the output in an external file.
2021-03-02 11:25:52,197 [main] INFO com.example.myapp.MyappApplication - No active profile set, falling back to default profiles: default
2021-03-02 11:25:53,153 [main] INFO org.springframework.data.repository.config.RepositoryConfigurationDelegate - Bootstrapping Spring Data JPA repositories in DEFAULT mode.
2021-03-02 11:25:53,249 [main] INFO org.springframework.data.repository.config.RepositoryConfigurationDelegate - Finished Spring Data repository scanning in 81 ms. Found 1 JPA repository interfaces.
2021-03-02 11:25:54,493 [main] INFO org.springframework.boot.web.embedded.tomcat.TomcatWebServer - Tomcat initialized with port(s): 8080 (http)
2021-03-02 11:25:54,507 [main] INFO org.apache.coyote.http11.Http11NioProtocol - Initializing ProtocolHandler ["http-nio-8080"]
It looks like you may have a typo - should be AppenderRef instead of appender-ref. See manual for examples - https://logging.apache.org/log4j/2.x/manual/configuration.html#Configuration_with_JSON.
We have used IoT agent -1.14.0 version from docker hub.
We have given the service and servicepath as follows
fiware-service:testiotagent
fiware-servicepath:/
Device registration payload :
{
"devices": [
{
"device_id":"Motion-10",
"entity_name":"urn:ngsi-ld:SENSOR:Motion-10",
"entity_type":"SENSOR",
"transport": "MQTT",
"attributes": [
{"object_id": "s", "name": "state", "type":"Text"},
{"object_id": "l", "name": "luminosity", "type":"Integer",
"metadata":{ "unitCode":{"type": "Text", "value" :"CAL"}
}
}
]
}
]
}
As per iotagent node lib version 2.12.0 ,IoT agent json -1.14.0 version should support the metadata in device provisioned attributes. But still facing issue.
When we try to provision the above device we are getting the below error:
{
"name": "WRONG_SYNTAX",
"message": "Wrong syntax in request: Errors found validating request."
}
I found that iotagent-node-lib have the schema to validate against device registration payload
https://github.com/telefonicaid/iotagent-node-lib/blob/master/lib/templates/createDevice.json
In this json schema there is no metadata schema mentioned in attributes.
I have followed the below steps for metadata in Entity level:
I have removed the metadata in IoT agent
Updated the entity 'urn:ngsi-ld:SENSOR:Motion-10' as below
{
"id":"urn:ngsi-ld:SENSOR:Motion-10",
"type":"SENSOR",
"luminosity":{
"type":"Integer",
"value":"0",
"metadata":{ "unitCode":{"type": "Text", "value" :"CAL"}
}
}
Tried to send measurement and metadata got overriden and got the empty metadata
{
"id":"urn:ngsi-ld:SENSOR:Motion-10",
"type":"SENSOR",
"luminosity":{
"type":"Integer",
"value":"15",
"metadata":{}
}
}
Is it due to the fix given for issue 1788 in fiware-orion ,https://github.com/telefonicaid/fiware-orion/issues?q=1788.
Need some qucik confirmation and help from Fiware experts to overcome this issue, it is very much appreciated.
The templates checking a valid provisioning request currently does not accept the metadata attribute. There is an outstanding PR for this. At the moment you would be better off defining the Entities with the metadata in a config.js file instead.
e.g.:
iotAgentConfig = {
contextBroker: {
host: '192.168.1.1',
port: '1026',
ngsiVersion: 'v2'
},
server: {
port: 4041
},
types: {
'WeatherStation': {
commands: [],
type: 'WeatherStation',
lazy: [],
active: [
{
object_id: 'p',
name: 'pressure',
type: 'Hgmm'
},
{
object_id: 'h',
name: 'humidity',
type: 'Percentage',
entity_name: 'Higro2000',
entity_type: 'Higrometer',
metadata:{
unitCode:{
type: "Text", value :"Hgmm"
}
}
}
]
},
....etc
I'm new to OpenAPI and I need some help to create a basic swagger file for PayPal's payment API to create a payment from our platform. Note: OAuth is already configured.
Below is a basic swagger file but I don't know where to add the paymet request information (i.e. intent, payer, transactions etc.) into:
{
"swagger": "2.0",
"info": {
"description": "this is a payment request to through PayPal",
"title": "Swagger PayPal Payment",
"version": "1.0.0"
},
"host": "api.sandbox.paypal.com",
"basePath": "/v1/payments", //
"schemes": [ "https" ],
"paths": {
"/payment":
{
"post": {
"summary": "Creates a payment"
"description": "Creates a payment request to Paypal",
"parameters": {
},
//"intent": "sale",
//"payer":
//{
// "payment_method": "paypal"
//},
//"transactions": [
// {
// "amount": {
// "total": "9.00",
// "currency": "EUR"
// }
// }
//],
"responses": {
"200": {
"description": "OK"
}
}
}
}
}
}
Testing the file on editor.swagger, I get an "OBJECT_ADDITIONAL_PROPERTIES" error on transactions, payer, and intent.
JSON payload is defined as a body parameter (parameter with in: body), and this parameter needs a schema that defines the JSON object properties.
You would typically define object schemas in the global definitions section and reference them using $ref.
Here is the YAML version for readability. To convert it to JSON, paste it into http://editor.swagger.io and use File > Download JSON.
swagger: "2.0"
info:
description: this is a payment request to through PayPal
title: Swagger PayPal Payment
version: "1.0.0"
host: api.sandbox.paypal.com
basePath: /v1/payments
schemes: [ https ]
paths:
/payment:
post:
summary: Creates a payment
description: Creates a payment request to Paypal
parameters:
- in: body
name: payment
required: true
schema:
$ref: "#/definitions/Payment" # <--------
responses:
"200":
description: OK
definitions:
# Request body object
Payment:
type: object
properties:
intent:
type: string
payer:
$ref: "#/definitions/Payer"
transactions:
type: array
items:
$ref: "#/definitions/Transaction"
Payer:
type: object
properties:
payment_method:
type: string
example: paypal
Transaction:
type: object
properties:
... # TODO