I successfully trained and deployed a Tensorflow Recommender model on Vertex AI.
Everything is online and to predict the output. In the notebook I do:
loaded = tf.saved_model.load(path)
scores, titles = loaded(["doctor"])
That returns:
Recommendations: [b'Nelly & Monsieur Arnaud (1995)'
b'Three Lives and Only One Death (1996)' b'Critical Care (1997)']
That is, the payload (input for the neural network) must be ["doctor"]
Then I generate the JSON for payload (the error is here):
!echo {"\""instances"\"" : [{"\""input_1"\"" : {["\""doctor"\""]}}]} > instances0.json
And submit to the endpoint:
!curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json" \
https://us-west1-aiplatform.googleapis.com/v1/projects/my_project/locations/us-west1/endpoints/123456789:predict \
-d #instances0.json > results.json
... as seen here: https://colab.research.google.com/github/GoogleCloudPlatform/vertex-ai-samples/blob/master/notebooks/community/vertex_endpoints/tf_hub_obj_detection/deploy_tfhub_object_detection_on_vertex_endpoints.ipynb#scrollTo=35348dd21acd
However, when I use this payload, I get error 400:
code: 400
message: "Invalid JSON payload received. Expected an object key or }. s" : [{"input_1" : {["doctor"]}}]} ^"
status: "INVALID_ARGUMENT"
This below don't work either:
!echo {"inputs": {"input_1": ["doctor"]}} > instances0.json
Even with validated JSON Lint, it does not return the proper prediction.
In another Stackoverflow question is suggested to remove the " \ " in the payload, but this didn't work either.
Running:
!saved_model_cli show --dir /home/jupyter/model --all
I get:
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
signature_def['__saved_model_init_op']:
The given SavedModel SignatureDef contains the following input(s):
The given SavedModel SignatureDef contains the following output(s):
outputs['__saved_model_init_op'] tensor_info:
dtype: DT_INVALID
shape: unknown_rank
name: NoOp
Method name is:
signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['input_1'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: serving_default_input_1:0
The given SavedModel SignatureDef contains the following output(s):
outputs['output_1'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 10)
name: StatefulPartitionedCall_1:0
outputs['output_2'] tensor_info:
dtype: DT_STRING
shape: (-1, 10)
name: StatefulPartitionedCall_1:1
Method name is: tensorflow/serving/predict
Concrete Functions:
Function Name: '__call__'
Option #1
Callable with:
Argument #1
input_1: TensorSpec(shape=(None,), dtype=tf.string, name='input_1')
Argument #2
DType: NoneType
Value: None
Argument #3
DType: bool
Value: True
Option #2
Callable with:
Argument #1
queries: TensorSpec(shape=(None,), dtype=tf.string, name='queries')
Argument #2
DType: NoneType
Value: None
Argument #3
DType: bool
Value: True
Option #3
Callable with:
Argument #1
input_1: TensorSpec(shape=(None,), dtype=tf.string, name='input_1')
Argument #2
DType: NoneType
Value: None
Argument #3
DType: bool
Value: False
Option #4
Callable with:
Argument #1
queries: TensorSpec(shape=(None,), dtype=tf.string, name='queries')
Argument #2
DType: NoneType
Value: None
Argument #3
DType: bool
Value: False
Function Name: '_default_save_signature'
Option #1
Callable with:
Argument #1
input_1: TensorSpec(shape=(None,), dtype=tf.string, name='input_1')
Function Name: 'call_and_return_all_conditional_losses'
Option #1
Callable with:
Argument #1
input_1: TensorSpec(shape=(None,), dtype=tf.string, name='input_1')
Argument #2
DType: NoneType
Value: None
Argument #3
DType: bool
Value: False
Option #2
Callable with:
Argument #1
queries: TensorSpec(shape=(None,), dtype=tf.string, name='queries')
Argument #2
DType: NoneType
Value: None
Argument #3
DType: bool
Value: True
Option #3
Callable with:
Argument #1
queries: TensorSpec(shape=(None,), dtype=tf.string, name='queries')
Argument #2
DType: NoneType
Value: None
Argument #3
DType: bool
Value: False
Option #4
Callable with:
Argument #1
input_1: TensorSpec(shape=(None,), dtype=tf.string, name='input_1')
Argument #2
DType: NoneType
Value: None
Argument #3
DType: bool
Value: True
The point is: I'm passing an array and I'm not sure if it must be in b64 format.
This Python code works, but returns a different result than expected:
import tensorflow as tf
import base64
from google.protobuf import json_format
from google.protobuf.struct_pb2 import Value
import numpy as np
from google.cloud import aiplatform
import os
vertex_model = tf.saved_model.load("gs://bucket/model")
serving_input = list(
vertex_model.signatures["serving_default"].structured_input_signature[1].keys()
)[0]
print("Serving input :", serving_input)
aip_endpoint_name = (
f"projects/my-project/locations/us-west1/endpoints/12345567"
)
endpoint = aiplatform.Endpoint(aip_endpoint_name)
def encode_input(input):
return base64.b64encode(np.array(input)).decode("utf-8")
instances_list = [{serving_input: {"b64": encode_input(np.array(["doctor"]))}}]
instances = [json_format.ParseDict(s, Value()) for s in instances_list]
results = endpoint.predict(instances=instances)
print(results.predictions[0]["output_2"])
['8 1/2 (1963)', 'Sword in the Stone, The (1963)', 'Much Ado About Nothing (1993)', 'Jumanji (1995)', 'As Good As It Gets (1997)', 'Age of Innocence, The (1993)', 'Double vie de Véronique, La (Double Life of Veronique, The) (1991)', 'Piano, The (1993)', 'Eat Drink Man Woman (1994)', 'Bullets Over Broadway (1994)']
Any ideas on how to fix / encode the payload ?
I solved the issue. The problem was base64 encoding. This is working perfectly:
def encode_64(input):
message = input
message_bytes = message.encode('ascii')
base64_bytes = base64.b64encode(message_bytes)
base64_message = base64_bytes.decode('ascii')
return base64_message
instances_list = [{serving_input: {"b64": encode_64("doctor")}}]
instances = [json_format.ParseDict(s, Value()) for s in instances_list]
results = endpoint.predict(instances=instances)
print(results.predictions[0]["output_2"][:3])
['Nelly & Monsieur Arnaud (1995)', 'Three Lives and Only One Death (1996)', 'Critical Care (1997)']
Related
As far as I understand, using:
dependencies:
cake:
- eggs
- flour
in a JSON schema validation (I actually parse the schema in YAML, but it should not really matter) should forbid the presence of a cake without an entry for eggs and flour. So this should be rejected:
receip:
cake: crepes
while this should be accepted:
receip:
eggs: 3 eggs, given by farmer
flour: 500g
cake: crepes
Unfortunately, both cases are accepted in my case. Any idea what I made wrong? I also tried to add another level required as proposed by jsonSchema attribute conditionally required but the problem is the same.
MWE
In case it helps, here are all the files that I'm using.
debug_schema.yml:
$schema: https://json-schema.org/draft/2020-12/schema
type: object
properties:
receip:
type: object
properties:
eggs:
type: string
flour:
type: string
cake:
type: string
dependencies:
cake:
- eggs
- flour
b.yml:
receip:
## I'd like to forbid a cake without eggs and flour
## Should not validate until I uncomment:
# eggs: 3 eggs, given by farmer
# flour: 500g
cake: crepes
validate_yaml.py
#!/usr/bin/env python3
import yaml
from jsonschema import validate, ValidationError
import argparse
# Instantiate the parser
parser = argparse.ArgumentParser(description='This tool not only checks if the YAML file is correct in term'
+ ' of syntex, but it also checks if the structure of the YAML file follows'
+ ' the expectations of this project (e.g. that it contains a list of articles,'
+ ' with each articles having a title etc…).')
# Optional argument
parser.add_argument('--schema', type=str, default="schema.yml",
help='Path to the schema')
parser.add_argument('--yaml', type=str, default="science_zoo.yml",
help='Path to the YAML file to verify')
args = parser.parse_args()
class YAMLNotUniqueKey(Exception):
pass
# By default, PyYAML does not raise an error when two keys are identical, which is an issue here
# see https://github.com/yaml/pyyaml/issues/165#issuecomment-430074049
class UniqueKeyLoader(yaml.SafeLoader):
def construct_mapping(self, node, deep=False):
mapping = set()
for key_node, value_node in node.value:
key = self.construct_object(key_node, deep=deep)
if key in mapping:
raise YAMLNotUniqueKey(f"Duplicate key {key!r} found.")
mapping.add(key)
return super().construct_mapping(node, deep)
with open(args.schema, "r") as schema_stream:
try:
schema = yaml.load(schema_stream, UniqueKeyLoader)
with open(args.yaml, "r") as yaml_stream:
try:
validate(yaml.load(yaml_stream, UniqueKeyLoader), schema)
print("The YAML file is correct and follows the schema, congrats! :D")
except ValidationError as exc:
print(exc.message)
print("The problem is located in the JSON at the path {}.".format(exc.json_path))
except YAMLNotUniqueKey as exc:
print("ERROR: {}".format(exc))
print("Rename in the YAML one the two instances of this key to ensure all keys are unique.")
except yaml.YAMLError as exc:
print("Errors when trying to load the YAML file:")
print(exc)
except YAMLNotUniqueKey as exc:
print("ERROR: {}".format(exc))
print("Rename in the schema one the two instances of this key to ensure all keys are unique.")
except yaml.YAMLError as exc:
print("Errors when trying to load the schema file:")
print(exc)
Command:
./validate_yaml.py --yaml b.yml --schema debug_schema.yml
Oh, so it seems that this keyword has been deprecated, and split into dependentRequired (see e.g. this example) and dependentSchemas (see e.g. this example). Just using dependentRequired solves the issue:
$schema: https://json-schema.org/draft/2020-12/schema
type: object
properties:
receip:
type: object
properties:
eggs:
type: string
flour:
type: string
cake:
type: string
dependentRequired:
cake:
- eggs
- flour
I have multiline log that consists correct json part (one or more lines), and after it - stack trace.
Is it possile to parse first part of the log as json, and for stack-trace make new label ("stackTrace" for example) and put there all the lines after first part?
Unfortunately, logs can contain a different number of fields in json format, and therefore it is unlikely to parse them using regex.
{ "timestamp" : "2022-03-28 14:33:00,000", "logger" : "appLog", "level" : "ERROR", "thread" : "ktor-8080", "url" : "/path","method" : "POST","httpStatusCode" : 400,"callId" : "f7a22bfb1466","errorMessage" : "Unexpected JSON token at offset 184: Encountered an unknown key 'a'. Use 'ignoreUnknownKeys = true' in 'Json {}' builder to ignore unknown keys. JSON input: { \"entityId\" : \"TGT-8c8d950036bf\", \"processCode\" : \"test\", \"tokenType\" : \"SSO_CCOM\", \"ttlMills\" : 600000, \"a\" : \"a\" }" }
com.example.info.core.WebApplicationException: Unexpected JSON token at offset 184: Encountered an unknown key 'a'.
Use 'ignoreUnknownKeys = true' in 'Json {}' builder to ignore unknown keys.
JSON input: {
"entityId" : "TGT-8c8d950036bf",
"processCode" : "test",
"tokenType" : "SSO_CCOM",
"ttlMills" : 600000,
"a" : "a"
}
at com.example.info.signtoken.SignTokenApi$signTokenModule$2$1$1.invokeSuspend(SignTokenApi.kt:94)
at com.example.info.signtoken.SignTokenApi$signTokenModule$2$1$1.invoke(SignTokenApi.kt)
at com.example.info.signtoken.SignTokenApi$signTokenModule$2$1$1.invoke(SignTokenApi.kt)
at io.ktor.util.pipeline.SuspendFunctionGun.loop(SuspendFunctionGun.kt:248)
at io.ktor.util.pipeline.SuspendFunctionGun.proceed(SuspendFunctionGun.kt:116)
at io.ktor.util.pipeline.SuspendFunctionGun.execute(SuspendFunctionGun.kt:136)
at io.ktor.util.pipeline.Pipeline.execute(Pipeline.kt:78)
at io.ktor.routing.Routing.executeResult(Routing.kt:155)
at io.ktor.routing.Routing.interceptor(Routing.kt:39)
at io.ktor.routing.Routing$Feature$install$1.invokeSuspend(Routing.kt:107)
at io.ktor.routing.Routing$Feature$install$1.invoke(Routing.kt)
at io.ktor.routing.Routing$Feature$install$1.invoke(Routing.kt)
UPD.
I've made promtail pipeline like so
scrape_configs:
- job_name: Test_AppLog
static_configs:
- targets:
- ${HOSTNAME}
labels:
job: INFO-Test_AppLog
host: ${HOSTNAME}
__path__: /home/adm_web/app.log
pipeline_stages:
- multiline:
firstline: ^\{\s?\"timestamp\"
max_lines: 128
max_wait_time: 1s
- match:
selector: '{job="INFO-Test_AppLog"}'
stages:
- regex:
expression: '(?P<log>^\{ ?\"timestamp\".*\}[\s])(?s)(?P<stacktrace>.*)'
- labels:
log:
stacktrace:
- json:
expressions:
logger: logger
url: url
method: method
statusCode: httpStatusCode
sla: sla
source: log
But in fact, json config block does not work, the result in Grafana is only two fields - log and stacktrace.
Any help would be appreciated
if the style is constantly like this maybe the easiest way is to analyze whole log string find index of last symbol "}" - then split the string using its index+1 and result should be in the first part of output array
I have a data structure that I want to convert to json and preserve the key order.
For example:
%{ x: 1, a: 5} should be converted to "{\"x\": 1, \"a\": 5}"
Poison does it without any problem. But when I upgrade to Jason, it changes to "{\"a\": 5, \"x\": 1}".
So I use JasonHelpers json_map to preserve the order like this:
Jason.Helpers.json_map([x: 1, a: 5])
It creates a fragment with correct order.
However, when I use a variable to do this:
list = [x: 1, a: 5]
Jason.Helpers.json_map(list)
I have an error:
** (Protocol.UndefinedError) protocol Enumerable not implemented for {:list, [line: 15], nil} of type Tuple.
....
QUESTION: How can I pass a pre-calculated list into Jason.Helpers.json_map ?
The calculation is complicated, so I don't want to repeat the code just to use json_map, but use the function that returns a list.
json_map/1 is a macro, from its docs:
Encodes a JSON map from a compile-time keyword.
It is designed for compiling JSON at compile-time, which is why it doesn't work with your runtime variable.
Support for encoding keyword lists was added to the Jason library a year ago, but it looks like it hasn't been pushed to hex yet. I managed to get it work by pulling the latest code from github:
defp deps do
[{:jason, git: "https://github.com/michalmuskala/jason.git"}]
end
Then by creating a struct that implements Jason.Encoder (adapted from this solution by the Jason author):
defmodule OrderedObject do
defstruct [:value]
def new(value), do: %__MODULE__{value: value}
defimpl Jason.Encoder do
def encode(%{value: value}, opts) do
Jason.Encode.keyword(value, opts)
end
end
end
Now we can encode objects with ordered keys:
iex(1)> Jason.encode!(OrderedObject.new([x: 1, a: 5]))
"{\"x\":1,\"a\":5}"
I don't know if this is part of the public API or just an implementation detail, but it appears you have some control of the order when implementing the Jason.Encoder protocol for a struct.
Let's say you've defined an Ordered struct:
defmodule Ordered do
#derive {Jason.Encoder, only: [:a, :x]}
defstruct [:a, :x]
end
If you encode the struct, the "a" key will be before the "x" key:
iex> Jason.encode!(%Ordered{a: 5, x: 1})
"{\"a\":5,\"x\":1}"
Let's reorder the keys we pass in to the :only option:
defmodule Ordered do
#derive {Jason.Encoder, only: [:x, :a]}
defstruct [:a, :x]
end
If we now encode the struct, the "x" key will be before the "a" key:
iex> Jason.encode!(%Ordered{a: 5, x: 1})
"{\"x\":1,\"a\":5}"
I am defining a function and trying to print the result of this function however i am getting the error message:
temp_of_seawater() missing 1 required positional argument: 'd'
Relevant code:
def temp_of_seawater(l, d):
temp_of_seawater = ((gradient * d) + temp_surfacelevel_1)
return temp_of_seawater(d)
print("The temperature at your given latitude and depth is:",
temp_of_seawater(d) , "\u00b0""C")
Can the default value variables of a struct be defined as a function instead of raw value?
A default value for a struct field is an expression evaluated at the time of struct definition.
Proof:
# struct.exs
defmodule M do
defstruct [a: IO.gets("> ")]
end
# ...
$ iex struct.exs
Erlang/OTP 17 [erts-6.0] ...
> hello
Interactive Elixir (0.13.3-dev) - ...
iex(1)> %M{}
%M{a: "hello\n"}
You can define a function that will create a struct and will set some of its fields:
# struct.exs
defmodule M do
defstruct [a: nil]
def new(val) do
%M{a: val}
end
end
# ...
M.new(123)
#=> %M{a: 123}