Error:
send: b'{"specversion": "1.0", "logEntryBatches": [{"entries": [{"data": "{\\"hello\\": \\"oracle\\", \\"as\\": \\"aaa\\"}", "id": "ocid1.test.oc1..jkhjkhh23423fd", "time": "2021-04-01T12:19:28.416000Z"}], "source": "EXAMPLE-source-Value", "type": "remediationLogs", "defaultlogentrytime": "2021-04-01T12:19:28.416000Z"}]}'
reply: 'HTTP/1.1 400 Bad Request\r\n'
header: Date: Fri, 02 Apr 2021 07:39:16 GMT
header: opc-request-id: ER6S6HDVTNWUOKCJ7XXZ/OpcRequestIdExample/770899C2C7CA6ABA11D996CC57E8EE8F
header: Content-Type: application/json
header: Connection: close
header: Content-Length: 79
Traceback (most recent call last):
File "tool.py", line 45, in <module>
put_logs_response = loggingingestion_client.put_logs(
File "/home/ubuntu/.local/lib/python3.8/site-packages/oci/loggingingestion/logging_client.py", line 172, in put_logs
return self.base_client.call_api(
File "/home/ubuntu/.local/lib/python3.8/site-packages/oci/base_client.py", line 276, in call_api
response = self.request(request)
File "/home/ubuntu/.local/lib/python3.8/site-packages/oci/base_client.py", line 388, in request
self.raise_service_error(request, response)
File "/home/ubuntu/.local/lib/python3.8/site-packages/oci/base_client.py", line 553, in raise_service_error
raise exceptions.ServiceError(
oci.exceptions.ServiceError: {'opc-request-id': 'ER6S6HDVTNWUOKCJ7XXZ/OpcRequestIdExample/770899C2C7CA6ABA11D996CC57E8EE8F', 'code': 'InvalidParameter', 'message': 'Unable to process JSON input', 'status': 400}
I am trying to send json data to Oracle logs, but getting the above error. I am using json.dumps(data) to convert the dict to string. Kindly let me know if any workaround available to this.
Code:
data = {'hello':'oracle', "as":"aaa"}
put_logs_response = loggingingestion_client.put_logs(
log_id="ocid1.log.oc1.iad.<<Log OCID>>",
put_logs_details=oci.loggingingestion.models.PutLogsDetails(
specversion="1.0",
log_entry_batches=[
oci.loggingingestion.models.LogEntryBatch(
entries=[
oci.loggingingestion.models.LogEntry(
data= json.dumps(data),
id="ocid1.test.oc1..jkhjkhh23423fd",
time=datetime.strptime(
"2021-04-01T12:19:28.416Z",
"%Y-%m-%dT%H:%M:%S.%fZ"))],
source="EXAMPLE-source-Value",
type="Logs",
defaultlogentrytime=datetime.strptime(
"2021-04-01T12:19:28.416Z",
"%Y-%m-%dT%H:%M:%S.%fZ"))]),
timestamp_opc_agent_processing=datetime.strptime(
"2021-04-01T12:19:28.416Z",
"%Y-%m-%dT%H:%M:%S.%fZ"),
opc_agent_version="EXAMPLE-opcAgentVersion-Value",
opc_request_id="ER6S6HDVTNWUOKCJ7XXZ/OpcRequestIdExample/")
This exception indicates that you have an InvalidParameter in your JSON input.
oci.exceptions.ServiceError: {'opc-request-id': 'ER6S6HDVTNWUOKCJ7XXZ/OpcRequestIdExample/770899C2C7CA6ABA11D996CC57E8EE8F', 'code': 'InvalidParameter', 'message': 'Unable to process JSON input', 'status': 400}
The InvalidParameter is your timestamp, which is date - 2021-04-01T12:19:28.416Z.
According to Oracle's documentation you need to use a RFC3339-formatted date-time string with milliseconds precision when creating a LogEntry.
This code snippet is from oci-python-sdk - log_entry.py, but it doesn't mention the milliseconds precision like Oracle's documentation.
#time.setter
def time(self, time):
"""
Sets the time of this LogEntry.
Optional. The timestamp associated with the log entry. An RFC3339-formatted date-time string.
If unspecified, defaults to PutLogsDetails.defaultlogentrytime.
:param time: The time of this LogEntry.
:type: datetime
"""
self._time = time
This code create a UTC RFC3339 complaint timestamp with milliseconds precision
from datetime import datetime
from datetime import timezone
current_utc_time_with_offset = datetime.now(timezone.utc).isoformat()
print(current_utc_time_with_offset)
#output
2021-04-06T13:00:52.706040+00:00
current_utc_time_with_timezone = datetime.now(timezone.utc).isoformat().replace("+00:00", "Z")
print(current_utc_time_with_timezone)
#output
2021-04-06T13:09:10.053432Z
This Stack Overflow question is worth a read:
What's the difference between ISO 8601 and RFC 3339 Date Formats?
This article is also useful:
Understanding about RFC 3339 for Datetime and Timezone Formatting in Software Engineering
Your data looks fine , I think the issue is that Time precision is more than millisec. it should work fine if you loose the trailing zeros in time.
Format a datetime into a string with milliseconds
https://docs.oracle.com/en-us/iaas/api/#/en/logging-dataplane/20200831/LogEntry/
Your time is RFC3339 but precision is more than millisec
'{"specversion": "1.0", "logEntryBatches": [{"entries": [{"data": "{\"hello\": \"oracle\", \"as\": \"aaa\"}", "id": "ocid1.test.oc1..jkhjkhh23423fd", "time": "2021-04-01T12:19:28.416000Z"}], "source": "EXAMPLE-source-Value", "type": "remediationLogs", "defaultlogentrytime": "2021-04-01T12:19:28.416000Z"}]}'
See https://docs.oracle.com/en-us/iaas/api/#/en/logging-dataplane/20200831/LogEntry/
The timestamp associated with the log entry. An RFC3339-formatted date-time string with milliseconds precision. If unspecified, defaults to PutLogsDetails.defaultlogentrytime.
Related
I'm using the Robot Framework API automation. Here, storing the JSON response in a variable [POSTResp.content]. I.e., "POSTResp.content" has the whole response, as given below. Please help me to get an attribute's value (for ex, value of referenceId) from the stored content.
Example of JSON response:
{
"serviceResponseHeader": {
"responseContext": {
"responseCode": "MS19",
"responseDescription": "Success",
"serviceResponseTimeInGMT": "18 Sep 2018 16:12:43 GMT"
},
"requesterContext": {
"applicationCode": null,
"applicationSubCode": null,
"countryCode": null,
"requesterReferenceNumber": null,
"requestTimeInGMT": "30 Jun 2015 11:54:49 GMT",
"requesterUserIdentity": "23483",
"requesterGroupIdentity": "1620",
"requesterIpAddress": "",
"sessionIdentity": "2536kjhfdashfkhfsab",
"ssoSessionIdentity": "2536kjhfdashfkhfsab",
"requesterAbbreviatedGroupName": "NEWCOMP"
},
"serviceContext": {
"serviceVersionNumber": "1.0",
"serviceCode": "30"
}
},
"getProxyDetailResponseBody": {
"proxyDetails": {
"proxyType": "",
"proxyValue": "20140005K",
"referenceId": "PR18090000847597",
"transactionId": "18091801657466"
}
}
}
I've tried the below ways,
1) ${json} To JSON ${POSTResp.content} true
log to console \n the Proxy ID is ${json["proxyValue"]}
Result: Resolving variable '${json["proxyValue"]}' failed: TypeError: string indices must be integers, not str
2) ${json} Evaluate json.loads(${POSTResp.content}} json
log to console \n the Proxy ID is ${json["proxyValue"]}
Result: failed: SyntaxError: unexpected EOF while parsing (, line 1)
Issues with your two approaches:
1) the library keyword call passes a true argument (well, truth-like) to the pretty_print parameter:
${json} To JSON ${POSTResp.content} true
Looking at the library's source, in that case the keyword does not return a dict object - but a string, a beatified version of the source json. That coincides with the error your received.
Remove the "true" argument and it must return a dict.
2) In the Evaluate surround the variable with triple quotes (python's literal string):
${json} Evaluate json.loads('''${POSTResp.content}'''}
json
Without it, the framework just dumped the variable's value, which raised a python syntax error.
By the way, try not to make your variables with language keywords/library names - like ${json} up there.
I have a REST request that will return a json response with a set of nine keys and there values. No the input values for the request are randomized and therefore will I will get different values every time it is run.
Is is possible to create a script assertion that will just validated whether the json structure is correct.
Json Response:
{
"sid": 636811,
"poss": 122,
"mis": -150,
"pres": 253,
"aea": 0,
"aa": 12,
"ua": 7,
"lar": null,
"lbr": 1
}
Script Assertion:
def expectedMap = [sid:'', poss:'', mis:'', pres:'', aea:'', aa:'', ua:'', lar:'', lbr:'']
def json = new groovy.json.JsonSlurper().parseText(context.response))
assert json.keySet().sort() == expectedMap.keySet().sort()
I believe the following script assertion I have is failing because is it asserting the key values as well.
log.info expectedMap.keySet().sort()
log.info json.keySet().sort()
Tue Jun 26 14:27:52 BST 2018:INFO:[aa, aea, lar, lbr, mis, poss, pres, sid, ua]
Tue Jun 26 14:27:52 BST 2018:INFO:[aa, aea, lar, lbr, mis, poss, pres, sid, ua]
log.info expectedMap.keySet().sort().getClass()
log.info json.keySet().sort().getClass()
Tue Jun 26 14:17:12 BST 2018:INFO:class java.util.ArrayList
Tue Jun 26 14:17:12 BST 2018:INFO:class java.util.TreeMap$KeySet
You are almost there. Just need to get the keys, sort them and compare.
Change from:
assert expectedMap == json, 'Actual response is not matching with expected data'
To:
assert expectedMap.keySet().sort() == json.keySet().sort() as List, 'Actual response is not matching with expected data'
It is looking like sqlAlchemy might have had a facelift since the time that the Airflow tutorial were written: it is not accepting a date in the format of YYYY-DD-MM that is shown in the tutorial at http://pythonhosted.org/airflow/tutorial.html :
$airflow test tutorial print_date 2017-12-30
[2017-12-29 19:10:40,695] {__init__.py:45} INFO - Using executor SequentialExecutor
[2017-12-29 19:10:40,745] {models.py:194} INFO - Filling up the DagBag from /git/airflow/home/dags
Traceback (most recent call last):
File "/usr/local/bin/airflow", line 4, in <module>
__import__('pkg_resources').run_script('apache-airflow==1.10.0.dev0+incubating', 'airflow')
File "/usr/local/lib/python2.7/site-packages/pkg_resources/__init__.py", line 748, in run_script
self.require(requires)[0].run_script(script_name, ns)
..
File "build/bdist.macosx-10.12-x86_64/egg/sqlalchemy/engine/default.py", line 623, in _init_compiled
File "build/bdist.macosx-10.12-x86_64/egg/sqlalchemy/sql/type_api.py", line 1074, in process
File "build/bdist.macosx-10.12-x86_64/egg/sqlalchemy_utc.py", line 31, in process_bind_param
sqlalchemy.exc.StatementError: (exceptions.ValueError) naive datetime is disallowed [SQL: u'SELECT task_instance.try_number AS task_instance_try_number, task_instance.task_id AS task_instance_task_id, task_instance.dag_id AS task_instance_dag_id, task_instance.execution_date AS task_instance_execution_date, task_instance.start_date AS task_instance_start_date, task_instance.end_date AS task_instance_end_date, task_instance.duration AS task_instance_duration, task_instance.state AS task_instance_state, task_instance.max_tries AS task_instance_max_tries, task_instance.hostname AS task_instance_hostname, task_instance.unixname AS task_instance_unixname, task_instance.job_id AS task_instance_job_id, task_instance.pool AS task_instance_pool, task_instance.queue AS task_instance_queue, task_instance.priority_weight AS task_instance_priority_weight, task_instance.operator AS task_instance_operator, task_instance.queued_dttm AS task_instance_queued_dttm, task_instance.pid AS task_instance_pid \nFROM task_instance \nWHERE task_instance.dag_id = ? AND task_instance.task_id = ? AND task_instance.execution_date = ?\n LIMIT ? OFFSET ?'] [parameters: [{}]]
What is the format now required by sqlAlchemy ? (It appears to be a matter of a missing timezone - so I'm also looking into that ..)
A format like the following is working:
'2017-12-28T12:27:00Z'
Where the first portion is the date , then a timestamp after the T and then the timezone information.
As per python's official documentation, there are two types of datetime objects - aware and naive
https://docs.python.org/3/library/datetime.html
Date and time objects may be categorized as “aware” or “naive” depending on whether or not they include timezone information.
See the example below -
from datetime import datetime, timezone
date_aware = datetime.now(timezone.utc)
date_naive = datetime.now()
output of 'date_aware.tzinfo' is datetime.timezone.utc
output of 'date_naive.tzinfo' is None
https://docs.python.org/3/library/datetime.html#determining-if-an-object-is-aware-or-naive
I am trying to get JSON using get_service_graph() provided by AWS X-Ray Python SDK in AWS Lambda function. reference link
import boto3
from datetime import datetime
def lambda_handler(event, context):
client = boto3.client('xray')
response1 = client.get_service_graph(
StartTime=datetime(2017, 5, 20, 12, 0),
EndTime=datetime(2017, 5, 20, 18, 0)
)
return response1
However, when I passed StartTime and EndTime parameters, stack trace reports datetime type is not JSON serializable. I even tried the following way.
response1 = client.get_service_graph(
StartTime="2017-05-20 00:00:00",
EndTime="2017-05-20 02:00:00"
)
What's weird is, if EndTime is set as "2017-05-20 01:00:00", there is no error generated. Other than that, the same error occurred.
{
"stackTrace": [
[
"/usr/lib64/python2.7/json/__init__.py",
251,
"dumps",
"sort_keys=sort_keys, **kw).encode(obj)"
],
[
"/usr/lib64/python2.7/json/encoder.py",
207,
"encode",
"chunks = self.iterencode(o, _one_shot=True)"
],
[
"/usr/lib64/python2.7/json/encoder.py",
270,
"iterencode",
"return _iterencode(o, 0)"
],
[
"/var/runtime/awslambda/bootstrap.py",
104,
"decimal_serializer",
"raise TypeError(repr(o) + \" is not JSON serializable\")"
]
],
"errorType": "TypeError",
"errorMessage": "datetime.datetime(2017, 5, 20, 1, 53, 13, tzinfo=tzlocal()) is not JSON serializable"
}
I did try only use date, like datetime(2017, 5, 20). However, if I use two consecutive days as StartTime and EndTime, the runtime complains the interval can't be more than 6 hours. If I use same date, it only returns empty JSON. I don't know how to get granularity of get_service_graph().
I think Python SDK for AWS X-Ray might be premature, but I'd still like to seek help from someone who had the same experience. Thanks!
the right way is using datetime(2017, 5, 20) not a string... but can you try using only date... without time? at least the AWS docs shows an example exactly like yours but only yyyy-mm-dd without time
I was flooded with a primitive json body for fcm:
Body = mochijson2:encode([ {<<"operation">>, <<"create">>},{<<"notification_key_name">>, <<"console group">>},{<<"registration_ids">>, [<<"02aa6XXXX3c9b6d">>,<<"APA91bGtaXXXXXXXXXXXXoi4UH8vIdZk1X67A_9izpSFSHV3BXxdIwG">>]}]).
And send POST-request to create group according to documentation:
httpc:request(post, {Url, [{"Authorization", KeyApi}, {"project_id", ProjectId}], "application/json", Body},[{timeout, 5000}], []).
But I got error BadJsonFormat:
{ok,{{"HTTP/1.1",400,"Bad Request"},
[{"cache-control","private, max-age=0"},
{"date","Fri, 10 Mar 2017 16:19:37 GMT"},
{"accept-ranges","none"},
{"server","GSE"},
{"vary","Accept-Encoding"},
{"content-length","25"},
{"content-type","application/json; charset=UTF-8"},
{"expires","Fri, 10 Mar 2017 16:19:37 GMT"},
{"x-content-type-options","nosniff"},
{"x-frame-options","SAMEORIGIN"},
{"x-xss-protection","1; mode=block"},
{"alt-svc","quic=\":443\"; ma=2592000; v=\"36,35,34\""}],
"{\"error\":\"BadJsonFormat\"}"}}
But mochijson2:decode(Body) works fine, and it looks like properly formed json, but I get the error BadJsonFormat anyway.
What was wrong? How can I fix this?
The function mochijson2:encode doesn't return a string or a binary, but an iolist:
1> Body = mochijson2:encode([ {<<"operation">>, <<"create">>},{<<"notification_key_name">>, <<"console group">>},{<<"registration_ids">>, [<<"02aa6XXXX3c9b6d">>,<<"APA91bGtaXXXXXXXXXXXXoi4UH8vIdZk1X67A_9izpSFSHV3BXxdIwG">>]}]).
[123,
[34,<<"operation">>,34],
58,
[34,<<"create">>,34],
44,
[34,<<"notification_key_name">>,34],
58,
[34,<<"console group">>,34],
44,
[34,<<"registration_ids">>,34],
58,
[91,
[34,<<"02aa6XXXX3c9b6d">>,34],
44,
[34,<<"APA91bGtaXXXXXXXXXXXXoi4UH8vIdZk1X67A_9izpSF"...>>,
34],
93],
125]
There is nothing wrong with that, by itself. Using iolists instead of strings or binaries means that you don't have to create an expensive flat data structure, that you would just write to a file or a socket, after which you'd throw it away. Function like file:write_file and gen_tcp:send handle iolists just as well as strings or binaries.
However, httpc:request doesn't!
Let's test that by starting a server on port 1111 with netcat in a shell:
$ nc -l 1111
And then make a request from the Erlang shell:
3> httpc:request(post, {"http://127.0.0.1:1111", [], "application/json", Body},[{timeout, 5000}], []).
The netcat server shows this output:
POST / HTTP/1.1
content-type: application/json
content-length: 13
te:
host: 127.0.0.1:1111
connection: keep-alive
{"operation":"create",....
Note that the content-length is 13 instead of 159! httpc:request is able to send the iolist, but it uses the function length instead of iolist_size to generate the content-length header, and as a result the server only considers the first 13 bytes of the JSON object, which is not valid JSON by itself.
The solution is to pass iolist_to_binary(Body) to httpc:request instead of just Body.