Azure Message Routing: JSON message in wrong format - json

I'm working with a raspberry pi zero and Python to send and recieve sensor data with Azure IoT. I've already created an endpoint and message routing to the storage container. But when I check the JSON-Files in the container, I've got two problems:
The file include various general data which i don't need
My message body is in Base24-format
My message looks like this:
{"EnqueuedTimeUtc":"2021-06-25T13:03:25.7110000Z","Properties":{},"SystemProperties":{"connectionDeviceId":"RaspberryPi","connectionAuthMethod":"{"scope":"device","type":"sas","issuer":"iothub","acceptingIpFilterRule":null}","connectionDeviceGenerationId":"637555519600003402","enqueuedTime":"2021-06-25T13:03:25.7110000Z"},"Body":"eyJ0ZW1wZXJhdHVyZSI6IDI4Ljk1LCAicHJlc3N1cmUiOiA5ODEuMDg2Njk1NDU5MzMyNiwgImh1bWlkaXR5IjogNDYuMjE0ODE3NjkyOTEyODgsICJ0aW1lIjogIjIwMjEtMDYtMjUgMTQ6MDM6MjUuNjMxNzk1In0="}
The body included my sensor data in Base64-format. I've already read about contentType = application/JSON and contentEncoding = UTF-8 so that Azure can work with correct JSON files. But where do i apply these settings? When I apply it to the routing query, I get the following error:
Routing Query Error (The server didn't understand your query. Check your query syntax and try again)
I just want to get the body-message in correct JSON Format.
Thank you all for any kind of help! Since it's my first experience with this kind of stuff, I'm a little helpless.

Zero clue if this helps, but here is my code for sending data from Raspberry Pi Python to AWS - Parse Server using base64/JSON. The only reason I use base64 is to send pictures. You should only have to use JSON to send your other data.
import requests
import random, time
import math
import json
import Adafruit_DHT
import base64
from Adafruit_CCS811 import Adafruit_CCS811
from picamera import PiCamera
from time import sleep
DHT_SENSOR = Adafruit_DHT.DHT22
DHT_PIN =4
ccs = Adafruit_CCS811()
camera = PiCamera()
while True:
time.sleep(5)
camera.start_preview()
sleep(5)
camera.capture('/home/pi/Desktop/image.jpg')
camera.stop_preview()
with open('/home/pi/Desktop/image.jpg', 'rb') as binary_file:
binary_file_data = binary_file.read()
base64_encoded_data = base64.b64encode(binary_file_data)
base64_message = base64_encoded_data.decode('utf-8')
humidity, temperature = Adafruit_DHT.read_retry(DHT_SENSOR, DHT_PIN)
ccs.readData()
parseServer = {
"temp": temperature,
"humid": humidity,
"co2": ccs.geteCO2(),
"pic": base64_message
}
resultJSON = json.dumps(parseServer)
headers = {
'X-Parse-Application-Id': 'myappID',
'Content-Type': 'application/json',
}
data = resultJSON
response =
requests.put('http://1.11.111.1111/parse/classes/Gamefuck/TIuRnws3Ag',
headers=headers, data=data)
print(data)

If you're using the Python SDK for Azure IoT, sending the message as UTF-8 encoded JSON is as easy as setting two properties on your message object. There is a good example here
msg.content_encoding = "utf-8"
msg.content_type = "application/json"
Furthermore, you don't need to change anything in IoT Hub for this. This message setting is a prerequisite to be able to do message routing based on the body of the message.

Related

Extract data from Zapier Storage

I was successful in publishing (POST) a JSON file in Zapier and creating a Storage for it. However, I´d like to access the JSON in Zapier Storage using a Python code run locally. I am able to access the storage with Python3, see that is something written there, but I cannot access the JSON contents.
import urllib
import json
import codecs
reader = codecs.getreader("utf-8")
access_token = "password"
def GetStorage(page_id, access_token):
url = 'https://hooks.zapier.com/url/'
response = urllib.request.urlopen(url)
data = json.load(reader(response))
return data
a=GetStorage(url, access_token)
print(a)
All I get is:
{'attempt': '5a539a49-65eb-44f8-a30e-e171faf7a680',
'id': '1b38d21a-0150-46df-98c1-490a0d04b565',
'request_id': '5a539a49-65eb-44f8-a30e-e171faf7a680',
'status': 'success'}
When in fact I need:
{'Name':'value',
'Address': 'value'
}
Any ideas ?
David here, from the Zapier Platform team.
You're close! hooks.zapier.com is the url we use for incoming webhooks, so we always reply with a 200 and the response body you're seeing.
Instead, use store.zapier.com. You'll also want to make sure to include your secret. A full request URL will look like:
https://store.zapier.com/api/records?secret=test
which will return arbitrary json data:
{
"name": "david",
"job": "programmer"
}
The full docs are in json here: https://store.zapier.com/

How to send and receive large numpy arrays (several GBs) using flask

I am creating a micro-service to be used locally. From some input I am generating one large matrix each time. Right now I am using json to transfer the data but it is really slow and became the bottleneck of my application.
Here is my client side:
headers={'Content-Type': 'application/json'}
data = {'model': 'model_4', \
'input': "this is my input."}
r = requests.post("http://10.0.1.6:3000/api/getFeatureMatrix", headers=headers, data=json.dumps(data))
answer = json.loads(r.text)
My server is something like:
app = Flask(__name__, static_url_path='', static_folder='public')
#app.route('/api/getFeatureMatrix', methods = ['POST'])
def get_feature_matrix():
arguments = request.get_json()
#processing ... generating matrix
return jsonify(matrix=matrix.tolist())
How can I send large matrices ?
In the end I ended up using
np.save(matrix_path, mat)
return send_file(matrix_path+'.npy')
On the client side I save the matrix before loading it.
I suppose that the problem is that the matrix takes time to generate. It's a CPU bound application
One solution would be to handle the request asynchronously. Meaning that:
The server receives request and returns a 202 ACCEPTED and the link to where the client can check the progress of the creation of the matrix
The client checks the returned url he either gets:
a 200 OK response if the matrix is not yet created
a 201 CREATED response if the matrix is finally created, with a link to the resource
However, Flask handles one request at a time. So you'll need to use multithreading or multiprocessing or greenthreads.
On the client side you could do something like:
with open('binariy.file', 'rb') as f:
file = f.read()
response = requests.post('/endpoint', data=file)
and on the Server side:
import numpy as np
...
#app.route('/endpoint', methods=['POST'])
def endpoint():
filestr = request.data
file = np.fromstring(filestr)

JSON encoding error publishing SNS message with Boto 3

I am trying to send a simple JSON message to an Amazon SNS topic in Boto 3. However, I keep getting a _jsonparsefailure in the tag of the message and I only receive the default value. Here is my code:
mess = {'default': 'default', 'this': 'that'}
jmess = json.JSONEncoder().encode(mess)
response = self.boto_client.publish(
TopicArn = self.TopicArn,
MessageStructure = 'json',
Message = jmess
)
I have also tried json.dumps(), which produces the same result.
mess = {'default': 'default', 'this': 'that'}
jmess = json.dumps(mess)
response = self.boto_client.publish(
TopicArn = self.TopicArn,
MessageStructure = 'json',
Message = jmess
)
I seem to be following all of the guidelines set by the documentation, and I'm not getting an exception when I run the script. There are SQS queues that subscribe to the topic, and I am pulling the result data straight from the console.
This is how I fixed it:
message = {"record_id": "my_id", "name": "value"}
json_message = json.dumps({"default":json.dumps(message)})
sns_client.publish("topic_arn", Subject="test", MessageStructure="json", Message=json_message)
SNS expects "default" as the key which contains the message to be published.
It turns out the message needs to look like this:
json.dumps({"default": "my default", "sqs": json.dumps({"this": "that"})})
Amazon has horrible documentation in this regard.
You can also remove the MessageStructure='json'and send just json.dumps({'this':'that'}) if you set the SQS queue to receive just the raw message. This is simply done through the console.
In Boto 3 (I'm using v1.4.7) this is the format:
sns.publish(TopicArn="topic_arn", Message=json.dumps({"this": "that"},ensure_ascii=False))
There isn't any need for the protocol definition, i.e. "default" unless you are delivering different structures per protocol, i.e., JSON for Lambda and HTML for email.

Returning JSON object from Django view to client via AJAX

Basically, I am trying to turn raw SQL query result to JSON object and then send it to client-side via AJAX.
Here is my view (I am using Django 1.8.6)
import MySQLdb
from django.db import connection
import json
from django.http import HttpResponse
from django.core import serializers
def test_view(request):
cursor = connection.cursor()
cursor.execute("select id, name from okved")
data = cursor.fetchall
json_data = serializers.serialize('json', data)
return HttpResponse(json_data, content_type="application/json")
The respective URLConf
url(r'^test/$', test_view),
JQuery function
var test = function()
{
$.ajax({
type:"GET",
url:"/test",
dataType : 'json',
cache: "false",
data:{},
success:function(response)
{
alert("Test successful");
}
});
return true;
}
I am constantly getting GET http://127.0.0.1:8000/test/ 500 (INTERNAL SERVER ERROR) error, wheareas I follow all the recommendations that I have come across the previous threads here. I would really appreciate any help on this. I have blown my mind trying surfing Stackoverflow on this.
First of all, 500 error is just an error code, django would give you stacktrace of where the error happens in the function stack, you should learn to read that and find where the error happens.
From you code sounds like you are trying to use serializers to serialize a raw query result. This wouldn't work because the raw result is a tuple of tuples, each sub tuple is a record returned from your database. Django serializers is only good at serializing objects queried from ORM. You'd better go with json.dumps() method. Here's last few lines in your views.py:
data = cursor.fetchall()
json_data = json.dumps(data)
return HttpResponse(json_data, content_type="application/json")
Here's the doc for django serializer.

How to JSON format an HTTP error response in webapp2

I am using webapp2 for development in App Engine. What I would like to do is to send a custom JSON formatted response in case of an error. For example when the request length is larger that a threshold, to respond with HTTP 400 and response body
{'error':'InvalidMessageLength'}
In webapp2, there is the option to assign error handlers for certain exceptions. For example:
app.error_handlers[400] = handle_error_400
Where handle_error_400 is the following:
def handle_error_400(request, response, exception):
response.write(exception)
response.set_status(400)
When webapp2.RequestHandler.abort(400) is executed, the above code is executed.
How is it possible to have different response formats (HTML and JSON) dynamically based on the above setup? That is, how it is possible to call different versions of handle_error_400 function?
Here is a fully working example that demonstrates how to have the same error handler for all kind of errors and if your URL starts with /json then the response will be an application/json (use your imagination on how you could make a good use of the request object to decide what kind of response you should provide):
import webapp2
import json
def handle_error(request, response, exception):
if request.path.startswith('/json'):
response.headers.add_header('Content-Type', 'application/json')
result = {
'status': 'error',
'status_code': exception.code,
'error_message': exception.explanation,
}
response.write(json.dumps(result))
else:
response.write(exception)
response.set_status(exception.code)
app = webapp2.WSGIApplication()
app.error_handlers[404] = handle_error
app.error_handlers[400] = handle_error
In the above example you can easily test the different behaviours by visting the following URLs that will return a 404 which is the easiest error to test:
http://localhost:8080/404
http://localhost:8080/json/404