Here's a tricky one.
I have a couple of VM's (debian and lubuntu), both boxes run the same code, where I pull from mysql some data and serve it as a pickled object. Everything fine in here, problem is, when pulling this info and responding it via jsonify (flask joins the game) the result is different between these two boxes... I know, not a code question... or is it? Actually I'm scratching my head for hours now, might have lost some hair in the process, can't figure out what am I doing wrong, I'd blame the distro differences (python versions is the same in both, lubuntu has a greater sqlalchemy version though) but I am not positive about it since I pulled from both responses a value by it's key (changed the query(...).all for query.first() and added a logger.debug(response.id))
Might answer some rising questions here... I "pickled" the query response for storing it in redis so I can cache that datasets instead of hitting the db constantly (didn't want to bring the topic earlier to save some confusion, the why I do this is to keep the keyedtuple as it is instead of getting back an ugly string)
# here happens the query
try:
connection = create_session(engine)
vendor_ = Table('vendor',MetaData(engine),autoload=True)
result = pickle.dumps(connection.query(vendor_).all())
connection.close()
in the view this is how I pull the info:
def all_vendor():
response = pickle.loads(controls.vendor.all_vendor(True))
return jsonify(vendors=response)
The response in the Debian box:
{
"vendors":[
{
"id": 9,
"name": "TEST",
"contact": "TEST",
"phone": "888888",
"email": "test#mail.test"
}
]
}
The response in lubuntu:
{
"vendors": [
[
9,
"TEST",
"TEST",
"888888",
"test#mail.test"
]
]
}
Weird huh? anyone has experienced this before? if so, how the * did you fix it?
Still unanswered. Well, it is a library thing. strace didn't help as I wanted (not and expert though) but after splitting the whole process, the retrieved by SQLAlchemy is not the same along different distro's; had to check pickle, flask and redis before getting to the assumption of the ORM being the culprit here.
No idea of the main difference in those, in any case the simplest solution was to map the whole resultset right before pushing the data to jsonify. It is not nice since it eats more resources but at least it prevents the mess.
It worked right in: Debian latest (unstable), Ubuntu 14.10.
Didn't work in: Lubuntu 14.10, Debian stable, Centos 7
That's It. Won't dig anymore into this issue.
Related
The title sounds quite comprehensive, but my baseline question is quite simple, I guess.
Context
I Azure, I have an IoT hub, which I am sending messages to. I use a modified version one of the samples from the Azure Iot SDK for python.
Sending works fine. However, instead of a string, I send a JSON structure.
When I watch the events flowing into the IoT hub, using the Cloud shell, it looks like this:
PS /home/marcel> az iot hub monitor-events --hub-name weathertestiothub
This extension 'azure-cli-iot-ext' is deprecated and scheduled for removal. Please remove and add 'azure-iot' instead.
Starting event monitor, use ctrl-c to stop...
{
"event": {
"origin": "raspberrypi-zero-wh",
"payload": "{ \"timestamp\": \"1608643863720\", \"locationDescription\": \"Attic\", \"temperature\": \"21.941\", \"relhumidity\": \"71.602\" }"
}
}
Issue
The data seems fine, except the payload looks strange here. BUT, the payload is literally what I send from the device, using the SDK sample.
Is this the correct way to do it? At the end, I have a very hard time to actually get the data into the Time Series Insights model. So I guess, my structure is to be blamed.
Question
What is a recommended JSON data structure to send to the IoT hub for later use?
You should add the following 2 lines to your message in your python SDK sample:
msg.content_encoding = "utf-8"
msg.content_type = "application/json"
This should resolve your formatting concern.
We've also updated our samples to reflect this: https://github.com/Azure/azure-iot-sdk-python/blob/master/azure-iot-device/samples/sync-samples/send_message.py
I ended up using the tip by #elhorton, but it was not the key change. Nonetheless, the formatting in the Azure Shell Monitor looks now much better:
"event": {
"origin": "raspberrypi-zero-wh",
"payload": {
"temperature": 21.543947753906245,
"humidity": 69.22964477539062,
"locationDescription": "Attic"
}
}
The key was:
include the message source time in ISO format
from datetime import datetime
timestampIso = datetime.now().isoformat()
message.custom_properties["iothub-creation-time-utc"] = timestampIso
Using the locationDescription as the Time Series ID Property See https://learn.microsoft.com/en-us/azure/time-series-insights/how-to-select-tsid (Maybe I could also have taken the iothub-connection-device-id, but I did not test that alone specifically)
I guess using "iothub-connection-device-id" will make "raspberrypi-zero-wh" as the name of the time series instance. I agree with your choice of using "locationDescription" as TSID; so Attic becomes the time series instance name, temperature and humidity will be your variables.
I'm trying to build a small app. I found some useful ressources which helped me a lot, but when it comes to problem solving, my (still) superficial knowledge hinders me from analyzing the issue correctly.
First of all I'm running a local database via xampp and successfully connected to it based on this article:
I made the function that encapsulates the whole connection/query/disconnection process a method of my React Component and I'm calling it in componentDidMount() like so:
componentDidMount() {
let current = this;
this.hasMounted = true;
this.getClients(function(rows){
if (current.hasMounted) {
current.setState({
clients : rows
})
}
})
}
The cool thing: It's working. The bad thing. It takes 10 seconds to complete this although my table has just 2 rows... Any thoughts on this?
UPDATE:
Resumed working on the app today and it worked perfectly... until it didn't. Might have to do with multiple restarting/hot reloading through npm? Strangely though restarting the PC didn't solve this, as I would have expected made the difference to yesterday...
Problem
Hello my problem is that I want to use the ssh2-python package to remotely read a a bunch of files, but I can't seem to send commands to the remote host machine.
Originally I started with the paramiko package and I did get that to work, but I am dealing with a lot of large memory files (which is why I can't bring them to the local machine) and it is a bit too slow. I am currently running Python 3.6.3 & ssh2-python 0.18.0.post1 and have tried changing versions of ssh2-python, but it didn't help.
Code
import socket
from ssh2.session import Session
host_ip=socket.gethostbyname('hostname')
sock=socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.connect((host_ip,22))
session=Session()
session.handshake(sock)
print(session.userauth_list('username'))
session.userauth_password('username','password')
channel=session.open_session()
channel.execute('echo Hello')
Code Prints the Following
0
['publickey', 'gssapi-keyex', 'gssapi-with-mic', 'password']
0
0
Expectation/Thoughts
I expected the code to print Hello, but instead it just printed 0. It also printed 0 after the handshake and after the call to the authentication method and I have no idea why. It seems like I am in contact with the remote machine as it did print out which authentications it would take, but it doesn't appear to me that I am actually logged in and can do anything. I would really like to use this package as from what I read online it is significantly faster paramiko, (alternatives would be good to) but I can't seem to figure out what is going on here.
Please help and thanks in advance!
You may in fact be connected and executing commands, but channel.execute('ls') returns '0' (it's exit/status code).
If you want to read your response from the server:
channel.execute('echo Hello')
size, data = channel.read()
while size:
size, dt = channel.read()
data += dt
print(data.decode())
The API documentation for ssh2-python is rather sparse, but the examples should get you through some of the basics: https://github.com/ParallelSSH/ssh2-python/tree/master/examples
A complete version of the above is in example_echo.py
Does the new version (0.24) of Orion let fuzzy search (approximate string search) over entities properties?
In addition, I tried to create an entity with an empty string, but althought the server is returning a 201 code, the entity is not created.
//url to create entity (POST)
http://some.ip:port/v2/entities
//payload:
{
"type": "Test",
"id": "Test.1",
"nombre": ""
}
//reponse
code 201
//url to list entities (GET)
http://some.ip:port/v2/entities?type=Test
//response
[]
This case doesn't work in Orion 0.24.0 due to a bug that has been recently solved in the develop branch. The fix will be available in the version next to 0.24.0, either 0.24.1 or 0.25.0 (number not yet decided at the moment of writting this) by the end of september 2015.
Regarding fuzzy search, we haven't consider yet that functionality in NGSIv2. If you find it useful/needed I'd recommend you to create a new issue in the Orion repository, explaining the feature request as detailed as you can, please.
I need to create my own REST API.
I just saw strongloop and loopback and I thought it will be perfect for my project.
In fact, I was able to get mysql connected using strongloop. However, I had to create something called a "model" and I did it. But, it was like creating a new model from scratch and use it for persisting on the Datasource.
Instead, what I was looking for, is to get a REST API directly from my model on the DB.
I mean taking the models from each table on the DB and then set them up as web services.
Is that possible?
I am newbie on these technologies, although I think it is an interesting question.
Thanks
I'm not sure of a Node tool to do what you're asking, but in other languages / databases you have some choices!
The only one I'm really familiar with is postgrest.
postgrest: You import your data into a Postgres database (similar to MySQL), and it generates a REST API on top of your tables instantly. Bam. Done. I've used this and it was amazingly awesome. You can also deploy it directly on Heroku.
StrongLoop actually has a "discovery" tool for just this purpose! Read up on that page, but here's the basic code to do it. Just drop this code in a file inside /server/boot/ (the docs are wrong on that, it must be in the directory I mention). Of course, you'll need to tailor it for your use case:
var loopback = require('loopback');
var ds = loopback.createDataSource('mysql', {
"host": "yourhost",
"port": 1234,
"database": "foobar",
"username": "someuser",
"password": "somepass"
});
// Discover and build models from a given table
ds.discoverAndBuildModels('PERSON', {visited: {}, associations: true},
function (err, models) {
// Now we have a list of models keyed by the model name
// You only need the rest of this if wanted to inspect what came in...
// For example, you could find the first record from the table
// and verify info or something.
models.Person.findOne({}, function (err, person) {
if(err) {
// handle this if need be...
console.error(err);
return;
}
// Some code using `person`
});
});
Good luck!
I was supposing "discovery tool" was to find out pattern, cluster or whatever else thing into the data. But following
the recomendation of #jakarella I went with more depth.
It was even easier than that, because you can do everything via StrongLoop Arc (GUI). I always prefer cli
but to have a general look of it, sometimes is better to start with the GUI if you don't have too much idea about the subject.
Anyway,
first of all you would need to connect your datasource before (installing previously the driver).
After that, through StrongLoop Arc you can do "discover the models" choosing your tables (taking care of run Arc and restarting every time
you get a new model) and that's it, you get an API over your Datasource
(for testing go to the explorer).
I described before generally the main tasks but if anyone needs more detail pls let me know. Hope this helps to anyone else who is looking to do
something similar
thanks guys for the interest.