I have a project in Microsoft Visual Studio C# and I have to send JSON data to my Arduino via Ehternet Shield.
This is how it works:
Is it possible? How to do it?
Yes. You can do that.
There are some Arduino JSON libraries
Arduino JSON
aJSON
json-arduino
There are some difference between them, one is the Memory allocation (dynamically allocated for aJSON and json-arduino, and static for Arduino JSON).
I only used Arduino JSON, I was convinced by the GitHub documentation and comparison.
Remember Arduino has almost no memory, so you should avoid sending big JSON messages, but if the message is like the one in the image, you shouldn't have any memory issue.
{"led":"255,255,255","tv":"on","air":"32"}
You can do a very decent WS client just following and "merging" these two tutorials:
Create a WS client
JSON Parser
Good luck!
You could grab the http request when it arrives on the arduino and manually parse the request to get the json key/values.
This probably is a little bit more work than just using a library, however you could save quite some memory by not having to include an entire library.
As I have no idea what kind of other code is running on your arduino, and if you're using an
UNO or a Mega, you might need the extra available memory.
If memory is no issue, take a look at Gonza's answer!
Good Luck!
I just want to suggest to you what in the Arduino JSON parsing could be very slow because the device (Arduinio Uno) Clock Speed is only 16 MHz and the memory is only 32 KB. So, you must consider using a simple text response, using something like "substring" in a Pipe separated values.
Related
I'm writing data to a JSON file in Processing with the saveJSONObject command. I would like to access that JSON file with another program (MAX/MSP) while my sketch is still open. The problem is, MAX is unable to read from the file while my sketch is running. Only after I close the sketch is MAX able to import data from my file.
Is Processing keeping that file open somehow while the sketch is running? Is there any way I can get around this problem?
It might be easier to stream your data straight to MaxMSP using the OSC protocol. On the Processing side, have a look at the oscP5 library and on the Max side at the udpreceive object.
You could send your JSON object as a string and unpack that in Max (maybe using the JavaScript support already present in Max), but it might be simpler to mimic the structure of your JSON object as the arguments of the OSC message object which you simply umpack in Max directly.
Probably, because I/O is usually buffered (notably for performance reasons, ans also because the hardware is doing I/O by blocks).
Try to flush the output channel, perhaps using PrintWriter::flush or something similar.
Details are implementation specific (and might be operating system specific).
Mashery IOdocs is a really a great tools for documenting API.
I'm using it for a quite big project with more then 50 methods and complex structures sent to this API, so that my json config file is more than 4000 lines long.
I self-host IOdocs on a VPS along with other stuff and the doc is awfully slow because of my long json file.
Any idea to cope with this latency ? Except obviously split my json config file into several.
I have a fork of IO Docs with some performance improvements which may help. In this instance they involve stripping out json-minify (which is only used to allow comments in the source specifications), server-side cacheing of the specifications and not having to load the specification via a synchronous AJAX call on the client.
Here's my use case:
I am implementing a finatra server, that should be able to receive many concurrent large requests.
These requests have a large body (several megabytes) comprised of many small json objects, concatenated.
I'd like to avoid loading the entire request body into memory. I'm looking for a way to read the request body in chunks, and use a json parser that supports this sort of async parsing.
In node.js this can be achieved by using the jsonp package (see the example - https://github.com/jaredhanson/node-jsonsp/blob/master/examples/twitter-stream/app.js).
Can I do something similar with finatra (and how)?
PS -
I also posted the question here, but got no answer so far.
This is not currently possible with Finatra. Finatra will not call your route until the entire request has been received and memorized into a ChannelBuffer. In addition, Finatra also reads the request as a single chunk so you cannot receive any body longer than ~2MB. Setting com.twitter.finatra.config.maxRequestSize to something higher than 2048 will cause it to crash at runtime.
I've switched to Play Framework using the NettyServer embed and "String Interpolating Routing DSL" to retain a DSL similar to Finatra.
I need to transfer data (objects) between client and server, and Twisted seems a good way to accomplish this. I've been doing a lot searching but still haven't found any example to understand the basic principle. So any simple code would help.
Thanks!
EDIT
Both client and server are written in python
The data may be large, so I need a fast, reliable transmission ( I've taken a look at producers, is that good?)
Flask is great, but I am using another framework, so the whole networking thing relies on Twisted.
It's hard to tell if your question is more about json, python or twisted, but here's an overview, more can follow once the specifics are known. Perhaps you could add some more info to your question so we can offer more assistance :-)
re Json: Json is just a string with a defined structure. If you are working in python and have an object to send as json, then you need to convert the object to a json string by use of
import json
json.dumps(objectName)
If your client is javascript then instead of json.dumps you might use JSON.stringify(objectname).
If you intend to use javascript for clients then some of the frameworks like jQuery make it very easy.
Pythons json.dumps has a lot of optional arguments, most of which you won't need. You can see the options at https://docs.python.org/2/library/json.html
Python is python, I assume you know how to create and populate objects. Will your client be python or javascript or something else? From a javascript client to a python server you would most likely use Ajax to send requests and get responses.
Twisted allows you to easily create a server that will listen on a given port and, when data arrives, an event will occur that supplies the data received. You can then do whatever you need to with the data. Just be careful about doing blocking things like database inserts since the server may miss some data or otherwise misbehave if you interrupt it's event loop. Twisted can be difficult to learn initially, but it is a very powerful and reliable system that is well proven. One alternative to consider, particularly if your clients are not python, is node.js. In my opinion, node is a little bit easier to grasp initially and there are thousands of add-on modules that let you do almost anything you'd want. I use both twisted and node for different things.
Neither node.js nor twisted are software that you can use to just quickly spin up a server or client without some study and experimentation. To use Twisted or Node.js properly confidently, using all their features and goodness, requires a bit of research and work on your part.
There are excellent frameworks like Flask that can be used to build a server that can react to a number of different Ajax calls from a client - you can have a single server be able to respond to several different kinds of requests instead of having a server for each Ajax type.
This is a small library that serializes an object with all its children to JSON and also parses it back to a fully working object:
https://github.com/Toubs/PyJSONSerialization/
Some of my friends are designing a game, and I am helping them out by implementing the game's backend server. The game is written in Flash, and I plan to develop the server in node.js because (a) it would be a cool project for learning node.js, and (b) it's fast, which is important for games.
The server's architecture is based on messages sent between the server and client (sort of like Minecraft's server protocol). The message format I have so far is a byte (the packet type), two bytes (the message length) and that many bytes (the message data, which is a mapping of key-value pairs). Problem is, I really don't want to develop my own serialization format (because while I probably could, implementing it would be a pain compared to using an existing solution).
Unfortunately, I am having problems finding a good candidate for the message data serialization format.
ActionScript's own remoting format might work, but I don't like it much.
JSON has support in node.js (obviously) and in ActionScript, but it's also textual and I would prefer binary for enhanced speed.
MessagePack looked like a good candidate, but I can't find an ActionScript implementation. (There's one called as3-msgpack on Google Code, but I get weird errors and can't access it.)
BSON has an ActionScript implementation, but no node.js support besides their MongoDB library (and I'm planning on using Redis).
So, can anyone offer any other serialization formats that I might have missed? Or should I just stick with one of these (or roll my own)?
Isn't that why HTTP supports gzipped content? Just use JSON and gzip the content when you send it. The time spent gzipping is more than recovered by the reduced latency of the transmission.
Check this article for more on gzip with Actionscript. On node.js I think that gzip-compress is fairly popular.
Actually, if I were in your shoes I would implement two methods and time them. Use JSON because it is common and easy to do. But then implement AMQP instead and compare them. If you want to massively scale this then you might find that AMQP makes it easier. Also. message queuing is just such a nice fit into the node.js world view.
AMQP on Actionscript, and someone doing similar on node.js.
Leverage JSAMF in Node.js for AMF communications with Flash.
http://www.jamesward.com/2010/07/07/amf-js-a-pure-javascript-amf-implementation/
If you wanted to, you could create your entire API in client side JavaScript, and use JSON as the data exchange format, then call ExternalInterface by AS to communicate with the client JavaScript API, which would make for an elegant server side solution.
It is worth noting that Flash Player has built in support for decompressing gzip compressed data. It may be worth compressing some of your JSON objects, things like localised string tables, game configuration data, etc which can grow to be a few hundred kb but are only loaded once on game load.
I'm working on a version of MessagePack for AS3.
At the current version it does the basic (encoding/decoding). Planning streams for the future.
Check the project page: https://github.com/loteixeira/as3-msgpack