Zigbee packet analysis using Killerbee - radio

I am working on Tiny OS using Micaz sensors on Zigbee platform. I am also using Killerbee to analyze the data packets. Can anyone suggest how to read those hexadecimal values?
Because the node-ID I am assigning while burning the nodes are not seen in the data packets at all.

The best tool I've found for analyzing ZigBee / 802.15.4 packets is Wireshark. You just need to figure a way of getting the raw packets into pcap format. I've got a blog post on adapting the Microchip ZENA analyzer to output pcap format here:
http://jdesbonnet.blogspot.com/2011/02/using-microchip-zena-zigbee802154.html

Related

Binary data in JSON

I am using JSON because its readable and offers flexibility as a transmission protocol for IPC. Part of the exchange between processes is a requirement to transfer large binary files (MB's).
I am using UDP and JSON as the transport protocol, in this case the binary data is translated into HEX strings with no delimiters so a single 8 bit character is used to represent each 4 bit nibble.
I'm exploring and looking for ways of keeping the JSON protocol but getting a more efficient way to transferring the binary hex data.
The reason for this is that UDP packets are limited in size and converting each nibble to a byte doubles the bit count and slows down the transfer as the data size is doubled.
Can anyone think of a better way of sending the binary data in a JSON packet without loosing anything?
I recommend Parket to submit information, Parket is a single table binary format, this format is used in machine learning in Python, here are some examples. Link Link
if your reason is UDP packets maybe try sockets: Link
again, :)
I hope it helps you, greetings

Stream from JMS queue and store in Hive/MySQL

I have the following setup (that I cannot change) and I'd like some advice from people who have been down that road. I'm not sure if this is the right place to ask, but here goes anyway.
Various JSON messages are placed on a different channels of a JMS queue (Universal Messaging/webMethods).
Before the data can be stored in relational-style DBs it has to be transformed: renamed, arrays flattened and some structures from nested objects extracted.
Data has to be appended to MySQL (as a serving layer for a visualization tool) and Hive (for long-term storage).
We're stuck on Spark 1.4.1 and may move to 1.6.0 in a few months' time. So, structured streaming is not (yet) an option.
At some point the events will be streamed directly to real-time dashboards, so having something in place that is capable of doing that now would be ideal.
Ideally coding is done in Scala (because we already have considerable batch-based repo with Spark and Scala), so the minimal requirement is JVM-based.
I've looked at Spark Streaming but it does not have a JMS adapter and as far as I can tell operating on JSON would be done using a SQLContext instance on the DStream's RDDs. I understand that it's possible to write a custom adapter, but then I'm not sure if Spark is still the best/easiest solution. I've also looked at the doc for Samza and Flink but did not find much for JMS and/or JSON, at least not natively.
Apache Camel seems like it might have a substantial set of connectors but I'm not too familiar with it, and I get the impression it does not do the streaming part, 'just' the bit where you connect to various systems. There's also Akka although I get the impression it's more of a replacement for messaging systems and JMS is set.
There is an almost bewildering amount of available tools and I'm at this point at a loss what to look at or what to look out for. What do you recommend based on your experience that I use to pick up the messages, transform, and insert into Hive and MySQL?

Why can't I read a JSON file with a different program while my Processing sketch is still open?

I'm writing data to a JSON file in Processing with the saveJSONObject command. I would like to access that JSON file with another program (MAX/MSP) while my sketch is still open. The problem is, MAX is unable to read from the file while my sketch is running. Only after I close the sketch is MAX able to import data from my file.
Is Processing keeping that file open somehow while the sketch is running? Is there any way I can get around this problem?
It might be easier to stream your data straight to MaxMSP using the OSC protocol. On the Processing side, have a look at the oscP5 library and on the Max side at the udpreceive object.
You could send your JSON object as a string and unpack that in Max (maybe using the JavaScript support already present in Max), but it might be simpler to mimic the structure of your JSON object as the arguments of the OSC message object which you simply umpack in Max directly.
Probably, because I/O is usually buffered (notably for performance reasons, ans also because the hardware is doing I/O by blocks).
Try to flush the output channel, perhaps using PrintWriter::flush or something similar.
Details are implementation specific (and might be operating system specific).

Webservice C# JSON to Arduino

I have a project in Microsoft Visual Studio C# and I have to send JSON data to my Arduino via Ehternet Shield.
This is how it works:
Is it possible? How to do it?
Yes. You can do that.
There are some Arduino JSON libraries
Arduino JSON
aJSON
json-arduino
There are some difference between them, one is the Memory allocation (dynamically allocated for aJSON and json-arduino, and static for Arduino JSON).
I only used Arduino JSON, I was convinced by the GitHub documentation and comparison.
Remember Arduino has almost no memory, so you should avoid sending big JSON messages, but if the message is like the one in the image, you shouldn't have any memory issue.
{"led":"255,255,255","tv":"on","air":"32"}
You can do a very decent WS client just following and "merging" these two tutorials:
Create a WS client
JSON Parser
Good luck!
You could grab the http request when it arrives on the arduino and manually parse the request to get the json key/values.
This probably is a little bit more work than just using a library, however you could save quite some memory by not having to include an entire library.
As I have no idea what kind of other code is running on your arduino, and if you're using an
UNO or a Mega, you might need the extra available memory.
If memory is no issue, take a look at Gonza's answer!
Good Luck!
I just want to suggest to you what in the Arduino JSON parsing could be very slow because the device (Arduinio Uno) Clock Speed is only 16 MHz and the memory is only 32 KB. So, you must consider using a simple text response, using something like "substring" in a Pipe separated values.

Serialization format common to node.js and ActionScript?

Some of my friends are designing a game, and I am helping them out by implementing the game's backend server. The game is written in Flash, and I plan to develop the server in node.js because (a) it would be a cool project for learning node.js, and (b) it's fast, which is important for games.
The server's architecture is based on messages sent between the server and client (sort of like Minecraft's server protocol). The message format I have so far is a byte (the packet type), two bytes (the message length) and that many bytes (the message data, which is a mapping of key-value pairs). Problem is, I really don't want to develop my own serialization format (because while I probably could, implementing it would be a pain compared to using an existing solution).
Unfortunately, I am having problems finding a good candidate for the message data serialization format.
ActionScript's own remoting format might work, but I don't like it much.
JSON has support in node.js (obviously) and in ActionScript, but it's also textual and I would prefer binary for enhanced speed.
MessagePack looked like a good candidate, but I can't find an ActionScript implementation. (There's one called as3-msgpack on Google Code, but I get weird errors and can't access it.)
BSON has an ActionScript implementation, but no node.js support besides their MongoDB library (and I'm planning on using Redis).
So, can anyone offer any other serialization formats that I might have missed? Or should I just stick with one of these (or roll my own)?
Isn't that why HTTP supports gzipped content? Just use JSON and gzip the content when you send it. The time spent gzipping is more than recovered by the reduced latency of the transmission.
Check this article for more on gzip with Actionscript. On node.js I think that gzip-compress is fairly popular.
Actually, if I were in your shoes I would implement two methods and time them. Use JSON because it is common and easy to do. But then implement AMQP instead and compare them. If you want to massively scale this then you might find that AMQP makes it easier. Also. message queuing is just such a nice fit into the node.js world view.
AMQP on Actionscript, and someone doing similar on node.js.
Leverage JSAMF in Node.js for AMF communications with Flash.
http://www.jamesward.com/2010/07/07/amf-js-a-pure-javascript-amf-implementation/
If you wanted to, you could create your entire API in client side JavaScript, and use JSON as the data exchange format, then call ExternalInterface by AS to communicate with the client JavaScript API, which would make for an elegant server side solution.
It is worth noting that Flash Player has built in support for decompressing gzip compressed data. It may be worth compressing some of your JSON objects, things like localised string tables, game configuration data, etc which can grow to be a few hundred kb but are only loaded once on game load.
I'm working on a version of MessagePack for AS3.
At the current version it does the basic (encoding/decoding). Planning streams for the future.
Check the project page: https://github.com/loteixeira/as3-msgpack