Serialization format common to node.js and ActionScript? - actionscript-3

Some of my friends are designing a game, and I am helping them out by implementing the game's backend server. The game is written in Flash, and I plan to develop the server in node.js because (a) it would be a cool project for learning node.js, and (b) it's fast, which is important for games.
The server's architecture is based on messages sent between the server and client (sort of like Minecraft's server protocol). The message format I have so far is a byte (the packet type), two bytes (the message length) and that many bytes (the message data, which is a mapping of key-value pairs). Problem is, I really don't want to develop my own serialization format (because while I probably could, implementing it would be a pain compared to using an existing solution).
Unfortunately, I am having problems finding a good candidate for the message data serialization format.
ActionScript's own remoting format might work, but I don't like it much.
JSON has support in node.js (obviously) and in ActionScript, but it's also textual and I would prefer binary for enhanced speed.
MessagePack looked like a good candidate, but I can't find an ActionScript implementation. (There's one called as3-msgpack on Google Code, but I get weird errors and can't access it.)
BSON has an ActionScript implementation, but no node.js support besides their MongoDB library (and I'm planning on using Redis).
So, can anyone offer any other serialization formats that I might have missed? Or should I just stick with one of these (or roll my own)?

Isn't that why HTTP supports gzipped content? Just use JSON and gzip the content when you send it. The time spent gzipping is more than recovered by the reduced latency of the transmission.
Check this article for more on gzip with Actionscript. On node.js I think that gzip-compress is fairly popular.

Actually, if I were in your shoes I would implement two methods and time them. Use JSON because it is common and easy to do. But then implement AMQP instead and compare them. If you want to massively scale this then you might find that AMQP makes it easier. Also. message queuing is just such a nice fit into the node.js world view.
AMQP on Actionscript, and someone doing similar on node.js.

Leverage JSAMF in Node.js for AMF communications with Flash.
http://www.jamesward.com/2010/07/07/amf-js-a-pure-javascript-amf-implementation/

If you wanted to, you could create your entire API in client side JavaScript, and use JSON as the data exchange format, then call ExternalInterface by AS to communicate with the client JavaScript API, which would make for an elegant server side solution.
It is worth noting that Flash Player has built in support for decompressing gzip compressed data. It may be worth compressing some of your JSON objects, things like localised string tables, game configuration data, etc which can grow to be a few hundred kb but are only loaded once on game load.

I'm working on a version of MessagePack for AS3.
At the current version it does the basic (encoding/decoding). Planning streams for the future.
Check the project page: https://github.com/loteixeira/as3-msgpack

Related

Protocol Buffer vs Json - when to choose one over another

Can anyone explain when to use protocol buffer instead of JSON for micro-services architecture? And vice-versa? Both on synchronous and asynchronous communication.
When to use JSON
You need or want data to be human readable
Data from the service is directly consumed by a web browser
Your server side application is written in JavaScript
You aren’t prepared to tie the data model to a schema
You don’t have the bandwidth to add another tool to your arsenal
The operational burden of running a different kind of network service
is too great
Pros of ProtoBuf
Relatively smaller size
Guarantees type-safety
Prevents schema-violations
Gives you simple accessors
Fast serialization/deserialization
Backward compatibility
While we are at it, have you looked at flatbuffers?
Some of the aspects are covered here google protocol buffers vs json vs XML
Reference:
https://codeclimate.com/blog/choose-protocol-buffers/
https://codeburst.io/json-vs-protocol-buffers-vs-flatbuffers-a4247f8bda6f
I'd use JSON when the consumer is or could possibly be written in a language with built-in native support for JSON (Javascript is an example), a web browser, or where human readability is wanted. Speaking of which, at least for asynchronous calls, many developers enjoy the convenience of examining the contents of the queue directly for debugging and even during the normal course of development. Depending on the tech stack used, it may or may not be worth the trade off to use protobuf just to reduce network load since any performance increase wont buy you much in the async world. And it's not like we need to write a bunch of boiler plate code anymore like we used to with JSON marshalling and unmarshalling in most languages.
I'd use protobuf for everything else... if there are any other use cases left for it with the considerations above. There are advantages you might see, such as performance, network load, the backwards compatibility offered by its versioning scheme, the lovely documentation that magically comes with proto files, and some validation! If for some reason you have a lot of REST or other synchronous calls between microservices, protobuf can be sent over the wire instead of JSON without many trade offs, if any at all, while offering a heap of advantages.

twisted - transfer data using json

I need to transfer data (objects) between client and server, and Twisted seems a good way to accomplish this. I've been doing a lot searching but still haven't found any example to understand the basic principle. So any simple code would help.
Thanks!
EDIT
Both client and server are written in python
The data may be large, so I need a fast, reliable transmission ( I've taken a look at producers, is that good?)
Flask is great, but I am using another framework, so the whole networking thing relies on Twisted.
It's hard to tell if your question is more about json, python or twisted, but here's an overview, more can follow once the specifics are known. Perhaps you could add some more info to your question so we can offer more assistance :-)
re Json: Json is just a string with a defined structure. If you are working in python and have an object to send as json, then you need to convert the object to a json string by use of
import json
json.dumps(objectName)
If your client is javascript then instead of json.dumps you might use JSON.stringify(objectname).
If you intend to use javascript for clients then some of the frameworks like jQuery make it very easy.
Pythons json.dumps has a lot of optional arguments, most of which you won't need. You can see the options at https://docs.python.org/2/library/json.html
Python is python, I assume you know how to create and populate objects. Will your client be python or javascript or something else? From a javascript client to a python server you would most likely use Ajax to send requests and get responses.
Twisted allows you to easily create a server that will listen on a given port and, when data arrives, an event will occur that supplies the data received. You can then do whatever you need to with the data. Just be careful about doing blocking things like database inserts since the server may miss some data or otherwise misbehave if you interrupt it's event loop. Twisted can be difficult to learn initially, but it is a very powerful and reliable system that is well proven. One alternative to consider, particularly if your clients are not python, is node.js. In my opinion, node is a little bit easier to grasp initially and there are thousands of add-on modules that let you do almost anything you'd want. I use both twisted and node for different things.
Neither node.js nor twisted are software that you can use to just quickly spin up a server or client without some study and experimentation. To use Twisted or Node.js properly confidently, using all their features and goodness, requires a bit of research and work on your part.
There are excellent frameworks like Flask that can be used to build a server that can react to a number of different Ajax calls from a client - you can have a single server be able to respond to several different kinds of requests instead of having a server for each Ajax type.
This is a small library that serializes an object with all its children to JSON and also parses it back to a fully working object:
https://github.com/Toubs/PyJSONSerialization/

Messaging library for jeroMQ

I have chosen jeroMQ for building Asynchronous message channel for publishing content from multiple clients. On the other end server side workers processes request and notify client only if server wanted to notify client based on the message received.
On digging deep, looking for messaging library to marshal/un-marshal message. I found kvpmsg class which does the job for simple key-value.
Don't want to re-invent the wheel if some standard library exists, that can be applied for bigger objects
It seems like you are asking for data serialization libraries. Check Wikipedia for a list and a comparison of data serialization formats.
Also there is a relevant entry in ZeroMQ FAQ explaining why ZeroMQ doesn't include any serialization format:
Does ØMQ include APIs for serializing data to/from the wire representation?
No. This design decision adheres to the UNIX philosophy of "do one thing and do it well". In the case of ØMQ, that one thing is moving messages, not marshaling data to/from binary representations.
Some middleware products do provide their own serialization API. We believe that doing so leads to bloated wire-level specifications like CORBA (1055 pages). Instead, we've opted to use the simplest wire formats possible which ensure easy interoperability, efficiency and reduce the code (and bug) bloat.
If you wish to use a serialization library, there are plenty of them out there. See for example
Google Protocol Buffers
MessagePack
JSON-GLib
C++ BSON Library
Note that serialization implementations might not be as performant as you might expect. You may need to benchmark your workloads with several serialization formats and libraries in order to understand performance and which format/implementation is best for your use case (ease of development must also be considered).

What is the most efficient way to send OData payloads over the wire? "Dense JSON?"

I'm designing a distributed application that will consist of a variety of REST services. Lately I've been going back and forth about whether to implement my REST services using the ASP.NET MVC 4 Web API or OData. Web API seems like it will some day be what I need but right now it's only half baked. Specifically, it only has a partial implementation of OData-style URI querying and doesn't do hypermedia out-of-the-box.
So this forces me to take another long hard look at OData. I really like the URI querying capability and structural hypermedia for lazy loading; I think I will use these features a lot in my application. However, the Atom Pub specification appears to be grossly inefficient.
I recently read a post about an efficient format for OData which mentions "dense JSON" but such a thing does not appear to actually exist. Is this true? And even if there's no such thing as dense JSON, regular JSON is still much more efficient than Atom Pub, correct?
Is there any situation where I would want to use Atom Pub over JSON?
There should be very little difference between ATOM and JSON on the semantic level with OData. Also most OData servers (WCF Data Services for sure) support both, so it's a choice of the client which one to use. As the blog post from Pablo mentions, to get the best payload size you should enable HTTP compression. It works great on both ATOM and JSON.
Reading JSON tends to be faster (XML parsing is kind of expensive), but that's if you're concerned with CPU consumption on the client. If I remember correctly, last time I saw the numbers, the compressed payload size for ATOM and JSON is not that different.
ATOM PUB is usually easier to consume in client which has available good XML or ATOM libraries and not JSON. And vice versa. But other than that, there should not be much of a difference.

RPC for java/python with rest support, HTML monitoring and goodies

Here's my set of requirements: I'm looking for an RPC framework such as thrift, avro, protobuf (when adding services to it) which supports:
Easy and intuitive IDL. No serial numbers, no manual versioning, simple... avro is a good example for this.
Works with Java and Python
Supports both fast binary prorocol, as well as HTTP based restful style. I'd like to be able to use it for both backend-to-backend communication (java-java or python-java) as well as frontend-to-backend communication (javascript to java).
The rest support needs to include &param=value input as get/post requests (configurable per request) and output in three possible formats: json, jsonp, XML.
Compact, fast, backward compatible, easy to upgrade etc...
Provides some nice monitoring interfaces such as: JMX, web page status reports (e.g. packets in, packets out, error rate etc)
Ops friendly... no need to take the whole site down to release new versions
Both sync and asyc communication
... other goodies are welcome...
Is there something out there?
So far I've looked at thrift and avro and they are both nice in some ways, but don't check all my list.
Thanks
That's a pretty tall order - some of the requirements are met by:
Avro, Thrift, Protobuff and ICE from Zero C.
ICE is probably the most performant.