TensorFlow export compute graph to XML, JSON, etc - json

I want to export a TensorFlow compute graph to XML or something similar so I can modify it with an external program and then re-import it. I found Meta Graph but this exports in a binary format which I wouldn't know how to modify.
Does such capability exist?

The native serialization format for TensorFlow's dataflow graph uses protocol buffers, which have bindings in many different languages. You can generate code that should be able to parse the binary data from the two message schemas: tensorflow.GraphDef (a lower-level representation) and tensorflow.MetaGraphDef (a higher-level representation, which includes a GraphDef and other information about how to interpret some of the nodes in the graph).
If there is no protocol buffer implementation for your target language, you can generate JSON from the Python protocol buffer object. For example, the following generates a string containing a JSON representation of a GraphDef:
import tensorflow as tf
from google.protobuf import json_format
with tf.Graph().as_default() as graph:
# Add nodes to the graph...
graph_def = graph.as_graph_def()
json_string = json_format.MessageToJson(graph_def)

Related

how to parse JSON string in mingw-w64 g++ on Windows 10?

I am writing C++ using mingw-w64 g++ on windows with VS Code.
I need to be able to parse some JSON string.
mingw doesn't seem to have any built-in JSON support.
What is the way to set up JSON support in mingw-w64 on windows 10?
There are quite a few libraries for handling JSON from C/C++ that you can use.
To name a few that I have been able to compile with MinGW-w64:
JSON-C
Description : JSON-C implements a reference counting object model that allows you to easily construct JSON objects in C, output them as JSON formatted strings and parse JSON formatted strings back into the C representation of JSON objects.
https://github.com/json-c/json-c
libjansson
Description : Jansson is a C library for encoding, decoding and manipulating
JSON data.
http://www.digip.org/jansson/
libjson-glib
Description : JSON-GLib is a library providing serialization and deserialization support for the JavaScript Object Notation (JSON) format described by RFC 4627.
Website URL : http://live.gnome.org/JsonGlib
json-parser
Description : Very low footprint JSON parser written in portable ANSI C
https://github.com/udp/json-parser
jsonh
Description : json parser for C and C++
https://github.com/sheredom/json.h
jsmn
Description : jsmn (pronounced like "jasmine") is a minimalistic JSON parser
in C. It can be easily integrated into the resource-limited projects or embedded systems.
http://zserge.com/jsmn.html
tiny-json
Description : tiny-json is a versatile and easy to use json parser in C suitable for embedded systems. It is fast, robust and portable. It is not only a tokenizer. You can get data in string format or get the primitives values in C type
variables without performance loss.
https://github.com/rafagafe/tiny-json
ujson4c
Description : A more user friendly layer for decoding JSON in C/C++ based on
the ultra fast UltraJSON library
https://github.com/esnme/ujson4c/
cajun-jsonapi
Description : CAJUN is a C++ API for the JSON data interchange format with an emphasis on an intuitive, concise interface. The library provides JSON types and operations that mimic standard C++ as closely as possible in concept and design.
https://github.com/cajun-jsonapi/cajun-jsonapi
frozen
Description : JSON parser and generator for C/C++ with scanf/printf like interface. Targeting embedded systems.
Website URL : https://github.com/cesanta/frozen
jq
Description : jq is a lightweight and flexible command-line JSON processor.
https://stedolan.github.io/jq/
js0n
Description : Flexible Zero-Footprint JSON Parser in C
https://github.com/quartzjer/js0n
libfastjson
Description : a fast json library for C
https://github.com/rsyslog/libfastjson
libxo
Description : The libxo library allows an application to generate text, XML,
JSON, and HTML output using a common set of function calls. The application decides at run time which output style should be produced.
https://github.com/Juniper/libxo
microjson
Description : Tiny JSON parser in C that uses only fixed-extent storage.
http://www.catb.org/esr/microjson/
minijsonreader
Description : A DOM-less JSON parser that can parse a JSON object without allocating a single byte of memory
https://github.com/giacomodrago/minijson_reader
minijsonwriter
Description : A simple, little-overhead, allocation-free, and extensible C++
JSON writer, directly wrapping a std::ostream
https://github.com/giacomodrago/minijson_writer
pdjson
Description : A public domain JSON parser focused on correctness, ANSI C99 compliance, full Unicode (UTF-8) support, minimal memory footprint, and a simple API. As a streaming API, arbitrary large JSON could be processed with a small amount of memory (the size of the largest string in the JSON). It seems most C JSON libraries suck in some significant way: broken string support (what if the string contains \u0000?), broken/missing Unicode support, or crappy software license (GPL or "do no evil"). This library intends to avoid these flaws.
https://github.com/skeeto/pdjson
picojson
Description : a header-file-only, JSON parser serializer in C++
https://github.com/kazuho/picojson
sajson
Description : Lightweight, extremely high-performance JSON parser for C++11
https://github.com/chadaustin/sajson
smalljsonparser
Description : This is a simple, one-file JSON parser in C. It is designed for highly resource-constrained systems. It uses no memory allocation, and can stream data, so that the whole file does not need to reside in memory.
https://github.com/DagAgren/SmallJSONParser
univalue
Description : C++ universal value object and JSON library
https://github.com/jgarzik/univalue
Following Brecht's list, I tried json-parser. The following is how I made it to work. Hope this will help folks not familiar with the process
Do this from Msys terminal which came with MinGw G++, becaue it has 'make' command.
cd mycppbase
git clone https://github.com/json-parser/json-parser.git
cd json-parser
export PATH=/c/msys64/mingw64/bin:$PATH
./configure
make
three files are important
json.h
libjsonparser.a
libjsonparser.so
cd myexampledir/
g++ myjson.cpp -o myjson \
-I "/c/.../mycppbase/json-parser" \
-L "/c/.../mycppbase/json-parser" \
-l:libjsonparser.a
UPDATE: 2022/11/20
the previous example is to link a static executable.
To link dynamically, we need to rename the .so file to .dll file. (see comments below)
The following is done in gitbash terminal and worked.
mv libjsonparser.so libjsonparser.dll
cd myexampledir/
g++ myjson.cpp -o myjson \
-I "/c/.../mycppbase/json-parser" \
-L "/c/.../mycppbase/json-parser" \
-ljsonparser

Purpose of "resolveJsonModule"?

The setting I am referencing is shown in the snippet bellow
{
"compilerOptions": {
"resolveJsonModule": true,
}
}
I don't really understand why TS language engineers would add a flag for "resolveJsonModule"? Either an environment supports resolving JSON as module via an import statement (or require() method), or the environment doesn't. Why bother with the extra complexity?
Context
Historically, Node has included a specialized JSON loader (unrelated to ECMA standards) to allow importing JSON data only in CommonJS mode.
Standardized importing of anything at all (ES modules) is only a relatively recent phenomenon in ECMAScript. Importing text files containing valid JSON, parsed as native JS data ("importing JSON") is described in a proposal that is still only in stage 3.
However, there has been recent movement in regard to implementation of the above mentioned proposal:
V8 implemented it in June (Chrome 91+)
TypeScipt v4.5.0 implemented it in November
Deno v1.17.0 implemented it in December
Node LTS v16.14.0 implemented it last Tuesday (behind a CLI flag --experimental-json-modules)
TypeScript
TypeScript is a static type-checker, but also a compiler (technically a transpiler), and transforms your TS source code syntax into a syntax that is valid JavaScript for the runtime environment you have specified in your TSConfig. Because there are different runtime environments with different capabilities, the way that you configure the compiler affects the transformed JavaScript that is emitted. In regard to defaults, the compiler uses an algorithmic logic to determine settings. (I can't summarize that here: you honestly have to read the entire reference in order to understand it.) Because loading of JSON data has been a non-standard, specialized operation until extremely recently, it has not been a default.
Alternatives
All JS runtimes offer alternatives to an import statment for importing of textual JSON data (which can then be parsed using JSON.parse), and none of them require configuring the compiler in the ways that you asked about:
Note: the data parsed from the JSON strings imported using these methods will not participate in the "automatic" type inference capabilities of the compiler module graph because they aren't part of the compilation graph: so they'll be typed as any (or possibly unknown in an extremely strict configuration).
Browser and Deno: window.fetch
Deno: Deno.readTextFile
Node: fs.readFile
Additionally, because all JSON (JavaScript Object Notation) is valid JS, you can simply prepend the data in your JSON file with export default , and then save the file as data.js instead of data.json, and then import it as a standard module: import {default as data} from './data.js';.
Final notes about inferred types:
I prefer to audit the JSON I'm importing and use my own manually-written types (written either by myself or someone else: imported from a module/declaration file) for the data, rather than relying on the compiler's inferred types from import statements (which I have found to be too narrow on many occasions), by assigning the parsed JSON data to a new variable using a type assertion.

Read graph into NetworkX from JSON file

I have downloaded my Facebook data. I got the data in the form of JSON files.
Now I am trying to read these JSON files into NetworkX. I don't find any function to read graph from JSON file into NetworkX.
In another post, found the info related to reading a graph from JSON, where the JSON file was earlier created from NetworkX using json.dump().
But here in my case I have downloaded the data from Facebook. Is there any function to read graph from JSON file into NetworkX?
Unlike Pandas tables or Numpy arrays, JSON files has no rigid structure so one can't write a function to convert any JSON file to Networkx graph. If you want to construct a graph based on JSON, you should pick all needed info yourself. You can load a file with json.loads function, extract all nodes and edges according to your rules and then put them into your graph with add_nodes_from and add_edges_from functions.
For example Facebook JSON file you can write something like it:
import json
import networkx as nx
with open('fbdata.json') as f:
json_data = json.loads(f.read())
G = nx.DiGraph()
G.add_nodes_from(
elem['from']['name']
for elem in json_data['data']
)
G.add_edges_from(
(elem['from']['id'], elem['id'])
for elem in json_data['data']
)
nx.draw(
G,
with_labels=True
)
And get this graph:

export R list into Julia via JSON

suppose I have this list in R
x = list(a=1:3,b=8:20)
and I write this to a json file on disk with
library(jsonlite)
cat(toJSON(x),file="f.json")
how can I use the Julia JSON package to read that? Can I?
# Julia
using JSON
JSON.parse("/Users/florianoswald/f.json")
gives a mistake - I guess it expects a json string.
Any alternatives? I would benefit from being able to pass a list (i.e. a nested structure) rather than tabular data. thanks!
If you want to do this with the current version of JSON you can use Julia's readall method to get a string from a file.
Pkg.clone("JSON") will get you the latest development version of JSON.jl (as opposed to the latest released version) – it seems parsefile is not released yet.

Parse JSON with R

I am fairly new to R, but the more use it, the more I see how powerful it really is over SAS or SPSS. Just one of the major benefits, as I see them, is the ability to get and analyze data from the web. I imagine this is possible (and maybe even straightforward), but I am looking to parse JSON data that is publicly available on the web. I am not a programmer by any stretch, so any help and instruction you can provide will be greatly appreciated. Even if you point me to a basic working example, I probably can work through it.
RJSONIO from Omegahat is another package which provides facilities for reading and writing data in JSON format.
rjson does not use S4/S3 methods and so is not readily extensible, but still useful. Unfortunately, it does not used vectorized operations and so is too slow for non-trivial data. Similarly, for reading JSON data into R, it is somewhat slow and so does not scale to large data, should this be an issue.
Update (new Package 2013-12-03):
jsonlite: This package is a fork of the RJSONIO package. It builds on the parser from RJSONIO but implements a different mapping between R objects and JSON strings. The C code in this package is mostly from the RJSONIO Package, the R code has been rewritten from scratch. In addition to drop-in replacements for fromJSON and toJSON, the package has functions to serialize objects. Furthermore, the package contains a lot of unit tests to make sure that all edge cases are encoded and decoded consistently for use with dynamic data in systems and applications.
The jsonlite package is easy to use and tries to convert json into data frames.
Example:
library(jsonlite)
# url with some information about project in Andalussia
url <- 'https://api.stackexchange.com/2.2/badges?order=desc&sort=rank&site=stackoverflow'
# read url and convert to data.frame
document <- fromJSON(txt=url)
Here is the missing example
library(rjson)
url <- 'http://someurl/data.json'
document <- fromJSON(file=url, method='C')
The function fromJSON() in RJSONIO, rjson and jsonlite don't return a simple 2D data.frame for complex nested json objects.
To overcome this you can use tidyjson. It takes in a json and always returns a data.frame. It is currently not availble in CRAN, you can get it here: https://github.com/sailthru/tidyjson
Update: tidyjson is now available in cran, you can install it directly using install.packages("tidyjson")
For the record, rjson and RJSONIO do change the file type, but they don't really parse per se. For instance, I receive ugly MongoDB data in JSON format, convert it with rjson or RJSONIO, then use unlist and tons of manual correction to actually parse it into a usable matrix.
Try below code using RJSONIO in console
library(RJSONIO)
library(RCurl)
json_file = getURL("https://raw.githubusercontent.com/isrini/SI_IS607/master/books.json")
json_file2 = RJSONIO::fromJSON(json_file)
head(json_file2)