**TypeError: can't pickle tensorflow.python.client._pywrap_tf_session.TF_Operation objects** - pickle

I am using Tensorflow-addons API for a machine translation project.
I want to load my models and do predictions without train them again.
My architecture is Encoder - Decoder with Bahdanau Attention mechanism.
When I try to serialize, with dill module, the encoder and decoder objects in a file and I get this error :
TypeError: can't pickle tensorflow.python.client._pywrap_tf_session.TF_Operation objects
How can I resolve this ?

Related

Catboost how to save model to python memory object and not disk

We like to use Catboost in an environment where we dont have permission to save data todisk. We found: https://github.com/catboost/tutorials/blob/master/model_analysis/model_export_as_json_tutorial.ipynb
Is there a way to pipe the model into an im memory python JSON object without saving to disk?
Although it won't be a JSON, you can use the protected method _serialize_model on the CatBoostClassifier to get the model blob. For loading it use CatBoostClassifier.load_model(blob=serialized_model).

converting json annotation to coco format

I have annotated my data using vott and the default format is json. I wanted to load my data to detectron2 model but it seems that the required format is coco.
Can anyone tell me how can I convert my data from json vott to coco format ??
My classmates and I have created a python package called PyLabel to help others with this kind of task and other labelling tasks. You can see an example in this notebook: https://github.com/pylabel-project/samples/blob/main/coco2voc.ipynb.
You might be able to use the package's importer tool to import your data and convert it to coco.
You can find the code for the package here: https://github.com/pylabel-project/.

How to convert Model to JSON

When I naively use Jackson to convert to JSON i receive this exception:
Caused by: com.fasterxml.jackson.databind.exc.InvalidDefinitionException: No serializer found for class org.apache.cayenne.access.DefaultDataRowStoreFactory and no properties discovered to create BeanSerializer
Edit: I'd like to do something like this:
ObjectContext context = cayenneRuntime.newContext();
List<User> users = ObjectSelect.query(User.class).select(context);
JsonObject json = Json.mapper.convertValue(obj, Map.class)
Are there any existing solutions? Thanks
Considering that in a general case Cayenne gives you not just objects, but a virtual graph of objects, serialization to JSON becomes a more quirky topic than it initially appears.
The short answer: you'd have manually build JSON for whatever subgraph of your object graph.
While not a direct answer, it may be worth mentioning that Agrest framework (ex. LinkRest) supports rule-based serialization of Cayenne object graphs to JSON. But it is not a standalone component. I.e. it will only work if you use it for your REST services.

How to instantiate Spark ML object from JSON?

Creating Spark ML object we just need to know:
The type of model
The parameters for the model
I am just brainstorming a way to pass this information using json and instantiate a Spark ML object from it.
For example, with this json
{
"model": RandomForestClassifier,
"numTrees": 10,
"featuresCol": "binaryFeatures"
}
It will instantiate a Random Forest model.
val rf = new RandomForestClassifier().setNumTrees(10).setFeaturesCol("binaryFeatures")
It is fairly straightforward to write a custom json serializer/deserializer by my own. Scala's pattern match seems good use case to dynamically instantiate an object from the name in string. However, when the object gets more complex (i.e. supporting pipeline), it is hard to maintain the custom serializer.
Is there any existing implementation for this? If not, what should the json structure look like?

TensorFlow export compute graph to XML, JSON, etc

I want to export a TensorFlow compute graph to XML or something similar so I can modify it with an external program and then re-import it. I found Meta Graph but this exports in a binary format which I wouldn't know how to modify.
Does such capability exist?
The native serialization format for TensorFlow's dataflow graph uses protocol buffers, which have bindings in many different languages. You can generate code that should be able to parse the binary data from the two message schemas: tensorflow.GraphDef (a lower-level representation) and tensorflow.MetaGraphDef (a higher-level representation, which includes a GraphDef and other information about how to interpret some of the nodes in the graph).
If there is no protocol buffer implementation for your target language, you can generate JSON from the Python protocol buffer object. For example, the following generates a string containing a JSON representation of a GraphDef:
import tensorflow as tf
from google.protobuf import json_format
with tf.Graph().as_default() as graph:
# Add nodes to the graph...
graph_def = graph.as_graph_def()
json_string = json_format.MessageToJson(graph_def)