I've used JSONfield in my serilizer and as the user in the thread store json as dict points out, DRF with Mysql stores JSONfield as a dict
However I would rather store it as JSON {"tags":{"data":"test"}} instead of the default behavior of storing it as Dict - {'tags': { 'data': 'test'}} - Daniel suggests using over riding the JSONfield as:
class JSONField(serializers.Field):
def to_representation(self, obj):
return json.loads(obj)
......
However the data is still stored as dict.
In my serializers.py
I've the overridden the JSONField class and then used it as such
class schemaserializer(serializers.ModelSerializer):
js_data=serializers.JSONField()
However it still saves it as as a dict.
Expected behavior:
Save it as JSON - POST
retrieve it as dict - GET (so that the render can parse it as JSON)
Am currently doing this manually using json dumps and json loads, but looking for a better way.
The reason to do this is because although I have an API there are instances where users read my DB directly and they need the field to be in a JSON.
Django (2.0.1)
Python 3.5
djangorestframework (3.7.7)
Serializers allow complex data such as querysets and model instances to be converted to native Python datatypes that can then be easily rendered into JSON, XML or other content types.
See more at serializers docs
What you need is:
class SchemaSerializer(serializers.ModelSerializer):
class Meta:
model = YOUR_MODEL_NAME
fields = A_LIST_OF_FIELD
and then in your view:
class SchemaView(mixins.ListModelMixin, generic.GenericAPIView):
queryset = YOUR_MODEL_NAME.objects.all()
serializer_class = SchemaSerializer
Do you mean you want to use the actual JSON type in your database backend? If so, you would want to use the appropriate JSONfield type on your model instead of in the serializer specifically.
Related
What I'm trying to do is something similar to the Stackoverflow question here: basically converting .seq.gz JSON files to Parquet files with a proper schema defined.
I don't want to infer the schema, rather I would like to define my own, ideally having my Scala case classes so they can be reused as models by other jobs.
I'm not too sure whether I should deserialise my JSON into a case class and let toDS() to implicitly convert my data like below:
spark
.sequenceFile(input, classOf[IntWritable], classOf[Text])
.mapValues(
json => deserialize[MyClass](json.toString) // json to case class instance
)
.toDS()
.write.mode(SaveMode.Overwrite)
.parquet(outputFile)
...or rather use a Spark Data Frame schema instead, or even a Parquet schema. But I don't know how to do it though.
My objective is having full control over my models and possibly map JSON types (which is a poorer format) to Parquet types.
Thanks!
I'm passing a Laravel Model's dataset to a vuejs2 component via ajax/ axiom and rendering it fine.
However, there is a JSON column in the model which stores a valid json object, the data could look like so: {'key':'value'} and it's worth noting that I'm working with it without issue in Laravel Controllers etc thanks to a Mutator on the Model ( protected $casts = [ 'the_json_column' => 'array']; )
When I pass this model to vuejs via axiom / ajax all of the properties in the array behave as usual, I can iterate over them and render them in the vuejs2 component DOM.
Until I interact with 'the_json_column' which despite Laravel's mutator is being passed to vuejs2 as a string, e.g. "{'key':'value'}"
Is there a more elegant way than doing a JSON.parse(data.the_json_column).key in my vuejs2 component every time I want to interact with the JSON column data?
The solution I've gone with is decoding the data property manually in the VueJS2 template,
e.g. JSON.parse(data.key_which_is_actually_json).property_in_the_object
Any laravel based code (accessors, mutators etc) will fail when the property is transferred to VueJS2 component over HTTP as VueJS2 isn't smart enough to check properties in data receive and decode them.
VueJS2 seems to only decode the top level of properties in data received.
You may create your own Accessor and then convert the column to an array manually before retrieving the model.
public function getTheJsonColumnAttribute($value)
{
return json_decode($value, true);
}
While it may seem laravel simply treated that column as a mere 'string' value when coming out, you can further validate that there is indeed a conversion.
I am using MongoDB 3.4 and Python 2.7. I have retrieved a document from the database and I can print it and the structure indicates it is a Python dictionary. I would like to write out the content of this document as a JSON file. When I create a simple dictionary like d = {"one": 1, "two": 2} I can then write it to a file using json.dump(d, open("text.txt", 'w'))
However, if I replace d in the above code with the the document I retrieve from MongoDB I get the error
ObjectId is not JSON serializable
Suggestions?
As you have found out, the issue is that the value of _id is in ObjectId.
The class definition for ObjectId is not understood by the default json encoder to be serialised. You should be getting similar error for ANY Python object that is not understood by the default JSONEncoder.
One alternative is to write your own custom encoder to serialise ObjectId. However, you should avoid inventing the wheel and use the provided PyMongo/bson utility method bson.json_util
For example:
from bson import json_util
import json
json.dump(json_util.dumps(d), open("text.json", "w"))
The issue is that “_id” is actually an object and not natively deserialized. By replacing the _id with a string as in mydocument['_id'] ='123 fixed the issue.
I'm using Django 1.8 and in some of my code I just do:
self.request.session['message'] = [
_(u'Tag!'),
_(u'Abt!'),
_(u'Click here to hide this message')]
Then when the page refreshed I have this problem:
<django.utils.functional.__proxy__ object at 0x04805F70> is not JSON serializable
Of course I've googled for it, and I've read the documentation which says that "JSON supports only string keys" and "the JSON serializer from django.core.signing Can only serialize basic data types".
Unless I'm wrong, arrays made of strings are basic data types. Moreover that code has been there for 6 months without a problem.
What am I missing?
It seems that what you are trying to serialise are not strings - they are lazy translation objects (i.e. strings marked for translation, that has not been evaluated yet).
Most likely there is a line in the same file similar to this one:
from django.utils.translation import ugettext_lazy as _
to use a translation function that is not lazy (i.e. it returns translated strings and not lazy translation objects) you should change it to:
from django.utils.translation import ugettext as _
Alternatively you can force evaluation of lazy translation objects before serialising them by calling str() on them.
I'm working with django + piston and so far created a few urls that are returning my django models in JSON quite well. I didn't even have to specify an emitter, the JSON serialization is done automagically.
Now I need to return a JSON serialized object that does not extend Django Model class. If I return its value, piston returns the __ str __ value, even if I add the JSON emitter nothing happens. I don't want to add the JSON serialization in the __ str __ method since it wouldn't be correct.
What's the correct approach for this?
You can use python's json package to do this.
Snippet:
>>> import json
>>> json.dumps(['foo', {'bar': ('baz', None, 1.0, 2)}])
'["foo", {"bar": ["baz", null, 1.0, 2]}]'
If you're looking to create this in the context of django view, you'll want to make sure you're sending back the right content type. That is,
from django.http import HttpResponse
return HttpResponse(your_json_string, mimetype='application/json')
Note that this doesn't necessarily work for all types in general. It's usually a good idea to construct a dict containing exactly what you want to return, then serialize it.