Im not able to serialize a ValuesQuerySet object to json data, i´ve found multiple solutions to this gap, but this case is different because i need to follow the Foreign Keys values.
from task_manager.models import UserTasks
data=UserTasks.objects.filter(user__username="root",server_id=2).values("server_id__mnemonic")
The previous query returns something like this:
>>> print data
[{'server_id__mnemonic': u'lol'}, {'server_id__mnemonic': u'lol'}, {'server_id__mnemonic': u'lol'},.......]
But when I try to serialize it to JSON format raises the next exception:
>>> json_data = serializers.serialize('json',data)
Traceback (most recent call last):
File "<console>", line 1, in <module>
File "C:\Python27\lib\site-packages\django\core\serializers\__init__.py", line 122, in serialize
s.serialize(queryset, **options)
File "C:\Python27\lib\site-packages\django\core\serializers\base.py", line 45, in serialize
concrete_model = obj._meta.concrete_model
AttributeError: 'dict' object has no attribute '_meta'
>>> type(data)
<class 'django.db.models.query.ValuesQuerySet'>
I´ve found in the official Django manual a solution that says: If you only want a subset of fields to be serialized, you can specify a fields argument to the serializer:
from django.core import serializers
data = serializers.serialize('xml', SomeModel.objects.all(), fields=('name','size'))
But with this code, i cannot get the foreign keys values i want.
Thanks
values() gives you a ValuesQuerySet which you can serialize by converting it to a list and using json module, no need to involve Django serializers here:
import json
from task_manager.models import UserTasks
data = UserTasks.objects.filter(user__username="root",server_id=2).values("server_id__mnemonic")
print json.dumps(list(data))
Another option would to be use serializers.serialize() with specifying fields argument:
data = UserTasks.objects.filter(user__username="root",server_id=2)
print serializers.serialize('json', data, fields=('server_id__mnemonic', ))
Related
I have a JSON file with some lines like:
"updatedAt" : ISODate("2018-11-20T09:32:16.732+0000"),
I tried json.loads but it has an error json.decoder.JSONDecodeError: Expecting value: line 2 column 13 (char 15).
I believe that the problem is at ISODate () but how could I handle that with Python?
Many thanks
This is not valid JSON, to begin with. I guess the ISODATE("...") is generated from MongoDB, maybe dumping the ISODate() helper directly instead of its string representation into the JSON?
In any case, you could use a regex on the whole JSON-string to get rid of the ISODate("..."), retrieve the date as a string and then use python-dateutil to parse the value to a datetime.datetime.
Something to the tune of
import json
import dateutil.parse
import re
json_str = ....
clean_json = re.compile('ISODate\(("[^"]+")\)').sub('\\1', json_str)
json_obj = json.loads(clean_json)
# use dateutil.parser.parse(s) to parse each date into a datetime.datetime
I store a blob of Json in the datastore using JsonProperty.
I don't know the structure of the json data.
I am using endpoints proto datastore in order to retrieve my data.
The probleme is the json property is encoded in base64 and I want a plain json object.
For the example, the json data will be:
{
first: 1,
second: 2
}
My code looks something like:
import endpoints
from google.appengine.ext import ndb
from protorpc import remote
from endpoints_proto_datastore.ndb import EndpointsModel
class Model(EndpointsModel):
data = ndb.JsonProperty()
#endpoints.api(name='myapi', version='v1', description='My Sample API')
class DataEndpoint(remote.Service):
#Model.method(path='mymodel2', http_method='POST',
name='mymodel.insert')
def MyModelInsert(self, my_model):
my_model.data = {"first": 1, "second": 2}
my_model.put()
return my_model
#Model.method(path='mymodel/{entityKey}',
http_method='GET',
name='mymodel.get')
def getMyModel(self, model):
print(model.data)
return model
API = endpoints.api_server([DataEndpoint])
When I call the api for getting a model, I get:
POST /_ah/api/myapi/v1/mymodel2
{
"data": "eyJzZWNvbmQiOiAyLCAiZmlyc3QiOiAxfQ=="
}
where eyJzZWNvbmQiOiAyLCAiZmlyc3QiOiAxfQ== is the base64 encoded of {"second": 2, "first": 1}
And the print statement give me: {u'second': 2, u'first': 1}
So, in the method, I can explore the json blob data as a python dict.
But, in the api call, the data is encoded in base64.
I expeted the api call to give me:
{
'data': {
'second': 2,
'first': 1
}
}
How can I get this result?
After the discussion in the comments of your question, let me share with you a sample code that you can use in order to store a JSON object in Datastore (it will be stored as a string), and later retrieve it in such a way that:
It will show as plain JSON after the API call.
You will be able to parse it again to a Python dict using eval.
I hope I understood correctly your issue, and this helps you with it.
import endpoints
from google.appengine.ext import ndb
from protorpc import remote
from endpoints_proto_datastore.ndb import EndpointsModel
class Sample(EndpointsModel):
column1 = ndb.StringProperty()
column2 = ndb.IntegerProperty()
column3 = ndb.StringProperty()
#endpoints.api(name='myapi', version='v1', description='My Sample API')
class MyApi(remote.Service):
# URL: .../_ah/api/myapi/v1/mymodel - POSTS A NEW ENTITY
#Sample.method(path='mymodel', http_method='GET', name='Sample.insert')
def MyModelInsert(self, my_model):
dict={'first':1, 'second':2}
dict_str=str(dict)
my_model.column1="Year"
my_model.column2=2018
my_model.column3=dict_str
my_model.put()
return my_model
# URL: .../_ah/api/myapi/v1/mymodel/{ID} - RETRIEVES AN ENTITY BY ITS ID
#Sample.method(request_fields=('id',), path='mymodel/{id}', http_method='GET', name='Sample.get')
def MyModelGet(self, my_model):
if not my_model.from_datastore:
raise endpoints.NotFoundException('MyModel not found.')
dict=eval(my_model.column3)
print("This is the Python dict recovered from a string: {}".format(dict))
return my_model
application = endpoints.api_server([MyApi], restricted=False)
I have tested this code using the development server, but it should work the same in production using App Engine with Endpoints and Datastore.
After querying the first endpoint, it will create a new Entity which you will be able to find in Datastore, and which contains a property column3 with your JSON data in string format:
Then, if you use the ID of that entity to retrieve it, in your browser it will show the string without any strange encoding, just plain JSON:
And in the console, you will be able to see that this string can be converted to a Python dict (or also a JSON, using the json module if you prefer):
I hope I have not missed any point of what you want to achieve, but I think all the most important points are covered with this code: a property being a JSON object, store it in Datastore, retrieve it in a readable format, and being able to use it again as JSON/dict.
Update:
I think you should have a look at the list of available Property Types yourself, in order to find which one fits your requirements better. However, as an additional note, I have done a quick test working with a StructuredProperty (a property inside another property), by adding these modifications to the code:
#Define the nested model (your JSON object)
class Structured(EndpointsModel):
first = ndb.IntegerProperty()
second = ndb.IntegerProperty()
#Here I added a new property for simplicity; remember, StackOverflow does not write code for you :)
class Sample(EndpointsModel):
column1 = ndb.StringProperty()
column2 = ndb.IntegerProperty()
column3 = ndb.StringProperty()
column4 = ndb.StructuredProperty(Structured)
#Modify this endpoint definition to add a new property
#Sample.method(request_fields=('id',), path='mymodel/{id}', http_method='GET', name='Sample.get')
def MyModelGet(self, my_model):
if not my_model.from_datastore:
raise endpoints.NotFoundException('MyModel not found.')
#Add the new nested property here
dict=eval(my_model.column3)
my_model.column4=dict
print(json.dumps(my_model.column3))
print("This is the Python dict recovered from a string: {}".format(dict))
return my_model
With these changes, the response of the call to the endpoint looks like:
Now column4 is a JSON object itself (although it is not printed in-line, I do not think that should be a problem.
I hope this helps too. If this is not the exact behavior you want, maybe should play around with the Property Types available, but I do not think there is one type to which you can print a Python dict (or JSON object) without previously converting it to a String.
Folks,
I just spent a good amount of time trying to look this up -- I ought to be missing something basic.
I have a python object, all I want to do is to insert this object in mondodb.
This is what I have:
from pymongo import Connection
import json
conn = Connection()
db = conn.cl_database
postings = db.postings_collection
class Posting(object):
def __init__(self, link, found=None, expired=None):
self.link = link
self.found = found
self.expired = expired
posting = Posting('objectlink1')
value = json.dumps(posting, default=lambda x:x.__dict__)
postings.insert(value)
throws this error:
Traceback (most recent call last):
File "./mongotry.py", line 21, in <module>
postings.insert(value)
File "build/bdist.macosx-10.7-intel/egg/pymongo/collection.py", line 302, in insert
File "build/bdist.macosx-10.7-intel/egg/pymongo/database.py", line 252, in _fix_incoming
File "build/bdist.macosx-10.7-intel/egg/pymongo/son_manipulator.py", line 73, in transform_incoming
TypeError: 'str' object does not support item assignment
Seems like it is because json.dumps() returns a string.
Now if I do do a loads of the value before inserting it works fine:
posting = Posting('objectlink1')
value = json.dumps(posting, default=lambda x:x.__dict__)
value = json.loads(value)
postings.insert(value)
What is the most straight-forward to do this?
Thanks!
What is value in your initial code?
It should be dict not class instance
This should work:
postings.insert(posting.__dict__)
You are misusing the insert method for the collection. Review here: http://api.mongodb.org/python/current/api/pymongo/collection.html#pymongo.collection.Collection.insert
What you need to be inserting is a document. It should be a dict with keys and values. Simply trying to insert a string is not appropriate. json.dumps returns a string in json format. If you are just dumping it to get a dict then the json step is not necessary.
Insert exactly how your documents should look:
postings.insert({"key":"value"})
Or convert your class instance directly into the dict you want to store as a doc and then insert it. It works with your json.dumps.loads() because that ultimately does give you a dict.
I am trying to import JSON data from an URL and extract the value of a specific key using python 2.7. I tried the following:
import urllib
import json
daily_stock = urllib.urlopen('http://www.bloomberg.com/markets/api/bulk-time-series/price/NFLX%3AUS?timeFrame=1_DAY')
stock_json = json.load(daily_stock)
print stock_json
The output is:
[{u'lastPrice': 95.9, u'lastUpdateDate': u'2016-04-22', u'price': [{u'value': 95.45, u'dateTime': u'2016-04-22T13:30:00Z'} ...
u'dateTimeRanges': {u'start': u'2016-04-22T13:30:00Z', u'end': u'2016-04-22T20:30:00Z'}}]
When i try to retrieve the value of 'lastPrice':
print stock_json["lastPrice"]
I get the following error:
TypeError: list indices must be integers, not str
Please help.
stock_json is a list with a single dictionary inside, get the dictionary by index:
print stock_json[0]["lastPrice"]
I have a WS using Flask/python 2.7. I have 1 JSON object passed to the WS. I have been successful in capturing the object and returning the whole JSON.
I have looked all over for examples (many use print of test dataset in python) and have tried json.dumps, json.loads, json.dump, json.load, for loops, etc.
What I would like to do seems simple and I know it is me, but I get errors no matter what I try. I am trying to parse the JSON, put the values in to variables, and do "stuff".
This works:
#app.route('/v1/test', methods = ['POST'])
def api_message():
if request.headers['Content-Type'] == 'application/json':
return "JSON Message: " + json.dumps(request.json, separators=(',',':'))
else:
return "415 Unsupported Media Type"
This does not (and many variations of this using different things):
jsonobject = json.dumps(request.json)
pstring = json.loads(jsonobject)
for key, value in pstring.iteritems():
return value
What I want to do (pseudo code):
for each JSON
get the name value pairs in to a place where I can do something like this (which was done on a flat file)
input_data = pd.read_csv(sio, delimiter=',', names=columns)
probs = model.predict_proba(input_data)
I am sure I didn't make this as clear as I could but it is a challenge because I get errors like below (examples -- not all at once of course) with all the different things I try:
AttributeError: 'dict' object has no attribute 'translate'
TypeError: 'dict' object is not callable
AttributeError: 'str' object has no attribute 'iteritems'
So after all that, what is the right way to do this?