Using json with jsondatetime - json

json.dumps gives an error if both json and jsondatetime are imported. The error is the following:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python3.5/json/__init__.py", line 230, in dumps
return _default_encoder.encode(obj)
TypeError: encode() missing 1 required positional argument: 'o'
But I just import JSON, then json.dumps work fine. I don't know how to deal with this. I need jsondatetime as well
This works::
import json
json.dumps({'DbName': 'DB','Hostname': '10.0.0.6','DbUsername':'SYSTEM'})
'{"Hostname": "10.0.0.6","DbName": "DB", "DbUsername": "SYSTEM"}'
This does not work::
import jsondatetime
import json
json.dumps({'DbName': 'DB', 'Hostname': '10.0.0.6', 'DbUsername': 'SYSTEM'})
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python3.5/json/__init__.py", line 230, in dumps
return _default_encoder.encode(obj)
TypeError: encode() missing 1 required positional argument: 'o'

jsondatetime is a drop-in replacement for json. You should only have
import jsondatetime as json
From the documentation:
JSON-datetime is a very simple wrapper around Python simplejson loads method. It decodes datetime values contained in JSON strings.

Related

module 'pyarrow' has no attribute 'decimal'

Trying to set the datatype to decimal. I get an error that pyarrow doesn't have the attribute decimal
>>> import pyarrow
>>> pyarrow.decimal(8)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: module 'pyarrow' has no attribute 'decimal'
Found that pa.decimal128(18) works

Upload Pandas dataframe as a JSON object in Cloud Storage

I have been trying to upload a Pandas dataframe to a JSON object in Cloud Storage using Cloud Function. Follwing is my code -
def upload_blob(bucket_name, source_file_name, destination_blob_name):
"""Uploads a file to the bucket."""
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
blob.upload_from_file(source_file_name)
print('File {} uploaded to {}.'.format(
source_file_name,
destination_blob_name))
final_file = pd.concat([df, df_second], axis=0)
final_file.to_json('/tmp/abc.json')
with open('/tmp/abc.json', 'r') as file_obj:
upload_blob('test-bucket',file_obj,'abc.json')
I am getting the following error in line - blob.upload_from_file(source_file_name)
Deployment failure:
Function failed on loading user code. Error message: Code in file main.py
can't be loaded.
Detailed stack trace: Traceback (most recent call last):
File "/env/local/lib/python3.7/site-
packages/google/cloud/functions/worker.py", line 305, in
check_or_load_user_function
_function_handler.load_user_function()
File "/env/local/lib/python3.7/site-
packages/google/cloud/functions/worker.py", line 184, in load_user_function
spec.loader.exec_module(main)
File "<frozen importlib._bootstrap_external>", line 728, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/user_code/main.py", line 6, in <module>
import datalab.storage as gcs
File "/env/local/lib/python3.7/site-packages/datalab/storage/__init__.py",
line 16, in <module>
from ._bucket import Bucket, Buckets
File "/env/local/lib/python3.7/site-packages/datalab/storage/_bucket.py",
line 21, in <module>
import datalab.context
File "/env/local/lib/python3.7/site-packages/datalab/context/__init__.py",
line 15, in <module>
from ._context import Context
File "/env/local/lib/python3.7/site-packages/datalab/context/_context.py",
line 20, in <module>
from . import _project
File "/env/local/lib/python3.7/site-packages/datalab/context/_project.py",
line 18, in <module>
import datalab.utils
File "/env/local/lib/python3.7/site-packages/datalab/utils/__init__.py",
line 15
from ._async import async, async_function, async_method
^
SyntaxError: invalid syntax
What possibly is the error?
You are passing a string to blob.upload_from_file(), but this method requires a file object. You probably want to use blob.upload_from_filename() instead. Check the sample in the GCP docs.
Alternatively, you could get the file object, and keep using blob.upload_from_file(), but it's unnecessary extra lines.
with open('/tmp/abc.json', 'r') as file_obj:
upload_blob('test-bucket', file_obj, 'abc.json')
Use a bucket object instead of string
something like upload_blob(conn.get_bucket(mybucket),'/tmp/abc.json','abc.json')}

JSON Parsing with Nao robot - AttributeError

I'm using a NAO robot with naoqi version 2.1 and Choregraphe on Windows. I want to parse json from an attached file to the behavior. I attached the file like in that link.
Code:
def onLoad(self):
self.filepath = os.path.join(os.path.dirname(ALFrameManager.getBehaviorPath(self.behaviorId)), "fileName.json")
def onInput_onStart(self):
with open(self.filepath, "r") as f:
self.data = self.json.load(f.get_Response())
self.dataFromFile = self.data['value']
self.log("Data from file: " + str(self.dataFromFile))
But when I run this code on the robot (connected with a router) I'll get an error:
[ERROR] behavior.box :_safeCallOfUserMethod:281 _Behavior__lastUploadedChoregrapheBehaviorbehavior_1136151280__root__AbfrageKontostand_3__AuslesenJSONDatei_1: Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/naoqi.py", line 271, in _safeCallOfUserMethod
func()
File "<string>", line 20, in onInput_onStart
File "/usr/lib/python2.7/site-packages/inaoqi.py", line 265, in <lambda>
__getattr__ = lambda self, name: _swig_getattr(self, behavior, name)
File "/usr/lib/python2.7/site-packages/inaoqi.py", line 55, in _swig_getattr
raise AttributeError(name)
AttributeError: json
I already tried to understand the code from the correspondending lines but I couldn't fixed the error. But I know that the type of my object f is 'file'. How can I open the json file as a json file?
Your problem comes from this:
self.json.load(f.get_Response())
... there is no such thing as "self.json" on a Choregraphe box, import json and then do json.load. And what is get_Response ? That method doesn't exist on anything in Python that I know of.
You might want to first try making a standalone python script (that doesn't use the robot) that can read your json file before you try it with choregraphe. It will be easier.

Python 3 JSON deserializer: JSON object must be str, not 'bytes'

There are couple of questions related to this error:
JSON object must be str, not 'bytes'
However, except obvious solution to read and decode response, I didn't learn anything special.
Here is example to the problem:
>>> import json
>>> from urllib.request import urlopen
>>> url = 'http://echo.jsontest.com/key/value/one/two'
>>> with urlopen(url) as request:
... json.load(request)
...
Traceback (most recent call last):
File "<stdin>", line 2, in <module>
File "C:\Python35\lib\json\__init__.py", line 268, in load
parse_constant=parse_constant, object_pairs_hook=object_pairs_hook, **kw)
File "C:\Python35\lib\json\__init__.py", line 312, in loads
s.__class__.__name__))
TypeError: the JSON object must be str, not 'bytes'
So my question is why Python's JSON deserializer (that accepts file-like objects with .read() method), does not try to handle this request, as response headers hint all there is needed to know:
Content-Type: application/json; charset=ISO-8859-1
Headers hint, they do not guarantee, but that can not be a reason not to try the obvious IMHO.
Use json.loads instead of json.load

python ordered_dict from json

I am using Python 2.6.6, and trying to generate a ordered_dict from json string. I could understand that I could use object_pairs_hook of json Decoder/loads, but unfortunately it's not supported in 2.6.6. Is there any way out?
e.g.
template_s = '{ "aa": {"_type": "T1"}, "bb": {"_type": "T11"}}'
json.loads(template_s, object_pairs_hook=OrderedDict)
>>> json.loads(json_str, object_pairs_hook=OrderedDict)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib64/python2.6/json/__init__.py", line 318, in loads
return cls(encoding=encoding, **kw).decode(s)
TypeError: __init__() got an unexpected keyword argument 'object_pairs_hook'
Thanks
I was able to do the same with simplejson
import simplejson as json
json.loads(config_str, object_pairs_hook=json.OrderedDict)