How to turn array into string in maco? - mako

I have an object in which array with strings are stored. I have tried different ways, but it didn't work.
For example:
${', '.join(data.array_of_strings)}

By "array of strings" do you mean a list/tuple of strings, or do mean this array of type char. If you mean a list/tuple of strings, then the snippet you have in your question should work. Here is a minimal working example.
Test.py
from mako.template import Template
class Data(object): pass
data = Data()
data.array_of_strings = ['a', 'b', 'c']
print Template("${', '.join(data.array_of_strings)}").render(data=data)
Which, when executed produces the following.
>> python test.py
a, b, c

Related

Micropython: bytearray in json-file

i'm using micropython in the newest version. I also us an DS18b20 temperature sensor. An adress of theses sensor e.g. is "b'(b\xe5V\xb5\x01<:'". These is the string representation of an an bytearray. If i use this to save the adress in a json file, i run in some problems:
If i store directly "b'(b\xe5V\xb5\x01<:'" after reading the json-file there are no single backslahes, and i get b'(bxe5Vxb5x01<:' inside python
If i escape the backslashes like "b'(b\xe5V\xb5\x01<:'" i get double backslashes in python: b'(b\xe5V\xb5\x01<:'
How do i get an single backslash?
Thank you
You can't save bytes in JSON with micropython. As far as JSON is concerned that's just some string. Even if you got it to give you what you think you want (ie. single backslashes) it still wouldn't be bytes. So, you are faced with making some form of conversion, no-matter-what.
One idea is that you could convert it to an int, and then convert it back when you open it. Below is a simple example. Of course you don't have to have a class and staticmethods to do this. It just seemed like a good way to wrap it all into one, and not even need an instance of it hanging around. You can dump the entire class in some other file, import it in the necessary file, and just call it's methods as you need them.
import math, ujson, utime
class JSON(object):
#staticmethod
def convert(data:dict, convert_keys=None) -> dict:
if isinstance(convert_keys, (tuple, list)):
for key in convert_keys:
if isinstance(data[key], (bytes, bytearray)):
data[key] = int.from_bytes(data[key], 'big')
elif isinstance(data[key], int):
data[key] = data[key].to_bytes(1 if not data[key]else int(math.log(data[key], 256)) + 1, 'big')
return data
#staticmethod
def save(filename:str, data:dict, convert_keys=None) -> None:
#dump doesn't seem to like working directly with open
with open(filename, 'w') as doc:
ujson.dump(JSON.convert(data, convert_keys), doc)
#staticmethod
def open(filename:str, convert_keys=None) -> dict:
return JSON.convert(ujson.load(open(filename, 'r')), convert_keys)
#example with both styles of bytes for the sake of being thorough
json_data = dict(address=bytearray(b'\xFF\xEE\xDD\xCC'), data=b'\x00\x01\02\x03', date=utime.mktime(utime.localtime()))
keys = ['address', 'data'] #list of keys to convert to int/bytes
JSON.save('test.json', json_data, keys)
json_data = JSON.open('test.json', keys)
print(json_data) #{'date': 1621035727, 'data': b'\x00\x01\x02\x03', 'address': b'\xff\xee\xdd\xcc'}
You may also want to note that with this method you never actually touch any JSON. You put in a dict, you get out a dict. All the JSON is managed "behind the scenes". Regardless of all of this, I would say using struct would be a better option. You said JSON though so, my answer is about JSON.

How to convert a multi-dimensional dictionary to json file?

I have uploaded a *.mat file that contains a 'struct' to my jupyter lab using:
from pymatreader import read_mat
data = read_mat(mat_file)
Now I have a multi-dimensional dictionary, for example:
data['Forces']['Ss1']['flap'].keys()
Gives the output:
dict_keys(['lf', 'rf', 'lh', 'rh'])
I want to convert this into a JSON file, exactly by the keys that already exist, without manually do so because I want to perform it to many *.mat files with various key numbers.
EDIT:
Unfortunately, I no longer have access to MATLAB.
An example for desired output would look something like this:
json_format = {
"Forces": {
"Ss1": {
"flap": {
"lf": [1,2,3,4],
"rf": [4,5,6,7],
"lh": [23 ,5,6,654,4],
"rh": [4 ,34 ,35, 56, 66]
}
}
}
}
ANOTHER EDIT:
So after making lists of the subkeys (I won't elaborate on it), I did this:
FORCES = []
for ind in individuals:
for force in forces:
for wing in wings:
FORCES.append({
ind: {
force: {
wing: data['Forces'][ind][force][wing].tolist()
}
}
})
Then, to save:
with open(f'{ROOT_PATH}/Forces.json', 'w') as f:
json.dump(FORCES, f)
That worked but only because I looked manually for all of the keys... Also, for some reason, I have squared brackets at the beginning and at the end of this json file.
The json package will output dictionaries to JSON:
import json
with open('filename.json', 'w') as f:
json.dump(data, f)
If you are using MATLAB-R2016b or later, and want to go straight from MATLAB to JSON check out JSONENCODE and JSONDECODE. For your purposes JSONENCODE
encodes data and returns a character vector in JSON format.
MathWorks Docs
Here is a quick example that assumes your data is in the MATLAB variable test_data and writes it to a file specified in the variable json_file
json_data = jsonencode(test_data);
writematrix(json_data,json_file);
Note: Some MATLAB data formats cannot be translate into JSON data due to limitations in the JSON specification. However, it sounds like your data fits well with the JSON specification.

how to convert os.stat_result to a JSON that is an object?

I know that json.dumps can be used to convert variables into a JSON representation. Sadly the conversion of python3's class os.stat_result is an string consisting of an array representing the values of the class instance.
>>> import json
>>> import os
>>> json.dumps(os.stat('/'))
'[16877, 256, 24, 1, 0, 0, 268, 1554977084, 1554976849, 1554976849]'
I would however much prefer to have it convert the os.stat_result being converted to an JSON being an object. How can I achieve this?
It seems that the trouble is that os.stat_result does not have a .__dict__ thing.
seeing the result of this:
>>> import os
>>> str(os.stat('/'))
'os.stat_result(st_mode=16877, st_ino=256, st_dev=24, st_nlink=1, st_uid=0, st_gid=0, st_size=268, st_atime=1554977084, st_mtime=1554976849, st_ctime=1554976849)'
makes me hope there is a swift way to turn an python class instance (e.g. `os.stat_result") into a JSON representation that is an object.
which while is JSON, but the results are
as gst mentioned, manually would be this:
def stat_to_json(fp: str) -> dict:
s_obj = os.stat(fp)
return {k: getattr(s_obj, k) for k in dir(s_obj) if k.startswith('st_')}
I would however much prefer to have it convert the os.stat_result being converted to an JSON being an object. How can I achieve this?
if by JSON you mean have a dict with keys st_mode, st_ino, etc.. then the answer is .. manually.
with a "prevaled" list of keys:
# http://thepythoncorner.com/dev/writing-a-fuse-filesystem-in-python/
def dict_of_lstat(lstat_res):
lstat_keys = ['st_atime', 'st_ctime', 'st_gid', 'st_mode', 'st_mtime', 'st_nlink', 'st_size', 'st_uid', 'st_blocks']
return dict((k, getattr(lstat_res, k)) for k in lstat_keys)
lstat_dict = dict_of_lstat(os.lstat(path))
def dict_of_statvfs(statvfs_res):
statvfs_keys = ['f_bavail', 'f_bfree', 'f_blocks', 'f_bsize', 'f_favail', 'f_ffree', 'f_files', 'f_flag', 'f_frsize', 'f_namemax']
return dict((k, getattr(statvfs_res, k)) for k in statvfs_keys)
statvfs_dict = dict_of_statvfs(os.statvfs(path))
longer but faster than a k.startswith('st_') filter

Pandas Series And Read CSV

I'm using a regular dictionary to store matrices and then converting that dict to a Pandas Series and write it out to a CSV. I then use pd.read_csv() on the csv file but the returned items are all strings, literally a string of the entire matrix of values. Anyway I can make it floats?
The datatypes to read in are an argument to read_csv. From the function help:
dtype : Type name or dict of column -> type
Data type for data or columns. E.g. {'a': np.float64, 'b': np.int32}
so you'd make a read call like:
mypd=pd.read_csv('my.csv', dtype={'a': np.float64, 'b': np.int32})
where a, b, etc match your input file.
You can also cast the type of a column after it's been read into a DataFrame.

Haskell: Parsing JSON data into a Map or a list of tuples?

Is there a way to automatically convert JSON data into Data.Map or just a list of tuples?
Say, if I have:
{Name : "Stitch", Age : 3, Friend: "Lilo"}
I'd like it to be converted into:
fromList [("Name","Stitch"), ("Age",3), ("Friend","Lilo")]
.. without defining a Stitch data type.
I am happy to parse integers into strings in the resulting map. I can just read them into integers later.
You can use aeson. See Decoding a mixed-type object in its documentation's tutorial:
>>> import qualified Data.ByteString.Lazy.Char8 as BS
>>> :m +Data.Aeson
>>> let foo = BS.pack "{\"Name\" : \"Stitch\", \"Age\" : 3, \"Friend\": \"Lilo\"}"
>>> decode foo :: Maybe Object
Just fromList [("Friend",String "Lilo"),("Name",String "Stitch"),("Age",Number 3.0)]
An Object is just a HashMap from Text keys to Value values, the Value type being a sum type representation of JS values.