I have three float arrays of data (A,B,C) that I want to send it to event hub.
The issue here is that the classes are nested and I can't figure out how to shape the data to be able to send successfully.
Here is the code that I am trying
logger = logging.getLogger("azure")
ADDRESS = ""
USER = ""
KEY = ""
try:
if not ADDRESS:
raise ValueError("No EventHubs URL supplied.")
# Create Event Hubs client
client = EventHubClient(ADDRESS, debug=False, username=USER, password=KEY)
sender = client.add_sender(partition="0")
client.run()
x_value = np.arange(100)
try:
start_time = time.time()
for i in range(100000):
A = np.asarray([1,2,3,4])
B = np.asarray([2,3,4,5])
C = np.asarray([3,4,5,6])
message = [A, B, C]
sender.send(EventData(body = message))
time.sleep(1)
except:
raise
finally:
end_time = time.time()
client.stop()
run_time = end_time - start_time
logger.info("Runtime: {} seconds".format(run_time))
except KeyboardInterrupt:
pass
In this way, I am seeing the error 'ValueBody' object has no attribute 'append'
I think by types, the class encodes the message differently.
Instead of sending a single message with a serial string, I want to send the message in a parallel way, and receive them something like this:
Receive one eventdata with the form of the list [A,B,C]
Receive three eventdata A, B, C separately, like calling three different objects basis.
For the second way, I am not sure if it would work since I am using only one partition and that may have the three eventdata A, B, C mixed when I receive them and I do not want that.
I have confirmed with ms support, the feedback is that: it only works for a list of string or bytes, but does not work for a list of int.
And also you should note that, even it sends a list of string, when receive the list, it will concatenate the all elements of the list to compose a single message. For example, if there is a list ["a","b","c","d"] for sending, when receive it, it will show this "abcd" as a whole string.
I have tested it by myself, the above information is correct.
So for your case, you should consider change the list to a string or json string. when receiving it, you can format the data to what you need.
Hope it helps.
Related
The following code goes over the 10 pages of JSON returned by GET request to the URL.
and checks how many records satisfy the condition that bloodPressureDiastole is between the specified limits. It does the job, but I was wondering if there was a better or cleaner way to achieve this in python
import urllib.request
import urllib.parse
import json
baseUrl = 'https://jsonmock.hackerrank.com/api/medical_records?page='
count = 0
for i in range(1, 11):
url = baseUrl+str(i)
f = urllib.request.urlopen(url)
response = f.read().decode('utf-8')
response = json.loads(response)
lowerlimit = 110
upperlimit = 120
for elem in response['data']:
bd = elem['vitals']['bloodPressureDiastole']
if bd >= lowerlimit and bd <= upperlimit:
count = count+1
print(count)
There is no access through fields to json content because you get dict object from json.loads (see translation scheme here). It realises access via __getitem__ method (dict[key]) instead of __getattr__ (object.field) as keys may be any hashible objects not only strings. Moreover, even strings cannot serve as fields if they starts with digits or are the same with built-in dictionary methods.
Despite this, you can define your own custom class realising desired behavior with acceptable key names. json.loads has an argument object_hook wherein you may put any callable object (function or class) that take a dict as the sole argument (not only the resulted one but every one in json recursively) & return something as the result. If your jsons match some template, you can define a class with predefined fields for the json content & even with methods in order to get a robust Python-object, a part of your domain logic.
For instance, let's realise the access through fields. I get json content from response.json method from requests but it has the same arguments as in json package. Comments in code contain remarks about how to make your code more pythonic.
from collections import Counter
from requests import get
class CustomJSON(dict):
def __getattr__(self, key):
return self[key]
def __setattr__(self, key, value):
self[key] = value
LOWER_LIMIT = 110 # Constants should be in uppercase.
UPPER_LIMIT = 120
base_url = 'https://jsonmock.hackerrank.com/api/medical_records'
# It is better use special tools for handling URLs
# in order to evade possible exceptions in the future.
# By the way, your option could look clearer with f-strings
# that can put values from variables (not only) in-place:
# url = f'https://jsonmock.hackerrank.com/api/medical_records?page={i}'
counter = Counter(normal_pressure=0)
# It might be left as it was. This option is useful
# in case of additional counting any other information.
for page_number in range(1, 11):
records = get(
base_url, data={"page": page_number}
).json(object_hook=CustomJSON)
# Python has a pile of libraries for handling url requests & responses.
# urllib is a standard library rewritten from scratch for Python 3.
# However, there is a more featured (connection pooling, redirections, proxi,
# SSL verification &c.) & convenient third-party
# (this is the only disadvantage) library: urllib3.
# Based on it, requests provides an easier, more convenient & friendlier way
# to work with url-requests. So I highly recommend using it
# unless you are aiming for complex connections & url-processing.
for elem in records.data:
if LOWER_LIMIT <= elem.vitals.bloodPressureDiastole <= UPPER_LIMIT:
counter["normal_pressure"] += 1
print(counter)
I am trying to figure out how to get this yubihsm2 to work with signing eth transactions. I have been using the python lib and so far i have had some basic setup. Below is an abbreviation of what i have
web3_endpoint = ''
web3 = Web3(HTTPProvider(web3_endpoint))
hsm = YubiHsm.connect("http://localhost:12345")
session = hsm.create_session_derived(1, "password")
key = session.get_object(1,OBJECT.ASYMMETRIC_KEY)
#key = AsymmetricKey.generate(session, 1, "EC Key", 1, CAPABILITY.SIGN_ECDSA, ALGORITHM.EC_K256)
pub_key = key.get_public_key()
#raw_pub = pub_key.public_bytes(
# encoding=serialization.Encoding.DER,
# format=serialization.PublicFormat.SubjectPublicKeyInfo
# )
raw_pub = pub_key.public_bytes(
encoding=serialization.Encoding.X962,
format=serialization.PublicFormat.UncompressedPoint
)
print ("Public key (Uncompressed):\n",binascii.b2a_hex(raw_pub))
unindexPub = raw_pub[1:]
public_key_hash = Web3.keccak(unindexPub)
address_bytes = public_key_hash[-20:]
address = address_bytes.hex()
print(address)
Now so far i can consistently get the same public key and it looks correct. I then get the same public key each time. When i say correct, the formatting looks correct and is the correct number of bytes.
1). should i be using the commented out public key formatting or the uncompressed X962 encoding that i have above.
From there, this is where things get a bit weird
transaction = {
'to': Web3.toChecksumAddress('0x785AB1daE1b0Ee3f2412aCF55e4153A9517b07e1'),
'gas': 21000,
'gasPrice': Web3.toWei(5, 'gwei'),
'value': 1,
'nonce': 1,
'chainId': 4,
}
serializable_transaction = serializable_unsigned_transaction_from_dict(transaction)
transaction_hash = serializable_transaction.hash()
print(transaction_hash.hex())
# sign the transaction hash and calculate v value
signature = key.sign_ecdsa(transaction_hash,hashes.SHA3_256())
r, s = ecdsa.util.sigdecode_der(signature, ecdsa.SECP256k1.generator.order())
print("r: "+str(r)+"\ns: "+str(s))
v = 28
# encode the transaction along with the full signature and send it
encoded_transaction = encode_transaction(serializable_transaction, vrs=(v, r, s))
web3.eth.sendRawTransaction(encoded_transaction)
I am settings v to 28.. i also test it with 27.. I could use the correct amount with the chainid.. but it's not necessary right from the perspective of trying to get a valid signature (recoverable to get the same public key each time). Sometimes i am getting the error "invalid sender" and other times i am getting the error "insufficient gas." If i take the signature output and use a javascript lib to try to find the public key, each time i am getting a different public key. But i keep consistently generating the same public key from the yubihsm2 in this python app.
I have also commented out in sign_ecdsa the hashing function as i am passing in the data already hashed (in order to use keccak256).
Is there something i am missing? Why are these transactions not signing correctly for eth?
i am getting some of those serialization helpers from enter link description here
helper serialization functions
Thanks
x = fopen('pm10_data.txt');
fseek(x, 8,0);
dat = fscanf (x,'%f',[2,1000]);
dat = transpose(dat);
a = dat(:,1);
b = dat(:,2);
[r,p] = cor_test (a,b)
fclose(x);
r
p
this is what i got,
r =
scalar structure containing the fields:
method = Pearson's product moment correlation
params = 76
stat = 6.2156
dist = t
pval = 2.5292e-08
alternative = !=
Run error
error: element number 2 undefined in return list
error: called from
tester.octave at line 7 column 6
Presumably you're referring to the cor_test function from the statistics package, even though you don't show loading this in your workspace.
According to the documentation of cor_test:
The output is a structure with the following elements:
PVAL The p-value of the test.
STAT The value of the test statistic.
DIST The distribution of the test statistic.
PARAMS The parameters of the null distribution of the test statistic.
ALTERNATIVE The alternative hypothesis.
METHOD The method used for testing.
If no output argument is given, the p-value is displayed.
This seems to be what you're getting too.
If you want the p value explicitly from that structure, you can access that as r.pval
The syntax [a, b, ...] = functionname( args, ... ) expects the function to return more than one argument, and capture all the returned arguments into the named variables (i.e. a, b, etc).
In this case, cor_test only returns a single argument, even though that argument is a struct (which means it has fields you can access).
The error you're getting effectively means you requested a second output argument p, but the function you're using does not return a second output argument. It only returns that struct you already captured in r.
Following script describes the decoding of a JSON Object, that is received via MQTT. In this case, we shall take following JSON Object as an example:
{"00-06-77-2f-37-94":{"publish_topic":"/stations/test","sample_rate":5000}}
After being received and decoded in the handleOnReceive function, the local function saveTable is called up with the decoded object which looks like:
["00-06-77-2f-37-94"] = {
publish_topic = "/stations/test",
sample_rate = 5000
}
The goal of the saveTable function is to go through the table above and assign the values "/stations/test" and 5000 respectively to the variables pubtop and rate. When I however print each of both variables, nil is returned in both cases.
How can I extract the values of this table and save them in mentioned variables?
If i can only save the values "publish_topic = "/stations/test"" and "sample_rate = 5000" at first, would I need to parse these to get the values above and save them, or is there another way?
local pubtop
local rate
local function saveTable(t)
local conversionTable = {}
for k,v in pairs(t) do
if type(v) == "table" then
conversionTable [k] = string.format("%q: {", k)
printTable(v)
print("}")
else
print(string.format("%q:", k) .. v .. ",")
end
end
pubtop = conversionTable[0]
rate = conversionTable[1]
end
local lua_value
local function handleOnReceive(topic, data, _, _)
print("handleOnReceive: topic '" .. topic .. "' message '" .. data .. "'")
print(data)
lua_value = JSON:decode(data)
saveTable(lua_value)
print(pubtop)
print(rate)
end
client:register('OnReceive', handleOnReceive)
previous question to thread: Decode and Parse JSON to Lua
The function I gave you was to recursively print table contents. It was not ment to be modified to get some specific values.
Your modifications do not make any sense. Why would you store that string in conversionTable[k]? You obviously have no idea what you're doing here. No offense but you should learn some basics befor you continue.
I gave you that function so you can print whatever is the result of your json decode.
If you know you get what you expect there is no point in recursively iterating through that table.
Just do it like that
for k,v in pairs(lua_value) do
print(k)
print(v.publish_topic)
print(v.sample_rate)
end
Now read the Lua reference manual and do some beginners tutorials please.
You're wasting a lot of time and resources if you're trying to implement things like that if you do not know how to access the elements of a table. This is like the most basic and important operation in Lua.
I know that I can use LIST_TO_ASCI to convert a report to ASCII, but I would like to have a more high level data format like JSON, XML, CSV.
Is there a way to get something that is easier to handle then ASCII?
Here is the report I'd like to convert:
The conversion needs to be executed in ABAP on a result which was executed like this:
SUBMIT <REPORT_NAME> ... EXPORTING LIST TO MEMORY AND RETURN.
You can get access to SUBMIT list in memory like this:
call function 'LIST_FROM_MEMORY'
TABLES
listobject = t_list
EXCEPTIONS
not_found = 1
others = 2.
if sy-subrc <> 0.
message 'Unable to get list from memory' type 'E'.
endif.
call function 'WRITE_LIST'
TABLES
listobject = t_list
EXCEPTIONS
EMPTY_LIST = 1
OTHERS = 2
.
if sy-subrc <> 0.
message 'Unable to write list' type 'E'.
endif.
And the final step of the solution (conversion of result table to JSON) was already answered to you in your question.
I found a solution here: http://zevolving.com/2015/07/salv-table-22-get-data-directly-after-submit/
This is the code:
DATA: lt_outtab TYPE STANDARD TABLE OF alv_t_t2.
FIELD-SYMBOLS: <lt_outtab> like lt_outtab.
DATA lo_data TYPE REF TO data.
" Let know the model
cl_salv_bs_runtime_info=>set(
EXPORTING
display = abap_false
metadata = abap_false
data = abap_true
).
SUBMIT salv_demo_table_simple
AND RETURN.
TRY.
" get data from SALV model
cl_salv_bs_runtime_info=>get_data_ref(
IMPORTING
r_data = lo_data
).
ASSIGN lo_data->* to <lt_outtab>.
BREAK-POINT.
CATCH cx_salv_bs_sc_runtime_info.
ENDTRY.
Big thanks to Sandra Rossi, she gave me the hint to cx_salv_bs_sc_runtime_info.
Related answer: https://stackoverflow.com/a/52834118/633961