Weird KeyError (Python) - json

So, I have to work with this JSON (from URL):
{'player': {'racing': 25260.154000000017, 'player': 259114.57700000296}, 'farming': {'fishing': 33783.390999999414, 'mining': 29048.60500000002, 'farming': 25334.504000000023}, 'piloting': {'piloting': 25570.18800000001, 'cargos': 3080.713000000036, 'heli': 10433.977000000004}, 'physical': {'strength': 198358.86700000675}, 'business': {'business': 50922.88500000005}, 'trucking': {'mechanic': 2724.5620000000004, 'garbage': 755.642999999997, 'trucking': 223784.99700000713, 'postop': 1411.4190000000006}, 'train': {'bus': 669.1940000000001, 'train': 1363.805999999999}, 'ems': {'fire': 25449.43400000001, 'ems': 13844.628000000012}, 'hunting': {'skill': 4179.033000000316}, 'casino': {'casino': 18545.526000000027}}
It is indeed one line. I am trying to make it so that for example, I can get racing, which is the first one you see. For this, you need go into Player first, and then you can get to Racing. How do I do this?
My current code:
def allthethings():
# Grab all the skills
geturl = ("http://server.tycoon.community:30120/status/data/" + str(setting_playerid))
print(geturl)
a = requests.get(geturl,headers={"X-Tycoon-Key":setting_apikeyTT}).json()
jsonconverted = (a["data"]["gaptitudes_v"])
print(jsonconverted)
# Convert JSON into many, many variables
Raw_RACR = jsonconverted['player.racing']
print(Raw_RACR)
I believe this is all the code that is needed.
Also, this is the error:
KeyError: 'player.racing'

Related

How do I read keyboard events from file?

I have read this question, which is similar and gets me most of the way.
The answer of the code isn't posted, but I believe I have followed the instructions and managed to get it working -- except after it's been opened.
It works perfectly fine immediately after recording, however I want to save the data and read it again for later use: literally every time I run the program and I don't want to have to re-record it every time.
import keyboard
import threading
from keyboard import KeyboardEvent
import time
import json
def record(file='record.txt'):
f = open(file, 'w+')
keyboard_events = []
keyboard.start_recording()
starttime = time.time()
keyboard.wait('esc')
keyboard_events = keyboard.stop_recording()
print(starttime, file=f)
for kevent in range(0, len(keyboard_events)):
print(keyboard_events[kevent].to_json(), file = f)
f.close()
def play(file="record.txt", speed = 1):
f = open(file, 'r')
lines = f.readlines()
f.close()
keyboard_events = []
for index in range(1,len(lines)):
keyboard_events.append(keyboard.KeyboardEvent(**json.loads(lines[index])))
starttime = float(lines[0])
keyboard_time_interval = keyboard_events[0].time - starttime
keyboard_time_interval /= speed
k_thread = threading.Thread(target = lambda : time.sleep(keyboard_time_interval) == keyboard.play(keyboard_events, speed_factor=speed) )
k_thread.start()
k_thread.join()
I am not especially new to coding, or the Python language, but this problem perplexes me. I've tested all the variables and none of them are being sustained outside of the record function.
(I don't fully understand lambda, Threading or **json.loads, but I don't think that's a problem.)
What's going on here?
For extra bonus points, if this is possible to do asynchronously, that'd be amazing. One problem at a time, though.
Just in case anyone else ever has the same problem as me, just tag this at the start of your code. No idea why it works, but it does.
keyboard.start_recording()
temp = keyboard.stop_recording()
You can forget about the temp variable immediately.

Trying to define a function that creates lists from files and uses random.choices to choose an element from the weighted lists

I'm trying to define a function that will create lists from multiple text files and print a random element from one of the weighted lists. I've managed to get the function to work with random.choice for a single list.
enter code here
def test_rollitems():
my_commons = open('common.txt')
all_common_lines = my_commons.readlines()
common = []
for i in all_common_lines:
common.append(i)
y = random.choice(common)
print(y)
When I tried adding a second list to the function it wouldn't work and my program just closes when the function is called.
enter code here
def Improved_rollitem():
#create the lists from the files#
my_commons = open('common.txt')
all_common_lines= my_commons.readlines()
common = []
for i in all_common_lines:
common.append(i)
my_uncommons = open('uncommon.txt')
all_uncommon_lines =my_uncommons.readlines()
uncommon =[]
for i in all_uncommon_lines:
uncommon.apend(i)
y = random.choices([common,uncommon], [80,20])
print(y)
Can anyone offer any insight into what I'm doing wrong or missing ?
Nevermind. I figured this out on my own! Was having issues with Geany so I installed Pycharm and was able to work through the issue. Correct code is:
enter code here
def Improved_rollitem():
#create the lists from the files#
my_commons = open('common.txt')
all_common_lines= my_commons.readlines()
common = []
for i in all_common_lines:
common.append(i)
my_uncommons = open('uncommon.txt')
all_uncommon_lines =my_uncommons.readlines()
uncommon =[]
for i in all_uncommon_lines:
uncommon.append(i)
y = random.choices([common,uncommon], [.8,.20])
if y == [common]:
for i in [common]:
print(random.choice(i))
if y == [uncommon]:
for i in [uncommon]:
print(random.choice(i))
If there's a better way to do something like this, it would certainly be cool to know though.

How to get dataset into array

I have worked all the tutorials and searched for "load csv tensorflow" but just can't get the logic of it all. I'm not a total beginner, but I don't have much time to complete this, and I've been suddenly thrown into Tensorflow, which is unexpectedly difficult.
Let me lay it out:
Very simple CSV file of 184 columns that are all float numbers. A row is simply today's price, three buy signals, and the previous 180 days prices
close = tf.placeholder(float, name='close')
signals = tf.placeholder(bool, shape=[3], name='signals')
previous = tf.placeholder(float, shape=[180], name = 'previous')
This article: https://www.tensorflow.org/guide/datasets
It covers how to load pretty well. It even has a section on changing to numpy arrays, which is what I need to train and test the 'net. However, as the author says in the article leading to this Web page, it is pretty complex. It seems like everything is geared toward doing data manipulation, where we have already normalized our data (nothing has really changed in AI since 1983 in terms of inputs, outputs, and layers).
Here is a way to load it, but not in to Numpy and no example of not manipulating the data.
with tf.Session as sess:
sess.run( tf.global variables initializer())
with open('/BTC1.csv') as csv_file:
csv_reader = csv.reader(csv_file, delimiter =',')
line_count = 0
for row in csv_reader:
?????????
line_count += 1
I need to know how to get the csv file in to the
close = tf.placeholder(float, name='close')
signals = tf.placeholder(bool, shape=[3], name='signals')
previous = tf.placeholder(float, shape=[180], name = 'previous')
so that I can follow the tutorials to train and test the net.
It's not that clear for me your question. You might be answering, tell me if I'm wrong, how to feed data in your model? There are several fashions to do so.
Use placeholders with feed_dict during the session. This is the basic and easier one but often suffers from training performance issue. Further explanation, check this post.
Use queue. Hard to implement and badly documented, I don't suggest, because it's been taken over by the third method.
tf.data API.
...
So to answer your question by the first method:
# get your array outside the session
with open('/BTC1.csv') as csv_file:
csv_reader = csv.reader(csv_file, delimiter =',')
dataset = np.asarray([data for data in csv_reader])
close_col = dataset[:, 0]
signal_cols = dataset[:, 1: 3]
previous_cols = dataset[:, 3:]
# let's say you load 100 row each time for training
batch_size = 100
# define placeholders like you
...
with tf.Session() as sess:
...
for i in range(number_iter):
start = i * batch_size
end = (i + 1) * batch_size
sess.run(train_operation, feed_dict={close: close_col[start: end, ],
signals: signal_col[start: end, ],
previous: previous_col[start: end, ]
}
)
By the third method:
# retrieve your columns like before
...
# let's say you load 100 row each time for training
batch_size = 100
# construct your input pipeline
c_col, s_col, p_col = wrapper(filename)
batch = tf.data.Dataset.from_tensor_slices((close_col, signal_col, previous_col))
batch = batch.shuffle(c_col.shape[0]).batch(batch_size) #mix data --> assemble batches --> prefetch to RAM and ready inject to model
iterator = batch.make_initializable_iterator()
iter_init_operation = iterator.initializer
c_it, s_it, p_it = iterator.get_next() #get next batch operation automatically called at each iteration within the session
# replace your close, signal, previous placeholder in your model by c_it, s_it, p_it when you define your model
...
with tf.Session() as sess:
# you need to initialize the iterators
sess.run([tf.global_variable_initializer, iter_init_operation])
...
for i in range(number_iter):
start = i * batch_size
end = (i + 1) * batch_size
sess.run(train_operation)
Good luck!

BaseHTTPRequestHandler hangs on self.rfile.read()

I implemented a python server using the BaseHTTPRequestHandler and it, often times, will hang while reading from the socket fileobject. It doesnt seem to matter how many bytes I read. I could read 30k bytes and it wont hang or I could read 7k bytes and it would hang. It is reading a Base64 string encoded image so i understand if it takes a second or two to read but it literally just hangs.
And then sometimes, when I press CTRL-C, it'll unhang and magically read everything. It's really bizarre. Any help will be appreciated. Thanks. Also, this is python2.7.
Code:
def do_POST(self):
print self.rfile
# Processing HTTP POST request data
content_len = int(self.headers.getheader('content-length'))
print 'Reading from HTTP header. Size: %s' % (content_len)
# THIS IS WHERE IT HANGS
post_body_json = self.rfile.read(content_len)
print 'Got it. Moving on, now.'
post_body = json.loads(post_body_json)
image_data = post_body.get('img_string_b64', 'No Image String')
print 'Decoding image string.'
# Processing image data
image_name = 'image.jpg'
decoded_str = base64.decodestring(image_data)
self.write_image_to_system(decoded_str, image_name)
print 'Getting text translation.'
opencv_handler = OpenCVHandler()
# Get translation from OpenCV then play text audio
text_trans = opencv_handler.get_text_translation_from_image(image_name)
opencv_handler.play_audio_translation_from_text(text_trans)
# Responding to the POST requester.
# text_trans = 'Translated'
response = text_trans
self.send_response(200) # OK
self.send_header('Content-type', 'text/html')
self.end_headers()
self.wfile.write(response)
return
I had the same issue with the call self.rfile.read(length), blocking for AJAX requests in POST.
This was due to an earlier statement :
form = cgi.FieldStorage(
fp=self.rfile,
headers=self.headers,
environ={'REQUEST_METHOD': 'POST'}
)
Once this is removed, it worked.
I hope it helps.
I had a similar problem. Found out I had to convert the data to string. Like: post_body_json = str(self.rfile.read(content_len)). Kept it from hanging there.

Converting a while loop into a function? Python

In order to condense my code I am trying to make one of my while loops into a function. I have tried numerous times and have yet to receive the same result upon compiling as I would just leaving the while loop.
Here's the while loop:
while True:
i = find_lowest_i(logs)
if i == -1:
break
print "i=", i
tpl = logs[i].pop(0)
print tpl
out.append(tpl)
print out
And here's what I have so far for my function:
def mergesort(list_of_logs):
i = find_lowest_i(logs)
out = []
while True:
if i == -1:
break
print "i=", i
tpl = logs[i].pop(0)
print tpl
out.append(tpl)
print out
return out
Thanks in advance. This place is a safe-haven for a beginner programmer.
It looks like the parameter to your function is list_of_logs and you're still using logs inside the function's body. The simplest fix is probably to rename your parameter to mergesort from list_of_logs to logs. Otherwise, looks completely correct to me.