I have an async generator that I'm trying to pass into a keras model fit_generator, but the async generator returns an object, not a generator.
I've tried googling, but I haven't found a solution. This seems to be a pretty specific problem.
It's intention of asyncio to split async generator from regular generator, read answer here.
However if you decided that you won't run async generator concurrently elsewhere and your only intention is to avoid RAM overflow, you can cast async generator to regular one manually iterating async generator and awaiting each new item:
import asyncio
async def my_gen():
for i in range(10):
yield i
await asyncio.sleep(0.5)
def to_sync_generator(ait):
loop = asyncio.get_event_loop()
try:
while True:
try:
coro = ait.__anext__()
res = loop.run_until_complete(coro)
except StopAsyncIteration:
return
else:
yield res
finally:
coro = loop.shutdown_asyncgens()
loop.run_until_complete(coro)
# Check:
if __name__ == '__main__':
for i in to_sync_generator(my_gen()):
print(i)
P.S. Didn't test code much.
Related
i am new to MongoDB and Scala language
i am using scala language to connect mongodb locally
i am using below dependency
//
https://mvnrepository.com/artifact/org.mongodb.scala/mongo-scala-driver
libraryDependencies += "org.mongodb.scala" %% "mongo-scala-driver" %
"4.2.3"
what I tried
object Demo extends App {
val mongoClient: MongoClient = MongoClient("mongodb://127.0.0.1:27017/")
val database: MongoDatabase = mongoClient.getDatabase("DemoDB")
println(database)
val collection: MongoCollection[Document] =database.getCollection("demodata");
val observable = collection.find();
}
the above code returning the data in below format
FindObservable(com.mongodb.reactivestreams.client.internal.FindPublisherImpl#6253c26)
I also tried with
observable.subscribe ( new Observer[Document] {
override def onNext(result: Document): Unit = println(result.toJson())
override def onError(e: Throwable): Unit = println("Failed" + e.getMessage)
override def onComplete(): Unit = println("Completed")
})
i also tried printResult() and printHeadResult() method also but none of the way is working
please help
thanks in advance
Mongo Scala driver works in a non-blocking manner by returning Observables which need to be Subsribed on to consume the published data.
When you are subscribing to the observable like following,
object Demo extends App {
val mongoClient: MongoClient = MongoClient("mongodb://127.0.0.1:27017/")
val database: MongoDatabase = mongoClient.getDatabase("DemoDB")
println(database)
val collection: MongoCollection[Document] = database.getCollection("demodata")
val observable = collection.find()
observable.subscribe ( new Observer[Document] {
override def onNext(result: Document): Unit = println(result.toJson())
override def onError(e: Throwable): Unit = println("Failed" + e.getMessage)
override def onComplete(): Unit = println("Completed")
})
}
Your code does not wait for observable to actually publish anything, it just finishes right after subscribing. Hence you don't get anything.
You can either add a Something like a Thread.sleep(5000) at the end to block and give the obeservable some time to (hopefully finish and) publish the data.
Or, you can add val resultSeq = observable.collect to block and collect all of published data in a single Sequence.
I found this link
it works for printResult() and printHeadResult() method
Printing query results from Mongodb in Scala using mongo-scala-driver
I'm developing a simple window that performs some operations at closure. This is my code extract:
from javax.swing import *
from java.awt import *
from java.awt.event import *
from java.io import *
import javax.swing.table.DefaultTableModel as DefaultTableModel
class registro(JFrame):
def __init__(self):
super(registro, self).__init__()
self.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE)
self.setExtendedState(JFrame.MAXIMIZED_BOTH)
#[...]
headers = ('Data e orario',
'Personale UO Q&A',
'Tipologia di attività '.decode('utf-8'),
'Personale incontrato con strutture di appartenenza',
'Note')
self.model = DefaultTableModel([["","","","",""]], headers)
self.table = JTable(self.model)
#[...]
self.addWindowListener(self.onClose())
#[...]
def onClose(self):
class saver(WindowAdapter):
tableModel = self.model
def windowClosing(self, event):
print tableModel #HERE IS THE ERROR!!!!!!!!!
return saver()
The error reported on the highlighted line is the following:
NameError: global name 'tableModel' is not defined
Although I have declared the variable inside the listener (to avoid misunderstanding between the two self), I don't understand why it has never been recognized. I'm almost a novice with object-oriented programming and Swing windows on Jython, and I hope this is not my (very) serious shortcoming!
Many thanks in advance.
There's a fairly subtle scope issue here, which is mostly about Python syntax, but also about what code you want to have access to the tableModel. The tableModel variable is not visible by default because you are inside the onClose() function. A defensive solution to this is to explicitly pass the needed variable into the new saver object. I personally prefer this as it more explicitly declares the inputs for saver objects.
class WindowAdapter:
None
class App:
def __init__(self):
self.model = 'DUMMYMODEL'
def onClose(self):
class Saver(WindowAdapter):
def __init__(self,tableModel):
WindowAdapter.__init__(self)
self.tableModel = tableModel
def windowClosing(self,event):
print (self.tableModel)
return Saver(self.model)
if __name__ == '__main__':
app = App()
sv = app.onClose()
sv.windowClosing(event=None)
(This code is cut down and in pure Python to show it is largely scoping related.)
An alternative would be using the Python global keyword to expose the tableModel variable to all lower scopes.
class WindowAdapter:
None
class App:
def __init__(self):
self.model = 'DUMMYMODEL'
def onClose(self):
global tableModel
tableModel = self.model
class Saver(WindowAdapter):
def windowClosing(self,event):
print (tableModel)
return Saver()
if __name__ == '__main__':
app = App()
sv = app.onClose()
sv.windowClosing(event=None)
I have several functions such as
def plot_lines(...):
def plot_setup():
def BP4_avg(...):
which all work fine but when I add a calling function main() it breaks
def main():
...
plot_setup()
BP4_avg(...)
plt.show()
if __name__ == "__main__":
main()
Any ideas?
If I remove main() and just have
plot_setup()
BP4_avg(...)
plt.show()
program works.
Thanks
In the first version you're just defining the functions but you are not calling them - so everything works fine.
On the second version (the one with main() ) you are actually executing these functions and one of them breaks...
Here's the code that illustrates the problem:
from PyQt4 import QtGui
app = QtGui.QApplication([])
dialog = QtGui.QDialog()
button = QtGui.QPushButton('I crash')
layout = QtGui.QHBoxLayout()
layout.addWidget(button)
dialog.setLayout(layout)
def crash(): raise Exception('Crash!')
button.clicked.connect(crash)
button.click()
print 'I should not happen'
When I run that, PyQt4 handles the error for me. My console displays a stack trace with 'Crash!' etc. in it, and I see 'I should not happen'.
This is not useful, because I've been handed a large application with many handlers, and I need to force all their errors up into my face (and into my - ahem - automated tests). Each time I run, errors escape my nets, and they would require excessive and useless try:except blocks, inside every handler, just to catch them all.
Put another way, I want good code to be very good, and bad code to be very bad. Not whitewashed.
Apologies if this is already asked, but when I e-search for it, I naturally get thousands of newbies asking basic error handling questions (or, worse, I get newbies asking how to turn OFF their wayward exceptions!;)
How do I override PyQt4's default error handling, so I can propagate or log errors myself? And please don't answer sys.excepthook, either - it catches the errors that PyQt4 doesn't catch.
This is not the answer, you silly website. Stop forcing us to fit into a preconceived notion of an idealized thread template.
The un-answer is to use my test framework setUp() to hook in an exception handler:
def setUp(self):
self.no_exceptions = True
def testExceptionHook(type, value, tback):
self.no_exceptions = False
sys.__excepthook__(type, value, tback)
sys.excepthook = testExceptionHook
Where that says "self.no_exceptions = False", I would much rather simply say self.fail(''). However, because Python's unit test library insists on throwing exceptions just to register test failures, and because PyQt insists on snarfing all exceptions, we have a deadlock.
To fit into unittest.TestCase's silly preconceived notion of an idealized test case, I have to instead set a variable, then detect it in the teardown:
def tearDown(self):
self.assertTrue(self.no_exceptions)
This is still not ideal, but at least it will force me to spend more time paying attention to the errors, instead of spending that time complaining about them on technical websites.
The root question: How to turn off PyQt's magic error handler? - remains unanswered...
I think the answer is that this isn't a 'feature' of PyQt, but a consequence inherent to the design that lets signals/slots work (remember that the signal/slot communication is going through a c++ layer as well).
This is ugly, but does a bit of and end-run around your problem
from PyQt4 import QtGui
import time
app = QtGui.QApplication([])
class exception_munger(object):
def __init__(self):
self.flag = True
self.txt = ''
self.type = None
def indicate_fail(self,etype=None, txt=None):
self.flag = False
if txt is not None:
self.txt = txt
self.type = etype
def reset(self):
tmp_txt = self.txt
tmp_type = self.type
tmp_flag = self.flag
self.flag = True
self.txt = ''
self.type = None
return tmp_flag, tmp_type, tmp_txt
class e_manager():
def __init__(self):
self.old_hook = None
def __enter__(self):
em = exception_munger()
def my_hook(type, value, tback):
em.indicate_fail(type, value)
sys.__excepthook__(type, value, tback)
self.old_hook = sys.excepthook
sys.excepthook = my_hook
self.em = em
return self
def __exit__(self,*args,**kwargs):
sys.excepthook = self.old_hook
def mang_fac():
return e_manager()
def assert_dec(original_fun):
def new_fun(*args,**kwargs):
with mang_fac() as mf:
res = original_fun(*args, **kwargs)
flag, etype, txt = mf.em.reset()
if not flag:
raise etype(txt)
return res
return new_fun
#assert_dec
def my_test_fun():
dialog = QtGui.QDialog()
button = QtGui.QPushButton('I crash')
layout = QtGui.QHBoxLayout()
layout.addWidget(button)
dialog.setLayout(layout)
def crash():
time.sleep(1)
raise Exception('Crash!')
button.clicked.connect(crash)
button.click()
my_test_fun()
print 'should not happen'
This will not print 'should not happen' and gives you something to catch with your automated tests (with the correct exception type).
In [11]: Traceback (most recent call last):
File "/tmp/ipython2-3426rwB.py", line 68, in crash
Exception: Crash!
---------------------------------------------------------------------------
Exception Traceback (most recent call last)
<ipython-input-11-6ef4090ab3de> in <module>()
----> 1 execfile(r'/tmp/ipython2-3426rwB.py') # PYTHON-MODE
/tmp/ipython2-3426rwB.py in <module>()
/tmp/ipython2-3426rwB.py in new_fun(*args, **kwargs)
Exception: Crash!
In [12]:
The stack trace is jacked up, but you can still read the first one that was printed out.
I'm learning Backbone.js and Flask (and Flask-sqlalchemy). I chose Flask because I read that it plays well with Backbone implementing RESTful interfaces. I'm currently following a course that uses (more or less) this model:
class Tasks(db.Model):
id = db.Column(db.Integer, primary_key=True)
title = db.Column(db.String(80), unique=True)
completed = db.Column(db.Boolean, unique=False, default=False)
def __init__(self, title, completed):
self.title = title
self.completed = completed
def json_dump(self):
return dict(title=self.title, completed=self.completed)
def __repr__(self):
return '<Task %r>' % self.title
I had to add a json_dump method in order to send JSON to the browser. Otherwise, I would get errors like object is not JSON serializable, so my first question is:
Is there a better way to do serialization in Flask? It seems that some objects are serializable but others aren't, but in general, it's not as easy as I expected.
After a while, I ended up with the following views to take care of each type of request:
#app.route('/tasks')
def tasks():
tasks = Tasks.query.all()
serialized = json.dumps([c.json_dump() for c in tasks])
return serialized
#app.route('/tasks/<id>', methods=['GET'])
def get_task(id):
tasks = Tasks.query.get(int(id))
serialized = json.dumps(tasks.json_dump())
return serialized
#app.route('/tasks/<id>', methods=['PUT'])
def put_task(id):
task = Tasks.query.get(int(id))
task.title = request.json['title']
task.completed = request.json['completed']
db.session.add(task)
db.session.commit()
serialized = json.dumps(task.json_dump())
return serialized
#app.route('/tasks/<id>', methods=['DELETE'])
def delete_task(id):
task = Tasks.query.get(int(id))
db.session.delete(task)
db.session.commit()
serialized = json.dumps(task.json_dump())
return serialized
#app.route('/tasks', methods=['POST'])
def post_task():
task = Tasks(request.json['title'], request.json['completed'])
db.session.add(task)
db.session.commit()
serialized = json.dumps(task.json_dump())
return serialized
In my opinion, it seems a bit verbose. Again, what is the proper way to implement them? I have seen some extensions that offer RESTful interfaces in Flask but those look quite complex to me.
Thanks
I would use a module to do this, honestly. We've used Flask-Restless for some APIs, you might take a look at that:
https://flask-restless.readthedocs.org/en/latest/
However, if you want build your own, you can use SQLAlchemy's introspection to output your objects as key/value pairs.
http://docs.sqlalchemy.org/en/rel_0_7/core/schema.html#metadata-reflection
Something like this, although I always have to triple-check I got the syntax right, so take this as a guide more than working code.
#app.route('/tasks')
def tasks():
tasks = Tasks.query.all()
output = []
for task in tasks:
row = {}
for field in Tasks.__table__.c:
row[str(field)] = getattr(task, field, None)
output.append(row)
return jsonify(data=output)
I found this question which might help you more. I'm familiar with SQLAlchemy 0.7 and it looks like 0.8 added some nicer introspection techniques:
SQLAlchemy introspection
Flask provides jsonify function to do this. Check out its working here.
Your json_dump method is right though code can be made concise. See this code snippet
#app.route('/tasks')
def tasks():
tasks = Tasks.query.all()
return jsonify(data=[c.json_dump() for c in tasks])