I have an application which needs to connect to and RDS (postgres) proxy with IAM. It makes use of the create_app method.
def create_app():
connex_app = connexion.App(__name__, specification_dir=base_apispec_dir)
connex_app.add_api("swagger.yaml", strict_validation=True)
app = connex_app.app
app.config.from_object(get_configuration())
ma.init_app(app)
db.init_app(app)
return connex_app
In this post there is an example of how to do this with SQLalchemy, but how do we connect with Flask-SQLAlchemy. In the example they use the #event.listen_for() event from SQLalchemy, but for that I need the engine, which I do not have.
It is possible to get it from the SQLalchemy object, but this gives to following error: No application found. Either work inside a view function or push.
Does anyone know how to make this connection working so the IAM-token can be refreshed every time it expires or just before that?
Related
I'm really a newbie on DB programming and Next.js also.
I have tried the function that received json data should be INSERTed (MySQL) with prisma in Next.js API server.
According to the explanation of the prisma, the code for inserting new record was as follows.
const result = wait prisma.[db_name].create({data:[json_data_name]});
for the PUT data [json-data_name].
For every http PUT case , the db connection count was added and after about 500 times inserting, there broke out the problem of "Too many connections...".
I think that the number of record inserted is not confined to 500 or..,
I think the prisma function prisma.[db_name].create makes new connection.
How to insert 2000~3000 http PUT data into MySQL db with prisma for Next.js API server?
const prisma = new PrismaClient();
the code above was moved out of the API handler, then the problem was finished!.
for every call of API handler , new prisma was created and used.
it means a new connection to db was created.
Thank you.
I'm new to using Svelte and would like to create a ordering website using Svelte. I know that I will need a database to keep track of the order, customer name, price etc. I have used MySQL before but I haven't learned how to connect a database to a website.
Is there a specific database that you can use if you are using Svelte?
Or is there a way to connect MySQL to Svelte?
I have searched about this on Youtube and Google but I'm not sure if it's different if you are using Svelte so I wanted to make sure.
Note: I have not started this project yet so I do not have any code to show I just want to know how you can connect a database if you're using Svelte.
Svelte is a front end javascript framework that run on the browser.
Traditionally, in order to use databases like mysql from a front end project such as svelte, (that contains only html,css and js), you would have to do it with a separate backend project. You can then communicate the svelte app and the backend project with the help of REST api. The same applies to other other front end libraries/frameworks like react, angular vue etc.
There are still so many ways to achieve the result. Since you are focusing on Svelte here are few things options
1 Sapper
Sapper is an application framework powered by svelte. You can also write backend code using express or polka so that you can connect to database of your choice (mysql / mongodb)
2 User Server less database
If you want you app simple and just focus on svelte app, you can use cloud based databases such as firebase. Svelte can directly talk to them via their javascript SDK.
3 monolithic architecture
To connect with mysql in the backend, you would need to use one serverside application programming language such as nodejs (express) php or python or whatever you are familiar with. Then use can embed svelte app or use api to pass data to the svelte app.
I can make an example with mongodb
You have to install the library
npm install mongodb
or add in package.json
Then you have to make a connection file that you have to call everytime you need to use the db
const mongo = require("mongodb");
let client = null;
let db = null;
export async function init() {
if(!client) {
client = await mongo.MongoClient.connect("mongodb://localhost");
db = client.db("name-of-your-db");
}
return { client, db }
}
for a complete example with insert you can see this video
https://www.youtube.com/watch?v=Mey2KZDog_A
You can use pouchdb, which gives you direct access to the indexedDB in the browser. No backend needed for this.
The client-pouchdb can then be replicated/synced with a remote couchdb. This can all be done inside you svelte-app from the client-side.
It is pretty easy to setup.
var db = new PouchDB('dbname');
db.put({
_id: 'dave#gmail.com',
name: 'David',
age: 69
});
db.changes().on('change', function() {
console.log('Ch-Ch-Changes');
});
db.replicate.to('http://example.com/mydb');
more on pouchdb.com
Also the client can save the data offline first and later connect to a remote database.
As i get question mostly about connection to backend, not a database. It is pity, but svelte app template has no way to connect backend "in box".
What about me, i'm using express middleware in front of rollup server. In this case you able to proxy some requests to backend server. Check code below
const proxy = require('express-http-proxy');
const app = require('express')();
app.use('/data/', proxy(
'http://backend/data',
{
proxyReqPathResolver: req => {
return '/data'+ req.url;
}
}
)
);
app.use('/', proxy('http://127.0.0.1:5000'));
app.listen(5001);
This script opend 5001 port where you have /data/ url proxied to backend server. And 5000 port still available from rollup server. So at http://localhost:5001/ you have svelte intance, connected to backend vi /data/ url, here you can send requests for fetching some data from database.
I have a project that is connected to an external database and just have view access to it. So I created my models with managed=False flag.
I was wondering how can I find out in django that any change in that database is happened. Is there any solution in django or I should find a method to communicate between that database and my django app. like socket, database triggers and ...?
More details:
Image my models is like this:
class Alert(models.Model):
key = models.CharField(max_length=20)
class Meta:
managed = False
Now i want to be notified in django each time the database is updated. I want a signal to capture database updates and do something in django?
So I have a Flask socket IO application start with Gunicorn with the worker class : geventwebsocket.gunicorn.workers.GeventWebSocketWorker and with 1 worker. And, I have find that in a particular case i am getting a [critical] WORKER_TIME_OUT and the API die ( there is probably other case ). I was able to reproduce the issue by doing that :
def test_get_all(self):
pool = ThreadPool(3)
entity_route = [self.API_ROUTE, self.API_ROUTE, self.API_ROUTE]
pool.map(self.get_entity, entity_route)
def get_entity(self, route):
rest_client.post(route, json={
"email": 'DEFAULT_USER_EMAIL',
"password": DEFAULT_USER_PASSWORD
}, status_code=200).json()
So i am calling the API_ROUTE 3 times in parallel. Inside the controller of API_ROUTE I am making a function call that update a field in an entity in the PSQL database and in a document in the Elastic Search instance. At that moment, The API froze, and after the default 30 sec timeout die. If i comment the call to ES or the PSQL. It pass no problem. I try to use the worker class eventlet and it fix it, but then other route failed, because it seem responses get mixed up. So i am not sure what worker class to use, because i need the web socket functionality.
I also try to use a lock around the function that call Elastic Search and PSQL. But it still fail. Something like this :
from gevent.threading import Lock
lock = Lock()
self.lock.acquire()
entity.update(**data)
self.lock.release()
If someone could point to me how to setup a monitoring on the greenlet with gunicorn and also explain to me what is happening it would be great. Also, the application is running in docker and kubernetes ( minikube locally )
Thank you !!
We have a system where we have a Master / Multiple Slaves .
Currently everything happens on the Master and the slaves are just here for backup .
We use Codeigniter as a development platform .
Now we decided to user the slaves for the Reads and the Master for the Write queries .
I have been told that this is not doable without modifying the source code because proxy can't know the type of the query .
Any idea how to proceed with this without causing too much damages for a perfectly working system ...
We will use this : http://dev.mysql.com/downloads/mysql-proxy/
It does exactly what we want :
More info here :
http://jan.kneschke.de/2007/8/1/mysql-proxy-learns-r-w-splitting/
http://www.infoq.com/news/2007/10/mysqlproxyrwsplitting
http://archive.oreilly.com/pub/a/databases/2007/07/12/getting-started-with-mysql-proxy.html
something i was also looking, few month back i did something like this but i added 3 web server with master slave mysql servers, first web server enabled with mod_proxy to redirect request to read and write server all request will come to this server, if post,put or delete request come to server it will go to write server, all get or normal request will go to read server
here you can find mod_proxy setting which i used
http://pastebin.com/a30BRHFq
here you can read about load balancing
http://www.rackspace.com/knowledge_center/article/simple-load-balancing-with-apache
still looking for better solution with less hardware involved
figure out another solution through CI, create two database connections in database.php file keep save mysql server as default database connection and other connection for write only server
you can use this base model extend
https://github.com/jamierumbelow/codeigniter-base-model
you need to extend your models with this model and need to extend you model with this, it has functionality for callbacks before and after insert,update, delete and get queries, only you need to add one custom method or callback change_db_group
//this method in MY_Model
function change_db_group{
$this->_database = $this->load->database('writedb', TRUE)
}
no your example model
class Example_Model extends MY_Model{
protected $_table = 'example_table';
protected $before_create = array('change_db_group');
protected $before_update = array('change_db_group');
protected $before_delete = array('change_db_group');
}
you database connection will be changed before executing insert,update or delete queries