I written a program to work with a MySQL database that powers my website and I have recently bought some reseller hosting however the hosting company has restricted external access on the shared server, so I was going to setup an external mySQL database on another server that can be accessed remotely, however to do this I need a PHP file on my reseller server that can connect to the local database and the remote database and sync them on request from the application via a url.
Does anyone know the best method achieve this?
Try using SQLyog's Database synchronization tool
Related
I want to develop a desktop application (with VB) and connect it to an online database (MySQL), I already had my hosting site and I thought why not create a MySQL database in my domain (which by the way is offered in the pack) and connect my desktop application to this database.
Why an online database? Because I want to develop a mobile application to keep me informed of any changes in my business without moving to my office.
The problem is that my VB6 application cannot connect to my MySQL due to domain restrictions. I wondered how these thousands of apps work with online data? What are their magic solutions? Or did I miss something?
Please any idea? I worked with VB6 and a local database which is no longer productive.
You can create a simple PHP file on your server that will take the data from your VB6 apps and store it in your database. You can also create a simple PHP file that will query your database and return the results in a format that your app can read. This is the typical way that a VB6 app will interact with a server-side database.
I'm a newbie to web development. My team at school is using J2EE and MySQL to develop a web app that will be deployed on AWS. We use GitHub for version control.
I am just wondering if I use MySQL from my terminal to add tables into the local "test" database, how can my teammates have access to them? Should I deploy the database somewhere or maybe create the tables in code so that my teammates can automatically have the tables in their local database when they run the code? But how can the data already stored in the database be shared then?
Sorry to have this naive question, I tried to do some research online but it seems that the results are more advanced and about PHP not J2EE... It will also be great if you can recommend some good resource for me to read through since I believe this is a very fundamental concept that I should know.
You can maintain the database's schema in your code so it can be committed to source control and accessed by the others. This is a good practice regardless of how you use a test database for the development.
Your team members will not be able to easily access your local database. For a distributed development environment it would be best to host your test database on a remote server, such as on an EC2 instance in a public subnet or in RDS. Then you can pass along the database's connection information (host, port) and credentials to the other team members.
Pay attention to the security group when creating the database either in EC2 or RDS. You can open it up to the world (0.0.0.0) or narrow it to just your team members' IP addresses to tighten security. Otherwise the team members will not be able to connect to the database.
It's hard for your team members to access your local database from another computer. It's a lot better to host your database on a remote server, such as a EC2 on AWS or Computer Engine on GCP. Then you MySQL database can be accessed by anyone with an authorized connection and whitelisted IP address.
Another solution is using a cloud-based data warehouse like a Snowflake or Acho Studio. Once you have the MySQL database connected to the DW, your teammates should be able to access the tables you've authorized them to see.
This way you can also share your SQL queries with your teammates so they can run them against the MySQL server themselves.
You can create a test environment using VM or containers and share it with your team members. You should pay attention to how to keep track of the changes in these test environments as well. The following answer describes how a db with schema can be shared using a docker image. You can version control these images so you can track the changes.
I have a number of apps running on Dreamhost and am creating a small app I will use locally to present reports on information stored in various databases. I do not want to host this reporting tool online so I need the app to connect to the remote MySQL server from my machine.
Codeigniter has been giving me an error all afternoon that it can't connect to the remote server, I'm guessing this is down to Dreamhosts security on remote access. I don't have a dedicated IP address so it's not ideal to set on the user account to allow external access. I've tried exploring an IP tracking solution such as No-IP, however I'm not sure if I've configured it properly, or if it'll even the service is capable of allowing my app to access the remote MySQL server when ever it needs.
Does anyone know how to get these working correctly, or am I on the right track??
I am designing the backend of my ios application. The backend has separate database and application server running mysql and django separately in different machines. Till now, I have connected my application server with my database server in simple way: I changed the database host in application server settings to point to remore database server and created a new remote host in database server configuration files allowing remote application server to access the database. All works fine and I have decided to go with this setup for production. Then when I was reading Instagram engineering blog, I saw them mentioning 'Pgbouncer' to pool connection to their postgresql database server. What is the need for something like this? Has this got something to do with only performance, or is this a production friendly approach to use something like this for communication between database and application server. Is my general approach mentioned too amateur?
Your approach is not amateur at all. The purpose of bouncer in your case would be to eliminate connection time that happens on each request django handles. For example, on Heroku, which is hosted on AWS servers, this could eat up 40-50ms of each request.
Now, if you had a master/slave setup or something like that, a connection pool would also provide you a failover functionality (just an example)
I would like to create a desktop application that should work with data on a mySql server running on a remote machine.
So each user has a copy of the desktop app and edits data on the remote mySql server.
Now my problem is that the mySql server will not allow connections from other hosts.
Question, is this just the wrong way of creating the app. If not how do I give any host access to the MYsql server.
(I know I can open up for a specific IP but that won't work as the app could be running anywhere)
You should front your database on the server with a thin service layer, where you could do some validation / processing on the data, perform authentication, etc. Your client apps would then expose those methods in your service layer as web services, to which your client apps would communicate using either SOAP/XML, REST/JSON, etc. In general, it is a bad idea to expose your database directly if your application is within a LAN, and a terrible one to expose it on the internet.