I developed a Laravel application (using Mysql for my database locally and remotly) for a small business and they asked me to make the application usable even if there is no internet connection, so I don't know where to start, What should i do to make my local db (the one in the company's computer) with the db hosted in digital ocean droplet. so that my application can be used in the event of an internet disconnection.
when i searched i found the concept of replication but the problem is that database contain links of files and pictures stored in the public/storage folder of my laravel project so i assume im gonna have to sync between the 2 storage folders (the one in the DO server and the one in my company).what should i do?
Related
Where are databases actually stored? Do we store the database in the web server itself (like NGINX or Apache), or do we have any other kind of server dedicated to only the database, if this is the case, how would we connect to the database from another machine?
For example, I've tinkered with MySQL databases stored in my local machine, which i used to create web applications: how could i store (host) those databases somewhere else in another machine and still use them in the same web application running on a completely different machine?
Where is the database stored?
The database -- the relational database management system -- is a network-accessible hunk of server code. MySQL, for example, is a server software package.
That server software runs on a computer -- in internet parlance a server machine -- somewhere. Maybe it's on your laptop. Maybe it's on a server machine in the next room. Maybe it's on a virtual machine you rent from AWS,Azure, Digital Ocean, or some other cloud vendor.
The machine hosting the database for a web site can be the same machine that hosts the web site's web server, or a different machine. Modestly sized web sites often run the database and the web server on the same machine. Bigger sites often have multiple web server machines all using just one database machine.
As long as the web server code can reach the database server via a TCP/IP connection, you have a working configuration.
Where is the database stored?
The database server software uses the file system (disk, SSD drives, or network-attached storage) of its host machine to store the data in its tables. The structure of the file-system files used by the database server software is a huge topic far beyond the scope of a StackOverflow answer. Suffice it to say that those files are useless without the database server software to read, write, and back them up ( with a few special exceptions).
I am trying to host 100+ Wordpress websites on Azure. For that I would need MySQL databases and I am confused which route I should go. With quick google search, I found mainly three options
Use MySQL Preview (which is not good for production environment)
Purchase costly third party subscription by ClearDB or Bitnami
Set UP windows/Linux VM and host LAMP Server on it. Host MySQL server there. I am just trying to know if there are any other better options. Thank you.
Update: One primary requirement I have is those wordpress websites are already created and I am migrating them to new hosting. I need separate database for each Word press site.
I have done extensive research, i feel that i have good candidates but i still lack enough knowledge to decide which one i should implement, ideally i would like to hear from someone that actually implemented a solution to a similar problem.
The Problem
Our project consists of a community of distributed nodes (25 nodes). The nods run on Linux computers, and are installed in the typical residential setting (behind a NAT), with wide dispersion geographically and ISP wise.
Our software on the node collects a variety of its own unique data which is logged to MySQL DB on the local host (node) which is not WAN accessible directly. We also have a Web interface for each node that uses the local node DB to allow the local node user to visualize certain data and parameters; this is only accessible on the LAN.
We typically set-up and maintain an open port for SSH from our labs to each node. All remote DB on nodes have the exact same schema but completely different data. We need an automated way to collect all data from all the nodes and get them to our WAN accessible lab servers (Windows 7 servers, but can be Linux if it provide a better solution). We have narrowed the option as follow:
Solutions:
Create a .bat script that sequentially connect to each node over SSH to import data.
Use the web interface that runs on each node to periodically query the local db then save that data to a central MySQL server. I know i can connect to two db in PHP. seems to be doable here.
Use the MySQL supported “slave-master” replication setup which will duplicate all remote databases on the server.
Use the MySQL supported federated engine setup which will link local tables to remote ones.
Questions:
Are all these viable solutions?
Any major cons i should be aware of for the viable ones?
is there better solutions available (paid or otherwise) ?
My company has Desktop application developed in vb.net using devexpress controls. Back End database is MySQL.
Company is in retailing and have 2 retail stores in in same city. Both stores always stay busy and customers are always in waiting at the counter. Basically, it is desktop based CRM application which has lot of modules inside it apart from invoice/Receipt module, it has other modules like Delivery module, installation module, Service/Repair module, Account Receivable module and many other modules used by various back office departments of the company. Other resources/hardware such as Barcode Printer, Receipt Printer, and Barcode scanner are connected to the CRM on Desktop PC.
Currently, there are around 55 clients always connected to server and using application.
Problem:
Till couple of weeks back, company had no issue using this desktop application and single MySQL server as all clients were connected via LAN or WLAN.
Now situation has changed, and new requirement has raised: Company has planned to open new stores at very far distance. Such stores cannot be connected to current central database via LAN or WLAN. Each new branch would have around 20-30 clients, say “Branch Clients”
Also, there would be field executive who will be working from their laptop. Say “Remote Clients”. They will just have 3G internet connection on their laptop.
Thought 1: Install desktop application at all branch PCs, and connect them to central MySQL database server over the internet.
Not possible: Connection over the internet would be very slow for fetching such huge data. Data is really huge For, e.g. if client opens “Customer Master”, then there would be more than 600,000 rows which takes lot of bandwidth and time to open over the internet. And there are many more such modules which loads lot of data.
Also, in case of losing internet connection, clients would not able to operate the application. Customer waiting in line to make receipt would go crazy if they have to wait for long.
Thought 2: Install new MySQL server at branch store, all the desktop PCs then would be connected to that local branch server. And then that local branch server would be connected to central server via MySQL replication option.
Not possible: Since MySQL replication has limitation of only one way replication, we cannot implement this structure. Application requires to move data from central server to branch server and from Branch to Central in real-time. Also, MySQL replication engineering has limitation to replicate only with one server only. In that case, we cannot replicate with multiple branch stores. There is an option of cluster server, but company cannot afford licensing cost.
Thought 3: Somebody suggested me that I should transfer entire desktop application into Web Application and get cloud server for database.
Not possible: I think looking at current requirement (fast access), environment (retail store-pos) and hardware (printers, scanners) connected to client - it is not advisable to have web application and cloud database server. Also in the event of no internet, entire store would go down.
Thought 4: Somebody suggested me that I should move from MySQL server to MSSQL and keep desktop application as it is. MSSQL has capability to sync with multiple servers in real-time over the internet. It has no limitation like MySQL’s one way replication and only one replication connection.
I guess, to make faster and constant database connection, installing local branch server is highly required. But I don’t know how those different branch servers could be connected to central server.
My Questions:
• What is the best way to resolve above issues in given condition and successfully fulfill the company’s requirement? Faster and constant connection to database server. And also real-time updates between all branches and central server. If internet connection is down, then delay in real-time update is acceptable but clients should not be affected from work.
• Would migration from MySQL to MSSQL resolve the issue? Because data migration is not issue as there are many tools available which converts the database from one platform to other. But issue is - application is very huge having hundreds of query written for MySQL. I guess I have to change those all queries also, because queries are not same for MySQL and MSSQL. Do I have to change all the queries or just the few percentage queries? Or if there is any tool available which convert queries from MySQL to MSSQL query.
• In general, how such small-medium retail store company have their infrastructure and application setup? Let me know some ideas.
Is there any local MySQL database which I can create on my system, just like the localhost we run on our system.
I never thought of this situation since we have our beta servers and live servers with database already set up in our offices.
I am creating an app on my own, that's why I need to setup a local db of mine.
You can use WAMP (or LAMP on linux) which is an integrated web development environment giving you access to your own LOCAL SQL databases accessible through your own machine's localhost. You will have full control to be able to create tables and administrate it. I am currently doing that myself
WAMP