mail website forum under one database - mysql

I've tried to find answer to my question but i couldn't find the right answer yet (would be glad if you point me to one). I'm a newbie when it comes to running services (websites, forum, wikis, emails). I'm rather experimenting.
I have couple of websites (mainly wordpress), mail server, forum, wikis, and file sharing (owncloud) hosted on one server.
Until now every time I would install new service I would create new database (mysql), just like the install readme's would advice. I would like to connect some of the services together. Mainly unified user database.
What is the best way to do it. Is having multiple databases versus one db heavier for my servers cpu load? Is it secure? Is it easy to administrate it?
If cpu load isn't issue while having multiple db's is it possible to create user database and link it to the services databases i would like to link it to?

Having multiple applications (forum, wiki, ...) access the same database is not likely to have any effect on CPU usage, but there are other drawbacks:
Table names used by applications might have conflicts (many of them might have a "session" or "posts" table). Some web apps have a feature to prefix table names with a string, like "wp_session" and "wp_posts" for example to get around conflicts.
Yes, it's less secure. When one of the applications has a security hole and someone manages to access its database, data of all applications is compromised.
Multiple databases is likely to be easier to manage when doing application upgrades, backups, removing or adding applications to the mix.
Accidentally break one database, and you'll break all apps.
To get the applications use the same authentication database it's usually not enough to point them at the same database, as they're likely to use a different database schema for storing user information (different columns in the auth database), different hashing for password storage, and so on.
The question is quite broad, and the specific answer depends a lot on the actual applications you're using. The best approach in general is probably to pick applications which support a protocol such as OpenID or OAuth, or an authentication backend such as an LDAP database or PAM (Pluggable Authentication Module). These methods allow you to use a single user database managed by a single method. The apps all need to work with the same backend. In any case, it's likely to be quite a learning experience to get it running smoothly.

Related

SaaS with exposed SQL

For a client I'm going to deliver a SaaS solution, SaaS in that matter it's still closed to a limited clients that has to sign a contract with us, so not shared world wide and the client base will be around 5-10 companies.
Our first client, the pilot client so to speak, has it as a requirement that they can perform SQL queries (read mode only) on the data, so they can make analysis in Excel along with what our application serves.
My question is that I would for maintain reasons prefer to serve everything in the same codebase, but I'm wondering how I can make sure, clients can access other clients SQL records?
I'm using Laravel so the solution for different installations would be to make everything in maintainable packages and upgrade all installations from there, but this can grow to a lot of work.
How to have the solution with only one solution I'm still not sure, maybe it is to have a separate database per client? That would require a central database to point them to the right database of course, or maybe only have some of the tables in another database, but it already sounds like a mess to me
In Laravel it is possible to have multiple database connections. As such your thoughts on giving the clients thier own database is going to be the most secure option.
Have your default database be your main application database which will be settings/auth etc.
For each client store their personal data in a separate database per client and only allow them to query this database.
Although I don't know the specifics of your application my true suggestion is to avoid the SQL queries completely and build an API.
Your SaaS clients should not have to be concerned with the internal implementation of your database structure. A well built API gives you freedom to modify the database as needed and the SaaS client the peace of mind that their "interface" is not in a technically permanent state of flux.

MySQL Multiple Databases for User Authentication

I have a few related web sites and it seems rather unfortunate that they have completely separate user databases. I've been contemplating different options on how to unify the databases:
Rework the sites to be running on one copy of my content management system rather than independent software. Pros: Seems clean. Cons: Complicated by needing to rewrite a lot of the backend of one of the sites to support the different features of the other site.
Use the OAuth backend I wrote to interface with Facebook to authenticate back and forth between the sites. Pros: Seems to be using OAuth for what it was meant to do. Cons: it requires at least some redundancy where I'd need to store duplicate user data on both sites and this could get out of sync. Also seems like overkill for two sites running on the same server.
Connect to both databases whenever an account is created or modified on either site and apply the modifications to the other site. Pros: seems to avoid risk of falling out of sync and avoids complications of having to create and receive OAuth data between the sites. Cons: it requires full duplication of user information between the sites.
Choose one of the sites as having the canonical database and have the user authentication mechanism of the other site connect to the first site's MySQL database, while still connecting to a separate database for the rest of the site's functionality.
I'm not totally happy with any of the options, although #4 feels like the simplest to implement as I'm thinking about it. Nonetheless, before I embark on such a project, I thought I'd ask for potential pitfalls I might be overlooking since none of the ideas are entirely trival. I'd appreciate advice on which might be considered "best practices" and, perhaps, more importantly, which one would cause the most impact on server resources. I'm using Perl's DBD::MySQL to interact with the databases.

Cloud service for large number of small MySQL databses?

I have an application which is going to be distributed to a hosting platform, most probably phpfog.
It is very similar to how WordPress.com operates, where each customer can host their own individual installation of the app on our servers. We host the 'work' files and provide the database (However, it is NOT WordPress; it's a custom app).
Each user of the application has their own separate MySQL database.
I am wondering what the most cost-effective service would be to provide this. It seems that most cloud services offer, for instance, one massive 50GB database. It is definitely conceivable that instead of an individual database, we have one huge one and prefix all the tables per user. But that seems really bloated and unwieldy. It's also not really possible without major structural changes to have one big database for everyone (And the same tables inside it for everyone) as the app is primarily designed to be standalone.
Each database really won't get that big. We are talking low GB - I'd suggest the biggest would be 5GB. However, there will be a LOT of them as obviously it's one per customer.
What would be the most cost- and performance-effective way of handling this?
Amazon RDS in fact provides a database server rather than an individual sales page; I misunderstood their offerings.
In this case, RDS is a drop-in replacement for existing MySQL databases and will work perfectly.

What data entry system should I choose for multiple users at multiple sites?

I've just started working on a project that will involve multiple people entering data from multiple geographic locations. I've been asked to prepare forms in Access 2003 to facilitate this data entry. Right now, copies of the DB (with my tables and forms) will be distributed to each of the sites, returned to me, and then I get to hammer them all together. I can do that, but I'm hoping that there is a better way - if not for this project, then for future projects.
We don't have any funding for real programming support, so it's up to me. I am comfortable with HTML, CSS, and SQL, have played around with Django a fair bit, and am a decently fast learner. I don't have much time to design forms, but they don't have to actually function for a few months.
I think there are some substantial benefits to web-based forms (primary keys are set centrally, I can monitor data entry, form changes are immediately and universally deployed, I don't have to do tech support for different versions of Access). But I'd love to hear from voices of experience about the actual benefits and hazards of this stuff.
This is very lightweight data entry - three forms attached to three tables, linked by person ID, certainly under 5000 total records. While this is hardly bank account-type information, I do take the security of these data seriously, so that's an additional consideration. Any specific technology recommendations?
Options that involve Access:
use Jet replication. If the machines where the data editing is being done can be connected via wired LAN to the central network, synchronization would be very easy to implement (via the simple Direct Synchronization, only a couple lines of code). If not (as seems the case), it's an order of magnitude more complex and requires significint setup of the remote systems. For an ongoing project, it can be a very good solution. For a one off, not so much. See the Jet Replication Wiki for lots of information on Jet Replication. One advantage of this solution is that it works completely offline (i.e., no Internet connection).
use Access for the front end and SQL Server (or some other server database) for the back end. Provide a mechanism for remote users to connect to the centrally-hosted database server, either over VPN (preferred) or by exposing a non-standard port to the open Internet (not recommended). For lightweight editing, this shouldn't require overmuch optimization of the Access app to get a usable application, but it isn't going to be as fast as a local connection, and how slow will depend on the users' Internet connections. This solution does require an Internet connection to be used.
host the Access app on a Windows Terminal Server. If the infrastructure is available and there's a budget for CALs (or if the CALs are already in place), this is a very, very easy way to share an Access app. Like #2, this requires an Internet connection, but it puts all the administration in one central location and requires no development beyond what's already been done to create the existing Access app.
For non-Access solutions, it's a matter of building a web front end. For the size app you've outlined, that sounds pretty simple for the person who already knows how to do that, not so much for the person who doesn't!
Even though I'm an Access developer, based on what you've outlined, I'd probably recommend a light-weight web-based front end, as simple as possible with no bells and whistles. I use PHP, but obviously any web scripting environment would be appropriate.
I agree with David: a web-based solution sounds the most suitable.
I use CodeCharge Studio for that: it has a very Access-like interface, lots of wizards to create online forms etc. CCS offers a number of different programming languages; I use PHP, as part of a LAMP stack.

MySQL AND Filemaker Pro?

I have a client that wants to use Filemaker for a few things in their office, and may have me building a web app.
The last time I used, or thought about, or even heard of, Filemaker was about 10 years ago, and I seem to remember that I don't want to use it as the back end of a sophisticated web app, so I am thinking to try to sell them on MySQL.
However, will their Filemaker database talk to MySQL? Any idea how best to talk them down from Filemaker?
You may have a hard time talking them out of FileMaker, because it was actually a pretty clever tool for making small, in-house database applications, and it had a very loyal user base. But you're right--it's not a good tool for making a web application.
I had a similar problem with a client who was still using a custom dBase IV application. Fortunately, Perl's CPAN archive has modules for talking to anything. So I wrote a script that exported the entire dBase IV database every night, and uploaded it into MySQL as a set of read-only tables.
Unfortunately, this required taking MySQL down for 30 minutes every night. (It was a big database, and we had to convert free-form text to HTML.) So we switched to PostgreSQL, and performed the entire database update as a single transaction.
But what if you need read-write access to the FileMaker database? In that case, you've got several choices, most of them bad:
Build a bi-directional synchronization tool.
Get rid of FileMaker entirely. If the client's FileMaker databases are trivial, this may be relatively easy. I'd begin by writing a quick-and-dirty clone of their most important databases and demoing it to them in a web browser.
The client may actually be best served by a FileMaker-based web application. If so, refer them to Google.
But how do you sell the client on a given choice? It's probably best to lay out the costs and benefits of each choice, and let the client decide which is best for their business. You might lose the job, but you'll maintain a reputation for honest advice, and you won't get involved in a project that's badly suited to your client.
We develop solutions with both FileMaker and PHP/MySQL. Our recommendation is to do the web app in a web app optimised technology like MySQL.
Having said that, FileMaker does have a solid PHP API so if the web app has relatively lightweight demands (e.g. in house use) then use that and save yourself the trouble of synchronisation.
FileMaker's ESS technology let's FileMaker use an SQL db as the backend data source, which gives you 2 options:
Use ESS as a nice tight way to synchronise right within FileMaker - that way you'd have a "native" data source to work with within the FileMaker solution per se.
Use ESS to allow FileMaker to be used as a reporting/data mining/casual query and edit tool directly on the MySQL tables - it works sweet.
We've found building a sophisticated application in FileMaker with ESS/MySQL backend to be very tricky, so whether you select 1 or 2 from above depends on how sophisticated and heavy duty that FileMaker usage is.
Otherwise, SyncDek has a good reputation as a third party solution for automating Synchronisation.
I've been tackling similar problems and found a couple of solutions that emk hasn't mentioned...
FileMaker can link to external SQL data sources (ESS) so you can use ODBC to connect to a MySQL (or other) database and share data. You can find more information here. we tried it and found it to be pretty slow to be honest
Syncdek is a product that claims to allow you to perform data replication and data transmission between Filemaker, MySQL and other structured sources.
It is possible to use Filemaker's Instant Web Publishing as a web service that your app can then push and pull data through. We found a couple of wrappers for this in python and php
you can put a trigger in the FileMaker database so that every time a record is changed (or part of a record you are interest in) you can call a web service that updates a MySQL or memcached version of that data that your website can access.
I found that people like FileMaker because it gives them a very visual interface onto their data - it's very easy to make quite large self-contained applications without too much development knowledge. But, when it comes to collaboration with many users or presenting this data in a format other than the FileMaker application we found performance a real problem.