MySQL Databases Online - mysql

I recently downloaded the community version of MySQL. I'm relatively new to databases and all so I may have some misconceptions about databases. However, based on my understanding, the database downloaded is local to my computer, so if my computer ever gets destroyed or wiped then the databases is not accessible anymore and cannot be referenced by programs.
If this is the case, is there a way to make the database not local to my computer, but based online so that it will always be accessible regardless of the condition of my computer?
EDIT: It would be best if the database was free if possible. In addition, I have no host accounts currently.

If you are running the database on your machine, then true - if the computer gets wiped or destroyed, so does your database and data.
If you want online, then Amazon Web Services (AWS) is probably your best bet. You create an RDS there, and it lives in the cloud. Not for the faint of heart. Not that it is particularly difficult, but probably not for the novice.

I also had the same problem and the solution came from here
All the free hosters i found with mySQL were way too limited and 99% didn't gave you remote connections.
So you set up a free VPS ...a nice linux distro,you install the mySQL and off you go...

Related

Architecture - Code locally (GIT), Database in the Cloud

We are a small team of developer who are looking for the ideal working environment.
Our current setup:
Everyone is developing locally on his machine. Code is managed/shared with git (bitbucket). We have a small mysql server in our local network where we all share the same database throughout the projects. If we want to share something with a client, we have a remote server where we move the code and database to.
Our preferred setup:
Code stays where it is. But we would like to move the SQL databases into the cloud to a remote server which we can access locally.
What we've tried:
Amazon RDS (free tier) which uses the smallest instance. This was horribly slow. Question here is, does it get really fast for a bigger instance? Page loads can't take 5 seconds for only the database requests. What instance do we theoretically need in order to have a really good performance?
Google SQL was honestly also too slow. I actually tested a bigger instance which was a lot better than Amazon but still not useful for our usecase.
Do you know any other services which provide such functionality? (MySQL remotely accessible)
Do you have any suggestions how we maybe can rethink our whole process of developing?

Moving XAMPP project to real server

I'm a college student and an amateur in web development. I've been working on a query system to query some foxpro database tables for this company I am interning in. I was asked to implement my project in their local server (running on Windows Server 2003) and I am not sure where to begin since this will be my first time working with a real server and I have almost zero knowledge about it.
The project was done with PHP, Javascript, mySQL and JQuery and is developed in XAMPP. It will be accessed by everyone on the office intranet. I need to set up mySQL database for the login as well.
1) Would it be better to do this on IIS or Apache?
2) I am aware that XAMPP is Apache, but if I implement my project on IIS, will there be any difference or will I need to change my codes?
Any advice is greatly appreciated!
The answer to this question is governed by
the tech your project uses.
your need to coexist with the other services on that company W2003 server.
If your project happens to be the first web app to be served from that server, just set up XAMPP, move your files and database to the server, and be done with it.
But it's unlikely. You need to talk to whomever set up the last web app on that server. It's possible to set up a server so some web pages are served by IIS and others by Apache, but then those pages need different port numbers. That will make things more confusing and less convenient for your users.
php/MySQL/odbc works fine on IIS if that's what you need to use. You may have to monkey around with protections and so forth to get it tweaked out. But it will perform just fine.
The take-home lesson for you as you develop this small scale web app is to build things like that to fit the intended production deployment environment. XAMPP is a popular and successful development setup PRECISELY BECAUSE there are approximately Avogadro's Number of $4-per-month Linux-Apache-MySQL-php hosting services in the world that are entirely compatible with it. But that's not your deployment target this time.

RDBMS setup itself with Application installation

I want to know is there any RDBMS which will install with our application executable file & we'll don't need to setup Database in every single machine, Like user just install the app and run RDBMS will also setup it self in installation.
There is more to that, than meets the eye: The first question to answer is, whether the RDBMS is used concurrently.
If not, a simple file-based engine will most likely be sufficient: I recomment SQlite, as it is as cross-platform as can be. There is no setup involved, you can simply package the sqlite.[so|dll|dylib] with your application
If yes, you have to think about which installation should be the "master" installation, with others being slaves: It might not be a good idea to use a salesman's laptop as the master, even if it happens this is the first machine available. Decicated servers exist for a reason. A lot of products have an installer, that asks for an installation mode such as Standalone/Server/Client. Chosing Server will mostly result in calling the installer for the RDBMS first.
My personal opinion is: If a non-tivial app is multitiered like in the second bullet, the tiers should be installed one-by-one by someone, who has an idea, what these tiers are and do. If not, you create an accident waiting to happen. But again - this is my personal opinion only.

ExpressionEngine : git : local development : remote database

To those of you that are trying to be good little developers and version control their ExpressionEngine sites with git, how do you handle your database?
In my limited experience with multiple developers working on one ExpressionEngine site, we've had to all run off of a single MySQL development database running on a remote web server. For those of you that have tried this, it is PAINFULLY slow. Page loads can easily take 5-10 seconds making development extremely difficult. It would be quicker to work off of a remote development server. I am trying to steer away from working off of a remote MySQL server in order to be able to work from anywhere and not depend on Internet connection speed/quality.
Just wondering how others handle their MySQL databases.
Do all of your developers run off of one central database? Have you dealt with slowness issues like we have?
Do you keep your database under version control? How do you handle export/imports among multiple developers and multiple branches?
With one developer I can import/export/commit the database very easily but as soon as you add another developer to the mix, it gets very VERY muddy. Looking forward to hearing everyone's thoughts on this mammoth topic.
Thanks!
It seems there is a lot of time lost on failing DNS requests, with a remote database.
Start your MySQL server with start mysqld with --skip-name-resolve. (More information on this topic can be found here: http://dev.mysql.com/doc/refman/5.0/en/host-cache.html)
Having a remote database still seems to be the best way for us to work on a project with multiple developers.
I almost always use a central database for development. Depending which host you use, the speed difference may not be huge.
Obviously, if you're not making changes to the database, i.e. only doing template development, keeping the database in sync is not as needed, so you could potentially bring up a local copy of the database. You just have to remember to repeat any database changes, if you do end up making some.
As far as version control, I keep a copy of my base EE install's SQL file in my base repository. Other than that I don't usually keep copies of the database in Git, so I don't do a lot of importing/exporting, etc.
Have you looked at the EE Profiler recently? You'll probably notice in the neighborhood of 20-80 queries on your home page depending on it's complexity.
The problem is that, for each query, MySQL must execute a remote request for data, download the response, and then present ExpressionEngine it's data. The 20-80 round trips to the database is what's causing your delay and I don't think there is much you can do about it. When using a remote (outside our network) database, I get the same delay as you.
When MySQL is running on your machine or the production server, it doesn't have the added network requests causing latency in it's requests for data. This is the difference.
As for fixes, all you can do is move to a database hosted on your internal network. We have a Linux machine that mimics our production environment that we use for staging. Since it's on our network, we can use the local IP address in our database.php file. This is much faster.
The problem that we still have is the issue of channels/fields/entries. When a developer is working on a new section, they'll likely need to create a new channel and fields and/or new entries. When we're ready to push that functionality to production, we have to manually make those changes on the production server as there is no way to reliably export them. I am hopeful of this addon though---we'll see.
In my company (4 developers) we each run our own DB locally. But recently I tested Rackspace Cloud Databases (but there are other cloud db providers) for a heavy DB that could become difficult to run on a little laptop. It's relatively less expensive than running our own db server, and it can be setup or deleted in the minute.

How to log Windows 7 network traffic & disk usage with MySQL?

I'm running Windows 7 Pro and have a few servers running. One of the servers is a SSH / file server that was made via Cygwin. I already have logging setup internally using syslog-d; however, it does not provide adequate logging. When a user is connected to the server I can see him/her in the Windows 7 Resource Monitor and it shows his/her IP address as well as how much data is being sent/received. When a user is downloading a file from the file server I can also see in the resource monitor what file he/she is downloading by looking at the disk usage.
Herein lies the first question: How can I log users' IP address, the time they connect & disconnect, what files they download, and what their download speed was, to a database in MySQL?
In addition to the aforementioned server, I also use IIS to host a website, and would like to have some sort of networking logging.
If I could find a tool that would work for both of these servers that would be the best solution.
I did some searching and found a program called Snort that looks like it would work for the network side of things, but not for the disk usage. I'm not familiar with this program at all, but at first glance maybe it could accomplish part of what I want to do? Maybe there is an easier/better way?
I'm pretty new to MySQL and know very little about network and disk logging so any and all help and guidance would be much appreciated. Thanks!
Advanced Web Statistics does a pretty good job of making sense of the IIS log files, and though it will give you more information than you need, it will certainly give you the information you want. It is open source, and my hosting provider uses it for the ASP.NET sites I have developed.
As far as logging the information to MySQL:
I am assuming that you already have, or know how to get the information and you simply want to log it to a MySQL DB.
1st, you will need to create the database.
2nd, you need the MySQL connector for your programming language of choice. The MySQL ADO.NET connector is excellent and easy to use. I am also assuming you know at least one programming language and how to connect it to a database. If not, I recommend C# with ADO.NET-- it is super easy and there are plenty of tutorials online.
3rd, write a program to send your information to the database, when you receive it.