How to log Windows 7 network traffic & disk usage with MySQL? - mysql

I'm running Windows 7 Pro and have a few servers running. One of the servers is a SSH / file server that was made via Cygwin. I already have logging setup internally using syslog-d; however, it does not provide adequate logging. When a user is connected to the server I can see him/her in the Windows 7 Resource Monitor and it shows his/her IP address as well as how much data is being sent/received. When a user is downloading a file from the file server I can also see in the resource monitor what file he/she is downloading by looking at the disk usage.
Herein lies the first question: How can I log users' IP address, the time they connect & disconnect, what files they download, and what their download speed was, to a database in MySQL?
In addition to the aforementioned server, I also use IIS to host a website, and would like to have some sort of networking logging.
If I could find a tool that would work for both of these servers that would be the best solution.
I did some searching and found a program called Snort that looks like it would work for the network side of things, but not for the disk usage. I'm not familiar with this program at all, but at first glance maybe it could accomplish part of what I want to do? Maybe there is an easier/better way?
I'm pretty new to MySQL and know very little about network and disk logging so any and all help and guidance would be much appreciated. Thanks!

Advanced Web Statistics does a pretty good job of making sense of the IIS log files, and though it will give you more information than you need, it will certainly give you the information you want. It is open source, and my hosting provider uses it for the ASP.NET sites I have developed.
As far as logging the information to MySQL:
I am assuming that you already have, or know how to get the information and you simply want to log it to a MySQL DB.
1st, you will need to create the database.
2nd, you need the MySQL connector for your programming language of choice. The MySQL ADO.NET connector is excellent and easy to use. I am also assuming you know at least one programming language and how to connect it to a database. If not, I recommend C# with ADO.NET-- it is super easy and there are plenty of tutorials online.
3rd, write a program to send your information to the database, when you receive it.

Related

Architecture - Code locally (GIT), Database in the Cloud

We are a small team of developer who are looking for the ideal working environment.
Our current setup:
Everyone is developing locally on his machine. Code is managed/shared with git (bitbucket). We have a small mysql server in our local network where we all share the same database throughout the projects. If we want to share something with a client, we have a remote server where we move the code and database to.
Our preferred setup:
Code stays where it is. But we would like to move the SQL databases into the cloud to a remote server which we can access locally.
What we've tried:
Amazon RDS (free tier) which uses the smallest instance. This was horribly slow. Question here is, does it get really fast for a bigger instance? Page loads can't take 5 seconds for only the database requests. What instance do we theoretically need in order to have a really good performance?
Google SQL was honestly also too slow. I actually tested a bigger instance which was a lot better than Amazon but still not useful for our usecase.
Do you know any other services which provide such functionality? (MySQL remotely accessible)
Do you have any suggestions how we maybe can rethink our whole process of developing?

MySQL Databases Online

I recently downloaded the community version of MySQL. I'm relatively new to databases and all so I may have some misconceptions about databases. However, based on my understanding, the database downloaded is local to my computer, so if my computer ever gets destroyed or wiped then the databases is not accessible anymore and cannot be referenced by programs.
If this is the case, is there a way to make the database not local to my computer, but based online so that it will always be accessible regardless of the condition of my computer?
EDIT: It would be best if the database was free if possible. In addition, I have no host accounts currently.
If you are running the database on your machine, then true - if the computer gets wiped or destroyed, so does your database and data.
If you want online, then Amazon Web Services (AWS) is probably your best bet. You create an RDS there, and it lives in the cloud. Not for the faint of heart. Not that it is particularly difficult, but probably not for the novice.
I also had the same problem and the solution came from here
All the free hosters i found with mySQL were way too limited and 99% didn't gave you remote connections.
So you set up a free VPS ...a nice linux distro,you install the mySQL and off you go...

How best to deploy this multi-tier app?

We currently have an application that runs on one dedicated server. I'd like to move it to OpenShift. It has:
A public-facing web app written in PhP
A Java app for administrators running on Wildfly
A Mysql database
A filesystem containing lots of images and documents that must be accessible to both the Java and PhP apps. A third party ftp's a data file to the server every day, and a perl script loads that into the db and the file system.
A perl script occasionally runs ffmpeg to generate videos, reading images from and writing videos to the filesystem.
Is Openshift a good solution for this, or would it be better to use AWS directly instead (for instance because they have dedicated file system components?)
Thanks
Michael Davis
Ottawa
The shared file system will definitely be the biggest issue here. You could get around it by setting up your applications to use Amazon S3 or some other shared Cloud file system though fairly easily.
As for the rest of the application, if I were setting this up I would:
Setup a scaled PHP application, even if you set the scaling to just use 1 gear this will allow you to put the MySQL database on it's own gear, and even choose a different size for it, such as having medium web gears (that run php) and a large gear that runs the MySQL database. This will also allow your wildfly gear to access the database since it will have a FQDN (fully qualified domain name) that any of your applications on your account can reach. However, keep in mind that it will use a non-standard port instead of 3306.
Then you can setup your WildFly server as whatever size you want, but, keep in mind that the MySQL connection variables will not be there, you will have to put them into your java application manually.
As for the perl script, depending on how intensive it is, you could run it on it's own whatever sized gear with some extra storage, or you could co-locate it with either the php or java application as a cron job. You can have it store the files on Amazon S3 and pull them down/upload them as it does the ffmpeg operations on them. Since OpenShift is also hosted on Amazon (In the US-EAST region) these operations should be pretty fast, as long as you also put your S3 bucket in the US-EAST region.
Those are my thoughts, hope it helps. Feel free to ask questions if you have them. You can also visit http://help.openshift.com and under "Contact Us" click on "Submit a request" and make sure you reference this StackOverflow question so I know what you are talking about, you can ask any questions you might have and we can discuss solutions for them.

ExpressionEngine : git : local development : remote database

To those of you that are trying to be good little developers and version control their ExpressionEngine sites with git, how do you handle your database?
In my limited experience with multiple developers working on one ExpressionEngine site, we've had to all run off of a single MySQL development database running on a remote web server. For those of you that have tried this, it is PAINFULLY slow. Page loads can easily take 5-10 seconds making development extremely difficult. It would be quicker to work off of a remote development server. I am trying to steer away from working off of a remote MySQL server in order to be able to work from anywhere and not depend on Internet connection speed/quality.
Just wondering how others handle their MySQL databases.
Do all of your developers run off of one central database? Have you dealt with slowness issues like we have?
Do you keep your database under version control? How do you handle export/imports among multiple developers and multiple branches?
With one developer I can import/export/commit the database very easily but as soon as you add another developer to the mix, it gets very VERY muddy. Looking forward to hearing everyone's thoughts on this mammoth topic.
Thanks!
It seems there is a lot of time lost on failing DNS requests, with a remote database.
Start your MySQL server with start mysqld with --skip-name-resolve. (More information on this topic can be found here: http://dev.mysql.com/doc/refman/5.0/en/host-cache.html)
Having a remote database still seems to be the best way for us to work on a project with multiple developers.
I almost always use a central database for development. Depending which host you use, the speed difference may not be huge.
Obviously, if you're not making changes to the database, i.e. only doing template development, keeping the database in sync is not as needed, so you could potentially bring up a local copy of the database. You just have to remember to repeat any database changes, if you do end up making some.
As far as version control, I keep a copy of my base EE install's SQL file in my base repository. Other than that I don't usually keep copies of the database in Git, so I don't do a lot of importing/exporting, etc.
Have you looked at the EE Profiler recently? You'll probably notice in the neighborhood of 20-80 queries on your home page depending on it's complexity.
The problem is that, for each query, MySQL must execute a remote request for data, download the response, and then present ExpressionEngine it's data. The 20-80 round trips to the database is what's causing your delay and I don't think there is much you can do about it. When using a remote (outside our network) database, I get the same delay as you.
When MySQL is running on your machine or the production server, it doesn't have the added network requests causing latency in it's requests for data. This is the difference.
As for fixes, all you can do is move to a database hosted on your internal network. We have a Linux machine that mimics our production environment that we use for staging. Since it's on our network, we can use the local IP address in our database.php file. This is much faster.
The problem that we still have is the issue of channels/fields/entries. When a developer is working on a new section, they'll likely need to create a new channel and fields and/or new entries. When we're ready to push that functionality to production, we have to manually make those changes on the production server as there is no way to reliably export them. I am hopeful of this addon though---we'll see.
In my company (4 developers) we each run our own DB locally. But recently I tested Rackspace Cloud Databases (but there are other cloud db providers) for a heavy DB that could become difficult to run on a little laptop. It's relatively less expensive than running our own db server, and it can be setup or deleted in the minute.

MySql Database Hacked, NOT injections

Three weeks ago, I found a list of my website's users and info on Paste Bin giving away all privacy. I ran updates and protected against SQL Injections. I also added a pre-request to save the SQL in text format in a LOG table whenever user input is required to be able to analyse any injection if my protection wasn't enough.
Then today the same post was on Paste Bin again with recent entries so I checked the LOG table to find only clean entries. Is there anything else than injections I should worry about? The web seems to give info about Injections only!
Could they have had access to the dbpassword in a php file on the server and could they have connected from and external server?
Should I change the dbpassword frequently?
Are there any solution non-script wise like hosting security plan or something like that which should be efficient enough?
I am receiving physical threats from hacked users and would really like to close this quickly...
If you're implementing your own protection against user input, you're probably doing it wrong. Most standard database libraries will give you a way of passing in parameters to queries where it will be sanitised properly, and these will have been coded with more things in mind than you're probably aware of. Reinventing the wheel in anything security-related is a bad idea!
Other things to worry about:
Password policy (strong passwords)
Access to your database server (is it firewalled?)
SSH access to your server (again, firewalled?)
Keeping all of your software up-to-date
Just to add to the other answers that you've had so far. If someone is posting the contents of your database online then you need to assume that the server(s) running the application and database have been compromised, as once they've gained initial access, it's likely that they'll have placed root-kits or similar tools onto the server to keep access to it.
As to how they got in there's a number of potential options, depending on the architecture of your solution, and it's imposssible to say which is the case without more details. Some of the more likely options options would be
SSH passwords
Administrative web apps (e.g. PHPMyAdmin) with common passwords or vulnerabilities
Access via hosting service (e.g. weak passwords on administrative login panels)
If the site is PHP based Remote File Inclusion issues are a distinct possibility
If you can I'd recommend engaging a forensics or incident response company to help you recover the data and rebuild, but failing that I'd recommend getting a backup from before the compromise and using that to rebuild the server, then ensure that all software is updated and patched and passwords are not the same as the compromised system, before bringing it online.
The best protection for this is to allow connections to the mysql database only from the machine where your application runs.
First of all, make sure, network access to the MySQL database is "need to know" - in most cases this is a simplye bind-address 127.0.0.1.
Next change the DB password, just because yes, you can (C)
Now think of this: If somebody got your DB passwd from your PHP files, you already are in deep s***t: Nothing stops him or her from just repeating that stunt! You need to audit your application for backdoors (after the fact problem) and how the guys got in there (before the fact problem). Check your apache logs for requests with unusual GET parameters - a filename in there mostly is a dead givaway.
I agree with Razvan. Also if you're running any CMS or prepackaged web pages, make sure they're the latest version. They most likely access as localhost from the web server. Hackers follow the change logs of those and every time a security patch is released, they attack published vulnerabilities on servers running the older version. It's often performed in bulk by crawlers. Odds are they have a database with your server listed as running old versions of things.
First you need to ensure that this "php file" containing the DB password(s) is not within the web root directory, otherwise they could simply access it like: http://mydomain.com/dbpassword.php.
Second, immediately change the passwords used to access your database.
Third, ensure that mysql will only accept connections from 'localhost', vs allowing connections from anywhere; '%'. And if it is a dedicated server, then you should "harden" the box and add an IP rule to IPTables where mysql access is only allowed from the server's IP. These changes would ensure that if they did get your db username/password creds, they can not access the database from a remote computer, instead they would have to exploit your application, or ssh into your server to gain access to your database.
Next, you should disable all user accounts to your site, and force them to update their passwords using a closed loop verification process. This will ensure no ongoing malicious activity is occurring with your users or their accounts.
These are just a few steps to take, there are others such as tracking local users login activity. It is possible that one of your system's user accounts has been compromised (rooted). The point is, you need to consider all points of access to your system and services therein, if you are unable to do it, it may be time to hire or contract a seasoned sysadmin to help you.
If this is shared web hosting, and another user is logged in with shell access and is able to guess the path to your web root, and the password configuration (PHP or other script) file is world readable, then the user can read it.
This is one of the most common vulnerabilities and is very easy to exploit.
If this is the case:
To correct the issue, you need to move the configuration file out of your web root folder and/or change the permissions on it so that it's not world readable, and then change your database password.
Most likely, the user would not be able to inject anything into your application.
Changing the database server so that it's only accessible locally or to your web server would do no good, since the malicious user would be on the same web server and still be able to access it.
If you did not see any malicious queries, then they are probably accessing your db via the MySQL command line (or PHPMyAdmin or other tool), and not through your application.
Enabling the general query log would allow you to see all queries in plain text in the log, but if this is shared web (and MySQL server) hosting, you probably won't be able to enable this.
This is something you may wish to report to your web host. They may be able to find the attacker and suspend their account or provide you with evidence.