The best way to store clicks and views statistics? - mysql

I have some items and I want to store statistics like daily clicks and the traffic source of the page and I want to store these statistics for 100`s of items. Based on what stats I will save I want to be able to generate charts with the clicks of each day (like on google analytics) and to show the number of clicks from each traffic source.
I'm not a specialist, but I'm thinking to store statistics in a mysql table for a single day then write them in multiple .xml files. I have a slow, cheap server and I`m searching for the best method, please help me!
These "items" are embedded in other websites. I control these items using php

Since this items are embedded in other websites storing this infos / request is a NO GO.
This means you either need to install and setup mysql on this other websites, which is unlikely.
Or you connect to a remote mysql .. which is quite expensive for each request.
Especially when you say yourself, that you only have a "cheap" server.
Additionally you risk bringing the websites with the embedded items down, when your mysql server fails.

Better to use google analytics to track the visited pages correctly instead of developing one.
It will show you daily, country wise visitors.

Related

MySQL Server Split Scaling Databases/Tables among Users

I'm relatively new to Databases and have been looking into a solution that will allow users, under my server, to access their own data and no one else. I want these databases for the user to be scalable, so if more space is needed to store files, they can do so with little intervention.
I was looking into MySQL, as I have only done single database work with it, and was trying to see how this could potentially be done. Would the best course of action be set a database for each user? That way the tables for each database are separate, password protected, and cannot exchange data among the tables in other databases? I know there can be essentially unlimited databases and tables, in addition to table sharding/partitioning, so I think this is a solid choice, but was wondering if anyone who has worked with MySQL more had any input.
Thanks
EDIT: Update for clarification of desires. So what I essentially want is a platform where I am the owner, but I can have users log in to my platform to access their data. This data will probably mostly include files, such as PDF's, but as to their size I cannot tell, but am planning for the worst. They will be able to use a web/application to view their files and download, upload, sort, delete these files. So in addition to creating files, there will be the ability to see historic files and download those as well if desired. What my platform will be providing is the framework for these files with fields being autofilled if I can, as well as the UI for the file management. My concern comes from architecture of having multiple users, with separate data, to be kept separate, scalable, and not completely crash the server with read/writes.
It sounds like you are looking to store the users "files" as BLOBs in the database which doesn't necessarily lend itself to scaling well in the first place. Depending on the type of files generally the best solution would be to provide security in the application layer and use cloud based storage for your files. If you need an additional layer of security (i.e. users can only access the files assigned to them) there are a number of options - one such option, for example, assuming you were using S3 would be to use IAM profiles which could be generated when the user a/c is set up. The same would apply for any third party cloud storage with API.
Having an individual database per user would be an administrative nightmare unless you could generate each database on login (which would mean a separate data store for credentials anyway so it would be somewhat pointless) and also would not work in a BLOB storage scenario.
If you can detail a little more in terms of precisely what you are trying to achieve and why there will for sure be plenty of answers.

Business Intelligence: Live reports from MySQL

I wanted to create a (nearly) live dashboard from MySQL databases I tried PowerBI, SSRS and other similar tools but they were not as fast as I wanted. What I have in mind is the data to be updated every 1 minute or even less. Is it possible? and are there any free (or inexpensive) tools for this?
Edit: I want to build a wallboard to show some data on a big TV screen. I need it to be real-time. I tried SSRS autorefresh as well but it has a loading sign and very slow, plus PowerBI uses Azure which is very complex to configure and blocked for my country.
This is a topic which has many more layers than to ask which tool is best for this case.
You have to consider
Velocity
Veracity
Variety
Kind
Use Case
of the data. Sure, this is usually only being recounted if talking about Big Data, but will give you a feeling about the size and complexity of data.
Loading
Is the data being loaded and you "just" use it? Or do you also need to load it realtime or near-realtime (for clarification read this answer here)?
Polling/Pushing
Do you want to poll data every x seconds or minutes? Or do you want to work event based? What are the requirements which will need you to show data this fast?
Use case
Do you want to show financial data? Do you need to show data about error and system logs of servers and applications? Do you want to generate insights as soon as a visitor of a webpage is making a request?
Conclusion
When thinking about those questions, keep in mind this should just be a hint to go into one direction or another. Depending on the data and the use case, you might use an ELK stack (for logs), Power BI (for financial data) or even some scripts (for billing).

What database/technology to use for a notification system on a node.js site?

I'm looking to implement notifications within my node.js application. I currently use mysql for relational data (users, submissions, comments, etc). I use mongodb for page views only.
To build a notification system, does it make more sense (from a performance standpoint) to use mongodb vs MySQL?
Also, what's the convention for showing new notifications to users? At first, I was thinking that I'd have a notification icon, and they click on it and it does an ajax call to look for all new notifications from the user, but I want to show the user that the icon is actually worth clicking (either with some different color or a bubble with the number of new notifications like Google Plus does).
I could do it when the user logs it, but that would mean the user would only see new notifications when they logged out and back in (because it'd be saved in their session). Should I poll for updates? I'm not sure if that's the recommended method as it seems like overkill to show a single digit (or more depending on the num of notifications).
If you're using node then you can 'push' notifications to a connected user via websockets. The linked document is an example of one well known websocket engine that has good performance and good documentation. That way your application can send notifications to any user, or sets of users, or everyone based on simple queries that you setup.
Data storage is a different question. Generally mysql does have poor perfomance in cases of high scalability, and mongo does generally have a quicker read query response, but it depends on what data structure you wish to use. If your data is in a simple key-value structure with no real need for relational data, then perhaps using a memory store such as Redis would be the most suitable.
This answer has more information on your question too if you want to follow up and investigate more.

Django Log File vs MySql Database

So I am going to be building a website using the Django web framework. In this website, I am going to have an advertising component. Whenever an advertisement is clicked, I need to record it. We charge the customer every time a separate user clicks on the advertisement. So my question is, should I record all the click entries into a log file or should I just create a Django model and record the data into a mysql database? I know its easier to create a model but am worried if there is a lot of high traffic to the website. Please give me some advice. I appreciate you taking the time to read and address my concerns.
Awesome. Thank you. I will definitely use a database.
Traditionally, this sort of interactions is stored in a DB. You could do it in a log, but I see at least two disadvantages:
log rotation
the fact that after logging you'll still have to process the data in a meaningful manner.
IMO, you could do it in a separate DB (see the multiple db feature in django). This way, you could have the performance somewhat more balanced.
You should save all clicks to a DB. A database is created to handle the kind of data you are trying to save.
Additionally, a database will allow you to analyze your data a lot more simply then a flat file. If you want to graph traffic from country, or by user agent or by date range, this will be almost trivial in a database, but parsing giganitc log files could be more involving.
Also a database will be easier to extend. Right now you are just tracking clicks but what happens if you want to start pushing advertisements that require some sort of additional user action or conversion. You will be able to extend this beyond clicks extremely easy in a database.

efficient way to store sitewide user activity

I tried searching through on stackoverflow as well as googling around a lot, but am not able to find answers to my problem (I guess I'm searching for the wrong keywords / terms).
We are in the process of building a recommendation engine, and while we are initially logging all user activity in custom logs (we use ruby / rails), we need to do an EOD scanning of that file and arrange according to the user. We also have some other user data coming in from some other places (his fb activity, twitter timeline, etc), and hence by EOD we want all data for a particular user to be saved somewhere and then run our analyzer code on all of the user's data to generate the recommendations.
The problem is that we are generating a lot of data, and while for the time being we are using a mysql table to store all this data, we are not sure till how much time can we continue to do this, as our user-base grows (we are still testing it out internally with about 10 users with a lot of activity). Plus, as eager developers we would like to try out something new that can suffice our needs.
Any pointers in this direction will be very helpful.
Check out Amazon Elastic Map Reduce. It was built for this very type of thing.