I've developed an iPhone app that allows users to send/receive files to/from a server.
At this point I wish to add a database to my server side application (sockets in c#) in order to keep track of the following:
personal card (name,age,email,etc...) - a user can (but isn't obligated) to fill one out
the number of files a user sent and received so far
app Stats which are sent every once in a while and contains info such as number of errors in the app, he's OS version etc...
the number of total files sent/received in the past hour (for all users)
each user has a unique 36 digit hex number "AF41-FB11-.....-FFFF"
The DB should provide the following answers: which users are "heavy users", how many files were exchanged in the past day/hour/month and is there a correlation between OS and number of errors.
Unfortunately I'm not very familiar with DB design, so I thought about the following:
a users table which will contain:
uniqueUserID | Parsonal Card | Files Sent | Files Received
an App stats table (each user can have many records)
uniqueUserID | number_of errors | OS_version | .... | sumbission_date_time
a general stats table (new record added every hour)
| total_files_received_in_the_last_hour | total_files_sent_in_the_last_hour | submission_date_time
My questions are:
performance-wise, does it make sense to collect and store data per user in side the server side application, and toss it all into the DB once an hour (e.g. open a connection, UPDATE fields/INSERT fields, close the connection) ? Or should I simply update each transaction (send/receive file) a user does every time he performs it?
Should I create a different primary key, other than the 36-digit id?
Does this design make sense??
I'm using mySQL 5.1, innoDB, the DBMS is on the same machine as the server-side app
Any insights will be helpful!
Related
I have a Theory question, I have an access database and I want to track cost by task. Currently I have a task tracker table that will store the users Hours|HourlyRate and Overtime|OvertimeRate among other things work order no, project no etc. I don't think that this is the best way to store this data as the users could look at the table and see each others rates, before now it didn't matter much, but I'm about to give this database to more users. I was thinking of having that Rate data in a separate table linked to the ID no of the Task table and not allow users to have access to this table, but then I couldn't do an after update event as the user wont have access to write to that table. Either that or store the rates in a separate Database with a start and end date of that given rate. For instance:
Ed | Rate $0.01 | StartDate 01/01/1999 | EndDate 12/32/1999
Ed | Rate $0.02 | StartDate 01/01/2000 | EndDate 12/32/2000
This way I can store the costing data in a separate database that the users don't have access too and just calculate the cost information every time I need it based on date and unique user ID. I was wondering what solutions have others come up within MSAccess for this type of situation.
I have a bunch of users in a project in the users table. I have a second table I use for notifications. When a user gets a notification for him, its added to the notifications table with his user_id and a read (true/false) flag.
Now, if I want to add a notification for every user (such as site going offline, etc), how would I add a row for every user in the users table? The new rows would be exactly identical with the exception of the auto_incremented row id but each new row would have to specify the user id.
Of course I could write a script in php to generate a ton of sql queries to do this and send them all at once using a multi-query insertion, but that seems computationally expensive if I have thousands of users.
Conceptual query:
FOR users.id as user_id
INSERT INTO notifications VALUES (,user_id=user_id,msg='We will be going offline at 5:00!',read='0')
Is that do able?
Thanks!
If you have thousands of users, it doesn't matter to generate thousands of SQL querys, and send it to your MySQL server.
But things will change if you have millions of users.
One affordable choice is, write a notification to a table, for example, system_notifications. Display not only notifications but also system_notifications on your webpage.
And then save the ids for the users who have already read these notifications.
e.g.
system_notifications table
| id | message |
system_notifications_read_users table
| id | user_id |
Then, you can only operate your database when the user read the notifications.
And when your system notification expired, such as you have finished your downtime, you can remove it from both of tables said before.
I'm creating a script which queries a server for the logged-in clients and then for each client checks if they (their id) is registered in the database.
Assume the clients with ID: 1, 3 and 5 are logged in. The database has only registered clients with ID 1,2,3 and 4. Client 5 is thus new to the system and needs to be registered into the database.
My question is how to do this in an efficient manner? Taking the client ID's one-by-one and search the database each time is no doubt the easiest approach, but how efficient is this? Are there ways to do this differently in order to minimize execution time and database load?
Thanks.
I will make my question very simple.
I have a ruby on rails app, backed with mysql.
I will click a button in page-1, it will goto page-2 and list a
table of 10 company's name.
Now, this list of ten companies are randomly generated(based on the
logic behind that clicked button in page-1) from COMPANIES table
which has 10k company names.
How do I calculate the count of the number of times a COMPANY name
being displayed on page-2, for a day.
Examaple: At the end of day - 1
COMPANY_NAME | COUNT
A | 2300
B | 100
C | 500
D | 10000
Now, from the research I have done, raw inserts will be costly and I learned there are 2 common ways to do it.
Open an unix file, write into it. At the end of the day, INSERT the content to the database.
Negative : File system if accessed concurrently, it will lead to lock issues.
Memcache the count and bulk insert into the DB.
What is the best way to do it in rails?
Are there any other ways to do this?
use redis. redis is good for maintaining in-memory data. It has atomic increment (an other data structures)
What's the best storage mechanism (from the view of the database to be used and system for storing all the records) for a system built to track whois record changes? The program will be run once a day and a track should be kept of what the previous value was and what the new value is.
Suggestions on database and thoughts on how to store the different records/fields so that data is not redundant/duplicated
(Added) My thoughts on one mechanism to store data
Example case showing sale of one domain "sample.com" from personA to personB on 1/1/2010
Table_DomainNames
DomainId | DomainName
1 example.com
2 sample.com
Table_ChangeTrack
DomainId | DateTime | RegistrarId | RegistrantId | (others)
2 1/1/2009 1 1
2 1/1/2010 2 2
Table_Registrars
RegistrarId | RegistrarName
1 GoDaddy
2 1&1
Table_Registrants
RegistrantId | RegistrantName
1 PersonA
2 PersonB
All tables are "append-only". Does this model make sense? Table_ChangeTrack should be "added to" only when there is any change in ANY of the monitored fields.
Is there any way of making this more efficient / tighter from the size point-of-view??
The primary data is the existence or changes to the whois records. This suggests that your primary table be:
<id, domain, effective_date, detail_id>
where the detail_id points to actual whois data, likely normalized itself:
<detail_id, registrar_id, admin_id, tech_id, ...>
But do note that most registrars consider the information their property (whether it is or not) and have warnings like:
TERMS OF USE: You are not authorized
to access or query our Whois database
through the use of electronic
processes that are high-volume and
automated except as reasonably
necessary to register domain names or
modify existing registrations...
From which you can expect that they'll cut you off if you read their databases too much.
You could
store the checksum of a normalized form of the whois record data fields for comparison.
store the original and current version of the data (possibly in compressed form), if required.
store diffs of each detected change (possibly in compressed form), if required.
It is much like how incremental backup systems work. Maybe you can get further inspiration from there.
you can write vbscript in an excel file to go out and query a webpage (in this case, the particular 'whois' url for a specific site) and then store the results back to a worksheet in excel.