I have a MySQL table which stores scores of the users. Every time a user answer a question correctly, I add his or her score by one using AJAX request. The request sends just an integer number which is the id of the question.
My Question is: How to prevent fake AJAX requests?
As it is just an integer number I can't check if it is a fake request or not. So the only solution I come up with is to add an extra column to my table, named "yesterday_score", as its name describe it is a column that change at time 00:00 and save users score. If a user add his score more than 300 in a day, I assume it is a hack, and I prevent it.
Check then answer with your back end to increment, not with the front end.
Never trust user input it the rule number one!
Rather than sending the number to the database you can use the language your database uses to update the number. So in MySQL
UPDATE users SET score = score + 1 WHERE user_id = 12
user_id can be verified by comparing it with the session or something of the sort. Be sure to use prepared statements too.
I read a lot of related pages, some user suggested some kinds of solutions like: "If a user hits 10 headshots in 10ms then you kick him. Write a clever cheat detection algorithm."
And there is an answer in same question:
There is no way to avoid forged requests in this case, as the client
browser already has everything necessary to make the request; it is
only a matter of some debugging for a malicious user to figure out how
to make arbitrary requests to your backend, and probably even using
your own code to make it easier. You don't need "cryptographic
tricks", you need only obfuscation, and that will only make forging a
bit inconvenient, but still not impossible.
in this page :
How to block external http requests? (securing AJAX calls)
I might also use PHPIDS. But for know I think I will stick with my solution, I add another column and hold the user's "yesterday-score" and if the user get more than 100 score today I will know he is defenetly cheating so I won't increment extra score.
Add a hidden field to form, and put in - md5(session_id())
if answer is correct - session_regenerate_id();
Related
I am working on a little package using PHP and MySQL to handle entries for events. After completing an entry form the user will see all his details on a page called something like website.com/entrycomplete.php?entry_id=15 where the entry_id is a sequential number. Obviously it will be laughably easy for a nosey person to change the entry_id number and look at other people's entries.
Is there a simple way of camouflaging the entry_id? Obviously I'm not looking to secure the Bank of England so something simple and easy will do the job. I thought of using MD5 but that produces quite a long string so perhaps there is something better.
Security through obscurity is no security at all.
Even if the id's are random, that doesn't prevent a user from requesting a few thousand random id's until they find one that matches an entry that exists in your database.
Instead, you need to secure the access privileges of users, and disallow them from viewing data they shouldn't be allowed to view.
Then it won't matter if the id's are sequential.
If the users do have some form of authentication/login, use that to determine if they are allowed to see a particular entry id.
If not, instead of using a url parameter for the id, store it in and read it from a cookie. And be aware that this is still not secure. An additional step you could take (short of requiring user authentication) is to cryptographically sign the cookie.
A better way to implement this is to show only the records that belong to that user. Say the id is the unique identifier for each user. Now store both entry_id and id in your table (say table name is entries).
Now when the user requests for record, add another condition in the mysql query like this
select * from entries where entry_id=5 and id=30;
So if entry_id 5 does not belong to this user, it will not have any result at all.
Coming towards restricting the user to not change his own id, you can implement jwt tokens. You can give a token on login and add it to every call. You can then decrypt the token in the back end and get the user's actual id out of it.
in vue.js app the main focus is working with prospects. prospects have many things like contacts, listings, and half a dozen other objects/tables.
they also have interactions, which could have 30 or more per prospect, while most things like emails or phones would have 1-3 results. I load 50 prospects at a time in to the front end
I'm trying to decide if loading it all into the front end to work 50 prospects at a time is a good idea, or if i should have a json column with interactions as part of the prospects table that i would update each time an interaction is saved, with minimal info like date, type, subject...
it seems like an extra step (and duplicate data, how important is that?) to update the json column with each interaction, but also seems like it would save looking up and loading data all the time
I'm not a programmer, but have been teaching myself how to do something i need done for my business with tutorials and youtube, any opinions from those who deal with this professionally would be appreciated
also, if anyone wants to tell me how to ask this question in a better formatted way, I'm open ears
thanks
Imagine if you have 1000 data, but you are sending only 50 of them, and your user do a filter by price. Will you display only the filtered data from 50 or 1000 of them?
That depends on whether you want to expose all 1000 data to the front end. It's a choice between that, and calling your server api everytime.
If you are calling the server, consider using a cache like Redis to store your results .
Pseudo code.
Request Received
Check Redis Cache - Redis.get('key')
If key exist - return cache.
Else -
check mysql for latest results.
Redis.set('key', latest results);
CreateRequest Received
- Write to mysql
- Redis.delete('key') // next request to view will create new cache with fresh data.
Your key can be anything like, e.g your url ('/my/url')
https://laravel.com/docs/8.x/redis
My app's functionality is like Tinder. I will go through the work flow.
App loads 10 Hunts (like tinder profiles)
User accepts or rejects it
Once user accepts or reject, hunt is removed (marked as seen so that it doesnt come back again )
When Hunts count become 2 , app loads next 10 hunts. ( This is not second page as seen hunts are already removed )
Here is the tricky part. When it queries database again, the hunt would have the 2 hunts which user hasnt yet accepted or rejected. To avoid duplication I avoid first 2 hunts from the response. But problem occurs if the query is run after one more accept or reject. I would remove 2 hunts expecting normal behavior but this would remove eliminate one hunt which is not a duplicate.
What would a best solution would be to get all the hunts which comes after a certain id.I can use WHERE NOT ID IN by passing the ids. But I would like to know if there is a better solution as I see this would be a pretty common scenario .
I hope I made myself very clear.
The solutions which I have thought of but not really liked
Pass ids of the 2 hunts back and exclude them in the results
Remove duplicates from hunts once I receive response back in my app
All suggestions are welcome. I m using Rails so active record solutions are also welcome.
It may be wise to add a status column with values 0,1,2 for unseen, rejected, accepted.
Then, when your user accepts / rejects each item, update that column.
To get the oldest (lowest-id-value) chunk of items for a user, use something like this.
WHERE user=whatever AND status=0
ORDER BY id
LIMIT 10
Build an index on (user, status, id), and MySQL can optimize this query.
I am developing an app with PhoneGap and have been storing the user id and user level in local storage, for example:
window.localStorage["userid"] = "20";
This populates once the user has logged in to the app. This is then used in ajax requests to pull in their information and things related to their account (some of it quite private). The app is also been used in web browser as I am using the exact same code for the web. Is there a way this can be manipulated? For example user changes the value of it in order to get info back that isnt theirs?
If, for example another app in their browser stores the same key "userid" it will overwrite and then they will get someone elses data back in my app.
How can this be prevented?
Before go further attack vectors, storing these kind of sensitive data on client side is not good idea. Use token instead of that because every single data that stored in client side can be spoofed by attackers.
Your considers are right. Possible attack vector could be related to Insecure Direct Object Reference. Let me show one example.
You are storing userID client side which means you can not trust that data anymore.
window.localStorage["userid"] = "20";
Hackers can change that value to anything they want. Probably they will changed it to less value than 20. Because most common use cases shows that 20 is coming from column that configured as auto increment. Which means there should be valid user who have userid is 19, or 18 or less.
Let me assume that your application has a module for getting products by userid. Therefore backend query should be similar like following one.
SELECT * FROM products FROM owner_id = 20
When hackers changed that values to something else. They will managed to get data that belongs to someone else. Also they could have chance to remove/update data that belongs to someone else agains.
Possible malicious attack vectors are really depends on your application and features. As I said before you need to figure this out and do not expose sensitive data like userID.
Using token instead of userID is going solved that possible break attemps. Only things you need to do is create one more columns and named as "token" and use it instead of userid. ( Don't forget to generate long and unpredictable token values )
SELECT * FROM products FROM owner_id = iZB87RVLeWhNYNv7RV213LeWxuwiX7RVLeW12
I'm about to implement a list of topic/argument in my forum, and I'd like to insert a sort of flag like "read/not read yet" for each message, regard each user in my website.
I think at somethings like this : a table watched_topics with id(INT), user(VARCHAR) and topic_id(INT). When a user watch the page, I'll insert (if the data doesn't exist) these information.
When another user will insert a new message in a topic, I'll delete from the table watched_topics all line with that topic_id.
That could provide a trouble : Think about to 9000 topics and 9000 users that have watched all topics : the table will be so big (9000x9000=81000000).
So, I think is not the best strategy to implement this kind of stuff! Any suggestion would be appreciated :)
Cheers
May I suggest a different approach?
Make use of web browser history mechanism.
Every topic can get a new, unique URL every time a new message is added there. It could include the number of messages, last modified time or a combination of both.
If the user did see the topic, he must have visited it, so a properly set up CSS can help identifying the read ones. You can even use some client-side scripts to modify the behaviour of the page based on that.
Another way to do that would be to keep the watched topics table the way you want to do it, but also store last visit time in user's profile and show all topics as read that haven't changed since that time.
However it's pretty safe to assume that all users reading all topics is very unlikely.
Your suggestion sounds good. I would make user-field also a foreign key - it gives you a bit more flexibility.
Are you sure all 9000 topics are read by all 9000 users? I mean is this reality? Like you said, topic-entries are deleted when new message is added. And when that happens, another 9000 entries are deleted :)
I would index the table and go with your suggestion (with user_id change). If the table size gets in your way, you can always change the implementation later. Most likely it will never be the issue anyway.
For the deletion: you could save what the latest msg-ID was the user saw. This way you do not have to perform a lot of delete actions every time a msg is posted in a much-viewed topic.