I got this message when I tried to log off from a site:
You are Successfully logged Out of blah blah.It is recommended that you close your browser when finishedto avoid unauthorized reentry.
Whey do they want us to restart the browser?I know it has got to do with session/cookie.What potential threat will not restarting cause?Is there a way to avoid this restart and still be safe?
Thanks
Theoretically, if you don't close your browser, someone can sit down at your machine and click the back button to see information associated with whatever you were doing - that could be bank accounts, credit card info, personal information - whatever. Unless the site screwed up horribly in their disabling of your session, they can't actually look at anything you didn't look at in that session, or actually change anything, but just being able to see the information might be all they need.
The reason this happens is that browsers tend to cache pages - if you click the back button, it will often load the page from its cache, rather than download it all again.
Of course, it's entirely possible that your browser decided not to cache these particular pages - I seem to recall that at least IE will never cache HTTPS traffic, although I could be wrong - but by showing the message at logout time, then you can't really say you haven't been warned.
Not all browsers actually remove deleted cookies until they are shut down.
I don't think restarting the browser will make any change to cookies. We can customize browsers to clear all private data when the browser starts. So by restarting in these cases can be useful. Otherwise there won't be any advantage of doing this.
The only thing that restarting the browser would do is that it would delete session cookies. I don't see any reason why the site couldn't delete it's own cookies, so it shouldn't be a problem.
Another possibility (aside from cookies) is that they are using Basic Authentication or Digest Authentication. With these mechanisms, there is no portable way to tell the web browser to clear the site's authentication details.
Related
I was checking the Application tab in the Chrome DevTools to debug an analytics issue and I noticed that the cookies section was empty, although the cookies menu under the lock icon in the chrome search bar seemed to show the cookie that I was interested in.
I did a bunch of refreshing, clearing site data, and restarting Chrome with no change in behaviour.
Is this a bug in Chrome? Why is there a discrepancy between the two menus? Does it have to do with httpOnly or secure cookies?
Not a very important question, mostly just curious!
I don't have an authoritative answer on this, but I think the explanation is in the language on top of this dialog: It says "The following cookies were set when you viewed this page". So it just shows whatever was set originally – or even "at some point" when you viewed the page. The message is not clear on this detail.
I can definitely confirm from my own tests that it will keep listing cookies that have since been deleted. Dev tools however always show the current state of cookie storage.
Whether it's a bug or not I suppose one could discuss at length, seeing as the behavior is not in conflict with the description. But it's certainly not expected behavior, and its usefulness for non-technical users is I think at least questionable.
This also lists "folder" type entries for indexed DBs, local storage and session storage which are not useful at all because you can't drill down. All of this and the missing punctuation on the message on top makes me think this is orphaned or forgotten code.
I also have no idea where that info is stored or how one might be able to purge it.
ETA: After some more experience with this discrepancy, I have since come to the conclusion that the list shown via the lock icon will show any kind of write access to the cookies. Since there is no dedicated "delete" operation for cookies, but you delete them by setting them with an expiration time in the past, deletions would still count as write access. So it looks pretty much like the dev tools seem to be only showing cookies that are still effectively there, i.e., would be sent to the server in a request, while the list in the lock icon is just a log of write accesses to cookie storage. This is still not definitive, but completely fits all my observations so far.
I'm having some issues with Chrome canceling some HTTP requests and I'm suspecting cached authentication data to be the cause. Let me first write down some important factors about the application I'm writing.
I was using Basic Authentication scheme for some time to guard several services and resources in my web app.
In the meantime I was using/testing the app heavily using Chrome with my main Google Account fully synced. Most frequently I was using my name - "lukasz" - as the username in Basic Auth.
Recently I have switched my application to use Digest Authentication.
Now, some of the HTTP requests I'm making are failing with status=failed with no apparent reason. It only happens when I'm using user "lukasz", if I enter some other unique username - there is no problem.
I looked everywhere in the backend and frontend and I couldn't locate the issue to be in our code. I can easily reproduce this with user "lukasz" each time. So I reverted my code to Basic Auth (while not touching the rest of app) and the problem was gone.
That led me to think that there is something wrong with cached passwords. So I cleared the cache in Chrome, but that didn't help. After several hours of analyzing the issue I decided to make sure that I'm running fresh instance of Chrome, so I reinstalled it (deleting the disk data along the way). TADAAA! The problem was gone and I couldn't reproduce this anymore.
Then I synchronized my Google Account with this newly installed Chrome and after a short while the requests to my app started failing again!! So I took a deeper look at this (cleaning profile data from disk and redoing all the steps) and indeed it looks like the problem starts as soon as my account is synced with cloud!
Yes, I know it sounds dodgy. It sounds ridiculous. It sounds stupid. But I am almost sure that those two problems are somehow related (failing requests and account sync).
My idea is this: Chrome somehow remembered that I was using "lukasz/my-pass" with Basic Auth for certain services. After I switched to Digest Auth the same combination of credentials (lukasz/my-pass) is now acting funny. Perhaps under the hood Chrome still thinks that this is Basic Auth and cancels requests when it learns otherwise?
UPDATE:
I've did some low level debugging with chrome://net-internals/ and it appears that the problem is while reading cache entry. This seems to prove my initial assumption.
I did some investigation and found this article. Apparently always adding "Last-Modified" header to my http response has solved the issue in Chrome (I'm still having some problems in FF, but that's off topic).
However, it still doesn't solve my issue entirely. Why the requests were failing in the first place?
You could try using incognito mode and see what happens. It may give you some hints without having to clear the cache or re-installing Chrome.
Also take a look at How to clear basic authentication details in chrome
I need to capture image from web page without security warning.
Page where i need webcam functionality can not be switched to https protocol.
I've installed root certificates and made them trusted.
I tried to insert iframe (which pointed to secure protocol https://mysecurepage.com) inside page (http://mypage.com), but not worked.
#bjelli is correct - this is a major security flaw for any internet content. Just imagine if you could go to a website which would start taking photos/recording everything going on without any permissions or notifications!
However, I am working on an intranet project where disabling the prompt would be quite safe.
If you are in this sort of position - there is one thing you can do;
Google Chrome Policies
If you are deploying the browser, you can override the security prompt for sites you specify. I don't know if you are working in such an environment, but this is the only way you can avoid the prompt all together. Similar things probably would apply for other browsers too.
As defined in http://www.w3.org/TR/mediacapture-streams/
When the getUserMedia() method is called, the user agent MUST run the following
steps:
[9 steps omitted]
Prompt the user in a user agent specific manner for permission to provide the
entry script's origin with a MediaStream object representing a media stream.
[...]
If the user grants permission to use local recording devices, user agents are
encouraged to include a prominent indicator that the devices are "hot" (i.e. an
"on-air" or "recording" indicator).
If the user denies permission, jump to the step labeled failure below. If the
user never responds, this algorithm stalls on this step.
If a browser does not behave as described here it is a serious security problem. If you find a way of making a browser skip the "permission" you have found a security problem.
What do you do if you find a security problem?
Report it IMMEDIATELY! Wikipedia: Vulnerability Disclosure
Firefox: http://www.mozilla.org/security/#For_Developers
Internet Explorer: http://technet.microsoft.com/en-us/security/ff852094.aspx
Safari: https://ssl.apple.com/support/security/
Chrome: http://www.google.com/about/appsecurity/
Opera: http://www.opera.com/security/policy
This is not just a question of technical possibilities, it's also a question of
professional ethics: what kind of job would I not take on? should I be
loyal to my customer or should I think of the welfare of the public? when do I
just follow orders, when do I stop bad stuff from happening, when do I blow the whistle?
Here are some starting points for computing professionals to think about the ethics of their work:
http://www.acm.org/about/se-code
http://www.acm.org/about/code-of-ethics
http://www.ieee.org/about/corporate/governance/p7-8.html
http://www.gi.de/?id=120
Some browsers does not asks the client if he want use the application cache, but simply just downloads the whole thing at first visist (e.g. the browser on android). That can cause troubles when the application cache is many MB's and the client is on a mobile network - that is expensive! And is it possible to stop the cache from being downloaded? Pressing the menu button on e.g. android will not close the browser, it will be running in the background.
Is it a good idea to only add the manifest based on a cookie set when the client pushes a "hey i want to offline cache this site" button? Will that cause any new challenges?
I've tested this, and it seems to work. Needs more testing though. Can provide a link to the site in about two weeks if anyone is interrested.
One possible solution, as a you hinted at yourself, would be to ask the user if he/she wants to cache the app offline, and if so, only then redirect to a page which has a link to the manifest file (which is subsequently downloaded by the browser). You cannot stop a download midway unless of course, your network disconnects mid way.
I have a page that is just a non interactive display for a shop window.
Obviously, I don't link to it, and I'd also like to avoid people stumbling across it (by Google etc).
It will always be powered by Chrome.
I have thought of...
Checking User Agent for Chrome
Ensuring resolution is 1920 x 1080 (not that useful as it is a client side check)
Banning under robots.txt to keep Google out of it
Do you have any more suggestions?
Should I not really worry about it?
Not that I would EVER recommend what I'm about to suggest - how about filtering by IP address. Since you provider IP is rarely going to change you can use Javascript to kick out or deny requests from IP addresses other than yours. Maybe a clean redirect to http://www.google.com or something silly like that. Although I would still suggest locking it down with a login and password and just have it write a never expiring cookie. That's still not a great idea but a shy bit better than the road your trucking down right now.
You could always limit the connections by IP address (If you know it ahead of time/it's reliable):
Apache's access control
If it is just for a shop window, do you even need access to a web page?
You can host the file locally.
Personally, I wouldn't worry about it, if no-one is linking to it externally it is unlikely to ever be found by search engines.