I am using an API bot for Stake.com, and for that, I need the cl_clearance. However, the cl_clearance is not updating when I take any action on the site. After clearing my cookies, it is not even showing up as a cl_clearance value in F12 -> Network. What can I do to get a cl_clearance value back again? I have already tried clearing cookies, refreshing the page, and using a proxy, but nothing is working to get the cl_clearance back to put into my code.
Related
I have a web application and we are calling a third party to process some data. Once it's done, the third party will redirect back to my application (It's a post redirection). To keep the session, we are using cookies. After the google chrome update, where the default values for samesite=Lax, I've updated our cookies to pass as samesite=None; Secure to overcome this issue. Now after google chrome version 91, this implementation is not working and I'm getting a session expiry issue. Can somebody help to fix this issue for google chrome version 91 and after? I'm using java
The best that we have been able to come up with is a client side meta refresh. When the third party posts back to our application, we have a page filter that will send it to a "refreshMeta" page similar to https://www.w3.org/TR/WCAG20-TECHS/H76.html. This has to happen without calling .getSession() anywhere because that will cause a new session to be created. This causes the page to refresh in the browser and send all original cookies back to the server because its coming from the same domain and a new session wasnt created.
I will say this worked for a while but it looks like there was change in Tomcat that's preventing this approach from working like it did on earlier versions, which is why I'm back looking for another solution.
I am trying out Google Colab, but then I keep getting this pop up box that says:
Error
Could not access the resources needed to display output.
This is probably because third-party cookies are not allowed by your
browser.
NotSupportedError: Failed to register a ServiceWorker: The user
denied permission to use Service Worker.
While turn off Block third party cookies worked for me from here, I would like to keep the setting to be turned on at all times for the sake of our privacy.
Currently I will have to have another tab next to the Colab tab the so that after I have finished using Colab, I can turn it off right away and not to forget it. BUT I would have to do it every time I use Colab.
To solve this, I have tried to follow the chrome help guide and added https://colab.research.google.com and [*.]google.com to the Allow whitelist on Cookies. However the error pop up would still show. I also tried https://colab.research.google.com[/*], but chrome said its not a vaild domain.
Is there a way to allow Colab domain cookies for that?
The output cell is an <iframe> element. It has a url like
https://jbe1910iol-colab.googleusercontent.com/v2/usercontent/8b5e8f2bbe60490e/outputframe.html
So, you can try adding [*.]googleusercontent.com to the whitelist as well.
Not sure if it will work though. Hope it does.
For Google Chrome, go to the cookies settings page
Type this in the address bar
chrome://settings/content/cookies
In Allow section, click add button.
paste this [*.]googleusercontent.com
That's all.
As Korakot Chaovavanich's explanation, I added steps
I've noticed that Google keeps visiting some of my url:s each time i boot up Google Chrome, does anyone know why this might be?
This wouldn't be much of a problem, except that it keeps hitting an login-url for a system I've built. And each time there's an unknown login-call I receive a text message... so, it's kind of annoying.
The IP range i keep receiving this visits from is 66.102.9.*.
Sure, I could block this ip-range. But first I'd like to know why I keep receiving this visits. Does anyone have any ideas?
Perhaps it is your Chrome's starting page and you could change it in the settings.
That's where I'd start, unless you have already checked that.
If that's not it, try the Google Chrome forums
When you use Google Chrome, it sends GET requests to Google's servers for the bowser's update checks and for the Chrome apps updates.
Chrome sends requests to multiple URLs when it’s checking for and downloading updates. The order of requests is determined dynamically at runtime. Both HTTP and HTTPS protocols might be tried. The following URL list of hostnames and paths can change at any time without notice:
www.google.com/dl/*
*.gvt1.com
tools.google.com/service/update2
dl.google.com/*
google.com/dl/*
clients2.google.com
update.googleapis.com/service/update2
So basically the first hit is returning 302 and the user agent (in this case my web browser) is invited to send the SAME hit again to another url but this time even when im getting a 200 the hit is sent incorrectly because lots of the params are not being send. (This happens sometimes not always)
First Hit:
Second Hit:
To clarify a little more im getting this problem with an addToCart (EE) and its causing me to products sales without being added to cart. I checked the dataLayer and everything seems to be ok.
So im in doubt if its a problem with my web browser or its a problem with analytics?
If version of the web browser are needed you can read it on the user-agent of the images.
I had the same issue when I tried to manually insert the google analytics tracking code(gtag) in my .
First step was to remove the Google analytics code from the header. I had used the plugin 'Insert Headers and Footers'.
Afterwards I installed Insights Monster plugin, went through the launch wizard and the 302 redirect disappeared.
I didn't really want to use a heavy plugin to resolve this issue but unfortunately it was the only solution for me.
I copied and answer from here: https://tonnygaric.com/blog/prevent-google-analytics-from-making-requests-to-stats-g-doubleclick-net you need to do a configuration in Google Analytics dashboard.
The answer is pretty simple. Go to GA -> Admin -> Tracking info -> Data Collection -> disable both toggles and click on save. It can take up to 24 hours for the changes to take effect.
I spent the day debugging my website, because I seem to be getting randomly logged out, but just on Chrome on Android.
After reviewing the server logs, I see requests from my Android tablets IP hitting my server for links that I never clicked on. After some experimentation I see every couple of links I click, Chrome will fetch another link at random in the page that was not clicked.
The issue is that there is an a link in the page with an href="logout" which will sign in, and Chrome calls this all on its own, disconnecting the session. I suppose if I changed the link to call a POST operation Google would not fetch it, but I can't see why Google would be fetching links that were not clicked.
This is very odd, and does not occur on Firefox, or Chrome on Windows.
Not sure if this is some sort of virus on the tablet, or some thing Google is doing to check the content of pages that it would not have access to without the session.
I have seen Google do this before, but only when there were Google ads in the page, then Google would fetch the links twice so that it could get the page content to choose the ad. This seems a huge privacy issue, as Chrome is fetching private data from the session.
So the issue is that Chrome is using a new feature "prefetching".
This is a "feature" that will have Chrome randomly fetch any linked URL from the page.
This seems like a very error prone "feature" for Chrome to enable by default. Seems like it could give the user cached or stale data, or change the server's state causing obscure difficult to debug issues. This will also use double the amount of the user's data (and server's CPU), which you would think would not be desirable to most users that pay for their data.
I confirmed this by disabling the feature in Chrome.
My solution was to switch the logout call to use a POST through a element.
See, https://www.technipages.com/google-chrome-prefetch