iOS13 and iOS12 iframe crossdomain setcookies policy is different - html

iOS 12 first time cannot set cookies, but its work after 2nd tried.
iOS 13 cannot set cookies even refresh the page many times.
Anyone face this new issue?
or have the solution for setcookies
My situation as below
step 1 :Site A > Site B
- site B will set cookie
step 2 : site B return a response link for site A
- site A use the cookies set at site B and then use the value to append in response link

Related

Finding out how to get this specific URL

In another question a user used this url(1) which contains a data table and somehow converted the code into this url(2) in order to scrape using json and beautiful soup.
My question is how do you get the second url which is scrape friendly given the first url?
The user which somehow got the 2nd url was asked how he got it and it has been a while and he never responded..Here is a link to the original thread.
You can do that by using Google Chrome Developer Tools (and with other browsers as well).
Open Google Chrome browser and navigate to the first URL
Open Developer Tools (⌘ + option + i)
Head to "Network" tab
Click on "Preserve log" and on "XHR" (since it's an XMLHttpRequest)
Reload the page and you'll see the XMLHttpRequest to the second URL
Note: In this case I guessed that it was loaded by an XHR but I'd recommend to click "All" instead of "XHR" next time. You'll see more results and you'll need to filter and/or take some more time to find the call/request on the matter, but it'll be more accurate.

Chrome sends two requests when downloading a PDF (and cancels one of them)

I noticed that whenever you download a PDF in Chrome, it consistently makes two requests, and then cancels one of them. This is causing the request to be registered twice in my Web app, which don't want. Is there a way to get Chrome to only make one request for PDFs?
I've researched this topic quite a bit now, and I have not found a sufficient answer. Closely-related answers suggest that the problem is that Chrome is looking for a favicon, but the network tab shows that it is actually making the same request twice, and then canceling the second request.
Is there a way to prevent Chrome from making the second request?
Below is a link to a random PDF file that I found through Google which when clicked should demonstrates the behavior. I would've posted a picture of my network tab in devtools but this is my first post on Stack Overflow, and the site is prohibiting me from uploading a picture.
https://www.adobe.com/enterprise/accessibility/pdfs/acro6_pg_ue.pdf
It looks like a bug in Chrome: https://bugs.chromium.org/p/chromium/issues/detail?id=587709
The problem is that Chrome, when it loads an iframe that returns a PDF stream, writes an "embed" tag inside that iframe which again contains the same URL as the iframe. This triggers a request for that URL again, but Chrome immediately cancels it. (see the network tab)
But by that time, the damage is done.
We have the same issue here, and it does not occur in Firefox or IE.
We're still looking for a good solution to this problem.
I'm still trying to find a proper solution but as a partial "fix" for now you could have two options
1) set the content disposition to "attachment" in the header
setting that to "inline" cause chrome to run a second cancelled call
so for example you can do something like that (nodejs resp in example)
res.writeHead(200, {
'Content-Type' : 'application/pdf',
'Access-Control-Allow-Origin' : '*',
'Content-Disposition' : 'attachment; filename=print.pdf'
});
unfortunately this solution will force the browser to download the pdf straight away instead of rendering it inline and that's not maybe desiderable
2) adding "expires" in the headers
this solution will always fire a second cancelled call but it's ignored by the server
so for example you can do something like that (nodejs resp in example)
res.writeHead(200, {
'Content-Type' : 'application/pdf',
'Access-Control-Allow-Origin' : '*',
'Content-Disposition' : 'inline; filename=print.pdf',
'Expires' : new Date(new Date().getTime() + (60000))
});
I had the same problem in an iframe. I turned of the PDF Viewer extension and the problem disappeared. I'm thinking the extension downloads the file twice. The first time to get the size, the second time to download with a progress bar (using the size gathered in the first request)
I've tried the other solutions and none worked for me, I'm a little late, I know, but just for the record, I solved this in the following manner:
Adding the download attribute:
In my case I was using a form, so it goes like this:
<form action="/package.zip" method="POST" download>
This worked on Brave and Safari, which previously showed the same problem, I think it will work for Chrome.
With my case, problem wasn't browser related. I've noticed our scrollbar plugin's (OverlayScrollbars) DOM manipulations reloads embedded pdf data and calls controller more than once due to on plugin's construct or destroy events. After I've initialized scrollbar before DOM is ready, problem is solved.

When does omniture video use trackingServerSecure vs trackingServer?

I have the following config flags in my com.omniture.AppMeasurement setup (swc version FAS-3.4.7)
trackingServer: "metrics.my-site.com",
trackingServerSecure: "smetrics.my-site.com"
and when visiting https://my-site.com and I'm seeing beacons go to metrics.my-site.com instead of smetrics.mysite.com. The page tracking is reporting to the secure page, but the swf based video metrics are not. The html5 fallback is also behaving correctly, it's only the swf that has this problem. (The swf is also being loaded via https)
What determines where the beacons are going and if it's just the parent url, what else could cause this to default to the http beacon url?
Looks like this might work:
actionSource = new ActionSource();
actionSource.ssl = true;

CFCookie dies on http meta reload

So I'm writing an experiment program. One of the steps include querying an entry to see if people are ready to move on. I'm more used to PHP, so a "" always did the trick. However, with Coldufusion the following page [posted on Pastebin at the bottom of the page] runs through once, does the meta refresh, and than the cookie dies.
So with flags, I see that the cookie exsited during the first run around, but second and so forth, the cookie dies and brigns the entire experiment to a halt.
So my question is does Coldfusion's cfcookies randomly die after a meta refresh? If so, is there a ColdFusion workaround?
The page with the problem : http://pastebin.com/1BJLahHZ
The page that pulls information from a form and stores it into a cookie : http://pastebin.com/ekP5Ea0U
*The timer on the cookie is two hours [timer = createTimeSpan(0,2,0,0)] so I am pretty sure it's not that.
Thanks ahead.
You can't create a cookie, and then immediately follow it with a cflocation; the http headers necessary to pass the information to your browser to communicate that a cookie is created are flushed away when a cflocation occurs.
Re-design your logic so that your <CFCOOKIE> sets are done on pages that have no chance of being redirected away.
Cookie won't ever get set when you use <cflocation url="http://cbees-dev/newTR3/wait.cfm"> as this occurs before the page is loaded and rendered to the client, thus the cookie is never set.
Use javascript instead.
<script type="text/javascript">
location.href='http://cbees-dev/newTR3/wait.cfm';
</script>

problem to get the Referer page

I'm trying to get the referer page, but i have a problem , sometimes i get bad the referer page,
for example:
i have 3 pages, when the page 1 link to page 2 , and the page 2 make a process and after redirect to page 3, so when i try to get the referer page in the page 3, i get the page 1 and not the page 2,
I think that the problem is the page 2 this page doesn´t show anything to the user, is only a page who make a procedure.
Do you have any idea how i can to get the referer page correctly??
Thanks.
I'm using TCL with openacs
It's difficult to answer without knowing exactly what you're trying to do. If page 2 is only calling a procedure, what about putting the contents of page 2 into an ad_proc, and then calling that proc in page 3? Or can page 2 redirect to other places when it is finished?
If you give more info, I'm sure I can help. The normal way I would pass referer info in OpenACS is to use a variable called return_url, which I pass from one page to the next as a hidden form element. There are lots of examples of that in OpenACS. Alternatively you could use ad_set_client_property to store it on page 1 and then on page 3 use ad_get_client_property to read it.
Thanks to everyone,
I already solve my problem using the < meta HTTP-EQUIV="REFRESH" content="0; url=page3" >, but in openacs there is a function that do it.
I relplace the
ad_returnredirect
by
util_ReturnMetaRefresh
so in this way i already can read the correct referer page
how are you sending the user from page 2 to page 3? with php:
enter code hereheader("location:")
or html redirect?
if you are using header("location:") it will probably not work. try using html redirect like
<meta HTTP-EQUIV="REFRESH" content="0; url=page3">