I've built a web application using CORS for communicating with the API.
The API accept all origin and some headers and is written using Play!Framework.
On every request made to the app, I add these headers :
override def doFilter(action: EssentialAction): EssentialAction = EssentialAction { request =>
action.apply(request).map(_.withHeaders(
"Access-Control-Allow-Origin" -> "*",
"Access-Control-Allow-Methods" -> "GET, POST, PUT, DELETE, OPTIONS",
"Access-Control-Allow-Headers" -> "Origin, X-Requested-With, Content-Type, Accept, Host, Api-Token"
))
}
Everything works great in Firefox and Chrome in the desktop (tested under Windows and Linux), but fails on my Android phone (using Chrome browser and the native browser).
I enabled the Chrome debugging via USB, and I can clearly see that Chrome doesn't go further than the OPTIONS request made to the server. Here's the request made :
Request URL:http://127.0.0.1:9100/auth/login
Request Headersview source
Access-Control-Request-Headers:accept, origin, x-requested-with, content-type
Access-Control-Request-Method:POST
Origin:http://192.168.0.15:9000
Referer:http://192.168.0.15:9000/login
User-Agent:Mozilla/5.0 (Linux; Android 4.1.1; HTC One S Build/JRO03C) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/27.0.1453.90 Mobile Safari/537.36
And it stops here, no response from the server (but the Network tabs indicate "Pending" for this request !).
So now I don't know what is wrong. How can I fix this problem ?
It may be a red herring, but I see in the headers you show the string:
Request URL:http://127.0.0.1:9100/auth/login
It seems to me that it tries to connect to localhost (127.0.0.1) but your mobile doesn't have a running server, so it fails.
Related
I am using watir for testing an internal application. The testing recently stops working.
These are the simplified steps of the test:
require 'watir'
Selenium::WebDriver::Chrome.path = 'PATH_TO_CHROME_EXE'
browser = Watir::Browser.new :chrome, :options => {:options => {'useAutomationExtension' => false}}
URL = "TEST_URL"
browser.goto URL
After the goto line executed, the browser failed to navigate to the page. When inspecting the network activity, The request status showing as canceled. I also noticed the "Sec-Fetch-User" field populate in the header:
Sec-Fetch-Mode: navigate
**Sec-Fetch-User: ?1**
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/77.0.3865.90 Safari/537.36
If I directly enter the test URL into the browser, the field is not populated and the login pop-up.
This is my setting:
jruby 9.2.5.0 (2.5.0) 2018-12-06 6d5a228 Java HotSpot(TM) 64-Bit Server VM 25.102-b14 on 1.8.0_102-b14 +jit [mswin32-x86_64]
Version 77.0.3865.90 (Official Build) (64-bit)
Watir 6.16.5
ChromeDriver 77.0.3865.40 (f484704e052e0b556f8030b65b953dce96503217-refs/branch-heads/3865#{#442})
Did anyone run into similar issue? Would it be possible to suppress the header?
Steve
I run into exactly the same problem of Sec-Fetch-User: ?1.
The only way I could get to load the webpage I wanted was to ditch the chromedriver and use the firefox driver instead. Firefox did not send those headers.
I think that, if you really need to use chromedriver, and if you need to change the headers is to use a proxy like https://github.com/lightbody/browsermob-proxy
I'm storing my static files on different domain and I want my remote static server returns files with CORS headers, but my server should get "Origin" header in request to allow CORS headers. Unfortunately Chrome and FF doesn't send this header when I try load my js script with usual
<script type="text/javascript" src="https://my-another-static-server.com/js/script.js"></script>
When I see in Chrome's network panel I see only these request headers for this request:
Accept:*/*
Accept-Encoding:gzip, deflate, sdch
Accept-Language:ru,en-US;q=0.8,en;q=0.6
Cache-Control:no-cache
Connection:keep-alive
Host:my-another-static-server.com
Pragma:no-cache
Referer:http://localhost:63342/example.html
User-Agent:Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.71 Safari/537.36
I know that browsers add Origin header for cross-domain request if we use XMLHttpRequest, but here we send request with native browsers technology.
There is tag crossorigin=anonymous that solved this problem. Now when I load script my browser also send Origin header.
In a commercial web application that uses POST requests frequently, it was necessary to perform history.go(-1) to trigger back navigation. As may others has experienced, I am getting -
Confirm Form Resubmission ERR_CACHE_MISS
error in Chrome. However, same works just fine in Firefox (this is no a duplicate, please read on).
It is true that using POST requests to render content is the reason leading to above problem (without using Post/Redirect/Get design pattern).
However, while looking for an alternative solution, it was observed that in certain web sites / apps it is possible to navigate back in Chrome (cache hit) where as in some sites it fails. I have inspected all the HTTP headers from successful sites and it looks like HTTP headers are not making any difference.
Could it possibly be the browser acting differently based on the SSL certificate used by the web application or what could be the reason for back navigation working in certain web sites?
Example working web applications :
http://gmail.com/ - Enter some random email. Click next. Enter incorrect password multiple times and use browser back button to navigate back.
https://support.hostgator.com/ - Enter some random text in search box (do this several times). Use browser back button to navigate back.
POST request used in failing web application :
POST /post3.jsp HTTP/1.1
Host: 192.168.1.111
Connection: keep-alive
Content-Length: 18
Cache-Control: max-age=0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
Origin: https://192.168.1.111
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.99 Safari/537.36
Content-Type: application/x-www-form-urlencoded
Referer: https://192.168.1.111/post2.jsp
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.8
Identified that in Chrome back navigation fails in above work flow, unless SSL Certificate used in HTTPS communication is a valid, trusted certificate.
If you are using a self signed certificate, add the CA certificate as a Trusted Root Certification Authorities and everything should work as expected.
I am using $http.get mthod to hit the mysql services to get the records of particulars table, even post and put is working fine. it fails on get method.
In request header we have like this
Provisional headers shown
Accept-Language:en-US,en;q=0.8
Access-Control-Request-Headers:accept, authorization
Access-Control-Request-Method:GET
Host:d.example.com
Origin:http://localhost
Referer:http://localhost/xxxx/
User-Agent:Mozilla/5.0 (Windows NT 5.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.125 Safari/537.36
It shows "Provisional headers shown" error if this error happens it fails the api. can you give some suggestions where it goes wrong , actually i set all the headers in mysql services
header("Access-Control-Allow-Origin: *");
header("Access-Control-Allow-Headers: Origin, X-Requested-With, Content-Type, Accept, Authorization");
header("Access-Control-Allow-Credentials: true");
header("Access-Control-Allow-Methods: GET, POST, OPTIONS");
header("Accept: text/html,application/json,text/plain");
header("Content-Type: application/json");
I found a solutions ....
Actually am using two services one for mysql and other for micro service mongoDB
So the problem is passing authentication header for mysql , now i remove the header for mysql service calls and the problem solved.
using below code
$http.defaults.headers.common = {};
Which may help for others.
So I understand how a manifest file works and I am getting all the resource I need from the network tab in Chrome's developer console. When I turn off my server and run the webpage offline it works. But after the first refresh the page has the error HTTP Error 404. The requested resource is not found. I check what file was missing and it was the text/html that was the webpage and when I check the header
Request URL:https://offline1.exactbid.net/
Request Method:GET
Status Code:404 Not Found
Request Headersview source
Accept:text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Encoding:gzip,deflate,sdch
Accept-Language:en-US,en;q=0.8
Cache-Control:max-age=0
Connection:keep-alive
Host:offline1.exactbid.net
Referer:https://offline1.exactbid.net/
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/28.0.1500.63 Safari/537.36
Response Headersview source
Connection:close
Content-Length:315
Content-Type:text/html; charset=us-ascii
Date:Fri, 28 Jun 2013 23:09:27 GMT
Server:Microsoft-HTTPAPI/2
UPDATE
So it seems that after I refresh the page everything I had in the appcache disappeared. Does it have to do with the Cache-Control:max-age=0, I read some stuff online and it says that it is similar to no-cache. is that correct?
It seems that when I try to start the page offline it says that the cache is obsolete
OK after experimenting I figure the what but not the why. I was using ASP. net for the web page and I did not have a site master page so I was declaring it in my default.aspx directly. But after I changed my code to have a site master page and declares the manifest file there it fixed the problem. When I am offline and refresh the manifest download fails (suppose to happen) and does not label the appcache as obsolete. I hope this helps someone. If someone knows the why I would mark that as the answer until then this is it