httpd-2.4.18 mod_http2 works with curl and nghttp but doesn't work with the browser - google-chrome

I have installed httpd-2.4.18 with nghttp 1.6.0 and curl 7.46 to work with an http2 server. It seems that the server works with http2 when I test it with Curl and nghttp comands (as you can see below), but when I use the browser( Google Chrome 47.0.2526.106) the response headers are http1 instead of http2, and the Spdy indicator is grey (should be blue). Does anybody know why?
Commands Used
Curl command used that says me that http2 works properly:
eloy#eloy-OptiPlex-745:/usr/local/apache2/logs$ curl --http2 -I http://localhost
HTTP/1.1 101 Switching Protocols
Upgrade: h2c
Connection: Upgrade
HTTP/2.0 200
date:Thu, 07 Jan 2016 21:38:06 GMT
server:Apache/2.4.18 (Unix) OpenSSL/1.0.2e
last-modified:Mon, 11 Jun 2007 18:53:14 GMT
etag:"2d-432a5e4a73a80"
accept-ranges:bytes
content-length:45
content-type:text/html
The same with nghttp2, it seems that http2 server is working properly with the following command:
eloy#eloy-OptiPlex-745:/usr/local/apache2/logs$ nghttp -uv http://localhost
[ 0.000] Connected
[ 0.000] HTTP Upgrade request
GET / HTTP/1.1
host: localhost
connection: Upgrade, HTTP2-Settings
upgrade: h2c
http2-settings: AAMAAABkAAQAAP__
accept: */*
user-agent: nghttp2/1.6.0
[ 0.001] HTTP Upgrade response
HTTP/1.1 101 Switching Protocols
Upgrade: h2c
Connection: Upgrade
[ 0.001] HTTP Upgrade success
[ 0.001] recv SETTINGS frame <length=6, flags=0x00, stream_id=0>
Response headers from browser:
HTTP/1.1 304 Not Modified
Date: Thu, 07 Jan 2016 21:49:40 GMT
Server: Apache/2.4.18 (Unix) OpenSSL/1.0.2e
Connection: Upgrade, Keep-Alive
Keep-Alive: timeout=5, max=100
ETag: "2d-432a5e4a73a80"
Request Headers from browser:
GET / HTTP/1.1
Host: localhost
Connection: keep-alive
Cache-Control: max-age=0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 Safari/537.36
Accept-Encoding: gzip, deflate, sdch
Accept-Language: es-ES,es;q=0.8
If-None-Match: "2d-432a5e4a73a80"
If-Modified-Since: Mon, 11 Jun 2007 18:53:14 GMT

Browsers do not support HTTP/1.1 to HTTP/2 upgrade requests.
The only way to use HTTP/2 from browsers is via TLS and ALPN.
Having said that, your "Request headers from browser" above are actually response headers and viceversa, so it's difficult to tell what you are actually doing. The request headers lack the necessary upgrade bits.
If you make a clear-text request from a browser (i.e. using the http scheme), then the browser will not try to upgrade and you will stay in HTTP/1.1 mode.

Related

How to bypass Cloudflare 503 Please turn JavaScript on and reload the page

I know this might be a frequently asked question, but I believe this is a different one.
Cloudflare prevents programmatically sent requests by responding status code 503 and saying "Please turn JavaScript on and reload the page.". Both python requests module and curl command raise this error. However, browsing on the same host with Chrome browser is fine, even if under "Incognito" mode.
I have made these attempts but failed to bypass it:
Use cloudscraper module. Like this
Copy all the headers including user-agent, cookie from opened browser page. Like this
Use mechanize module. Like this
Use requests_html to run JS scripts on the page. Like this
According to my inspections, I found that, in a newly opened Chrome Incognito Window, when visiting https://onlinelibrary.wiley.com/doi/full/10.1111/jvs.13069, the following requests happens:
Browser send request to the url with no cookies. The server responds 302 to redirect to the same url with a cookieSet=1 query param, i.e. https://onlinelibrary.wiley.com/doi/full/10.1111/jvs.13069?cookieSet=1. The response also contains set-cookie headers. The response has no body.
Browser send request to the redirected url, with the set cookies. The server responds 302 to redirect to the original url with no query param. The response contains no set-cookie header and has no body.
Browser send request the original url, with the previously set cookies. The server responds 200 with the HTML contents we'd like to see as its body.
However, in a curl request without redirection enabled (i.e. without -L arg), I got status code 503 and a HTML response body which says Please turn JavaScript on and reload the page..
curl -i -v 'https://onlinelibrary.wiley.com/doi/abs/10.1111/jvs.13069' \
--header 'accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9' \
--header 'accept-encoding: gzip, deflate, br' \
--header 'accept-language: en-US,en;q=0.9,zh-CN;q=0.8,zh;q=0.7,ja;q=0.6' \
--header 'cache-control: no-cache' \
--header 'cookie: MAID=k4Rf/MejFqG1LjKdveWUPQ==; I2KBRCK=1; rskxRunCookie=0; rCookie=i182bjtmkm3tmujm7wb4xl6fx8wuv; osano_consentmanager_uuid=35ffb0d0-e7e0-487a-a6a5-b35cad9e589f; osano_consentmanager=EtuJH5neWpR-w0VyI9rGqVBE85dQA-2D4f3nUxLGsObfRMLPNtojj-WolqO0wrIvAr3wxlwRPXQuL0CCFvMIDZxXehUBFEicwFqrV4kgDwBshiqbbfh1M3w3V6WWcesS8ZLdPX4iGQ3yTPaxmzpOEBJbeSeY5dByRkR7P2XyOEDAWPT8by2QQjsCt3y3ttreU_M3eV_MJCDCgknIWOyiKdL_FBYJz-ddg8MFAb1N8YBTRQbQAg8r-bSO1vlCqPyWlgzGn-A5xgIDWlCDIpej0Xg2rjA=; JSESSIONID=aaaFppotKtA-t7ze73Rjy; SERVER=WZ6myaEXBLGhNb4JIHwyZ2nYKk2egCfX; MACHINE_LAST_SEEN=2022-08-05T00%3A52%3A30.362-07%3A00; __cf_bm=d9mhQ_ZtETjf41X0VuxDl6GkIZbQtNLJnNIOtDoIPuA-1659685954-0-AXLwPXO1kJb2/IQc+zIesAsL71FoLTgRJqS5M5fxizuFMTw92mMT/yRv5cIq6ZMiRcZE1DchGsO2ZZMdv+/P4JSdUDMAcepY/oXIKFQgauELPNrwiwG/7XYXFRy91+qreazjYASX6Fq0Ir90MNfJ8EcWc10KJyGvSN7QtledQ6Lu9B5S1tqHoxlddPAMOtdL6Q==; lastRskxRun=1659686676640' \
--header 'pragma: no-cache' \
--header 'sec-ch-ua: ".Not/A)Brand";v="99", "Google Chrome";v="103", "Chromium";v="103"' \
--header 'sec-ch-ua-mobile: ?0' \
--header 'sec-ch-ua-platform: "macOS"' \
--header 'sec-fetch-dest: document' \
--header 'sec-fetch-mode: navigate' \
--header 'sec-fetch-site: none' \
--header 'sec-fetch-user: ?1' \
--header 'upgrade-insecure-requests: 1' \
--header 'user-agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.0.0 Safari/537.36'
* Trying 162.159.129.87...
* TCP_NODELAY set
* Connected to onlinelibrary.wiley.com (162.159.129.87) port 443 (#0)
* ALPN, offering http/1.1
* Cipher selection: ALL:!EXPORT:!EXPORT40:!EXPORT56:!aNULL:!LOW:!RC4:#STRENGTH
* successfully set certificate verify locations:
* CAfile: /Users/cosmo/anaconda3/ssl/cacert.pem
CApath: none
* TLSv1.2 (OUT), TLS header, Certificate Status (22):
* TLSv1.2 (OUT), TLS handshake, Client hello (1):
* TLSv1.2 (IN), TLS handshake, Server hello (2):
* TLSv1.2 (IN), TLS handshake, Certificate (11):
* TLSv1.2 (IN), TLS handshake, Server key exchange (12):
* TLSv1.2 (IN), TLS handshake, Server finished (14):
* TLSv1.2 (OUT), TLS handshake, Client key exchange (16):
* TLSv1.2 (OUT), TLS change cipher, Client hello (1):
* TLSv1.2 (OUT), TLS handshake, Finished (20):
* TLSv1.2 (IN), TLS change cipher, Client hello (1):
* TLSv1.2 (IN), TLS handshake, Finished (20):
* SSL connection using TLSv1.2 / ECDHE-ECDSA-AES128-GCM-SHA256
* ALPN, server accepted to use http/1.1
* Server certificate:
* subject: C=US; ST=California; L=San Francisco; O=Cloudflare, Inc.; CN=sni.cloudflaressl.com
* start date: Apr 17 00:00:00 2022 GMT
* expire date: Apr 17 23:59:59 2023 GMT
* subjectAltName: host "onlinelibrary.wiley.com" matched cert's "onlinelibrary.wiley.com"
* issuer: C=US; O=Cloudflare, Inc.; CN=Cloudflare Inc ECC CA-3
* SSL certificate verify ok.
> GET /doi/abs/10.1111/jvs.13069 HTTP/1.1
> Host: onlinelibrary.wiley.com
> accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
> accept-encoding: gzip, deflate, br
> accept-language: en-US,en;q=0.9,zh-CN;q=0.8,zh;q=0.7,ja;q=0.6
> cache-control: no-cache
> cookie: MAID=k4Rf/MejFqG1LjKdveWUPQ==; I2KBRCK=1; rskxRunCookie=0; rCookie=i182bjtmkm3tmujm7wb4xl6fx8wuv; osano_consentmanager_uuid=35ffb0d0-e7e0-487a-a6a5-b35cad9e589f; osano_consentmanager=EtuJH5neWpR-w0VyI9rGqVBE85dQA-2D4f3nUxLGsObfRMLPNtojj-WolqO0wrIvAr3wxlwRPXQuL0CCFvMIDZxXehUBFEicwFqrV4kgDwBshiqbbfh1M3w3V6WWcesS8ZLdPX4iGQ3yTPaxmzpOEBJbeSeY5dByRkR7P2XyOEDAWPT8by2QQjsCt3y3ttreU_M3eV_MJCDCgknIWOyiKdL_FBYJz-ddg8MFAb1N8YBTRQbQAg8r-bSO1vlCqPyWlgzGn-A5xgIDWlCDIpej0Xg2rjA=; JSESSIONID=aaaFppotKtA-t7ze73Rjy; SERVER=WZ6myaEXBLGhNb4JIHwyZ2nYKk2egCfX; MACHINE_LAST_SEEN=2022-08-05T00%3A52%3A30.362-07%3A00; __cf_bm=d9mhQ_ZtETjf41X0VuxDl6GkIZbQtNLJnNIOtDoIPuA-1659685954-0-AXLwPXO1kJb2/IQc+zIesAsL71FoLTgRJqS5M5fxizuFMTw92mMT/yRv5cIq6ZMiRcZE1DchGsO2ZZMdv+/P4JSdUDMAcepY/oXIKFQgauELPNrwiwG/7XYXFRy91+qreazjYASX6Fq0Ir90MNfJ8EcWc10KJyGvSN7QtledQ6Lu9B5S1tqHoxlddPAMOtdL6Q==; lastRskxRun=1659686676640
> pragma: no-cache
> sec-ch-ua: ".Not/A)Brand";v="99", "Google Chrome";v="103", "Chromium";v="103"
> sec-ch-ua-mobile: ?0
> sec-ch-ua-platform: "macOS"
> sec-fetch-dest: document
> sec-fetch-mode: navigate
> sec-fetch-site: none
> sec-fetch-user: ?1
> upgrade-insecure-requests: 1
> user-agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.0.0 Safari/537.36
>
< HTTP/1.1 503 Service Temporarily Unavailable
HTTP/1.1 503 Service Temporarily Unavailable
< Date: Fri, 05 Aug 2022 08:56:14 GMT
Date: Fri, 05 Aug 2022 08:56:14 GMT
< Content-Type: text/html; charset=UTF-8
Content-Type: text/html; charset=UTF-8
< Transfer-Encoding: chunked
Transfer-Encoding: chunked
< Connection: close
Connection: close
< X-Frame-Options: SAMEORIGIN
X-Frame-Options: SAMEORIGIN
< Permissions-Policy: accelerometer=(),autoplay=(),camera=(),clipboard-read=(),clipboard-write=(),fullscreen=(),geolocation=(),gyroscope=(),hid=(),interest-cohort=(),magnetometer=(),microphone=(),payment=(),publickey-credentials-get=(),screen-wake-lock=(),serial=(),sync-xhr=(),usb=()
Permissions-Policy: accelerometer=(),autoplay=(),camera=(),clipboard-read=(),clipboard-write=(),fullscreen=(),geolocation=(),gyroscope=(),hid=(),interest-cohort=(),magnetometer=(),microphone=(),payment=(),publickey-credentials-get=(),screen-wake-lock=(),serial=(),sync-xhr=(),usb=()
< Cache-Control: private, max-age=0, no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Cache-Control: private, max-age=0, no-store, no-cache, must-revalidate, post-check=0, pre-check=0
< Expires: Thu, 01 Jan 1970 00:00:01 GMT
Expires: Thu, 01 Jan 1970 00:00:01 GMT
< Expect-CT: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
Expect-CT: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
< Set-Cookie: __cf_bm=Z8oUUTMhz8K.._yzicdZVzO49fmFKCtgS2CDTlnFvpU-1659689774-0-ARUAfH3m6VNwz09gKVsRECZkXJf5BdqNsW+oIPcy1oKzvppiMWxz7HGFkEwMuGHGzrHRDy5nV+VVj74AxTN8ThozSiHa/8sYH0IwMMe62woC; path=/; expires=Fri, 05-Aug-22 09:26:14 GMT; domain=.onlinelibrary.wiley.com; HttpOnly; Secure; SameSite=None
Set-Cookie: __cf_bm=Z8oUUTMhz8K.._yzicdZVzO49fmFKCtgS2CDTlnFvpU-1659689774-0-ARUAfH3m6VNwz09gKVsRECZkXJf5BdqNsW+oIPcy1oKzvppiMWxz7HGFkEwMuGHGzrHRDy5nV+VVj74AxTN8ThozSiHa/8sYH0IwMMe62woC; path=/; expires=Fri, 05-Aug-22 09:26:14 GMT; domain=.onlinelibrary.wiley.com; HttpOnly; Secure; SameSite=None
< Vary: Accept-Encoding
Vary: Accept-Encoding
< Strict-Transport-Security: max-age=15552000
Strict-Transport-Security: max-age=15552000
< Server: cloudflare
Server: cloudflare
< CF-RAY: 735e5184085e52cb-LAX
CF-RAY: 735e5184085e52cb-LAX
<
<!DOCTYPE HTML>
<html lang="en-US">
...... (HTML codes saying "Please turn JavaScript on and reload the page")
The HTML looks like this as rendered by Postman:
And yes, Postman can neither visit the url.
According to these observations, I believe that the site behaves differently when receiving a first request from browser and curl. But I don't know how does Cloudflare tells between a human being (using a browser) and a bot (using curl). As I have described before, the two kinds of clients have no difference in:
IP address (they are tested on the same host)
context (both requests are the very first request)
headers (headers are copied from browser to the command line)
I ran into the same problem and solved it by adding headers to one of the solutions, using cloudscraper, already mentioned in the initial post.
import cloudscraper
url = 'https://onlinelibrary.wiley.com/doi/full/10.1111/jvs.13069'
headers = {
'Wiley-TDM-Client-Token': 'your-Wiley-TDM-Client-Token',
'User-Agent' : 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/106.0.0.0 Safari/537.36'
}
scraper = cloudscraper.create_scraper()
r = scraper.get(url, headers = headers)
print(r.status_code)
This code snipped returns a status_code 200
It seems like Cloudflare looks for the presence of the 'Wiley-TDM-Client-Token' header, and not for the actual token as removing it from the headers results in a 'Detected a Cloudflare version 2 challenge' error. The code posted here already works without adding your own TDM key, but I expect that for paid content a TDM with the right licenses is needed.

Chrome is not caching JS and CSS

I am running an expressjs server and tested caching in Chrome and Firefox. The headers are shown after the questions.
Could anyone tell me why Chrome is not caching JS and CSS files?
How to make Chrome cache the files?
Chrome request headers (initial and repeated):
GET /js/app.js HTTP/1.1
Host: 10.64.30.105
Connection: keep-alive
Accept: */*
User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.80 Safari/537.36
Referer: https://10.64.30.105/
Accept-Encoding: gzip, deflate, sdch
Accept-Language: en-US,en;q=0.8,zh-CN;q=0.6,zh;q=0.4
Chrome response headers (initial and repeated):
HTTP/1.1 200 OK
Access-Control-Allow-Origin:
Access-Control-Allow-Credentials:
Access-Control-Allow-Methods:
Access-Control-Allow-Headers:
Accept-Ranges: bytes
Date: Thu, 12 Nov 2015 22:16:57 GMT
Cache-Control: public, max-age=31536000
Last-Modified: Thu, 12 Nov 2015 16:02:47 GMT
ETag: W/"XsMH2eh+CkXmU96uopajGg=="
Content-Type: application/javascript
Vary: Accept-Encoding
Content-Encoding: gzip
Connection: keep-alive
Transfer-Encoding: chunked
The same request is cached in Firefox.
Firefox initial request headers:
GET /js/app.js HTTP/1.1
Host: 10.64.30.105
User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:42.0) Gecko/20100101 Firefox/42.0
Accept: */*
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Referer: https://10.64.30.105/
Connection: keep-alive
Pragma: no-cache
Cache-Control: no-cache
Firefox initial response headers:
HTTP/1.1 200 OK
Accept-Ranges: bytes
Date: Thu, 12 Nov 2015 22:30:27 GMT
Cache-Control: public, max-age=31536000
Last-Modified: Thu, 12 Nov 2015 16:02:47 GMT
Etag: W/"XsMH2eh+CkXmU96uopajGg=="
Content-Type: application/javascript
Vary: Accept-Encoding
Content-Encoding: gzip
Connection: keep-alive
Transfer-Encoding: chunked
There is a problem with gzipped resources like .js .css and the Vary: Accept-encoding Header with Chrome.
Please check my Anwser given here: https://stackoverflow.com/a/40726246/135785
This solved the problem for me:
<FilesMatch "(\.js\.gz|\.css\.gz)$">
# Serve correct encoding type.
Header set Content-Encoding gzip
# Force proxies to cache gzipped & non-gzipped css/js files separately.
BrowserMatch "Chrome" ChromeFound
Header append Vary Accept-Encoding env=!ChromeFound
</FilesMatch>
Check your Apache Config for "Header append Vary Accept-Encoding"

Replicating a Post action from a HTTP header

I am trying to send a remote post action to a website to change from one state to another automatically at specific intervals, however, I am unable to decipher the HTTP header information to get the desired result. Every time I send a post the website doesn't accept it, so obviously I have parsed the post incorrectly.
The working HTTP Header information I have captured is as follows:
http://URLXXX.com/p/9998812/update_availability
POST /p/9998812/update_availability HTTP/1.1
Host: URLXXX.com
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:39.0) Gecko/20100101 Firefox/39.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
DNT: 1
Referer: http://URLXXX.com/p/9998812/s/fwkA-irHT-2kMfS
Cookie:
Connection: keep-alive
Content-Type: application/x-www-form-urlencoded
Content-Length: 138
utf8=%E2%9C%93&_method=put&authenticity_token=xMiaIdT%2Fnw%2FPbsYq%2BmVaLFnH362HIvIdXQQX3D%2F4uEo%3D&product%5Bstate%5D=active&commit=Save
HTTP/1.1 302 Found
Server: nginx/1.8.0
Date: Mon, 20 Jul 2015 06:00:16 GMT
Content-Type: text/html; charset=utf-8
Content-Length: 93
Connection: keep-alive
Status: 302 Found
Location: http://URLXXX.com/p/9998812
Set-Cookie: makara-force-master=master; expires=Mon, 20-Jul-2015 06:00:21 GMT
Set-Cookie: csrf-param=authenticity_token; path=/
Set-Cookie: _ssn=c8a813425bc34cd850277f5745ff957e; domain=.URLXXX.com; path=/; expires=Mon, 20-Jul-2015 06:30:16 GMT; HttpOnly
X-UA-Compatible: IE=Edge,chrome=1
Cache-Control: no-cache
X-Request-Id: 3bcc2b5f06cdd5215a613e01726559d9
X-Runtime: 0.160565
X-Served-By: app102.c1.prod
----------------------------------------------------------
http://URLXXX.com/p/9998812
GET /p/9998812 HTTP/1.1
Host: URLXXX.com
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:39.0) Gecko/20100101 Firefox/39.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
DNT: 1
Referer: http://URLXXX.com/p/9998812/s/fwkA-irHT-2kMfS
Cookie:
Connection: keep-alive
HTTP/1.1 301 Moved Permanently
Server: nginx/1.8.0
Date: Mon, 20 Jul 2015 06:00:16 GMT
Content-Type: text/html; charset=utf-8
Content-Length: 127
Connection: keep-alive
Status: 301 Moved Permanently
Vary: User-Agent
Location: http://URLXXX.com/p/9998812/productdetail
X-UA-Compatible: IE=Edge,chrome=1
Cache-Control: no-cache
Set-Cookie: _ssn=c8a813425bc34cd850277f5745ff957e; domain=.URLXXX.com; path=/; expires=Mon, 20-Jul-2015 06:30:16 GMT; HttpOnly
X-Request-Id: 8b68dac5cb24355d19aa46c9ac22df61
X-Runtime: 0.020239
X-Served-By: app103.c1.prod
----------------------------------------------------------
http://URLXXX.com/p/9998812/productdetail
GET /p/9998812/productdetail HTTP/1.1
Host: URLXXX.com
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:39.0) Gecko/20100101 Firefox/39.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
DNT: 1
Referer: http://URLXXX.com/p/9998812/s/fwkA-irHT-2kMfS
Cookie:
Connection: keep-alive
HTTP/1.1 200 OK
Server: nginx/1.8.0
Date: Mon, 20 Jul 2015 06:00:17 GMT
Content-Type: text/html; charset=utf-8
Transfer-Encoding: chunked
Connection: keep-alive
Status: 200 OK
Vary: User-Agent
X-UA-Compatible: IE=Edge,chrome=1
Etag: W/"2e41a435d3ea497f97654949f587fb46"
Cache-Control: max-age=0, private, must-revalidate
Set-Cookie: csrf-param=authenticity_token; path=/
Set-Cookie: _ssn=c8a813425bc34cd850277f5745ff957e; domain=.URLXXX.com; path=/; expires=Mon, 20-Jul-2015 06:30:17 GMT; HttpOnly
X-Request-Id: 7334d318bb69997260fd1b24f2d290de
X-Runtime: 0.369774
X-Served-By: app101.c1.prod
Content-Encoding: gzip
----------------------------------------------------------
Any help someone can provide me would be great. I really just want to understand how the pass the same parameters to the site so I can replicate the function.
Thanks
You need to send the request line, the headers, a blank line, and then the data (in the case of a POST request). For the example above this should look like this:
POST /p/9998812/update_availability HTTP/1.1
Host: URLXXX.com
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:39.0) Gecko/20100101 Firefox/39.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
DNT: 1
Referer: http://URLXXX.com/p/9998812/s/fwkA-irHT-2kMfS
Cookie:
Connection: keep-alive
Content-Type: application/x-www-form-urlencoded
Content-Length: 138
utf8=%E2%9C%93&_method=put&authenticity_token=xMiaIdT%2Fnw%2FPbsYq%2BmVaLFnH362HIvIdXQQX3D%2F4uEo%3D&product%5Bstate%5D=active&commit=Save
You didn't say what you are using to make the request to the server. There are several very powerful tools (such as curl for Unix/Linux systems), and lots of very powerful and friendly libraries (for instance requests for Python) for almost every language, which you can use to make HTTP requests. These do a lot of the work of handling the details of the protocol for you.
If you are writing your own HTTP client, using lower-level networking libraries, you should seriously consider using one of these tools. If you have a good reason not to do this, you should look at the RFC which specifies HTTP: http://www.w3.org/Protocols/rfc2616/rfc2616.html or at a comprehensive resource on HTTP.

Google compute engine console return 399 error code

When I try to reach (after login) https://cloud.google.com/console I get a 399 error code:
Remote Address:212.179.154.246:443
Request URL:https://cloud.google.com/console
Request Method:GET
And the response headers are:
HTTP/1.1 399 Internal Server Error
status: 399 Internal Server Error
version: HTTP/1.1
cache-control: no-cache, no-store, max-age=0, must-revalidate
content-encoding: gzip
content-length: 141
content-type: text/html; charset=utf-8
date: Tue, 13 May 2014 12:01:20 GMT
expires: Fri, 01 Jan 1990 00:00:00 GMT
pragma: no-cache
server: Google Frontend
strict-transport-security: max-age=31536000
vary: Accept-Encoding
x-frame-options: DENY
x-pan-versionid: racy-novel-20140502-rc22.375698005067444148
Any idea why this is happening?
Using other devices works, and sometimes using the incognito mode works as well.
Thanks!
That's a strange error! Perhaps try connecting directly to the new console at https://console.developers.google.com/ ?

Getting duplicate http requests when setting content-type to utf-8 inside http-equiv meta tag

I've found in my site that I'm getting the http requests twice. I'm using an apache2 server. For example I visit index.php and I get 2 different header requests for index.php (images and CSS files are only requested once), so the page is served twice and any database operation is done twice.
I've found that this is being caused by the meta tag, http-equiv. When I set the content-type attribute to UTF-8 I get this behaviour, removing the tag or setting it to another encoding type (such as ISO-8859-1) eliminates this issue.
This is the html code for that meta-tag:
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
Here are the sent and received headers caught by Http Headers Live plugin, that show the duplicate request:
http://oposiziones.dev/
GET / HTTP/1.1
Host: oposiziones.dev
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:7.0.1) Gecko/20100101 Firefox/7.0.1
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: es-es,en-us;q=0.7,en;q=0.3
Accept-Encoding: gzip, deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Connection: keep-alive
Referer: http://oposiziones.dev/error-53_q0.html
Cookie: PHPSESSID=jeup12fp5lpoo5t9k052qt7tl7
HTTP/1.1 200 OK
Date: Mon, 21 Nov 2011 11:53:25 GMT
Server: Apache/2.2.20 (Ubuntu)
X-Powered-By: PHP/5.3.6-13ubuntu3.2
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 6496
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
Content-Type: text/html
http://oposiziones.dev/
GET / HTTP/1.1
Host: oposiziones.dev
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:7.0.1) Gecko/20100101 Firefox/7.0.1
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: es-es,en-us;q=0.7,en;q=0.3
Accept-Encoding: gzip, deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Connection: keep-alive
Referer: http://oposiziones.dev/error-53_q0.html
Cookie: PHPSESSID=jeup12fp5lpoo5t9k052qt7tl7
HTTP/1.1 200 OK
Date: Mon, 21 Nov 2011 11:53:26 GMT
Server: Apache/2.2.20 (Ubuntu)
X-Powered-By: PHP/5.3.6-13ubuntu3.2
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Pragma: no-cache
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 6385
Keep-Alive: timeout=5, max=99
Connection: Keep-Alive
Content-Type: text/html
Anybody with an idea on how to solve this? I need to keep the UTF-8 encoding because my database data is set to UTF-8, and everything should be encoded to UTF-8.
I guess this is an apache encoding issue, but have no idea why this happens.
Thanks in advance!
I didn't find why is happening this but, I solved the issue by adding this directive to the apache configuration file.
Added to config file /etc/apache2/conf.d/charset
AddDefaultCharset UTF-8
This option overrides any http-equiv charset meta tag, so the content is always sent in utf-8. This is no problem if all your content should be sent in that encoding, but won't be a solution if you use several types of encoding.
You can move this configuration directive to your .htaccess so it doesn't affect the whole server, just the folder/site you want to.