BurpSuite can only intercept HTTP traffic. How can I also intercept HTTPS traffic on Ubuntu? I need to install the CA but how?
Obtaining the certificate:
When chrome is configured to use Burp as a proxy, go to http://burp/cert and the DER encoded certificate will be downloaded automatically.
Download the certificate in BurpSuite under the Proxy->Options tab under Import / export CA certificate. Export the certificate in DER format.
Install the certificate:
Either by double clicking on it in your file browser (Nautilus in my case) or by importing it into Chrome.
Another way of installing it is by importing directly into Chrome.
Go to settings->Show advance settings... (at the bottom)->HTTPS/SSL:Manage certificates->Authorities(tab)->Import
In the file selector you must set the file filter to 'DER-encoded binary..' or 'all files' to make your certificate file visible. The default file selector setting is base-64 encoded ASCII and our file is DER encoded.
Now, for the step I was missing in other explanations, in the chrome certificate manager in the tab Authorities (where you just imported the certificate), find the newly imported certificate. In my case it looked like this:
Notice the "Untrusted", in my case this meant that it I still got the SSL warnings and the red padlock. Click on "untrusted PortSwigger CA" and click Edit...
Check "Trust this certificate for identifying websites." and click "OK". In my case the text "untrusted" didn't disappear directly but after restarting Chrome, the PortSwigger CA was trusted and SSL proxying works.
If this is a duplicate please tell me, but I haven't found a similar explanation.
For Mac: Configuring BurpSuite Proxy with HTTPS and fixing the your connection is not private message
1. Configure Chrome to use Burp as a Proxy
You can view detailed instructions of this step here
https://support.portswigger.net/customer/portal/articles/1783070-configuring-safari-to-work-with-burp
Make sure you hit OK and Apply
2. Download and Install the Burp Certificate
http://burp/cert
You need to have the proxy enabled to do this. Once it's downloaded, double click on it to install it. Save to login keychain.
3. Modify certificate permissions
Open Keychain Access and search for "portswigger" to find the certificate. Right click and hit "Get Info".
Select "Always Trust".
The red Your connection is not private message should be gone now.
In Kali linux with Chromium browser this work for me
Start BurpSuite
Open Chrome (Chromium web Browser) and type in url "127.0.0.1:8080"
Click on "CA Certificate" to Download the Certificate of Burp Suite.
View Image
Save file "Cacert.der" is the certifcate.
Note: when i try import directly to chromium with "der" extension the web browser did not recognized the file So the solution was next:
Open Firefox and click in settings or Preferences.
search certificates. View Image
view Certificates. View Image
Click on Import button and search cert.der previosly downloaded.
Then export (Firefox automatically export file with another extension "PortSwiggerCA.crt").
Now we can import the certificate in chromium web browser (The file "PortSwiggerCA.crt"). To import is the same steps for firefox:
Settings -> Search "certificates" -> view certificates -> authorities -> import
Related
A site I use has suddenly moved to a new domain. The old site 301 redirects to the new site. All my settings for the site have been lost, because they are stored in localstorage on the old domain.
Normally, if I need to view everything in localstorage, I go to the site and open developer tools, and there it is.
However, if I try that now, I can only see the new empty localstorage on the new origin, which does not help me at all.
How do I view the localstorage of a site that I cannot load?
I tried interrupting the redirect by:
Disabling redirects in Chrome settings for this domain (no effect)
Hitting ESC while it loads (failed after repeated attempts)
Writing a Tampermonkey script (never runs)
window.addEventListener("beforeunload", function() { debugger; }, false) (stops, but devtools localstorage never shows old domain, even if I step through)
At no point am I stopped on the previous domain, allowing me to view the associated localstorage.
But I can see that it still exists if I go to Chrome's settings | All cookies and site data, and filter for the domain. But, unlike cookies (whose contents are viewable from this settings page), the only things localstorage tells me here are the origin, the size on disk, and the last modified date. Not the contents. But this IS something stored on my computer somehow, and I should be able to access it, right?
How do I browse the contents of localstorage locally, on my local machine?
Is there a way to redirect dev tools to a specific domain?
Is there a way of "tricking" the browser into thinking I'm on a specific domain, so that I can then open dev tools and view the localstorage like normal?
Or is there an obvious way to browse localstorage that I'm just overlooking?
Expanding on wOxxOm's solution:
For context, I'm on Ubuntu 22.04 with Python3 and mkcert already installed.
Edit the hosts file
sudo gedit /etc/hosts
I add a new line at the bottom: 127.0.0.1 (the domain I'm trying to recover)
Now, I don't have anything running on port 80, so for a regular website I could have run:
sudo python3 -m http.server 80
(sudo because Ubuntu will complain that I don't have permissions to use that port otherwise)
But here's the twist: Nothing showed up in localstorage when I did that. Why? Because the original site wasn't on http, it was on https.
Move to a folder you can leave files in.
I moved to my development folder
Generate local certificate
mkcert (target domain)
mkcert -install
This makes a couple files in this folder, whateverdomain.pem and whateverdomain-key.pem.
Write a small python wrapper (from this StackOverflow)
import http.server, ssl
server_address = ('localhost', 443)
httpd = http.server.HTTPServer(server_address, http.server.SimpleHTTPRequestHandler)
httpd.socket = ssl.wrap_socket(httpd.socket,
server_side=True,
certfile='whatever.pem',
keyfile='whatever-key.pem',
ssl_version=ssl.PROTOCOL_TLS)
httpd.serve_forever()
Server address is 443 for https, keyfile is included with certfile.
Run the new script
sudo python3 simpleserver.py
Sudo again, because of using a main port.
Open the target domain in the browser.
Open dev tools. Right-click on the page and choose "Inspect" or use the keyboard shortcut ctrl-shift-i
Go to the Application tab.
Success! My localstorage is finally viewable. From here I was able to copy out the values I needed.
We're done, so clean up:
Stop python script.
Finally, re-edit /etc/hosts and remove or comment out the redirect line.
Was there a simpler way I could have run https? Maybe. But this wasn't difficult to set up, and I did recover my inaccessible localstorage, and that's the important thing.
I have a self-signed certificate that I generated as a .p12 and imported into Mac Keychain. I've set the trust to Always Trust however Google Chrome still shows Not Secure and prompts me with Your connection is not private. Safari is trusting the certificate just fine. Does anyone know how to fix this? See image below (Its not incognito, its just a dark theme)
This worked for me, but I cannot reproduce to confirm.
1) Open Chrome
2) Go to chrome://flags/#allow-insecure-localhost
3) Enabled it. Then relaunch.
4) Then disable it again. Then relaunch.
I use this for a local domain which is formatted https://xxx.xxx.com.local i.e. it wasn't just for https://localhost
you should add a x509 SAN extension to the cert.
The Network tab in the Google Chrome Developer Tools window shows almost all http requests made, but does not seem to capture anything when the http request results in a file being downloaded.
How can I capture download requests in Google Chrome?
I am sure that your file download is happening by opening a new window. Network tab of developer tools only captures the request of current tab.
For example, following link will download the file but it will not appear in the network tab.
Click Here to Download file
Similar type of things can be done using javascript using (window.open, dynamic hyperlink/iframe), which will not appear in the network tab.
Various Javascript approach you can check here
I have observed similar behaviour in my past.
You cane check the chrome://net-internals in older version of chrome and chrome://net-export/ in the newer version of chrome to monitor any type of request being made by any instance/tab of chrome.
Note: You can check the internal events of chrome by typing chrome://net-export/ in the url box of chrome.
I have faced similar issue, and here's how I solved it.
Issue:
Debug an anchor link that download file upon clicking it.
Debugging Process:
Steps
Go to chrome://settings/content/automaticDownloads?search=download and disable auto download
Open chrome dev tools, Settings -> Global -> Auto-open DevTools for popup
Open chrome dev tools, Settings -> Console -> Preserve log upon navigation
I hope that helps.
This works without changing any settings of Chrome for a single download-request. It however does not automatically display all download-requests triggered in a different tab or window.
Trigger the download in the GUI.
Open Chrome's download history (chrome://downloads/).
Right-click your download and Copy link address.
Open DevTools, paste the link into the address bar of the corresopnding Chrome tab and execute it.
The download-request shows in the DevTools.
You can use Fiddler for a more grainy look into your network traffic:
https://www.telerik.com/fiddler
*I don't work for fiddler
What do you mean by capture?
If you meant that nothing showed up in preview tab or in response tab, it's because the response is the actual file being downloaded.
I've recently tried downloading Oracle JDK 11 with dev-tools open in network tab and here is what I got:
I have no particular configuration in this version of Chrome (Versione 71.0.3578.98 (Build ufficiale) (a 64 bit))
As #jlvaquero said, if you're trying to get as much details as possibile, try WireShark on your own local pc.
I can see it in my case by downloading a document from google drive and limit download speed to 3G.
First step : Open with f12 the programmer toolbar.
Step Two : Go to the networking tab and locate the video in question. To help filter by clicking on media.
Step Three : If the video has no protection you can right click, click open on a new tab and download with crtl + s. If this does not work is because the video has parameters to prevent it from doing so. In that case right click again, go to the COPY session and then click copy as cURL.
Step Four : Go to your linux terminal (If you use windows turn around), if you don't have curl installed type sudo apt install curl and then paste the copied CURL command from the developer bar.
Step 5 : Before executing the command you need to add at the end of it --output video.mp4 --insecure as it is a binary. The insecure parameter is if you have problem with certificate. Wait for the download to complete and be happy!
Obs: This link can help you: https://www.hanselman.com/blog/HowToDownloadEmbeddedVideosWithF12ToolsInYourBrowser.aspx
Google Chrome has been updated to support downloads in the Network Tab
This question was asked in February of 2018, and at the time Google Chrome did not support downloads in the Network tab.
I have verified this by downloading the 64.0.3282.140 build of Google Chrome.
And then attempted to download Spotify as an example and found no event appear in the network tab.
Any Google Chrome version released in 2019 or later will capture all download requests in the Network tab.
i did follow all the answers on
here
and nothing worked for me... nothing at all.
I'm on windows 10, using chrome version 54.0.2840.99 m
trying to access my QNAP TS-453a on local on a static ip address (10.1.1.1)
https://10.1.1.1/cgi-bin/
I tried using imported certificates, self signed, export and import the default one, etc nothing works
Some help would be really really appreciated
Valid as of Chrome v58.0.3029:
Visit the site in Chrome.
Open Developer Tools (F12)
Navigate to Security tab
Click "View certificate"
Click Details > Copy to file
Choose a save location on your local machine
Open Chrome settings
Toggle "Show Advanced Settings" (bottom of screen)
Navigate to HTTPS/SSL > Manage certificates
Click "Trusted Root Certification Authorities"
Click Import
Navigate to the cert you just stored
Quit Chrome (Ctrl+Shift+Q) and re-visit your site
NOTE:
Chrome recently (as of 05/15/17) began to require that the cert's subjectAltName parameter be filled. This question received an answer that tells you how to do so.
In general, to troubleshoot this kind of problem, open Developer Tools, go to Security tab, and you will see what Chrome deems wrong with that certificate.
It is likely that it doesn't include a subjectAltName extension, and the solution for adding one is here: https://stackoverflow.com/a/56530824/2873507
I'm using Chrome 50.0.2661.87.
I develop locally and want to use self signed certificates (created with MAMP PRO 3.2.0).
I do the following steps:
Go to the site (which points via hosts file to 127.0.0.1): E.g. https://www.my-local-dev-page.com
Get an error: NET::ERR_CERT_COMMON_NAME_INVALID
I click continue
I click on the red lock in the top left corner inside the certificate popup I choose "export to file"
I choose "DER-codiert-binär X.509 (.CER)"
I click next and close chrome
I import this CER file both via windows user certificates and also in the chrome settings in the ssl path of the root authorities
I open up chrome and the error comes again
If I click again "continue", it seems that somehow it is working, but after a few requests, it is falling back and the error comes again.
What I would expect is that ALL requests in the future work without a problem and that a green lock is shown on the top left.
What am I missing?
Thanks!
I got it.
The problem was, that the certificate was created with a wrong CN.
Now everything works as expected!