Chrome doesn’t show un-minified code in spite of source map present - google-chrome

I’m using Grunt and UglifyJS to generate source maps for my AngularJS app. It produces a file customDomain.js and customDomain.js.map.
JS file
Last line of customDomain.js looks like this:
//# sourceMappingURL=customDomain.js.map
Map file
I find two references to customDomain.js inside of customDomain.js.map, one at the beginning:
"sources":["../../../.tmp/concat/scripts/customDomain.js"]
I think this looks weird so I trim it to:
"sources":["customDomain.js"]
The second reference is at the end:
"file":"customDomain.js"
...which I leave as it is.
Testing
When I run my app in Chrome I expect to see my development code when I click on customDomain.js, but I do not:
I can see on the console output from my web server that customDomain.js.map is indeed requested from the browser:
200 /js/customDomain.js.map (gzip)
What is missing?

"sources":["customDomain.js"] should be relative to the customDomain.map.js file.
Make sure they are in the same directory on your server if this is the case for you.
"file":"customDomain.js" should be changed to the name of the map file, in your case this would be "file":"customDomain.map.js".
Here's a map file example taken from treehouse (sourceRoot may be unnecessary in your case):
{
version: 3,
file: "script.js.map",
sources: [
"app.js",
"content.js",
"widget.js"
],
sourceRoot: "/",
names: ["slideUp", "slideDown", "save"],
mappings: "AAA0B,kBAAhBA,QAAOC,SACjBD,OAAOC,OAAO..."
}

Related

Why does Chrome DevTools show multiple garbled versions of my source code for my Vue application?

I have a Vue application and I'm trying to debug it in Chrome DevTools. The problem is when I try to find the file I want to debug, I get a list of files with the same name plus some weird hash tacked onto the end:
When I open any one file, I get some garbled minified code:
Sometimes I can find the file I want (with the original source code) but sometimes not.
What are these weird files and how can I find the file I want (with the original source code). Is there a way of getting the DevTools to list only the original source code files?
Thanks.
What tool in dev tools are you using to get that list? Seems like a list of cached files, so it's showing all the old versions of your code.
If you go to the network tab and reload the page. You should see a list of all the resources downloaded by the browser. Choose the js filter and you should see your vue js bundle (made by webpack) somewhere in that list.
To allow chrome to display the source correctly you need to generate the Source Maps in development deployments.
I am not sure what tool you are using to build and bundle, but it is likely that you might have support for this already.
Chrome Details:
https://developer.chrome.com/docs/devtools/javascript/source-maps/
OMG - debugging my debugging environment. It's SO maddening.
I'm working with Vue v2, and I'm using vuetify in my app. Here is a complete vue.config.js configuration that solved this problem for me.
// vue.config.js file
const path = require('path')
const { defineConfig } = require('#vue/cli-service')
module.exports = defineConfig({
transpileDependencies: [
'vuetify'
],
configureWebpack: config => {
if (process.env.NODE_ENV === 'development') {
// See available sourcemaps:
// https://webpack.js.org/configuration/devtool/#devtool
config.devtool = 'eval-source-map'
// console.log(`NOTICE: vue.config.js directive: ${config.devtool}`)
config.output.devtoolModuleFilenameTemplate = info => {
let resPath = path.normalize(info.resourcePath)
let isVue = resPath.match(/\.vue$/)
let isGenerated = info.allLoaders
let generated = `webpack-generated:///${resPath}?${info.hash}`
let vuesource = `vue-source:///${resPath}`
return isVue && isGenerated ? generated : vuesource
}
config.output.devtoolFallbackModuleFilenameTemplate =
'webpack:///[resource-path]?[hash]'
}
},
})
I found a work around for this. While you can not see the source code of your file, just change the code (add console or sth.) of the file you want to see while Vue is hot reloading your changes. It occurs to me that the source code is then reachable when you check the developer console.
There is a surprising number of developers I meet on projects that have no idea there are official browser extensions for debugging Vue, Router, VueX etc.
Stumbling across this question prompted me to post this life saving link for those that land here and have missed the existence of this essential tool:
https://devtools.vuejs.org/guide/installation.html

Google Chrome: DOMException: Registration failed - manifest empty or missing

I am trying to implement Push Notifications on my website (using Pushpad). Therefore I created a "manifest.json" with following content:
{
"gcm_sender_id": "my_gcm_sender_id",
"gcm_user_visible_only": true
}
of course I created a valid GCM-Account and have a sender id
I put the manifest.json into my root directory and I also added this line to my index.php:
<link rel="manifest" href="/manifest.json">
Using Firefox everything works fine and I can send and receive push notifications (so I think the manifest-include works fine), but Chrome won't work...
The console shows following error:
Uncaught (in promise) DOMException: Registration failed - manifest empty or missing
I searched Google for a long time and tried everything I found, but nothing works.
What I tried:
created the manifest.json with "Editor" and saved it as type All Types (so no hidden .txt-file) and also with UTF-8-Encoding.
restarted Chrome
cleared Chrome's cache, history, etc.
I really hope somebody can help me.
For me it was a redirect. The manifest.json must return a 200 status code (must be directly available from the server), without any redirects.
You can check the response via
wget --max-redirect=0 https://example.com/manifest.json
or
curl https://example.com/manifest.json
I faced same issue,added manifest file right after head tag . which worked for me.Cheers!
This may be an issue with your Service Worker scope. I ran into a similar problem when I rearranged my files/directories. Make sure your sw.js is on the same level as your manifest.json, otherwise the service worker won't be able to find your manifest. Try putting them both in the root of your directory. Optionally, you can specify the scope of your service worker by adding it to serviceWorker.register():
if ('serviceWorker' in navigator) {
navigator.serviceWorker.register('/sw-test/sw.js', {scope: '/sw-test/'})
.then(function(reg) {
// registration worked
console.log('Registration succeeded. Scope is ' + reg.scope);
}).catch(function(error) {
// registration failed
console.log('Registration failed with ' + error);
});
}
Read more here:
https://developer.mozilla.org/en-US/docs/Web/API/Service_Worker_API/Using_Service_Workers
Was wondering if your "manifest.json" is public accessible ?
If not maybe you can try to set it public accessible to see if that helps or not.
And it seems that the current chrome, when getting the "manifest.json" won't supply the cookies.
Because I didn't find an answer anywhere out there in the WWW, but managed to get it working after some time I want to provide my solution/answer for other users, who probably have the same problem:
In the file where I inlcuded the Pushpad files I wrote some PHP-Code before the <head>-Tag to include some files, e.g. for database connection. After I moved the PHP-Code below the <head>-Tag everything worked fine.
There seem to be three ways to fix this bug:
a) No redirects for "manifest.json" file.
b) Put a link to this file at the top of the tag.
c) Be sure, that there is no other manifest file before this one, cause it seems that web push script will try to import the first one and return an error due to the wrong data.
I have tried all three and finally forced Chrome to behave.
Adding the following block fixed this for me:
self.addEventListener('push', (event) => {
const title = 'Get Started With Workbox';
const options = {
body: event.data.text()
};
event.waitUntil(self.registration.showNotification(title, options));
});

Google Chrome / Chromium event filter, local file hostname?

I'm developing an extension to the Chrome and I'd like to filter WebNavigation.OnCompleted event so that it's only fired on certain domains.
I'm using it like this:
chrome.webNavigation.onCompleted.addListener(function(details) {
// some code here...
}, {
url : {hostEquals : 'www.foo.bar'}
});
That works. Then I started testing this on a test page, which is located on my local computer. That's when I ran into the problem: what's the value of the filter when URL points to a file located on the local computer (i.e. what's the hostname of a local file from Chrome event filter perspective):
file:///home/usr/testfile.html
I know, the URL doesn't really contain a hostname, but I think it should be possible to filter these kind's of addresses too. I've tested different variations, like 'file:///', 'localhost' and '/' but none of them seem to get the job done. Leaving the filter empty equals no filtering at all.
The extension works fine without the filter, but I need to get this system to work with it.
To match a file URL that starts with "/home/user/", just use urlPrefix, e.g. as follows:
chrome.webNavigation.onCompleted.addListener(function(details) {
// Some code here...
}, {
url: [{
hostEquals: 'example.com',
}, {
// Note: This filter will only match if
// 1) You've declared the "file://*" or "<all_urls>" permission
// in manifest.json, and
// 2) You've visited chrome://extensions and ticked the checkbox
// "Allow access to file URLs" at your extension.
urlPrefix: 'file:///home/user/'
}],
});
For other filters, see the documentation at chrome.webNavigation.onCompleted.
PS. The host name of a file:-URL is whatever that comes between the file:// and the path. In case of file:///, it's an empty string.

Grunt doesn't run json_server task from grunt-json-server

I am using the configuration the npm page gives an example for, yet when I try to run the task using either grunt.run.task (['json_server']) or in concurrent: { server: { tasks [ 'json_server'] } }, grunt doesn't even print out the task name in the console. It doesn't even give me an error if I remove the db file it tries to point to.
See: https://github.com/tfiwm/grunt-json-server/issues/4
The library uses registerMultiTask; when changed to registerTask, it worked fine for me.

Importing local json file using d3.json does not work

I try to import a local .json-file using d3.json().
The file filename.json is stored in the same folder as my html file.
Yet the (json)-parameter is null.
d3.json("filename.json", function(json) {
root = json;
root.x0 = h / 2;
root.y0 = 0;});
. . .
}
My code is basically the same as in this d3.js example
If you're running in a browser, you cannot load local files.
But it's fairly easy to run a dev server, on the commandline, simply cd into the directory with your files, then:
python -m SimpleHTTPServer
(or python -m http.server using python 3)
Now in your browser, go to localhost:3000 (or :8000 or whatever is shown on the commandline).
The following used to work in older versions of d3:
var json = {"my": "json"};
d3.json(json, function(json) {
root = json;
root.x0 = h / 2;
root.y0 = 0;
});
In version d3.v5, you should do it as
d3.json("file.json").then(function(data){ console.log(data)});
Similarly, with csv and other file formats.
You can find more details at https://github.com/d3/d3/blob/master/CHANGES.md
Adding to the previous answers it's simpler to use an HTTP server provided by most Linux/ Mac machines (just by having python installed).
Run the following command in the root of your project
python -m SimpleHTTPServer
Then instead of accessing file://.....index.html open your browser on http://localhost:8080 or the port provided by running the server. This way will make the browser fetch all the files in your project without being blocked.
http://bl.ocks.org/eyaler/10586116
Refer to this code, this is reading from a file and creating a graph.
I also had the same problem, but later I figured out that the problem was in the json file I was using(an extra comma). If you are getting null here try printing the error you are getting, like this may be.
d3.json("filename.json", function(error, graph) {
alert(error)
})
This is working in firefox, in chrome somehow its not printing the error.
Loading a local csv or json file with (d3)js is not safe to do. They prevent you from doing it. There are some solutions to get it working though. The following line basically does not work (csv or json) because it is a local import:
d3.csv("path_to_your_csv", function(data) {console.log(data) });
Solution 1:
Disable the security in your browser
Different browsers have different security setting that you can disable. This solution can work and you can load your files. Disabling is however not advisable. It will make you vulnerable for all kind of threads. On the other hand, who is going to use your software if you tell them to manually disable the security?
Disable the security in Chrome:
--disable-web-security
--allow-file-access-from-files
Solution 2:
Load your csv/json file from a website.
This may seem like a weird solution but it will work. It is an easy fix but can be unpractical though. See here for an example. Check out the page-source. This is the idea:
d3.csv("https://path_to_your_csv", function(data) {console.log(data) });
Solution 3:
Start you own browser, with e.g. Python.
Such a browser does not include all kind of security checks. This may be a solution when you experiment with your code on your own machine. In many cases, this may not be the solution when you have users. This example will serve HTTP on port 8888 unless it is already taken:
python -m http.server 8888
python -m SimpleHTTPServer 8888 &
Open the (Chrome) browser address bar and type the underneath. This will open the index.html. In case you have a different name, type the path to that local HTML page.
localhost:8888
Solution 4:
Use local-host and CORS
You may can use local-host and CORS but the approach is not user-friendly coz setting up this, may not be so straightforward.
Solution 5:
Embed your data in the HTML file
I like this solution the most. Instead of loading your csv, you can write a script that embeds your data directly in the html. This will allow users use their favorite browser, and there are no security issues. This solution may not be so elegant because your html file can grow very hard depending on your data but it will work though. See here for an example. Check out the page-source.
Remove this line:
d3.csv("path_to_your_csv", function(data) { })
Replace with this:
var data =
[
$DATA_COMES_HERE$
]
You can't readily read local files, at least not in Chrome, and possibly not in other browsers either.
The simplest workaround is to simply include your JSON data in your script file and then simply get rid of your d3.json call and keep the code in the callback you pass to it.
Your code would then look like this:
json = { ... };
root = json;
root.x0 = h / 2;
root.y0 = 0;
...
I have used this
d3.json("graph.json", function(error, xyz) {
if (error) throw error;
// the rest of my d3 graph code here
}
so you can refer to your json file by using the variable xyz and graph is the name of my local json file
Use resource as local variable
var filename = {x0:0,y0:0};
//you can change different name for the function than json
d3.json = (x,cb)=>cb.call(null,x);
d3.json(filename, function(json) {
root = json;
root.x0 = h / 2;
root.y0 = 0;});
//...
}