Say I was to sign into a new user on Chrome and had extensions from a previous computer synced to my account. Where are these extensions downloaded from after I sign? I thought maybe update_url in manifest.json but I'm not sure.
Let's examine an actual sync payload through chrome://sync-internals/
"SPECIFICS": {
"encrypted": true,
"extension": {
"disable_reasons": "0",
"enabled": true,
"id": "ijglncoabcgieiokjmgdogpefdblmnle",
"incognito_enabled": false,
"installed_by_custodian": false,
"name": "Desktop Notifications for Stack Exchange",
"remote_install": false,
"update_url": "https://clients2.google.com/service/update2/crx",
"version": "1.6.12"
}
},
From this, Chrome does see the update_url, which it can query for the download URL according to auto-update protocol. Note that the actual extension is not hosted at that URL, that URL merely contains instructions on where to download it from. In this instance, the download will be from the Chrome Web Store.
It will also sync disabled status across instances: a disabled extension will still be downloaded, but will be disabled immediately.
Since non-enterprise versions can't install from anywhere but CWS, and non-CWS enterprise installs take their update_urls from policies and not Sync, I think the answer to your question is almost certainly "from CWS". Can't test it though.
Related
I know that it is possible to download the derivatives via their respective urns. However, the SVF2 object in the manifest doesn't contain its urn. Therefore, I cannot download the derivative as explained here or here. Is this not supported yet? And can I compute the urn from the data returned in the manifest?
Extract of an manifest example:
{
"urn": "SOME_URN",
"derivatives": [
{
"hasThumbnail": "true",
"children": [
{
"useAsDefault": true,
"role": "3d",
"hasThumbnail": "true",
"children": [
{
...
},
{
...
},
{
"role": "graphics",
"mime": "application/autodesk-svf2",
"guid": "SOME_GUID",
"type": "resource"
}
],
I'd like to make clear that it is possible to download the SVF2 'files' since your WEB browser can do it; therefore, you can access the data as well. The files are actually cached in your Browser, see below.
The Viewer downloads an extra manifest files (otg_model.json) which contains additional information. But downloading the 'files' on your local machine will not help since it requires a lot of setup to get the Viewer work properly with a local SVF2 storage. And with the current state of the technology, it is highly recommended you do not try to do this in production. When it comes to development, and debugging, I go a sample posted here which can help. But please be careful with the Autodesk EULA on doing offline workflows. This sample is a replacement of the old extract.autodesk.io sample as people were abusing of this website, and can work with both SVF and SVF2.
To answer the question in the comment section. SVF2 is still in beta, and access to the underlying data/files will probably be only available at the end on the beta. The main reason is that SVF2 and the Viewer code evolves too rapidly today to make a general availability to everything. So unless you keep updating them on your local machine, things may break, and therefore Autodesk is limiting the access.
Sorry for disappointing you, but ...
Unfortunately, it's expected behavior. SVF2 doesn't have a concept of URN, and you cannot download SVF2 for offline viewing at this moment since it's unsupported.
I have developed a Chrome extention that modifies web pages for an ASP.net system used at my workplace.
Due to the new Cookie restrictions introduced in recent versions of Chrome, I have to remove the SameSite lax cookie and replace it with a SameSite none, secure cookie.
The organisation has recently updated Chrome from 75 to 80. Now it works for some people and is broken for others.
When attempting to use Chrome.cookies api, the error is Unchecked runtime.lastError: Failed to parse or set cookie named "ASP.NET_SessionId".
Everybody appears to be running the same version of Chrome, and the cookie key is always the same.
See below for code. I have replaced the urls for this example.
function sameSiteCookieMaker() {
chrome.cookies.get({
"url": "https://example.example.example.com.au",
"name": "ASP.NET_SessionId"
}, function(cookie) {
state = cookie.value
chrome.cookies.remove({
"url": "https://example.example.example.com.au",
"name": "ASP.NET_SessionId"
}, function(cookie2) {
chrome.cookies.set({
"url": "https://example.example.example.com.au",
"domain": "example.example.example.com.au",
"httpOnly": true,
"name": "ASP.NET_SessionId",
"path": "/",
"sameSite": "no_restriction",
"secure": true,
"storeId": "0",
"value": state
})
})
})
}
I got the same error. I solved it by following those rules:
url should contain the value of domain. You are not allowed to use a different domain name for url.
Chrome rejects any url with a leading dot, like https://.example.com/.
url should match with secure. That means, whenever secure is true, url should start with https.
i got the same error, and after a little research i found out my cookie size exceeded the size limit.
I am trying to publish an chrome extension with following manifest file.
Every time I publish my app it is getting rejected.
Updated
{
"manifest_version": 2,
"name": "Aiwozo",
"description": "AI Work Zone Web Automation Extension is an component developed for browser interaction to implement automation on web applications.",
"version": "1.1",
"icons": {
"16":"static/activate_icons/Aiwozo16.png",
"32":"static/activate_icons/Aiwozo32.png",
"64": "static/activate_icons/Aiwozo64.png",
"128":"static/activate_icons/Aiwozo128.png"
},
"background":{
"scripts":["background.js"]
},
"browser_action": {
"default_icon": {
"16":"static/activate_icons/Aiwozo16.png",
"32":"static/activate_icons/Aiwozo32.png",
"64": "static/activate_icons/Aiwozo64.png",
"128":"static/activate_icons/Aiwozo128.png"
},
"default_title": "Artificial Intelligence Work Zone"
},
"permissions": ["nativeMessaging", "<all_urls>"],
"web_accessible_resources": [ "css/general.css", "static/activate_icons/AIwozo16.png", "static/activate_icons/AIwozo32.png", "static/activate_icons/AIwozo64.png", "static/activate_icons/AIwozo128.png", "static/deactivate_icons/AIwozo16.png", "static/deactivate_icons/AIwozo32.png", "static/deactivate_icons/AIwozo64.png", "static/deactivate_icons/AIwozo128.png"]
}
It seems that Google now requires you, the developer, to provide an explanation of what your extension does (according to Single Purpose Policy) and an explanation of why specific permissions are needed.
This is on the Privacy tab of the "new" Developer Dashboard's listing:
Until those fields are filled out, Web Store blocks publishing of new extensions and updated versions of existing extensions.
On the plus side: it doesn't mean you have failed a review yet. So with good explanations you may be able to get this published. In your particualr case though, those are broad permissions + arbitrary code execution. It will be tough.
Try with
"version": "1.0",
Instead of
"version": "0.01",
Manifesto version 2 and his new content_security_policy is now necessary for chrome extension.
I read some docs about 'sandbox mode" which seems to be a workaround for inline javascripts, but I still have a big issue.
After some refactoring, I got the following error:
"Unsafe JavaScript attempt to access frame with URL chrome-extension://mafcgphdkdbjlngfndodameheehmfhac/eventpage.html from frame with URL chrome-extension://mafcgphdkdbjlngfndodameheehmfhac/DCE24DB153A80B735442BF97F168AE6C.cache.html. Domains, protocols and ports must match."
I can't understand why 2 files from the same extension doesn't have the same "Domains, protocols and ports"!
NB: Here is a part of my manifesto:
"permissions": [
"http://*/",
"tabs"
],
"background": {
"page": "eventpage.html",
"persistent": false
},
"sandbox": {
"pages": [
"sandbox.html",
"DCE24DB153A80B735442BF97F168AE6C.cache.html"
]
}
...
Sandboxed pages are allowed to bypass the extension's Content Security Policy in part because sandboxing forces them into a unique origin. They don't have access to the extension's special APIs, nor can they grab its data.
http://developer.chrome.com/trunk/extensions/sandboxingEval.html offers a description of the workflow we'd suggest you use with sandboxed pages. In short, you'll need to replace direct access between the frame and its parent with postMessage-based communication.
I'm trying to use text-to-speech in a Chrome app, but I'm getting an error when trying to load the app.
My manifest.json looks like this:
{
"name": "APPNAME",
"description": "DESCRIPTION",
"version": "3",
"app": {
"urls": ["APPURL"],
"launch": {"web_url": "APPURL"}
},
"icons": {"24": "icon24.png", "128": "icon128.png"},
"permissions": ["tts"]
}
The error I'm getting reads "Could not load extension from <PATH>. Access to permission 'tts' denied."
Removing the "app" part of the manifest seems to allow it to load without problems. That would make me think that TTS is limited to Chrome extensions, but the docs suggest otherwise. Changing the "tts" permission to the "cookies" permission results in the same error, but changing it to "clipboardRead" does not.
I'm attempting to load the app via: Tools > Extensions > Load unpacked extension, and I'm using Chrome 16 on Ubuntu 11.10.
Can anyone tell me what I'm doing wrong?
It turned out that some permissions are only available for extensions and packaged apps. I was trying to use tts with a webapp, which is unfortunately not available.
That said, the Web Speech API is now available, along with Speech Synthesis.