Yammer embed open-graph feed for object, not an url? - embed

Is it possible to create an open-graph url for object which is not a www url? Like for network drive location for example?
I tried it and the embed feed seems to only load for www urls.
Here are my questions:
Is it possible to not use a normal www url for the object?
Where is the documentation what different config options are available for the open-graph?
Where is the documentation for the objectProperties config? I'd like to provide all of them manually and disable the system which tries to fetch them through embed.ly or what ever.
This works fine:
yam.connect.embedFeed({
container: "#embedded-feed",
network: "m-files.com",
feedType: "open-graph",
objectProperties: {
url: "https://www.microsoft.com",
type: "file",
title: "Yammer.pdf"
}
});
But for example this doesn't:
yam.connect.embedFeed({
container: "#embedded-feed",
network: "m-files.com",
feedType: "open-graph",
objectProperties: {
url: "\\192.168.1.8\Something\somefile.pdf",
type: "file",
title: "Yammer.pdf"
}
});

Related

How can background and popup connect when using webpack

I am using webpack to develop a chrome extend, webpack.config.js can looks like this:
entry: {
background: ['babel-polyfill', './src/background'],
content: ['babel-polyfill', './src/content'],
popup: ['babel-polyfill', './src/popup'],
},
output: {
filename: '[name].js',
path: path.resolve('./dist/'),
publicPath: '/',
},
The webpack build files to directory--dist and I set them in the mainifest.json for chrome to use. However I found that because of webpacks' building, dist/background and dist/popup.js cannot connect as before:
// popup.js
var bg = chrome.extension.getBackgroundPage();
bg.test()
I wanna pass some vars from popup.js to background.js, What can I do in such situation?
I just used the chrome storage API to solve this problem. Both background and popup are sharing the same storage.
As for the common functions, put them in the same javascript file and import it in background and popup using webpack will be fine.

Sourcemaps are detected in chrome but original source is not loaded, using webpack-2

When running an application that is built using webpack 2, sourcemaps are detected in chrome but original source is not loaded.
I'm using webpack beta21.
These files used to be detected automatically, ie when a breakpoint was put in the the output from webpack js file, the source view would jump to the original source input to webpack. But now I am stuck with this screen:
config:
var path = require("path");
var webpack = require("webpack");
var WebpackBuildNotifierPlugin = require('webpack-build-notifier');
const PATHS = {
app: path.join(__dirname, '../client'),
build: path.join(__dirname, '../public')
};
module.exports = {
entry: {
app: PATHS.app + '/app.js'
},
output: {
path: PATHS.build,
filename: '[name].js'
},
devtool: "source-map",
module: {
loaders: [
{
test: /\.js?$/,
loader: 'babel-loader',
include: [
path.resolve(__dirname, 'client'),
],
exclude: /node_modules/
},
{
test: /\.css/,
loader: "style!css"
}
]
},
resolve: {
// you can now require('file') instead of require('file.js')
extensions: ['', '.js', '.json']
} ,
plugins: [
new WebpackBuildNotifierPlugin()
]
};
Generated files with source maps won't automatically redirect to their original files, because there's potentially a 1-to-many relationship.
If you see the message Source Map Detected, the original file should already appear on the side file tree or the file explorer via Crl + P. If you don't know the original file name, you can open the source map file itself.
The source map path can be identified by a //# sourceMappingURL= comment or the X-SourceMap header:
Open up the source map via url and look for the sources property for the original file name:
The original file should be visible in the sources panel:
If you don't see the message Source Map Detected
You can manually add an external source map by right clicking and selecting Add Source Map:
Additional Resources
If that still doesn't work, you can try a Source Map Validator
For webpack specifically, you can configure the devtool property
If you're mapping to a workspace, that means you already have the source code. Including the source code in your source map is creating an unnecessary redundancy.
Use nosources-source-map instead.
The issue with external source maps was fixed in Chrome 52 but it looks like you've got your devtool set differently from mine, I use:
devtool: '#source-maps'
How are you building your source? If you're running with -d it will switch to inline source maps

RequireJS text plugin: cannot load HTML from other domain

I'd like to fetch some html from another domain with require.js. I know that CORS policies doesn't allow this easily. Note: I have configured the web server (with Access-Control-Allow-Origin "*" and other directives) and require.js so far that all JS and CSS files (css with require-css plugin) gets loaded from the other domain as expected - just fetching html makes problems. But in the browser network protocol I can see that the html content even gets loaded. However, this content does not get passed to the require function! The browser gets the content, but require.js doesn't provide it as an parameter...
My configuration:
requirejs.config({
baseUrl: "http://some.other.domain/",
paths: {
jquery: 'ext/jquery/jquery.min',
htmlTemplate: 'test.html?',
siteCss: '../css/site'
},
shim: {
htmlTemplate: [
'css!siteCss'
]
},
config: {
text: {
useXhr: function (url, protocol, hostname, port) {
return true;
}
}
},
map: {
'*': {
text: 'ext/require/text',
css: 'ext/require/css.min'
}
}
});
require(['text!htmlTemplate'], function (htmlTemplate) {
console.log(htmlTemplate); // prints 'undefined' into the console
});
Two notes: The useXhr configuration is taken from require.js text plugin adds “.js” to the file name but it makes no difference if it is there or not. I appended a ? to htmlTemplate path. With this the .js does not get appended to the URL and the browser loads the html content - as said before, unfortunately, without that require.js is passing it to parameter htmlTemplate.
What can I do? I've read that if I use the require.js optimizer the generated file wouldn't have this problem anymore (however that works...). But I need to develop my JS without optimization on every edit.
Update: Found one solution but I'd be happy if anyone can provide the 'right' solution.
I've found the actual problem! This part:
config: {
text: {
useXhr: function (url, protocol, hostname, port) {
return true;
}
}
},
should really do it. However, I found out that it wasn't called at all. Instead, the default implementation was called. And this returned false.
To make it work it is necessary to have the right keys in the config section since the mapping doesn't seem to be evaluated for it.
So this is the right configuration that fetches HTML from the other domain:
requirejs.config({
baseUrl: "http://some.other.domain/",
paths: {
jquery: 'ext/jquery/jquery.min',
htmlTemplate: 'test.html', // // ---> removed the '?'
siteCss: '../css/site'
},
shim: {
htmlTemplate: [
'css!siteCss'
]
},
config: {
'ext/require/text': { // ---> full path is required!!!
useXhr: function (url, protocol, hostname, port) {
return true;
}
}
},
map: {
'*': {
text: 'ext/require/text',
css: 'ext/require/css.min'
}
}
});
require(['text!htmlTemplate'], function (htmlTemplate) {
console.log(htmlTemplate); // now prints HTML into the console!!!
});
Hallelujah!
Found the right hint here. Another option might be to set the path for text. At least the configuration must be set somehow so that the function gets called...
I think I've found a solution. Doc of requirejs/text:
So if the text plugin determines that the request for the resource is on another domain, it will try to access a ".js" version of the resource by using a script tag. Script tag GET requests are allowed across domains. The .js version of the resource should just be a script with a define() call in it that returns a string for the module value.
Because of that I changed the configuration to this, so text is not used anymore:
requirejs.config({
baseUrl: "http://some.other.domain/",
paths: {
jquery: 'ext/jquery/jquery.min',
htmlTemplate: 'test.html', // removed the '?'
siteCss: '../css/site'
},
shim: {
htmlTemplate: [
'css!siteCss'
]
},
map: {
'*': {
// removed the text plugin
css: 'ext/require/css.min'
}
}
// removed the useXhr configuration for the text plugin
});
require(['htmlTemplate'], function (htmlTemplate) {
console.log(htmlTemplate); // prints '<div>Here I am!</div>' into the console
});
Now http://some.other.domain/test.html.js gets loaded. The content of test.html is:
define(function () {
return '<div>Here I am!</div>';
});
So I surrounded the HTML with a little bit of JS - no problem to me. And now htmlTemplate is set to the expected string. It's not pure HTML anymore, but since it is a fixed template (i.e. not generated) it may be acceptable.
I am getting No 'Access-Control-Allow-Origin' header is present on the requested resource. after adding the code
config: {
'ext/require/text': { // required!!!
useXhr: function (url, protocol, hostname, port) {
return true;
}
}
},

Unable to get data from json file on local server (port: 3000)

I am hosting an html page on localhost:8888 with a MAMP Server and I am trying to get some data from a JSON file which I am hosting on localhost:3000 on the 'categories' route. Firstly, I wanted to know is this possible ?
If it is not possible, is it possible for me to route the JSON data to another site. If it is possible, here is the script I have embedded in my HTML
<script>
$(document).ready(function(){
setInterval(test,500);
console.log("document ready");
alert('page ready');
});
function test() {
$.ajax({
url:"HTTP://localhost:3000/categories",
dataType: 'jsonp',
success: function(json){
$("#Address1").html(json[0]["id"]);;
}
});
}
</script>
Here is the JSON file :
[{"_id":"5624711f1a530785d511e747","__v":0,"name":"Beverages","description":"Soft drinks, coffees, teas, beers, and ales","created":"2015-10-19T04:27:11.649Z"}]
Currently, It doesn't display any data. I have tried pure JS instead of jquery but it doesn't help.
This is what I got in the Chrome Console : GET localhost:3000/categories?callback=jQuery1113012827125121839345_1445236000644&_=‌​1445236000645 net::ERR_UNKNOWN_URL_SCHEME
Added http:// - does not make a difference

Extracting MP4-Direct link from JWPlayer?

I am trying to extract a videolink from a streaming website.
So I use PHP and "file_get_contents" to open the videolink and then "explode" to find the link to the generated link
Its looks like config5/videoid/generatedhash/ but you can access it also with config5/videoid/
So when I open this link I see this:
document.write('<div id="mediaspace2" style="width:880px;height:495px">');
jwplayer.key="key";
document.write('<div id="mediaspace" itemprop="video"></div>');
var scrwid=window.screen.width;
var def=false;
if(scrwid<1200) def=true;
jwplayer("mediaspace").setup({
sources: [{
file: "http://linkto.mp4",
label: "1080p HD",
type: "mp4"
}
,{
file: "http://linkto.mp4",
label: "720p HD",
type: "mp4",
} ,{
file: "http://linkto.mp4",
label: "360p",
type: "mp4",
"default": def
}
and a lot more stuff
when i copy one link of my choice and open it in chrome everything is fine. but when i try to embed it with html5 video tags it opens a file named na.flv
ist like a security for the website
how can i decrypt it? :/