How to get metadata from Amazon Kinesis Video Streams via Video.js and http-streaming? - aws-sdk

Now, I am working on client-side of Amazon Kinesis Video Streams, using video.js and http-streaming to display video.
However, on stream server there are some metadata (text only) for each fragment (as this link: https://aws.amazon.com/about-aws/whats-new/2018/10/kinesis-video-streams-fragment-level-metadata-support/).
I don't know how to get this data by using AWSJavaScriptSDK (Ex: https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/KinesisVideoMedia.html).
I've test with getMedia function, but it not working as expectation (just get media info one time, not each fragment)
var kinesisvideomedia = new AWS.KinesisVideoMedia({
//apiVersion: '2017-09-30',
region: options.region,
accessKeyId: options.accessKeyId,
secretAccessKey: options.secretAccessKey,
endpoint: response.DataEndpoint
});
// 3. Create the parameters for getMedia()
var mopts = {
StartSelector: {
StartSelectorType: 'EARLIEST'
},
StreamName: streamName
};
kinesisvideomedia.getMedia(mopts, function (error, vmresp) {
if (error) {
console.log(error);
}
//console.log(vmresp);
});
Many thanks for any support!

Your parameters only tells getMedia to grab the earliest fragment from the stream. If you want to get all the following fragments you have to use the ContinuationToken that was returned in the response from the previous call to getMedia when doing additional calls to getMedia.
Regarding the metadata on the fragment level, you need to parse the response payload, for example like in this example, using the video streams parser library

getMedia is not well documented in the js aws-sdk, the main trick is to use request.createReadStream() in order to stream the media chunks.
You could do it like
var kinesisvideomedia = new AWS.KinesisVideoMedia();
var kinesisvideo = new AWS.KinesisVideo();
const params = {
APIName: "GET_MEDIA",
StreamName: streamName
}
kinesisvideo.getDataEndpoint(params, function(err, data) {
if (err) {
throw(err)
}
console.log("Changing endpoint to", data.DataEndpoint);
kinesisvideomedia.endpoint = data.DataEndpoint;
var mopts = {
StartSelector: {
StartSelectorType: 'EARLIEST'
},
StreamName: streamName
};
const request = kinesisvideomedia.getMedia(mopts);
const stream = request.createReadStream();
stream.on('data', function(data) { console.log("data", data)})
});

Related

How to use a Javascript file to refresh/reload a div from an HTML file?

I am using Node JS and have a JS file, which opens a connection to an API, works with the receving API data and then saves the changed data into a JSON file. Next I have an HTML file, which takes the data from the JSON file and puts it into a table. At the end I open the HTML file in my browser to look at the visualized table and its data.
What I would like to happen is, that the table (or more specific a DIV with an ID inside the table) from the HTML file refreshes itself, when the JSON data gets updated from the JS file. Kinda like a "live table/website", that I can watch change over time without the need to presh F5.
Instead of just opening the HTML locally, I have tried it by using the JS file and creating a connection with the file like this:
const http = require('http');
const path = require('path');
const browser = http.createServer(function (request, response) {
var filePath = '.' + request.url;
if (filePath == './') {
filePath = './Table.html';
}
var extname = String(path.extname(filePath)).toLowerCase();
var mimeTypes = {
'.html': 'text/html',
'.css': 'text/css',
'.png': 'image/png',
'.js': 'text/javascript',
'.json': 'application/json'
};
var contentType = mimeTypes[extname] || 'application/octet-stream';
fs.readFile(filePath, function(error, content) {
response.writeHead(200, { 'Content-Type': contentType });
response.end(content, 'utf-8');
});
}).listen(3000);
This creates a working connection and I am able to see it in the browser, but sadly it doesn't update itself like I wish. I thought about some kind of function, which gets called right after the JSON file got saved and tells the div to reload itself.
I also read about something like window.onload, location.load() or getElementById(), but I am not able to figure out the right way.
What can I do?
Thank you.
Websockets!
Though they might sound scary, it's very easy to get started with websockets in NodeJS, especially if you use Socket.io.
You will need two dependencies in your node application:
"socket.io": "^4.1.3",
"socketio-wildcard": "^2.0.0"
your HTML File:
<script type="module" src="https://cdnjs.cloudflare.com/ajax/libs/socket.io/4.0.0/socket.io.js"></script>
Your CLIENT SIDE JavaScript file:
var socket = io();
socket.on("update", function (data) { //update can be any sort of string, treat it like an event name
console.log(data);
// the rest of the code to update the html
})
your NODE JS file:
import { Server } from "socket.io";
// other code...
let io = new Server(server);
let activeConnections = {};
io.sockets.on("connection", function (socket) {
// 'connection' is a "magic" key
// track the active connections
activeConnections[socket.id] = socket;
socket.on("disconnect", function () {
/* Not required, but you can add special handling here to prevent errors */
delete activeConnections[socket.id];
})
socket.on("update", (data) => {
// Update is any sort of key
console.log(data)
})
})
// Example with Express
app.get('/some/api/call', function (req, res) {
var data = // your API Processing here
Object.keys(activeConnections).forEach((conn) => {
conn.emit('update', data)
}
res.send(data);
})
Finally, shameful self promotion, here's one of my "dead" side projects using websockets, because I'm sure I forgot some small detail, and this might help. https://github.com/Nhawdge/robert-quest

How can I use ngCordova File api to save JSON?

I'm trying to save JSON data in my Ionic app to the local device storage. I would like to use the ngCordova File plugin. I can't seem to find any tutorials or example apps that use the exact methods they have in the docs.
Has anyone used this plugin before to save JSON data? How did you do it?
ngCordova takes away a lot of the ugliness of writing files using the file writer API.
This example has been adapted from the docs, and uses writeFile(path, file, data, replace) where the path is defined by cordova.file.DIRECTORY_TYPE, file is a string name for the file, data is the string representation of the data (so we will use JSON.stringify()). Replace is a boolean that will simply erase the existing contents of the file.
//Write using cordova.file.dataDirectory, see File System Layout section for more info
var json = {"test": "hello world"}
$cordovaFile.writeFile(cordova.file.dataDirectory, "hello.json", JSON.stringify(json), true)
.then(function (success) {
// success
}, function (error) {
// error
console.log(error); //error mappings are listed in the documentation
});
For a controller, supposing we are using controllerAs syntax it could look something like this:
angular.controller("...",['$cordovaFile' function ($cordovaFile) {
var vm = this;
vm.writeFile = function (fileName) {
ionic.Platform.ready(function(){
// will execute when device is ready, or immediately if the device is already ready.
var json = {"test": "hello world"}
$cordovaFile.writeFile(cordova.file.dataDirectory, "hello.json", JSON.stringify(json), true)
.then(function (success) {
// success
}, function (error) {
// error
console.log(error); //error mappings are listed in the documentation
});
});
};
});

AngularJS File Upload to Backend Express Server

I am trying to do a file upload using angularjs, using angular-file-upload library (https://github.com/danialfarid/angular-file-upload)
Here is my code
// ===============================My HTML File===========================
<input type="file" ng-file-select="onFileSelect($files)">
// ===============================My Controller==========================
var $scope.formObj = {
name: "Test"
};
var fileToUpload;
$scope.onFileSelect = function (file) {
fileToUpload = file[0];
};
// POSt request to /api/items
$scope.addItem = function() {
console.log($scope.formObj);
$scope.upload = $upload.upload({
url: '/api/items',
method: 'POST',
data: { myObj: $scope.formObj },
file: fileToUpload
}).success(function(data, status, headers, config) {
console.log("success");
});
};
// ================================My Backend=============================
// This is the function that will receive POST request to /api/items
exports.create = function(req, res) {
console.log(req.body); // req.body is just an empty object. ==> {}
// apparently, I found all the data to be in req._readableState.buffer[0]
// in the form of a buffer
var buffer = req._readableState.buffer[0];
// trying to console.log the buffer.toString, resulting in something similar to this
// { name: "Test", image: Object }
console.log(buffer.toString());
return res.send(200);
};
So my backend received the formObj with all its properties and values, however, the actual file data itself, whether in the form of buffer, or base64, or whatever, never gets received.
I wonder why. This is my first time working with file uploading, so I don't understand the concept.
Please point me in the right direction
If you are using Latest version of Express, you'd notice that
app.use(express.multipart()); is no longer bundled with express.
So do the following configuration changes. in express.js
var multer = require('multer');
app.use(multer({ dest: './uploads/'}));
You'd find that after doing this you would find the data and file , in req.body req.file respectively.
Hope it helps

A solution for streaming JSON using oboe.js in AngularJS?

I'm pretty new to Angular so maybe I'm asking the impossible but anyway, here is my challenge.
As our server cannot paginate JSON data I would like to stream the JSON and add it page by page to the controller's model. The user doesn't have to wait for the entire stream to load so I refresh the view fo every X (pagesize) records.
I found oboe.js for parsing the JSON stream and added it using bower to my project. (bower install oboe --save).
I want to update the controllers model during the streaming. I did not use the $q implementation of pomises, because there is only one .resolve(...) possible and I want multiple pages of data loaded via the stream so the $digest needs to be called with every page. The restful service that is called is /service/tasks/search
I created a factory with a search function which I call from within the controller:
'use strict';
angular.module('myStreamingApp')
.factory('Stream', function() {
return {
search: function(schema, scope) {
var loaded = 0;
var pagesize = 100;
// JSON streaming parser oboe.js
oboe({
url: '/service/' + schema + '/search'
})
// process every node which has a schema
.node('{schema}', function(rec) {
// push the record to the model data
scope.data.push(rec);
loaded++;
// if there is another page received then refresh the view
if (loaded % pagesize === 0) {
scope.$digest();
}
})
.fail(function(err) {
console.log('streaming error' + err.thrown ? (err.thrown.message):'');
})
.done(function() {
scope.$digest();
});
}
};
});
My controller:
'use strict';
angular.module('myStreamingApp')
.controller('MyCtrl', function($scope, Stream) {
$scope.data = [];
Stream.search('tasks', $scope);
});
It all seams to work. After a while however the system gets slow and the http call doesn't terminate after refreshing the browser. Also the browser (chrome) crashes when there are too many records loaded.
Maybe I'm on the wrong track because passing the scope to the factory search function doesn't "feel" right and I suspect that calling the $digest on that scope is giving me trouble. Any ideas on this subject are welcome. Especially if you have an idea on implementing it where the factory (or service) could return a promise and I could use
$scope.data = Stream.search('tasks');
in the controller.
I digged in a little further and came up with the following solution. It might help someone:
The factory (named Stream) has a search function which is passed parameters for the Ajax request and a callback function. The callback is being called for every page of data loaded by the stream. The callback function is called via a deferred.promise so the scope can be update automatically with every page. To access the search function I use a service (named Search) which initially returns an empty aray of data. As the stream progresses the factory calls the callback function passed by the service and the page is added to the data.
I now can call the Search service form within a controller and assign the return value to the scopes data array.
The service and the factory:
'use strict';
angular.module('myStreamingApp')
.service('Search', function(Stream) {
return function(params) {
// initialize the data
var data = [];
// add the data page by page using a stream
Stream.search(params, function(page) {
// a page of records is received.
// add each record to the data
_.each(page, function(record) {
data.push(record);
});
});
return data;
};
})
.factory('Stream', function($q) {
return {
// the search function calls the oboe module to get the JSON data in a stream
search: function(params, callback) {
// the defer will be resolved immediately
var defer = $q.defer();
var promise = defer.promise;
// counter for the received records
var counter = 0;
// I use an arbitrary page size.
var pagesize = 100;
// initialize the page of records
var page = [];
// call the oboe unction to start the stream
oboe({
url: '/api/' + params.schema + '/search',
method: 'GET'
})
// once the stream starts we can resolve the defer.
.start(function() {
defer.resolve();
})
// for every node containing an _id
.node('{_id}', function(node) {
// we push the node to the page
page.push(node);
counter++;
// if the pagesize is reached return the page using the promise
if (counter % pagesize === 0) {
promise.then(callback(page));
// initialize the page
page = [];
}
})
.done(function() {
// when the stream is done make surethe last page of nodes is returned
promise.then(callback(page));
});
return promise;
}
};
});
Now I can call the service from within a controller and assign the response of the service to the scope:
$scope.mydata = Search({schema: 'tasks'});
Update august 30, 2014
I have created an angular-oboe module with the above solution a little bit more structured.
https://github.com/RonB/angular-oboe

DevExtreme datasource can't load Data Service data

I tried to accomplish the tutorial here, and when I used their data service, it worked just fine.
I modified the source to my data service (WCF Data Service v5.6, OData V2), and the list just shows the Loading sign and nothing happens.
The code should load any data type, it just has to be mapped accordingly. My service is availabe through the browser, I checked.
Here is the code:
DevExTestApp.home = function (params) {
var viewModel = {
dataSource: DevExpress.data.createDataSource({
load: function (loadOptions) {
if (loadOptions.refresh) {
try {
var deferred = new $.Deferred();
$.get("http://192.168.1.101/dataservice/dataservice.svc/People")
.done(function (result) {
var mapped = $.map(result, function (data) {
return {
name: data.Name
}
});
deferred.resolve(mapped);
});
}
catch (err) {
alert(err.message);
}
return deferred;
}
}
})
};
return viewModel;
}
What else should I set?
The try-catch block would not help is this case, because data loading is async. Instead, subscribe to the fail callback:
$.get(url)
.done(doneFunc)
.fail(failFunc);
Another common problem with accessing a web service from JavaScript is Same-Origin Policy. Your OData service have to support either CORS or JSONP. Refer to this discussion.