A solution for streaming JSON using oboe.js in AngularJS? - json

I'm pretty new to Angular so maybe I'm asking the impossible but anyway, here is my challenge.
As our server cannot paginate JSON data I would like to stream the JSON and add it page by page to the controller's model. The user doesn't have to wait for the entire stream to load so I refresh the view fo every X (pagesize) records.
I found oboe.js for parsing the JSON stream and added it using bower to my project. (bower install oboe --save).
I want to update the controllers model during the streaming. I did not use the $q implementation of pomises, because there is only one .resolve(...) possible and I want multiple pages of data loaded via the stream so the $digest needs to be called with every page. The restful service that is called is /service/tasks/search
I created a factory with a search function which I call from within the controller:
'use strict';
angular.module('myStreamingApp')
.factory('Stream', function() {
return {
search: function(schema, scope) {
var loaded = 0;
var pagesize = 100;
// JSON streaming parser oboe.js
oboe({
url: '/service/' + schema + '/search'
})
// process every node which has a schema
.node('{schema}', function(rec) {
// push the record to the model data
scope.data.push(rec);
loaded++;
// if there is another page received then refresh the view
if (loaded % pagesize === 0) {
scope.$digest();
}
})
.fail(function(err) {
console.log('streaming error' + err.thrown ? (err.thrown.message):'');
})
.done(function() {
scope.$digest();
});
}
};
});
My controller:
'use strict';
angular.module('myStreamingApp')
.controller('MyCtrl', function($scope, Stream) {
$scope.data = [];
Stream.search('tasks', $scope);
});
It all seams to work. After a while however the system gets slow and the http call doesn't terminate after refreshing the browser. Also the browser (chrome) crashes when there are too many records loaded.
Maybe I'm on the wrong track because passing the scope to the factory search function doesn't "feel" right and I suspect that calling the $digest on that scope is giving me trouble. Any ideas on this subject are welcome. Especially if you have an idea on implementing it where the factory (or service) could return a promise and I could use
$scope.data = Stream.search('tasks');
in the controller.

I digged in a little further and came up with the following solution. It might help someone:
The factory (named Stream) has a search function which is passed parameters for the Ajax request and a callback function. The callback is being called for every page of data loaded by the stream. The callback function is called via a deferred.promise so the scope can be update automatically with every page. To access the search function I use a service (named Search) which initially returns an empty aray of data. As the stream progresses the factory calls the callback function passed by the service and the page is added to the data.
I now can call the Search service form within a controller and assign the return value to the scopes data array.
The service and the factory:
'use strict';
angular.module('myStreamingApp')
.service('Search', function(Stream) {
return function(params) {
// initialize the data
var data = [];
// add the data page by page using a stream
Stream.search(params, function(page) {
// a page of records is received.
// add each record to the data
_.each(page, function(record) {
data.push(record);
});
});
return data;
};
})
.factory('Stream', function($q) {
return {
// the search function calls the oboe module to get the JSON data in a stream
search: function(params, callback) {
// the defer will be resolved immediately
var defer = $q.defer();
var promise = defer.promise;
// counter for the received records
var counter = 0;
// I use an arbitrary page size.
var pagesize = 100;
// initialize the page of records
var page = [];
// call the oboe unction to start the stream
oboe({
url: '/api/' + params.schema + '/search',
method: 'GET'
})
// once the stream starts we can resolve the defer.
.start(function() {
defer.resolve();
})
// for every node containing an _id
.node('{_id}', function(node) {
// we push the node to the page
page.push(node);
counter++;
// if the pagesize is reached return the page using the promise
if (counter % pagesize === 0) {
promise.then(callback(page));
// initialize the page
page = [];
}
})
.done(function() {
// when the stream is done make surethe last page of nodes is returned
promise.then(callback(page));
});
return promise;
}
};
});
Now I can call the service from within a controller and assign the response of the service to the scope:
$scope.mydata = Search({schema: 'tasks'});
Update august 30, 2014
I have created an angular-oboe module with the above solution a little bit more structured.
https://github.com/RonB/angular-oboe

Related

How to get metadata from Amazon Kinesis Video Streams via Video.js and http-streaming?

Now, I am working on client-side of Amazon Kinesis Video Streams, using video.js and http-streaming to display video.
However, on stream server there are some metadata (text only) for each fragment (as this link: https://aws.amazon.com/about-aws/whats-new/2018/10/kinesis-video-streams-fragment-level-metadata-support/).
I don't know how to get this data by using AWSJavaScriptSDK (Ex: https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/KinesisVideoMedia.html).
I've test with getMedia function, but it not working as expectation (just get media info one time, not each fragment)
var kinesisvideomedia = new AWS.KinesisVideoMedia({
//apiVersion: '2017-09-30',
region: options.region,
accessKeyId: options.accessKeyId,
secretAccessKey: options.secretAccessKey,
endpoint: response.DataEndpoint
});
// 3. Create the parameters for getMedia()
var mopts = {
StartSelector: {
StartSelectorType: 'EARLIEST'
},
StreamName: streamName
};
kinesisvideomedia.getMedia(mopts, function (error, vmresp) {
if (error) {
console.log(error);
}
//console.log(vmresp);
});
Many thanks for any support!
Your parameters only tells getMedia to grab the earliest fragment from the stream. If you want to get all the following fragments you have to use the ContinuationToken that was returned in the response from the previous call to getMedia when doing additional calls to getMedia.
Regarding the metadata on the fragment level, you need to parse the response payload, for example like in this example, using the video streams parser library
getMedia is not well documented in the js aws-sdk, the main trick is to use request.createReadStream() in order to stream the media chunks.
You could do it like
var kinesisvideomedia = new AWS.KinesisVideoMedia();
var kinesisvideo = new AWS.KinesisVideo();
const params = {
APIName: "GET_MEDIA",
StreamName: streamName
}
kinesisvideo.getDataEndpoint(params, function(err, data) {
if (err) {
throw(err)
}
console.log("Changing endpoint to", data.DataEndpoint);
kinesisvideomedia.endpoint = data.DataEndpoint;
var mopts = {
StartSelector: {
StartSelectorType: 'EARLIEST'
},
StreamName: streamName
};
const request = kinesisvideomedia.getMedia(mopts);
const stream = request.createReadStream();
stream.on('data', function(data) { console.log("data", data)})
});

How to response a plain text in feathersjs websocket api?

I define a feathers service api as below:
class Monitor {
find(_) {
const metrics = prom.register.metrics();
log.info(metrics);
return new Promise((resolve) => {
resolve({text: metrics});
});
}
}
function restFormatter(req, res) {
res.format({
'text/plain': function() {
log('xxxx:', res);
res.end(`The Message is: "${res.data}"`);
}
});
}
module.exports = function () {
const app = this;
// Initialize our service with any options it requires
const service = new Monitor();
app.configure(rest(restFormatter)).use('/metrics', service);
// Get our initialize service to that we can bind hooks
const monitorService = app.service('/metrics');
// Set up our before hooks
monitorService.before(hooks.before);
// Set up our after hooks
monitorService.after(hooks.after);
return service;
};
module.exports.Monitor = Monitor;
when call this API from browser, I get below response:
"# HELP nodejs_gc_runs_total Count of total garbage collections.\n# TYPE nodejs_gc_runs_total counter\n\n# HELP nodejs_gc_pause_seconds_total Time spent in GC Pause in seconds.\n# TYPE nodejs_gc_pause_seconds_total counter\n\n# HELP nodejs_gc_reclaimed_bytes_total Total number of bytes reclaimed by GC.\n# TYPE nodejs_gc_reclaimed_bytes_total counter\n"
from above output you can see that feathersjs doesn't return the data in plain text format. It transpile my response text into a string. Below is the output from express service shown in the browser:
# HELP nodejs_gc_runs_total Count of total garbage collections.
# TYPE nodejs_gc_runs_total counter
# HELP nodejs_gc_pause_seconds_total Time spent in GC Pause in seconds.
# TYPE nodejs_gc_pause_seconds_total counter
# HELP nodejs_gc_reclaimed_bytes_total Total number of bytes reclaimed by GC.
# TYPE nodejs_gc_reclaimed_bytes_total counter
# HELP newConnection The number of requests served
# TYPE newConnection counter
this output is what I really want. How can I make them feathersjs service return above output?
Below is my feathersjs configuration part:
app
.use(compress())
.options('*', cors())
.use(cors())
.use('/', serveStatic(app.get('public')))
.use(bodyParser.json())
.use(bodyParser.urlencoded({extended: true}))
.configure(hooks())
.configure(rest())
.configure(
swagger({
docsPath: '/docs',
uiIndex: path.join(__dirname, '../public/docs.html'),
info: {
title: process.env.npm_package_fullName,
description: process.env.npm_package_description
}
})
)
.configure(
primus(
{
transformer: 'websockets',
timeout: false
},
(primus) => {
primus.library();
primus.save(path.join(__dirname, '../public/dist/primus.js'));
}
)
)
.configure(services)
.configure(middleware);
You are configuring feathers-rest twice which is why you still get the old output. Remove the app.configure(rest(restFormatter)) from your service file and then either change .configure(rest()) to .configure(rest(restFormatter)) in the main file to use the formatter to apply to all services or register a custom middleware for the service that does the formatting just for that service:
app.use('/metrics', service, function(req, res) {
res.format({
'text/plain': function() {
log('xxxx:', res);
res.end(`The Message is: "${res.data}"`);
}
});
});

html fetch multiple files

I would like to fetch multiple files at once using the new fetch api (https://fetch.spec.whatwg.org/). Is is possible natively? If so, how should I do it leveraging the promises?
var list = [];
var urls = ['1.html', '2.html', '3.html'];
var results = [];
urls.forEach(function(url, i) { // (1)
list.push( // (2)
fetch(url).then(function(res){
results[i] = res.blob(); // (3)
})
);
});
Promise
.all(list) // (4)
.then(function() {
alert('all requests finished!'); // (5)
});
This is untested code! Additionally, it relies on Array.prototype.forEach and the new Promise object of ES6. The idea works like this:
Loop through all URLs.
For each URL, fetch it with the fetch API, store the returned promise in list.
Additionally, when the request is finished, store the result in results.
Create a new promise, that resolves, when all promises in list are resolved (i.e., all requests finished).
Enjoy the fully populated results!
While implementing Boldewyn's solution in Kotlin, I pared it down to this:
fun fetchAll(vararg resources: String): Promise<Array<out Response>> {
return Promise.all(resources.map { fetch(it) }.toTypedArray())
}
Which roughly translates to this in JavaScript:
function fetchAll(...resources) {
var destination = []
resources.forEach(it => {
destination.push(fetch(it))
})
return Promise.all(destination)
}
Earlier, I tried to use map instead of forEach + pushing to a new array, but for some reason that simply didn't work.

Angularjs $resource and $http synchronous call?

I want write two services one with a $http.get method and one with $resource
This service should receive a Json Object and looks like this, at the moment this code is direct in my controller and not in a service:
var csvPromise= $http.get(base_url + 'DataSource/1').success(function(data) {
$scope.data4=JSON.stringify(data);
});
The problem is, I want save received data in $scope.data4 and I want use this data after the $http.get call but the value is empty.
Direct after this call there is and Object that needs this value:
new myObject($scope.data4)
so myObject must wait so long until the data has arrived.
or can I make a synchronous call with $http or $resource ?
How can i do this ? I have found so many examples with promise and .then but nothing has worked for me.
EDIT: I have now written a service but it didn`t work:
var test=angular.module('myApp.getCSV', ['ngResource']);
test.factory('getCSV',function($log, $http,$q, $resource){
return {
getData: function (id) {
var csvPromise= $http.get(base_url +'DataSource/'+id)
.success(function(data) {
return data;
});
return csvPromise;
}
}
});
and then in my controller I call this:
getCSV.getData(1).then(function(theData){
$scope.data4=JSON.stringify(theData);
new myObject( $scope.data4); });
but this did not work. I thought if the $http.get receives the data then the then Function is called.
I don't believe you can do synchronous calls. That said, you have at least two options:
1) Pass in the data using the $routeProvider resolve feature. From the documentation:
An optional map of dependencies which should be injected into the controller. If any of these dependencies are promises, the router will wait for them all to be resolved or one to be rejected before the controller is instantiated. If all the promises are resolved successfully, the values of the resolved promises are injected
An example on how to use this:
$routeProvider
.when('/your/path', {
templateUrl: '/app/yourtemplate.html',
controller: 'yourController',
resolve: {
data: ['$route', '$http', function($route, $http) {
return $http.get(base_url +'DataSource/1');
}]
}
})
And then in your controller:
app.controller('yourController', ['$scope', 'data', function($scope, data) {
$scope.data4 = JSON.stringufy(data);
var yourObj = new myObject($scope.data4);
}]);
2) The second option is to use promises and only instantiate your new myObject($scope.data4) once the promise successfully completes.
Your code needs to be changed just a bit:
$scope.data4 = '';
var csvPromise= $http.get(base_url +'DataSource/1');
csvPromise.then(function(data){
$scope.data4 = JSON.stringify(data);
}, function(data){
//error handling should go here
window.alert(data);
});
This should give you what it sounds to me like you need.
As i know, there's no way to sync~ call the http or resource. They're hard coded on AngularJS core file :
xhr.open(method, url, true);
And you don't want to hurt your users too by blocking the browser wait the data arrived. You'll better show how you make the nothing has worked for me so we can start working to fix it.
Have you try call new myObject($scope.data4) inside success method?
$http.get(...).success(function(data){
$scope.data4 = JSON.stringify(data); // I've no idea why do you need this.
var stuff = new myObject($scope.data4); // THis is now your.
});

Service retrieves data from datastore but does not update ui

I have a service which retrieves data from the datastore (Web SQL). Afterwards, it stores the data in a AngularJS array. The problem is that this does not initiate changes to the UI.
Contrary, if after the retrieval of data from datastore, I call a web services using a $get method and append the results to the previous array, all data updates the UI.
Any suggestions? Is it possible that I fill the array before the Angular binds the variable?
Can I somehow delay the execution of the service?
Most of the code has been taken from the following example: http://vojtajina.github.io/WebApp-CodeLab/FinalProject/
In order for the UI to magically update, some changes must happen on properties of the $scope. For example, if retrieving some users from a rest resource, I might do something like this:
app.controller("UserCtrl", function($http) {
$http.get("users").success(function(data) {
$scope.users = data; // update $scope.users IN the callback
}
)
Though there is a better way to retrieve data before a template is loaded (via routes/ng-view):
app.config(function($routeProvider, userFactory) {
$routeProvider
.when("/users", {
templateUrl: "pages/user.html",
controller: "UserCtrl",
resolve: {
// users will be available on UserCtrl (inject it)
users: userFactory.getUsers() // returns a promise which must be resolved before $routeChangeSuccess
}
}
});
app.factory("userFactory", function($http, $q) {
var factory = {};
factory.getUsers = function() {
var delay = $q.defer(); // promise
$http.get("/users").success(function(data){
delay.resolve(data); // return an array of users as resolved object (parsed from JSON)
}).error(function() {
delay.reject("Unable to fetch users");
});
return delay.promise; // route will not succeed unless resolved
return factory;
});
app.controller("UserCtrl", function($http, users) { // resolved users injected
// nothing else needed, just use users it in your template - your good to go!
)
I have implemented both methods and the latter is far desirable for two reasons:
It doesn't load the page until the resource is resolved. This allows you to place a loading icon, etc, by attaching handlers on the $routeChangeStart and $routeChangeSuccess.
Furthermore, it plays better with 'enter' animations in that, all your items don't annoyingly play the enter animation every time the page is loaded (since $scope.users is pre populated as opposed to being updated in a callback once the page has loaded).
Assuming you're assigning the data to the array in the controller, set an $scope.$apply() after to have the UI update.
Ex:
$scope.portfolio = {};
$scope.getPortfolio = function() {
$.ajax({
url: 'http://website.com:1337/portfolio',
type:'GET',
success: function(data, textStatus, jqXHR) {
$scope.portfolio = data;
$scope.$apply();
},
error: function(jqXHR, textStatus, errorThrown) {
console.log(errorThrown);
}
});
};