Low Latency (50ms) Video Streaming with NODE.JS and html5 - html

OBJECTIVE:
I'm building a FPV robot, I want to control it with a with a webbrowser over a local wi-fi connection.
I'm using a raspberry pi 3B+ with Raspbian Stretch. I built my own motor control and power regulator hat.
After lots of research testing, I decided to use node.JS as http server and socket.io to provide a low latency bidirectional communication with my robot. This stack achieve about 7ms of latency.
Picture of the robot
PROBLEM:
I need to stream low latency video from an USB camera attached to the RPI to the browser. My target is to achieve at least 640x480 resolution at 10FPS with 50ms of latency or better. I'm happy sacrificing visual fedelity to get a quicker response from my robot.
If possible I would like to stream in UDP to improve the reliability of the stream.
If possible I would like to stream a video that modern webbrowsers can natively decode. I'd like to use a H264 codec and the HTML5 video tag.
I can fall back to use a javascript player if there is no other option.
WHAT I TRIED:
I did an extensive research and tried many tools.
Amongst other, I tried VLC, mjpg streamer, gstreamer and raspivid. A few times I got to a stream the webbrowser could view, but at best I got a latency of 700ms at 320x240. Very very far from my target.
Currently I'm looking into WebRTC solutions.
QUESTION:
I'd like suggestions for NODE.JS packages or other solutions to provide a UDP H264 video stream that can be decoded by an HTML5 video tag with a target latency of 50ms.
Thanks
UPDATE:
Thanks for your answers! I'll keep updating this question and I'll post the solution once it works.
PUSH INDIVIDUAL FRAMES
I tried a different approach by pushing individual 200KB 640x480 jpg frame through websocket and I got a latency of about 190ms. I can probably do a lot better by reusing objects but I'm putting this attempt in hold for now.
UPDATE2:
While researching WebRTC I found a stack that looked easy enough.
Server side it uses V4L2 as driver, FFMPEG to transcode into an MPEG1 http stream with TS encapsulation locally, node js to flip the stream into a websocket.
Client side there is a javascript that decode the MPEG1 TS stream and paint a canvas object into the HTML page.
It achieves 640x480#20FPS with 240mS of latency.
Good enough for an MVP, but I'll keep working to get it down.
Code in the answer.

I adapted code from here and integrated it with an http server and socket.io controls:
https://github.com/phoboslab/jsmpeg
Server:
V4L2 -> FFMPEG (MPEG1 TS) -> NODE HTTP Server -> NODE Websocket broadcast
Client:
Websocket -> Javascript (Decode MPEG1 TS and paint to html canvas) -> Html Canvas
This stack achieve 640x480#20FPS with 240ms of latency. Still far from my target but good enough as MVP. The controls in both directions have a latency of 7ms, which is excellent.
This stack is held back by the transcoding and decoding stage, and the RPI gets really hot. The transport of raw data through websocket looks good, I'm going to profile the latency of each steps in the future.
Execution:
pi#MazeRunner:~ $ node node.js &
pi#MazeRunner:~ $ ffmpeg -f v4l2 -framerate 20 -video_size 640x480 -i /dev/video0 -f mpegts -codec:v mpeg1video -s 640x480 -b:v 600k -bf 0 http://localhost:8080/mystream
Server side NODE.JS
//operating system library. Used to get local IP address
var os = require("os");
//file system library. Used to load file stored inside back end server (https://nodejs.org/api/fs.html)
var fs = require("fs");
//http system library. Handles basic html requests
var http = require("http").createServer(http_handler);
//url library. Used to process html url requests
var url = require("url");
//Websocket
var io = require("socket.io")(http);
//Websocket used to stream video
var websocket = require("ws");
//-----------------------------------------------------------------------------------
// CONFIGURATION
//-----------------------------------------------------------------------------------
//Port the server will listen to
var server_port = 8080;
var websocket_stream_port = 8082;
//Path of the http and css files for the http server
var file_index_name = "index.html";
var file_css_name = "style.css";
var file_jsplayer_name = "jsmpeg.min.js";
//Http and css files loaded into memory for fast access
var file_index;
var file_css;
var file_jsplayer;
//Name of the local video stream
var stream_name = "mystream";
//-----------------------------------------------------------------------------------
// DETECT SERVER OWN IP
//-----------------------------------------------------------------------------------
//If just one interface, store the server IP Here
var server_ip;
//Get local IP address of the server
//https://stackoverflow.com/questions/3653065/get-local-ip-address-in-node-js
var ifaces = os.networkInterfaces();
Object.keys(ifaces).forEach
(
function (ifname)
{
var alias = 0;
ifaces[ifname].forEach
(
function (iface)
{
if ('IPv4' !== iface.family || iface.internal !== false)
{
// skip over internal (i.e. 127.0.0.1) and non-ipv4 addresses
return;
}
if (alias >= 1)
{
// this single interface has multiple ipv4 addresses
console.log('INFO: Server interface ' +alias +' - ' + ifname + ':' + alias, iface.address);
}
else
{
server_ip = iface.address;
// this interface has only one ipv4 adress
console.log('INFO: Server interface - ' +ifname, iface.address);
}
++alias;
}
);
}
);
//-----------------------------------------------------------------------------------
// HTTP SERVER
//-----------------------------------------------------------------------------------
// Fetch and serves local files to client
//Create http server and listen to the given port
http.listen
(
server_port,
function( )
{
console.log('INFO: ' +server_ip +' listening to html requests on port ' +server_port);
//Pre-load http, css and js files into memory to improve http request latency
file_index = load_file( file_index_name );
file_css = load_file( file_css_name );
file_jsplayer = load_file( file_jsplayer_name );
}
);
//-----------------------------------------------------------------------------------
// HTTP REQUESTS HANDLER
//-----------------------------------------------------------------------------------
// Answer to client http requests. Serve http, css and js files
function http_handler(req, res)
{
//If client asks for root
if (req.url == '/')
{
//Request main page
res.writeHead( 200, {"Content-Type": detect_content(file_index_name),"Content-Length":file_index.length} );
res.write(file_index);
res.end();
console.log("INFO: Serving file: " +req.url);
}
//If client asks for css file
else if (req.url == ("/" +file_css_name))
{
//Request main page
res.writeHead( 200, {"Content-Type": detect_content(file_css_name),"Content-Length" :file_css.length} );
res.write(file_css);
res.end();
console.log("INFO: Serving file: " +req.url);
}
//If client asks for css file
else if (req.url == ("/" +file_jsplayer_name))
{
//Request main page
res.writeHead( 200, {"Content-Type": detect_content(file_jsplayer_name),"Content-Length" :file_jsplayer.length} );
res.write(file_jsplayer);
res.end();
console.log("INFO: Serving file: " +req.url);
}
//Listening to the port the stream from ffmpeg will flow into
else if (req.url = "/mystream")
{
res.connection.setTimeout(0);
console.log( "Stream Connected: " +req.socket.remoteAddress + ":" +req.socket.remotePort );
req.on
(
"data",
function(data)
{
streaming_websocket.broadcast(data);
/*
if (req.socket.recording)
{
req.socket.recording.write(data);
}
*/
//console.log("broadcast: ", data.length);
}
);
req.on
(
"end",
function()
{
console.log("local stream has ended");
if (req.socket.recording)
{
req.socket.recording.close();
}
}
);
}
//If client asks for an unhandled path
else
{
res.end();
console.log("ERR: Invalid file request" +req.url);
}
}
//-----------------------------------------------------------------------------------
// WEBSOCKET SERVER: CONTROL/FEEDBACK REQUESTS
//-----------------------------------------------------------------------------------
// Handle websocket connection to the client
io.on
(
"connection",
function (socket)
{
console.log("connecting...");
socket.emit("welcome", { payload: "Server says hello" });
//Periodically send the current server time to the client in string form
setInterval
(
function()
{
socket.emit("server_time", { server_time: get_server_time() });
},
//Send every 333ms
333
);
socket.on
(
"myclick",
function (data)
{
timestamp_ms = get_timestamp_ms();
socket.emit("profile_ping", { timestamp: timestamp_ms });
console.log("button event: " +" client says: " +data.payload);
}
);
//"ArrowLeft"
socket.on
(
"keyboard",
function (data)
{
timestamp_ms = get_timestamp_ms();
socket.emit("profile_ping", { timestamp: timestamp_ms });
console.log("keyboard event: " +" client says: " +data.payload);
}
);
//profile packets from the client are answer that allows to compute roundway trip time
socket.on
(
"profile_pong",
function (data)
{
timestamp_ms_pong = get_timestamp_ms();
timestamp_ms_ping = data.timestamp;
console.log("Pong received. Round trip time[ms]: " +(timestamp_ms_pong -timestamp_ms_ping));
}
);
}
);
//-----------------------------------------------------------------------------------
// WEBSOCKET SERVER: STREAMING VIDEO
//-----------------------------------------------------------------------------------
// Websocket Server
var streaming_websocket = new websocket.Server({port: websocket_stream_port, perMessageDeflate: false});
streaming_websocket.connectionCount = 0;
streaming_websocket.on
(
"connection",
function(socket, upgradeReq)
{
streaming_websocket.connectionCount++;
console.log
(
'New websocket Connection: ',
(upgradeReq || socket.upgradeReq).socket.remoteAddress,
(upgradeReq || socket.upgradeReq).headers['user-agent'],
'('+streaming_websocket.connectionCount+" total)"
);
socket.on
(
'close',
function(code, message)
{
streaming_websocket.connectionCount--;
console.log('Disconnected websocket ('+streaming_websocket.connectionCount+' total)');
}
);
}
);
streaming_websocket.broadcast = function(data)
{
streaming_websocket.clients.forEach
(
function each(client)
{
if (client.readyState === websocket.OPEN)
{
client.send(data);
}
}
);
};
//-----------------------------------------------------------------------------------
// FUNCTIONS
//-----------------------------------------------------------------------------------
//-----------------------------------------------------------------------------------
// SERVER DATE&TIME
//-----------------------------------------------------------------------------------
// Get server time in string form
function get_server_time()
{
my_date = new Date();
return my_date.toUTCString();
}
//-----------------------------------------------------------------------------------
// TIMESTAMP
//-----------------------------------------------------------------------------------
// Profile performance in ms
function get_timestamp_ms()
{
my_date = new Date();
return 1000.0* my_date.getSeconds() +my_date.getMilliseconds()
}
//-----------------------------------------------------------------------------------
// FILE LOADER
//-----------------------------------------------------------------------------------
// Load files into memory for improved latency
function load_file( file_name )
{
var file_tmp;
var file_path = __dirname +"/" +file_name;
//HTML index file
try
{
file_tmp = fs.readFileSync( file_path );
}
catch (err)
{
console.log("ERR: " +err.code +" failed to load: " +file_path);
throw err;
}
console.log("INFO: " +file_path +" has been loaded into memory");
return file_tmp;
}
//-----------------------------------------------------------------------------------
// CONTENT TYPE DETECTOR
//-----------------------------------------------------------------------------------
// Return the right content type to give correct information to the client browser
function detect_content( file_name )
{
if (file_name.includes(".html"))
{
return "text/html";
}
else if (file_name.includes(".css"))
{
return "text/css";
}
else if (file_name.includes(".js"))
{
return "application/javascript";
}
else
{
throw "invalid extension";
}
}
Client Side html
<!DOCTYPE html>
<meta charset="utf-8"/>
<html>
<head>
<title>Maze Runner</title>
<link rel="stylesheet" href="style.css">
<script type="text/javascript" src="/socket.io/socket.io.js"></script>
<script type="text/javascript">
var host_ip = document.location.hostname;
console.log("connecting to host: ", host_ip);
//Get references to the html controls
textbox_input1 = window.document.getElementById("my_text_box")
//Connect to the server via websocket
var mysocket = io("http://" +host_ip +":8080");
//Long lived frame object
var last_frame;
//-----------------------------------------
// CONNESSION ACKNOWLEDGE
//-----------------------------------------
// Link is initiated by the client
// Server sends a welcome message when link is estabilished
// Server could send an auth token to keep track of individual clients and login data
mysocket.on
(
"welcome",
(message) =>
{
console.log("Server websocket connession acknoweldged... " +message.payload);
}
)
//-----------------------------------------
// SERVER->CLIENT CONTROLS
//-----------------------------------------
// Server can send an async message to dinamically update the page without reloading
// This is an example message with the server local date and time in string form
mysocket.on
(
"server_time",
(message) =>
{
fill_label( message.server_time );
console.log("Server sent his local time... " +message.server_time);
}
)
function fill_label( payload )
{
textbox_input1.value=payload;
}
//-----------------------------------------
// CLIENT->SERVER CONTROLS
//-----------------------------------------
// Controls inside the webpage can emit async events to the server
// In this example I have a push button and I catch keyboard strokes
//Handler for a pushbutton
function socket_button_handler()
{
mysocket.emit("myclick", { payload: "button was clicked" });
console.log("Button was clicked...");
}
//Listen for keystrokes
window.document.addEventListener
(
"keypress",
function onEvent(event)
{
//Inform the server that a key has been pressed
mysocket.emit("keyboard", { payload: event.key });
console.log("Key press...");
}
);
//-----------------------------------------
// PING-PONG
//-----------------------------------------
// Server sends ping messages with a timestamp
// Client answers with pongs to allow server to profile latency of the channel
//profile messages means the server wants to compute roundway trip
mysocket.on
(
"profile_ping",
(message) =>
{
//Answer back with the received timestamp so that server can compute roundway trip
mysocket.emit("profile_pong", { timestamp: message.timestamp });
console.log( "server wants a pong. server absolute timestamp[ms]: " +message.timestamp );
}
);
</script>
</head>
<body>
<h1>Html+Css Server +low latency Websocket server</h1>
<!-- button control with socket emitter as handler -->
<p> This button will emit a websocket event. The server will be informed in real time of the event. </p>
<button id="my_button" type="button" onclick="socket_button_handler()">Websocket Button!</button>
<!-- input text control -->
<p> This input can be filled through websockets directly by the server in real time </p>
<input id="my_text_box" type="text" value="" size="40">
<!-- canvas object, it's painted by the javascript video decoder -->
<p> This canvas is painted by the javascript player and shows the live stream.'</p>
<canvas id="video-canvas" width=640 height=480></canvas>
<!-- Javascript video decoder, take in a data stream from a websocket and paint on a canvas -->
<script type="text/javascript" src="jsmpeg.min.js"></script>
<script type="text/javascript">
var mycanvas = document.getElementById("video-canvas");
var url = "ws://" + host_ip +":8082/";
var player = new JSMpeg.Player(url, {canvas: mycanvas});
</script>
</body>
</html>
Javascript Player
You can get the javascript player I used from here:
https://github.com/phoboslab/jsmpeg/blob/master/jsmpeg.min.js

I’d like suggestions for NODE.JS packages or other solutions to provide a UDP H264 video stream that can be decoded by an HTML5 video tag with a target latency of 50ms.
That’s Almost certainly not possible in that configuration.
If you drop the video tag requirement, and use just straight WebRTC in the browser, you may be able to get down to about 150ms.

Related

Web midi on Chrome works with local server but not when served in the cloud

I built a website that uses the Chrome web midi interface (based on navigator.requestMidiAccess) that works fine in a local development server, but when pushed to a cloud server fails, saying that navigator.requestMidiAccess is not a function. The same code, the same browser. I'll try to include the relevant code:
function initializeMidi() {
navigator.requestMIDIAccess()
.then(
(midi) => midiReady(midi),
(err) => console.log('Something went wrong', err));
}
window.onload = (event) => {
initializeMidi();
};
// this next function builds a list of radio buttons to select the MIDI device
function midiReady(midi) {
globalMidi = midi.outputs
parentElement = document.getElementById('midi-devices-div')
parentElement.innerHTML = ''
var lastMidiPortName = null
midi.outputs.forEach(function (port, key) {
addRadioButton(parentElement, port)
lastMidiPortName = port.name
})
var n = window.localStorage.getItem('selectedMidiPortName')
if (n)
{
var e = document.getElementById(n)
e.checked = true
}
}
The Web MIDI interface is only exposed to SecureContexts, you must serve your document using https://.

GDrive API v3 files.get download progress?

How can I show progress of a download of a large file from GDrive using the gapi client-side v3 API?
I am using the v3 API, and I've tried to use a Range request in the header, which works, but the download is very slow (below). My ultimate goal is to playback 4K video. GDrive limits playback to 1920x1280. My plan was to download chunks to IndexedDB via v3 API and play from the locally cached data. I have this working using the code below via Range requests, but it is unusably slow. A normal download of the full 438 MB test file directly (e.g. via the GDrive web page) takes about 30-35s on my connection, and, coincidentally, each 1 MB Range requests takes almost exactly the same 30-35s. It feels like the GDrive back-end is reading and sending the full file for each subrange?
I've also tried using XHR and fetch to download the file, which fails. I've been using the webContent link (which typically ends in &export=download) but I cannot get access headers correct. I get either CORS or other odd permission issues. The webContent links work fine in <image> and <video> src tags. I expect this is due to special permission handling or some header information I'm missing that the browser handles specifically for these media tags. My solution must be able to read private (non-public, non-sharable) links, hence the use of the v3 API.
For video files that are smaller than the GDrive limit, I can set up a MediaRecorder and use a <video> element to get the data with progress. Unfortunately, the 1920x1080 limit kills this approach for larger files, where progress feedback is even more important.
This is the client-side gapi Range code, which works, but is unusably slow for large (400 MB - 2 GB) files:
const getRange = (start, end, size, fileId, onProgress) => (
new Promise((resolve, reject) => gapi.client.drive.files.get(
{ fileId, alt: 'media', Range: `bytes=${start}-${end}` },
// { responseType: 'stream' }, Perhaps this fails in the browser?
).then(res => {
if (onProgress) {
const cancel = onProgress({ loaded: end, size, fileId })
if (cancel) {
reject(new Error(`Progress canceled download at range ${start} to ${end} in ${fileId}`))
}
}
return resolve(res.body)
}, err => reject(err)))
)
export const downloadFileId = async (fileId, size, onProgress) => {
const batch = 1024 * 1024
try {
const chunks = []
for (let start = 0; start < size; start += batch) {
const end = Math.min(size, start + batch - 1)
const data = await getRange(start, end, size, fileId, onProgress)
if (!data) throw new Error(`Unable to get range ${start} to ${end} in ${fileId}`)
chunks.push(data)
}
return chunks.join('')
} catch (err) {
return console.error(`Error downloading file: ${err.message}`)
}
}
Authentication works fine for me, and I use other GDrive commands just fine. I'm currently using drives.photos.readonly scope, but I have the same issues even if I use a full write-permission scope.
Tangentially, I'm unable to get a stream when running client-side using gapi (works fine in node on the server-side). This is just weird. If I could get a stream, I think I could use that to get progress. Whenever I add the commented-out line for the responseType: 'stream', I get the following error: The server encountered a temporary error and could not complete your request. Please try again in 30 seconds. That’s all we know. Of course waiting does NOT help, and I can get a successful response if I do not request the stream.
I switched to using XMLHttpRequest directly, rather than the gapi wrapper. Google provides these instructions for using CORS that show how to convert any request from using gapi to a XHR. Then you can attach to the onprogress event (and onload, onerror and others) to get progres.
Here's the drop-in replacement code for the downloadFileId method in the question, with a bunch of debugging scaffolding:
const xhrDownloadFileId = (fileId, onProgress) => new Promise((resolve, reject) => {
const user = gapi.auth2.getAuthInstance().currentUser.get()
const oauthToken = user.getAuthResponse().access_token
const xhr = new XMLHttpRequest()
xhr.open('GET', `https://www.googleapis.com/drive/v3/files/${fileId}?alt=media`)
xhr.setRequestHeader('Authorization', `Bearer ${oauthToken}`)
xhr.responseType = 'blob'
xhr.onloadstart = event => {
console.log(`xhr ${fileId}: on load start`)
const { loaded, total } = event
onProgress({ loaded, size: total })
}
xhr.onprogress = event => {
console.log(`xhr ${fileId}: loaded ${event.loaded} of ${event.total} ${event.lengthComputable ? '' : 'non-'}computable`)
const { loaded, total } = event
onProgress({ loaded, size: total })
}
xhr.onabort = event => {
console.warn(`xhr ${fileId}: download aborted at ${event.loaded} of ${event.total}`)
reject(new Error('Download aborted'))
}
xhr.onerror = event => {
console.error(`xhr ${fileId}: download error at ${event.loaded} of ${event.total}`)
reject(new Error('Error downloading file'))
}
xhr.onload = event => {
console.log(`xhr ${fileId}: download of ${event.total} succeeded`)
const { loaded, total } = event
onProgress({ loaded, size: total })
resolve(xhr.response)
}
xhr.onloadend = event => console.log(`xhr ${fileId}: download of ${event.total} completed`)
xhr.ontimeout = event => {
console.warn(`xhr ${fileId}: download timeout after ${event.loaded} of ${event.total}`)
reject(new Error('Timout downloading file'))
}
xhr.send()
})

Multiple native app are ran by connecting with native messaging when reload the extension

In chrome extension, I use native messaging to call the local application. But I found an issue that everytime I reload the extension, it seems like a new process is created for the application. By the documentation, the app will be ended if the port is disconnected or the page is closed. Does that mean reload extension won't close the background page? How can I solve this problem? Also, I cannot find my local application process in the chrome task manager.
// background.js
var port = null;
connectToNativeHost();
// Receive message from other js
chrome.runtime.onMessage.addListener(
function(request, sender, sendResponse) {
console.log("background recieved message from " + sender.url + JSON.stringify(request));
parseMessage(request);
}
);
//onNativeDisconnect
function onDisconnected()
{
console.log(chrome.runtime.lastError);
console.log('disconnected from native app.');
port = null;
}
// Receive message from native app
function onNativeMessage(message)
{
console.log('recieved message from native app: ' + JSON.stringify(message));
}
//connect to native host and get the communicatetion port
function connectToNativeHost()
{
var nativeHostName = 'com.group_project.time_tracker';
port = chrome.runtime.connectNative(nativeHostName);
port.onMessage.addListener(onNativeMessage);
port.onDisconnect.addListener(onDisconnected);
console.log("connected");
}
// Send message to native app
function sendMessage(message)
{
port.postMessage(message);
console.log('send messsage to native app: ' + JSON.stringify(message));
}

PWA: Chrome warning "Service worker does not have the 'fetch' handler"

I'm currently unsuccessfully trying to make my PWA installable. I have registered a SertviceWorker and linked a manifest as well as I am listening on the beforeInstallPromt event.
My ServiceWorker is listening to any fetch event.
My problem is, that the created beforeInstall banner is just being shown on Chrome desktop but on mobile I get a warning in Chrome inspection tab "Application" in the "Manifest" section:
Installability
Service worker does not have the 'fetch' handler
You can check the message on https://dev.testapp.ga/
window.addEventListener('beforeinstallprompt', (e) => {
// Stash the event so it can be triggered later.
deferredPrompt = e;
mtShowInstallButton();
});
manifest.json
{"name":"TestApp","short_name":"TestApp","start_url":"https://testapp.ga/loginCheck","icons":[{"src":"https://testapp.ga/assets/icons/launcher-ldpi.png","sizes":"36x36","density":0.75},{"src":"https://testapp.ga/assets/icons/launcher-mdpi.png","sizes":"48x48","density":1},{"src":"https://testapp.ga/assets/icons/launcher-hdpi.png","sizes":"72x72","density":1.5},{"src":"https://testapp.ga/assets/icons/launcher-xhdpi.png","sizes":"96x96","density":2},{"src":"https://testapp.ga/assets/icons/launcher-xxhdpi.png","sizes":"144x144","density":3},{"src":"https://testapp.ga/assets/icons/launcher-xxxhdpi.png","sizes":"192x192","density":4},{"src":"https://testapp.ga/assets/icons/launcher-web.png","sizes":"512x512","density":10}],"display":"standalone","background_color":"#ffffff","theme_color":"#0288d1","orientation":"any"}
ServiceWorker:
//This array should NEVER contain any file which doesn't exist. Otherwise no single file can be cached.
var preCache=[
'/favicon.png',
'/favicon.ico',
'/assets/Bears/bear-standard.png',
'/assets/jsInclude/mathjax.js',
'/material.js',
'/main.js',
'functions.js',
'/material.css',
'/materialcolors.css',
'/user.css',
'/translations.json',
'/roboto.css',
'/sw.js',
'/'
];
//Please specify the version off your App. For every new version, any files are being refreched.
var appVersion="v0.2.1";
//Please specify all files which sould never be cached
var noCache=[
'/api/'
];
//On installation of app, all files from preCache are being stored automatically.
self.addEventListener('install', function(event) {
event.waitUntil(
caches.open(appVersion+'-offline').then(function(cache) {
return cache.addAll(preCache).then(function(){
console.log('mtSW: Given files were successfully pre-cached')
});
})
);
});
function shouldCache(url) {
//Checking if url is market as noCache
var isNoCache=noCache.includes(url.substr(8).substr(url.substr(8).indexOf("/")))||noCache.includes((url.substr(8).substr(url.substr(8).indexOf("/"))).substr(0,(url.substr(8).substr(url.substr(8).indexOf("/"))).indexOf("?")));
//Checking of hostname of request != current hostname
var isOtherHost=url.substr(8).substr(0,url.substr(8).indexOf("/"))!=location.hostname&&url.substr(7).substr(0,url.substr(7).indexOf("/"))!=location.hostname;
return((url.substr(0,4)=="http"||url.substr(0,3)=="ftp") && isNoCache==false && isOtherHost==false);
}
//If any fetch fails, it will look for the request in the cache and serve it from there first
self.addEventListener('fetch', function(event) {
//Trying to answer with "online" version if fails, using cache.
event.respondWith(
fetch(event.request).then(function (response) {
if(shouldCache(response.url)) {
console.log('mtSW: Adding file to cache: '+response.url);
caches.open(appVersion+'-offline').then(function(cache) {
cache.add(new Request(response.url));
});
}
return(response);
}).catch(function(error) {
console.log( 'mtSW: Error fetching. Serving content from cache: ' + error );
//Check to see if you have it in the cache
//Return response
//If not in the cache, then return error page
return caches.open(appVersion+'-offline').then(function (cache) {
return cache.match(event.request).then(function (matching) {
var report = !matching || matching.status == 404?Promise.reject('no-match'): matching;
return report
});
});
})
);
})
I checked the mtShowInstallButton function. It's fully working on desktop.
What does this mean? On the Desktop, I never got this warning, just when using a handheld device/emulator.
Fetch function is used to fetch JSon manifest file. Try reading google docs again.
For adding PWA in Mobile you need manifest file to be fetched which is fetched using service-worker using fetch function.
Here is the code :
fetch('examples/example.json')
.then(function(response) {
// Do stuff with the response
})
.catch(function(error) {
console.log('Looks like there was a problem: \n', error);
});
for more about fetch and manifest try this.

ProgressEvent.load is always the same as ProgressEvent.Total which causes the progress to fake

I'm trying to implement progress bar on a website.
The Problem:
ProgressEvent.load is always the same as ProgressEvent.Total which prevent the progress to show the real state of the upload. At the first second the xhr request does sent it looks like it finished but actually the server is still getting parts of the file.
JS:
My js code(the part of the progress) looks like that:
xhr.upload.onprogress = function (event) {
var progress = Math.round(event.lengthComputable ? event.loaded * 100 / event.total : 0);
that._onProgressItem(item, progress);
};
the property lengthComputable is true.
the event.loaded is 4354707 as the event.total which is 4354707.
C# Server Side:
public async Task<FileResultViewModel> Upload(string type)
{
string ServerUploadFoler = "...";
// Verify that this is an HTML Form file upload request
if (!Request.Content.IsMimeMultipartContent())
{
throw new HttpResponseException(Request.CreateResponse(HttpStatusCode.UnsupportedMediaType));
}
// Create a stream provider for setting up output streams
var streamProvider = new MultipartFormDataStreamProvider(ServerUploadFolder);
// Read the MIME multipart asynchronously content using the stream provider we just created.
await Request.Content.ReadAsMultipartAsync(streamProvider);
string guid = String.Empty;
if (serverUploadMoveFolder != ServerUploadFolder)
{
foreach (MultipartFileData fileData in streamProvider.FileData)
{
guid = Guid.NewGuid().ToString();
string newFileName = serverUploadMoveFolder + guid + GetExtension(uploadType);
FileInfo fi = new FileInfo(fileData.LocalFileName);
fi.MoveTo(newFileName);
}
}
// Create response
return new FileResultViewModel
{
FileName = guid
};
}
Chrome debug after 1 second of upload with a file of 4.2MB:
In fiddler after the request has completed:
My questions are:
How does the browser knows the loaded size? How does it split the file to parts and based on what params?
How do the xhr.upload.onprogress function event get updated with the progress? Does it the server which report about his progress and if it is so where is it on the code because I didn't handle it.
Why doesn't the loaded property show the real size of part?