Akamai HDCore + live stream = occasional blips of black - actionscript-3

I noticed I'd only getting the "blips of black" (maybe 300ms of all black) whenever the stream quality changes (due to the DSS throttle).
I thought maybe there is not enough buffer, but the stream change takes about 7s (according to the HDCore debug messages) and the bufferTime, according to the associated netStream, is set to 10 seconds by default.
Perhaps there's a better way to set up the buffer in HDCore? This worked fine with OSMF, but OSMF doesn't support HTTP DSS.
Using: Flash Player 10.2 and Akamai HDCore 2.1.20
Embed Code:
<script type="text/javascript">
/*var str = '?';
for(var b in flashVars) str += b + '=' + flashVars[b] + '&';
alert(str);*/
var params = {
allowFullScreen:"true",
wmode:"window",
bgcolor:"#000000"
};
swfobject.embedSWF(WEBCAST_SWF_URL, "flashContent", "512", "288", "10.2.0", "/flash/expressinstall.swf?", null, params);
</script>

I noticed that running locally and hitting the swf both worked find.
So I changed the wrapper in the HTML and that fixed the "blip". I switched from swfobject to the native non-swfobject wrapper and everything worked (AC_OETags.js).
Happy streaming.

Related

Action script, NativeProcess , resolvePath and swf does not work

I will expose my problem but first I have to show you my configuration to give you all the details.
I have 2 Virtual Machines, 2 windows 7. The first one, it is where I developp all my Action Scripts, where there is my Development Environment(IDE) and second one there is nothing special installed. On both there is Adobe AIR and Adobe Flash Player.
Ok, here is my problem. I develop (on first one) a script that uses NativeProcess to run a CMD.exe that load in command line a dll.
And when I Build&Run the project everything is ok, I check and the dll is loaded. But the problem is when the second Windows connected into my localhost website (to the first windows that play as a server) and run the file "myProgram.swf" (the ActionScript program) that do not load my dll.
Now I print you all my code :
This is the script that loads the dll "myProgram.swf" :
public class NativeProcessExample extends Sprite
{
public var process:NativeProcess;
public function NativeProcessExample()
{
if(NativeProcess.isSupported)
{
setupAndLaunch();
}
else
{
trace("NativeProcess not supported.");
}
}
public function setupAndLaunch():void
{
var fmt:TextFormat = new TextFormat();
var txt:TextField = new TextField();
fmt.size = 32;
txt.text = 'Hello, world!' + '\n' +
'Width = ' + stage.fullScreenWidth + '\n' +
'Height = ' + stage.fullScreenHeight;
txt.setTextFormat(fmt);
txt.autoSize = TextFieldAutoSize.LEFT;
addChild(txt);
var nativeProcessStartupInfo:NativeProcessStartupInfo = new NativeProcessStartupInfo();
var file:File = File.applicationDirectory.resolvePath("C:\\Windows\\System32\\regsvr32.exe");
nativeProcessStartupInfo.executable = file;
var args:Vector.<String> = new Vector.<String>();
args.push("C:\\Users\\myUser\\Downloads\\myDLL.dll");
nativeProcessStartupInfo.arguments = args;
var process:NativeProcess = new NativeProcess();
process.start(nativeProcessStartupInfo);
process.addEventListener(NativeProcessExitEvent.EXIT, exitHandler);
I cut (I deleted all includes and end part) the script cause its too long but here is the most interesting part.
Now I will show you my "index.php" where the 2nd Windows connected to recover and inject the dll. :
<!DOCTYPE html>
<html>
<head>
<title>Test</title>
<style type=\"text/css\">
body, html
{
width:100%;
height:100%;
overflow:hidden;
}
#SWFSquare
{
height: 200px;
width: 200px;
background-color: blue;
}
</style>
<script type="text/javascript" src="swfobject.js"></script>
<script src="https://code.jquery.com/jquery-1.12.0.min.js"></script>
<script src="https://code.jquery.com/jquery-migrate-1.2.1.min.js"></script>
</head>
<body bgcolor="#ffdfaf">
<div id="SWFSquare">
</div>
<input type="button" value="Download" id="buttonDownload" style="margin-left: auto; margin-right: auto; display: block;">
<script type="text/javascript">
$(function() {
$("#buttonDownload").click(function() {
window.open("myDLL.dll");
myFunction();
});
function myFunction() {
setTimeout(function(){
var element = document.getElementById("SWFSquare");
swfobject.embedSWF("myProgram.swf", element, 300, 120, 10);
},10000);
}
});
</script>
</body>
</html>
So I hope you have all needed information. Do not hesitate to ask me for more information.
So to remind. When I launch my script on 1st Windows under my Development Environment (IDE) everything works my DLL is loaded but when I try do load it with 2nd Windows by connected to index.php (=1st Windows as a server) the SWF works cause i get the message "HelloWorld" on the page but the dll is not loaded...
Can you help me ? I work on this for 2 weeks :-(.
First of all, Thank you guys for the quick response :-)
So, I will answer "Akmozo's question :
As you see on the description of my ActionScript it will use "NativeProcess" to run the cmd that will execute a command to load myDLL.dll
So, I just have to execute the swf to start all of this. That is the relation between AIR app and swf. I work on FlashDevelop environment and every script "myProgram.as" that you "Build&Runs" create a "myProgram.swf" file. Once I get this file (automatically created) I just have to run it through the web by my "index.php" and more precisely by this code :
var element = document.getElementById("SWFSquare");
swfobject.embedSWF("myProgram.swf", element, 300, 120, 10);
So, when 2nd windows connected to index.php that run the myProgram.swf and finally I have not dll loaded...
That's my problem. Did I answer you "Akzmozo" ?
Now, for your answer "VC.one" I think it should be possible to do it on the environment I especially prepared.
That is to say :
1st Windows with last update and patches
2nd Windows with no update and no last Flash Player (currently is 19.0.0.206)
I'm an IT security researcher (student) and that's why I'm working now on a breach in Adobe Flash Player 19. Normally, it possible to do it because there is already a CVE on this work, and I would (re) create this scenario. But I'm always stuck on this problem and I think I missed something but I don't know what it is...
But I'm always stuck on this problem and I think I missed something
but I don't know what it is...
#Akmozo is correct. Flash Player (browser) & AIR (OS app) are two different ways to run AS3 code as an application. They don't always work the same (an AS3 app rendered by browser Flash Player plugin is much more limited for security reasons, it cannot run programs on a computer otherwise hackers & virus creators would have found heaven with this power, spreading chaos via internet).
Also think about what happens if the SWF is run from a Mac or Linux browser? How do these OS load the dll (since it's a Windows-only file)? This breaks the rule that code in browser works same everywhere, regardless of platform.
Just to prove a point... update your textfield code to look like this below. In IDE testing it should say (NP) Support = true but when in browser you will get = false. Of course when its false then you cannot load the dll from a browser.
var fmt:TextFormat = new TextFormat();
var txt:TextField = new TextField();
fmt.size = 32;
txt.text = 'Hello, world!' + '\n' +
'Width = ' + stage.fullScreenWidth + '\n' +
'Height = ' + stage.fullScreenHeight + '\n' +
'(NP) Support = ' + String(NativeProcess.isSupported); //# check if available
txt.setTextFormat(fmt);
txt.autoSize = TextFieldAutoSize.LEFT;
addChild(txt);

Can I use a local file as a source in a live page?

I like to use JSFiddle when designing a new interface because I find it convenient for various tools within. I'm working on the front end of a site where I want to use a video, and unlike an image, I cant just throw it up on imgur and link to it for free instant hosting while I fiddle with the interface design.
So I want to know if I can somehow use a local file on my PC as the source for an HTML video element hosted on a live site. Obviously this is trivial to do with a web project being worked on on my Desktop, but I'm not sure it can be done on a live test.
For example this would work on a page I open from my desktop, living on my PC:
<video id="Video-Player">
<source src="../movie.mp4" type="video/mp4"/>
</video>
But I don't know whether I can do the equivalent with a page living on the web.
Here's how to allow a user to select an image from their local machine. This should get you started in the right direction.
Add a file input button in the HTML
<input type="file" id="file-btn"/>
and the corresponding handler
document.getElementById('file-btn').addEventListener('change', function(e){
readFiles(e.target.files);
})
Then the code to read the files
function readFiles(files){
files = [].slice.call(files); //turning files into a normal array
for (var file of files){
var reader = new FileReader();
reader.onload = createOnLoadHandler(file);
//there are also reader.onerror reader.onloadstart, reader.onprogress, and reader.onloadend handlers
reader.readAsDataURL(file);
}
}
Now, I've only done this with images, but this is how I read the image data.
function createOnLoadHandler(file){
console.log('reading ' + file.name + ' of type ' + file.type)
function onLoad(e){
var data = e.target.result
display(data);
}
return onLoad
}
function display(data){
var img = document.createElement('img');
img.src = data;
var context = canvas.getContext('2d')
context.clearRect(0, 0, WIDTH, HEIGHT);
context.drawImage(img, 0, 0, WIDTH, HEIGHT);
}
Here is a demo of the above code.
As a side note, if you try to read images from another domain you'll run into cross origin policy issues. I would think the same problem exists for videos as well.

HTML5 <audio> poor choice for LIVE streaming?

As discussed in a previous question, I have built a prototype (using MVC Web API, NAudio and NAudio.Lame) that is streaming live low quality audio after converting it to mp3. The source stream is PCM: 8K, 16-bit, mono and I'm making use of html5's audio tag.
On both Chrome and IE11 there is a 15-34 second delay (high-latency) before audio is heard from the browser which, I'm told, is unacceptable for our end users. Ideally the latency would be no more than 5 seconds. The delay occurs even when using the preload="none" attribute within my audio tag.
Looking more closely at the issue, it appears as though both browsers will not start playing audio until they have received ~32K of audio data. With that in mind, I can affect the delay by changing Lame's MP3 'bitrate' setting. However, if I reduce the delay (by sending more data to the browser for the same length of audio), I will introduce audio drop-outs later.
Examples:
If I use Lame's V0 encoding the delay is nearly 34 seconds which requires almost 0.5 MB of source audio.
If I use Lame's ABR_32 encoding, I can reduce the delay to 10-15 seconds but I will experience pauses and drop-outs throughout the listening session.
Questions:
Any ideas how I can minimize the start-up delay (latency)?
Should I continue investigating various Lame 'presets' in hopes of picking the "right" one?
Could it be that MP3 is not the best format for live streaming?
Would switching to Ogg/Vorbis (or Ogg/OPUS) help?
Do we need to abandon HTML5's audio tag and use Flash or a java applet?
Thanks.
You can not reduce the delay, since you have no control on the browser code and buffering size. HTML5 specification does not enforce any constraint, so I don't see any reason why it would improve.
You can however implement a solution with webaudio API (it's quite simple), where you handle streaming yourself.
If you can split your MP3's chunk in fixed size (so that each MP3 chunks size is known beforehand, or at least, at receive time), then you can have a live streaming in 20 lines of code. The chunk size will be your latency.
The key is to use AudioContext::decodeAudioData.
// Fix up prefixing
window.AudioContext = window.AudioContext || window.webkitAudioContext;
var context = new AudioContext();
var offset = 0;
var byteOffset = 0;
var minDecodeSize = 16384; // This is your chunk size
var request = new XMLHttpRequest();
request.onprogress = function(evt)
{
if (request.response)
{
var size = request.response.length - byteOffset;
if (size < minDecodeSize) return;
// In Chrome, XHR stream mode gives text, not ArrayBuffer.
// If in Firefox, you can get an ArrayBuffer as is
var buf;
if (request.response instanceof ArrayBuffer)
buf = request.response;
else
{
ab = new ArrayBuffer(size);
buf = new Uint8Array(ab);
for (var i = 0; i < size; i++)
buf[i] = request.response.charCodeAt(i + byteOffset) & 0xff;
}
byteOffset = request.response.length;
context.decodeAudioData(ab, function(buffer) {
playSound(buffer);
}, onError);
}
};
request.open('GET', url, true);
request.responseType = expectedType; // 'stream' in chrome, 'moz-chunked-arraybuffer' in firefox, 'ms-stream' in IE
request.overrideMimeType('text/plain; charset=x-user-defined');
request.send(null);
function playSound(buffer) {
var source = context.createBufferSource(); // creates a sound source
source.buffer = buffer; // tell the source which sound to play
source.connect(context.destination); // connect the source to the context's destination (the speakers)
source.start(offset); // play the source now
// note: on older systems, may have to use deprecated noteOn(time);
offset += buffer.duration;
}

Using html5 to capture microphone input on mobile chrome

I am trying to record from the microphone using HTML5 in Chrome 29 Beta for Android (it has enabled web audio support in its beta 29). In the code below, ProcessAudio is the web audio filter function, in which I receive the input buffer from the microphone. I get the correct sample size. However, the audio pcm samples in mobile chrome are always zero. Does chrome disable the input to the audio filters in its mobile version? Has anybody got audio recording working using HTML5 in chrome mobile version?
The following code works fine in chrome (desktop version).
<!DOCTYPE HTML>
<html>
<head>
<script>
function GetMedia(obj, fnSuccess, fnFailure)
{
if(navigator.getUserMedia)
{
return navigator.getUserMedia(obj, fnSuccess, fnFailure);
}
else if(navigator.webkitGetUserMedia)
{
return navigator.webkitGetUserMedia(obj, fnSuccess, fnFailure);
}
else if(navigator.mozGetUserMedia)
{
return navigator.mozGetUserMedia(obj, fnSuccess, fnFailure);
}
else if(navigator.msGetUserMedia)
{
return navigator.msGetUserMedia(obj, fnSuccess, fnFailure);
}
alert("no audio capture");
}
var incrementer = 0;
function ProcessAudio(e)
{
var inputBuffer = e.inputBuffer.getChannelData(0);
var outputBuffer = e.outputBuffer.getChannelData(0);
outputBuffer.set(inputBuffer, 0);
document.getElementById("display").innerText =
incrementer + " " + inputBuffer[0];
incrementer++;
}
var context = null;
function Success(localMediaStream)
{
context = new window.webkitAudioContext();
var microphone = context.createMediaStreamSource(localMediaStream);
var node = context.createScriptProcessor(4096, 1, 1);
node.onaudioprocess = ProcessAudio;
microphone.connect(node);
node.connect(context.destination);
}
function Error(err)
{
alert("no audio support");
}
function load(e)
{
GetMedia({audio:true,video:false}, Success, Error);
}
</script>
</head>
<body onload="load(event)">
<div>Audio Player</div>
<div id="display"></div>
<video id="localvideo" autoplay="autoplay" style="opacity:1"></video>
</body>
</html>
We now have Web Audio input working in Chrome for Android Beta (31.0.1650.11).
There's an issue with inputBuffer in Chrome beta for Android right now - it always contains zeros (at least on those devices that I tested) as you mentioned - confirmed. That ain't much of help but for demo purposes the best I could achieve right now is playing "live" input stream via <audio> object. See here. Hit "record" and then start playing the audio object. The rest beyond that is TARFU in the current beta. Guess I'm waiting for G-folks to patch it asap. What I found out though, is that zingaya.com works fine (that is, records your audio and plays it back to you) on Chrome Beta for Android. Just try them out and they'll play back your stream. Apparently, this is made by leveraging WebRTC. In other words you evidently can record an audio stream that way, I haven't figured out how to do it just yet. Post it up if you come up with something. G-guys are working fast on the other hand - they've released a new beta version less than a week ago that at least can play audio fetched with XHR. The version prior to that couldn't do that even.

HTML5 video will not loop

I have a video as a background to a web page, and I am trying to get it to loop. Here is the code:
<video autoplay='true' loop='true' muted='true'>
<source src='/admin/wallpapers/linked/4ebc66e899727777b400003c' type='video/mp4'></source>
</video>
Even though I have told the video to loop, it does not. I also tried to get it to loop with the onended attribute (as per this Mozilla support thread, I also tried that bit of jQuery). Nothing has worked so far. Is it an issue with Chrome, or my code?
Edit:
I checked the Network events and HEAD of a working copy (http://fhsclock-labs.heroku.com/no-violence) versus the application I'm trying to get working. The difference is the working copy is serving up the video from a static asset on Heroku (via Varnish, apparently), whilst mine is serving from GridFS (MongoDB).
The Network tab of Chrome's Inspector show that in my application, the video is requested three times. One time the Status is "pending", the second is "canceled", and the final one is 200 OK. The working copy only shows two requests, one's Status is pending and the other is 206 Partial Content. However, after the video plays once, that request changes to "Cancelled" and it makes another request for that video. In my application, that does not happen.
As for Type, in my application, two are "undefined" and the other "video/mp4" (which it is supposed to be). In the working app, all of the requests are "video/mp4".
In addition, I'm getting Resource interpreted as Other but transferred with MIME type undefined. warnings in the Console.
I'm not really quite sure where to begin on this. It's my belief that the issue is server-side, as serving the file as static assets works fine. It could be that the server isn't sending the correct content type. It could be an issue with GridFS. I do not know.
At any rate, the source is here. Any insight that you can offer is appreciated.
Ah, I just stumbled into this exact problem.
As it turns out, looping (or any sort of seeking, for that matter) in <video> elements on Chrome only works if the video file was served up by a server that understands partial content requests. i.e. the server needs to honor requests that contain a "Range" header with a 206 "Partial Content" response. This is even the case if the video is small enough to be fully buffered by chrome, and no more server-round trips are made: if your server didn't honor chrome's Range request the first time, the video will not be loopable or seekable.
So yes, an issue with GridFS, although arguably Chrome should be more forgiving.
Simplest workaround:
$('video').on('ended', function () {
this.load();
this.play();
});
The 'ended' event fires when the video reaches the end, video.load() resets the video to the beginning, and video.play() starts it playing immediately once loaded.
This works well with Amazon S3 where you don't have as much control over server responses, and also gets around Firefox issues related to video.currentTime not being settable if a video is missing its length metadata.
Similar javascript without jQuery:
document.getElementsByTagName('video')[0].onended = function () {
this.load();
this.play();
};
Looks like its been an issue in the past, there are at least two closed bugs on it, but both state that it was fixed:
http://code.google.com/p/chromium/issues/detail?id=39683
http://code.google.com/p/chromium/issues/detail?id=18846
Since Chrome and Safari both use webkit based browsers you might be able to use some of these work arounds:
http://blog.millermedeiros.com/2011/03/html5-video-issues-on-the-ipad-and-how-to-solve-them/
function restartVideo(){
vid.currentTime = 0.1; //setting to zero breaks iOS 3.2, the value won't update, values smaller than 0.1 was causing bug as well.
vid.play();
}
//loop video
vid.addEventListener('ended', restartVideo, false);
Just in case none of the answers above help you, make sure you don't have your inspector running with the Disable cache option checked. Since Chrome grabs the video from cache, it will basically work once. Just debugged this for 20 minutes before realizing this was the cause. For reference and so I know I am not the only one someone else's chromium bug report.
My situation:
I have the exact same problem, however changing the header of the response message alone didnt do. No loop, replay or seek. Also a pure stop doesnt work, but that could be my configuration.
Answer:
According to some sites (couldnt find them anymore) its also possible to trigger the load() method right after the video ends, and before the next one is supposed to start. That should reload the source causing a once again working video/audio element.
#john
Please note that your answers/links are normal bugs, and not focused on this problem. Using a server/webserver is what causes this problem. Whereas the bugs these links describe are of a different kind. Thats also why the answer isnt working.
I hope it helps, i am still looking for a solution.
For anyone coming on this page 9 years later and if all the above answers didn't work: I had this issue too and I thought the source of the issue was either my browsers or with the server.
I've later noticed that the other websites on internet which use looping videos they don't have issue with looping videos. To troubleshoot I have downloaded a random video from one of the sites and I visited and uploaded on my own server to delightedly find out it was working, so it seemed that the source of the issue was the video I was using.
Then I fixed my video with an online video converter website (don't want to publicize any in particular but the first ones from a quick google research do work) and alas, this solved the issue.
I'm not sure what the real reason of the issue was. I do assume there was a conversion or compression error of the original video that was handed me from my client.
I know this doesn't pertain exactly to the question asked, but if someone comes across this when having a similar issue, make sure that you have your sources in order properly.
I was loading an mp4 and a webm file and noticed that the video was not looping in Chrome. It was because the webm file was the first source listed so Chrome was loading the webm file and not the mp4.
Hope that helps someone else that comes across this issue.
<video autoplay loop>
<source src="/path-to-vid/video.mp4" type="video/mp4">
<source src="/path-to-vid/video.webm" type="video/webm">
</video>
it is super lame but dropbox use the right status code. So upload to dropbox and replace the www by dl.
Thus using a dropbox url the video play fine.
I had same issue and inevitably solved problem by streaming the content.
e.g this is the code with PHP laravel blade html code which is requesting to streaming route:
<video>
<source src="{{route('getVideoStream',$videoId)}}" type="video/mp4"/>
</video>
in the Controller I will stream video and return it as laravel stream function:
public function getVideoStream($videoId){
$path = $pathOfVideo;
$headers = [
'Content-Type' => 'video/mp2t',
'Content-Length' => File::size($path),
'Content-Disposition' => 'attachment; filename="start.mp4"'
];
$stream = new VideoStream($path);
return response()->stream(function () use ($stream) {
$stream->start();
});
}
and VideoStream Class is the streaming class I found from a GitHub gist:
class VideoStream
{
private $path = "";
private $stream = "";
private $buffer = 102400;
private $start = -1;
private $end = -1;
private $size = 0;
function __construct($filePath)
{
$this->path = $filePath;
}
/**
* Open stream
*/
private function open()
{
if (!($this->stream = fopen($this->path, 'rb'))) {
die('Could not open stream for reading');
}
}
/**
* Set proper header to serve the video content
*/
private function setHeader()
{
ob_get_clean();
header("Content-Type: video/mp4");
header("Cache-Control: max-age=2592000, public");
header("Expires: " . gmdate('D, d M Y H:i:s', time() + 2592000) . ' GMT');
header("Last-Modified: " . gmdate('D, d M Y H:i:s', #filemtime($this->path)) . ' GMT');
$this->start = 0;
$this->size = filesize($this->path);
$this->end = $this->size - 1;
header("Accept-Ranges: 0-" . $this->end);
if (isset($_SERVER['HTTP_RANGE'])) {
$c_start = $this->start;
$c_end = $this->end;
list(, $range) = explode('=', $_SERVER['HTTP_RANGE'], 2);
if (strpos($range, ',') !== false) {
header('HTTP/1.1 416 Requested Range Not Satisfiable');
header("Content-Range: bytes $this->start-$this->end/$this->size");
exit;
}
if ($range == '-') {
$c_start = $this->size - substr($range, 1);
} else {
$range = explode('-', $range);
$c_start = $range[0];
$c_end = (isset($range[1]) && is_numeric($range[1])) ? $range[1] : $c_end;
}
$c_end = ($c_end > $this->end) ? $this->end : $c_end;
if ($c_start > $c_end || $c_start > $this->size - 1 || $c_end >= $this->size) {
header('HTTP/1.1 416 Requested Range Not Satisfiable');
header("Content-Range: bytes $this->start-$this->end/$this->size");
exit;
}
$this->start = $c_start;
$this->end = $c_end;
$length = $this->end - $this->start + 1;
fseek($this->stream, $this->start);
header('HTTP/1.1 206 Partial Content');
header("Content-Length: " . $length);
header("Content-Range: bytes $this->start-$this->end/" . $this->size);
} else {
header("Content-Length: " . $this->size);
}
}
/**
* close curretly opened stream
*/
private function end()
{
fclose($this->stream);
exit;
}
/**
* perform the streaming of calculated range
*/
private function stream()
{
$i = $this->start;
set_time_limit(0);
while (!feof($this->stream) && $i <= $this->end) {
$bytesToRead = $this->buffer;
if (($i + $bytesToRead) > $this->end) {
$bytesToRead = $this->end - $i + 1;
}
$data = fread($this->stream, $bytesToRead);
echo $data;
flush();
$i += $bytesToRead;
}
}
/**
* Start streaming video content
*/
function start()
{
$this->open();
$this->setHeader();
$this->stream();
$this->end();
}
}