QtWebEngine Quicknano has no Sound in Embedded Linux - html

I have compiled QtWebEngine into my i.MX6 embedded device. When I tried to play youtube Video with quicknanobrowser, the video plays but there would be no sound. In fact, there is no sound when I try to test play the audio files in hpr.dogphilosophy.net/test even though the website said that the browser codec is supported.
I have enabled pulseaudio, gstreamer, ffmpeg, opus, vpx, libwebp and yet still no sound.
However, I could play video with gst-launch and there would be sound.
Is it something wrong with quicknanobrowser that does not enable sound? Or is there components that I need to add to the embedded system?
Edit: Alsa and pulseaudio, GStreamer are all working fine with sound.

You need to force QtWebEngine to use ALSA. In embedded systems, it is disabled by default.
In qt5.7/qtwebengine/src/3rdparty/chromium/media/media.gyp, there is a test to check if we are on an embedded system:
# Enable ALSA and Pulse for runtime selection.
['(OS=="linux" or OS=="freebsd" or OS=="solaris") and ((embedded!=1 and chromecast==0) or is_cast_desktop_build==1)', {
# ALSA is always needed for Web MIDI even if the cras is enabled.
'use_alsa%': 1,
'conditions': [
['use_cras==1', {
'use_pulseaudio%': 0,
}, {
'use_pulseaudio%': 1,
}],
],
}, {
'use_alsa%': 0,
'use_pulseaudio%': 0,
}],
I changed last use_alsa% to 1 and in qt5.7/qtwebengine/src/core/config/embedded_linux.pri, I added a new flag:
use_alsa=1
With this settings I have audio on my embedded ARM Linux and with flag:
enable_webrtc=1
I am able to start a WebRTC session with video and audio.

Related

Media Recorder Chrome; Opentok audio not captured

looking for assistance from anyone who might have some insight into this. having a problem with the MediaRecorder API built into Chrome, specifically on macOS, a problem that does not appear at all on windows
i am needing to capture the desktop screen as well as the desktop audio, and currently i get the stream using:
navigator.mediaDevices.getDisplayMedia({
video: {
cursor: 'always',
width: 1280,
height: 720,
frameRate: {
ideal: 12,
max: 15
},
},
audio: true
})
this of course gives the prompt for the user to select a screen/application/tab
following that, i load that into a new MediaRecorder:
new MediaRecorder(screenCaptureStream, {mimeType: "video/webm"})
on windows this all works fine. as long as the "Share Audio" checkbox on the bottom left is marked, audio will be heard in the ending file.
on mac however, the only time a "Share Audio" option is allowed is when a chrome tab is selected. this is fine for something like a youtube tab playing music, it captures both video and audio without a problem.
however, on a tab running an opentok video session, no audio coming from the video session is captured. at first i thought it may have been ffmpeg incorrectly processing it, but simply playing back the raw webm file gives no audio. checking with mediaInfo the file does indeed come back with an audio stream at "48.0kHz, 32bits, 1 channel, Opus"
because this isnt a problem on windows, i imagine that the Opentok video session is somehow outputting in such a way that it it bypassing the chrome tab altogether, so that even is the user can hear the audio, the recorder doesnt capture it.
does anyone know how to allow chrome to record audio on the Application/Screen choices on macOS?
EDIT: upon more experimentation, ive discovered that the issue lies in tokbox streams themselves.
setup: using a second pc, i hooked up my phone to a line-in port to use as the "mic" for the tokbox stream, and started up the session again. on that same page i have a youtube embedded iframe link to a random video, and on firefox, another video entirely.
selecting entire screen and making sure "Share Audio" is selected, i check that i can hear the audio being played from the phone on the other pc and its fine. the recording is going, and the checks for if there is an audio track pass. i swap to the next audio source by muting the 'phone' stream, and start playing the youtube embed. after a little bit, i stop it, and play the firefox video.
the results are that the embed and firefox audio got captured, but NONE of the tokbox audio got captured. (this was done on windows, so now i know its not a mac issue)
where is it being outputted?

Howto: Save screencast to video file ChromeOS?

Two Chrome apps/extensions have caught my eye on the webstore:
Screencastify
Snagit
I am aware of chrome.desktopCapture and how I can use getUserMedia() to capture a live stream of a user's desktop.
Example:
navigator.webkitGetUserMedia({
audio: false,
video: {
mandatory: {
chromeMediaSource: 'desktop',
chromeMediaSourceId: desktop_id,
minWidth: 1280,
maxWidth: 1280,
minHeight: 720,
maxHeight: 720
}
}
}, successCallback, errorCallback);
I'd love to create my own screencast app that allows audio recording as well as embedding webcam capture in a given corner of the video like Screencastify.
I understand capturing the desktop and the audio and video of the user, but how do you put it all together and make it into a video file?
I'm assuming there is a way to create a video file from a getUserMedia() stream on ChromeOS. Something that only ChromeOS has implemented?
How is it done? Thanks in advance for your answers.
The actual encoding and saving of the video file isn't something that's been implemented in Chrome as of yet. Mozilla has it in a nascent form at the moment. I'm unsure of its state in ChromeOS. I can give you a little information I've gleaned during development with the Chrome browser, however.
The two ways to encode, save, and distribute a media stream as a video are client-side and server-side.
Server-side:
Requires a media server of some kind. The best I've free/open-source solution that I've found so far is Kurento. The media stream is uploaded(chunks or whole) or streamed to the media server where it is encoded and saved for later use. This also works with peer-to-peer by acting as a middleman, recording as the data streams through.
Client-side:
This is all about browser-based encoding. There are currently two working options that I've tested successfully in Chrome.
Whammy.js:
This method uses a canvas hack to save arrays of webp images and then encode them into a webm container. While slow, it works well with video. No audio support. I'm working on that at the moment.
videoconverter.js(was ffmpeg-asm.js):
This is a straight port of ffmpeg to JavaScript using Emscripten. It works with both audio and video. It's also gigantic, script-wise, at around 25MB uncompressed. The other reason I'm not using it in production is the shaky licensing ground that ffmpeg is on at the moment.
It has not been optimized as much as it could be. It would probably be quite a project to make it reliably production-ready.
Hopefully that at least gives you avenues of research.

WebRTC - enable sound in chrome doesn't work

I need to create mute/unmute sound in my app using WebRTC.
I created a function:
this.StreamSound = function(aStream)
{
mStream.getAudioTracks()[0].enabled = aStream
}
It works in FF but in chrome don't (aStream == false) :/
Any idea?
(I'm testing it by local "video" tag)
For a local stream, to mute on a local video, you should set the 'muted' property on the local video tag.
If using jQuery the code will be:
$('#localvideo').prop('muted',true);
For muting audio that is being sent on the PeerConnection to a remote browser the audio tracks enabled should work in both Firefox and Chrome.

knitr mp4 movie embedding does not work on Windows XP

I knit a Rmd file (to html) with a chunk producing a mp4 movie :
```{r clock, fig.width=7, fig.height=6, fig.show='animate'}
par(mar = rep(3, 4))
for (i in seq(pi/2, -4/3 * pi, length = 12)) {
plot(0, 0, pch = 20, ann = FALSE, axes = FALSE)
arrows(0, 0, cos(i), sin(i))
axis(1, 0, "VI"); axis(2, 0, "IX")
axis(3, 0, "XII"); axis(4, 0, "III"); box()
}
```
knitrgenerates the following html code for embedding the mp4 movie :
<p><video controls="controls" loop="loop"><source src="figure/clock.mp4" type="video/mp4" />video of chunk clock</video></p>
The mp4 movie is well created in the figure subfolder but it does not appear in the html output when I open it with a Windows XP machine using either Chrome, Firefox or Explorer.
Here is a (temporary) example : http://stla.overblog.com/ellipse-chart-test - this is not the "clock" example but this is exactly the same rendering problem. I see the movie with Chrome on a Windows Vista machine, but not with my Windows XP machine.
What is the explanation ? Is it really a problem with the OS or with browser versions ?
tl;dr Browsers really use the OS to perform some media decoding tasks. Work around it by a) providing alternate media streams b) using the most compatible media format for your audience c) using a plugin (i.e. Flash), or d) recommend installing an MP4 plugin.
This is in fact a 'problem' with the OS. Many browsers, just as some other programs on a particular platform, use the operating system resources to acomplish a given task. This is particularly true when it comes to procedures protected by intellectual property rights.
Your codec (h.264 aka "MP4") happens to be a particulary fiercely fought over piece of IP. Thus, browsers don't go to lengths to licence the IP at hand, but rather use the licenced codecs of the host system.
In your case, Windows XP happens not to be able to decode the media format of your video, and the browser doesn't seem to be able to do that by itself.
Your alternatives now:
Give additional media streams with your video tag (see Wikipedia for an example)
Trying to find out what browser the majority of users is using on XP, then choose a natively supported format (either webm for Chrome or ogg for Firefox)
Just use Flash to play the MP4 (as in the pre-HTML5 days)
Telling users to install an OS-level plugin to play h.264; you could even do that in the fallback text. I won't recommend a specific product, but there are many.

Does HTML5 audio stream, or download and then play

I intend to put together a web based player for myself. Is it possible to stream audio files using the html5 tag? or will they fully download and then play. Is any special server configuration required to play these files
Also what if the audio is not a file on the server but being dynamically generated on the server side with the raw bytes being written to the response stream
http://www.w3.org/TR/2010/WD-html5-20101019/video.html#audio
edit:
What's up with all that downloading?
Opera, Chrome and Safari will automatically download the whole video file even if it hasn't started to play yet. Firefox 3.6 only loads enough to render a frame and determine duration, unless the autobuffer attribute is present. Note that the spec changed from autobuffer to preload, which hasn't been implemented anywhere yet. Opera plans to change to the Firefox behavior of only loading enough to render a frame and determine duration by default, unless the preload attribute says otherwise.
source: http://dev.opera.com/articles/view/everything-you-need-to-know-about-html5-video-and-audio/