Realtime streaming video to HTML5 with RaspberryPi, gstreamer - html

I'm trying to make a live stream from a Raspberry camera available on a HTML5 webpage. Because of combination of factors, I would like to stream it to an outside server pc(Server pc os is window7) and this server should be able to supply the streams to the webpage HTML.
I'm able to get the stream from the Raspberry Pi and stream it with Gstreamer to an external server like this:
Raspberry Pi:
raspivid -n -t 0 -rot 270 -w 960 -h 720 -fps 30 -b 2000000 -o - | gst-launch-1.0 - e -vvvv fdsrc ! h264parse ! rtph24pay pt=96 config-interval=1 ! udpsink host=External IP port=5000
External server
gst-launch-1.0 -e -v udpsrc port=5000 ! application/x-rtp, payload=96 ! rtpjitterbuffer ! rtph264depay ! avdec_h264 ! fpsdisplaysink sync=false text-overlay=false
As a result I could display live video stream through gstreamer(GStreamer D3D video sink) in external server pc.
Now I have a problem:
I want to display this as HTML 5 video with Apache on server side (PC) instead of GStreamer D3D video output.
I searched for this solution for a long time but I couldn't find anything.

Related

iOs simulator does not change language when edit .GlobalPreferences.plist in github actions

I'm creating a CI flow that uses appium and iOs simulator in macos-latest. My app will change language along with simulator language. I found that edit .GlobalPreferences.plist file and then boot the simulator will change to Japanese but the simulator still get default language (en)
Nodejs : 16
Java: 11
Appium: 1.22.3
MacOs: latest
iOs Runtime: 12.4
Device: IphoneX - Simulator
xcrun simctl create TestiPhone com.apple.CoreSimulator.SimDeviceType.iPhone-X com.apple.CoreSimulator.SimRuntime.iOS-12-4 > deviceid.txt
DEVICEUUID=`cat deviceid.txt`
echo $DEVICEUUID
plutil -p ~/Library/Developer/CoreSimulator/Devices/$DEVICEUUID/data/Library/Preferences/.GlobalPreferences.plist
plutil -replace AppleLocale -string "ja_US" ~/Library/Developer/CoreSimulator/Devices/$DEVICEUUID/data/Library/Preferences/.GlobalPreferences.plist
plutil -replace AppleLanguages -json "[ \"ja\" ]" ~/Library/Developer/CoreSimulator/Devices/$DEVICEUUID/data/Library/Preferences/.GlobalPreferences.plist
echo "Verify locale and language ~ JP"
plutil -p ~/Library/Developer/CoreSimulator/Devices/$DEVICEUUID/data/Library/Preferences/.GlobalPreferences.plist
xcrun simctl boot $DEVICEUUID
xcrun simctl bootstatus $DEVICEUUID
xcrun simctl install booted /Users/runner/work/appiumclonetest/appiumclonetest/BuildFiles/mobile.app
When I use iOS 15.0, .GlobalPreferences.plist file does not exist in ~/Library/Developer/CoreSimulator/Devices/$DEVICEUUID/data/Library/Preferences. Where can I found it ?
Can I change simulator language by edit .GlobalPreferences.plist file or do I need to change other things to make it work? I also search for similar discussions but no luck.
Thanks

Display received gstreamer mjpeg on HTML page

I am receiving with Gstreamer an mjpg stream and I have not been able to display it in a simple HTML5 page.
The send command:
gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,format=YUY2,width=640,height=480,framerate=30/1 ! jpegenc ! rtpjpegpay ! udpsink host=<IP> port=5600
The receive command:
gst-launch-1.0 udpsrc port=5600 ! application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink
The receive command works fine opening a new window that shows the stream as expected.
However, I was not able to find a way of displaying it in an HTML page.
Do you have any suggestions of what to look for since I am new in the media streaming field?

Does qemu emulator have checkpoint function?

I am using qemu emulator for aarch64 and want to create an outside checkpoint (or fast forwarding) to save all I need to restart the system just from the point when I create checkpoint. (In fact, I want to skip the booting step) I only found something on qemu VM snapshot and fast forwarding, but it does not work for the emulator. Is there any checkpoint function for qemu emulator?
A savevm snapshot should do what you want. The short answer is that you need to set up a QCOW2 disk for the snapshots to be saved to, and then in the monitor you can use the 'savevm' command to take the snapshot. Then the command line '-loadvm' option will let you resume from there. This all works fine in emulation of AArch64.
https://translatedcode.wordpress.com/2015/07/06/tricks-for-debugging-qemu-savevm-snapshots/ has a more detailed tutorial.
Minimal example
Peter's answer just worked for me, but let me provide a fully reproducible example.
I have fully automated everything at: https://github.com/cirosantilli/linux-kernel-module-cheat/tree/1e0f0b492855219351b0bfa2eec4d3a6811fcaaa#snapshot
The key step is to convert the image to qcow2 as explained at: https://docs.openstack.org/image-guide/convert-images.html
cd buildroot/output.x86_64~/images
qemu-img convert -f raw -O qcow2 rootfs.ext2 rootfs.ext2.qcow2
And the final QEMU command used was:
./buildroot/output.x86_64~/host/usr/bin/qemu-system-x86_64 -m 128M -monitor telnet::45454,server,nowait -netdev user,hostfwd=tcp::45455-:45455,id=net0 -smp 1 -M pc -append 'root=/dev/vda nopat nokaslr norandmaps printk.devkmsg=on printk.time=y console=ttyS0' -device edu -device virtio-net-pci,netdev=net0 -drive file=./buildroot/output.x86_64~/images/rootfs.ext2.qcow2,if=virtio,format=qcow2 -kernel ./buildroot/output.x86_64~/images/bzImage -nographic
To test it out, login into the VM, and run:
i=0; while true; do echo $i; i=$(($i + 1)); sleep 1; done
Then on another shell, open the monitor:
telnet localhost 45454
savevm my_snap_id
The counting continues. Then, if we load the vm:
loadvm my_snap_id
the counting goes back to where we saved. This shows that CPU and memory states were reverted.
We can also verify that the disk state is also reversed. Guest:
echo 0 >f
Monitor:
savevm my_snap_id
Guest:
echo 1 >f
Monitor:
loadvm my_snap_id
Guest:
cat f
And the output is 0.

GStreamer + V4L2loopback as Chrome compatible webcam

I am trying to create a virtual camera in Chrome using v4l2loopback where the incoming video is H264 via RTP.
I have has some success in getting a GStreamer test video recognized in Chrome with MediaStreamTrack.getSources:
$ sudo modprobe v4l2loopback
$ gst-launch-1.0 videotestsrc ! v4l2sink device=/dev/video0
This works well, Chrome will display the video test source.
However, when I use an incoming h264/RTP source the device does not show up in MediaStreamTrack.getSources. For example:
gst-launch-1.0 -v tcpclientsrc host=<IPADDRESS> port=5000 ! gdpdepay ! rtph264depay ! avdec_h264 ! videoconvert ! v4l2sink device=/dev/video0
What is the reason for this? What would the solution be?
I had thought perhaps this is to do with the video formats and perhaps the correct caps needed to be set through v4l2loopback.
This looks like a bug in gstreamer or v4l2loopback. It is somehow related to how variable frame rate is handled.
I managed to reproduce it in this way:
Start pipeline transmitting video from network to /dev/video0
$ gst-launch-1.0 -v tcpserversrc port=5000 \
! gdpdepay ! rtph264depay \
! decodebin \
! v4l2sink device=/dev/video0
Start pipeline transmitting some video to port 5000
$ gst-launch-1.0 -v videotestsrc \
! x264enc ! rtph264pay ! gdppay \
! tcpserversink port=5000
Try to get video from /dev/video0
$ gst-launch v4l2src device=/dev/video0 ! autovideosink
...
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Device '/dev/video1' is not a capture device.
Now, note the caps for v4l2sink in the debug log of the first pipeline.
/GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)bt601, framerate=(fraction)0/1
It mentions that framerate=(fraction)0/1. In gstreamer's terms this means that frame rate is variable. According to v4l2sink's source code it seems that it feed this same frame rate to v4l2loopback kernel module but v4l2loopback does not understand zero frame rate.
(This is only hypothesis, still need to check if this is what really happens.)
To workaround this bug you can fix frame rate. Just add videorate element to the first pipeline:
$ gst-launch-1.0 -v tcpserversrc port=5000 \
! gdpdepay ! rtph264depay \
! decodebin \
! videorate ! video/x-raw, framerate=25/1 \
! v4l2sink device=/dev/video0

Saving security camera stream

I've an axis M1011 camera and i want to continuosly save the flow of the camera and divide it in multiple file?
Than register it to a database mysql (i think only the information of the file).
How it is possible to do it?
I saw ffmpeg but i think i would lose some frame between the various connection.
One simple script is this: this would save a video every minute, the fps = 1.
The video are saved in the directory in year/month/day/hour/... With a path created as it is, i don't know if it is useful to store the path in the database.
b=.avi;
while true; do
path=`date +%Y/%m/%d/%k/`;
file=`date +%k:%M-%d_%m_%Y`;
mkdir -p $path;
e=$file$b;
echo $e;
ffmpeg -r 1 -t "00:01:00" -f mjpeg -i http://address/mjpg/video.mjpg? streamprofile=lowprofile $path$e &
sleep 60;
i=`expr $i + 1`;
done