Display RTSP stream in Adobe AIR - actionscript-3

I am working on a project that involves displaying video feed from an IP camera using Adobe AIR. I know that Flash does not have a native support of RTSP protocol and therefore I am evaluating all possible routes I can take to solve this issue:
Use Adobe Media Server to convert incoming RTSP stream to RTMP and then use Flash API (NetConnection & NetStream) directly.
Write a custom class to fetch, decode and display the stream in adobe AIR. [I am unable to confirm if this is possible due to insufficient info on the net]
Give up on RTSP and instead fetch JPEG/MJPEG sequence of images and display them in AIR relatively easily but with doubtful live performance. [due to JPEG/MJPEG refresh interval of IP camera and same interval separately in AIR]
Use DirectShow Video Source Filter for JPEG and M-JPEG IP Cameras to process the JPEG/MJPEG stream, create a virtual Webcam device (the filter does this automatically) and then use Camera class to display the video feed in AIR.
Use webcam 7 - A software designed to handle RTSP, JPEG/MJPEG and other stream protocols for many camera brands/models. It installs a driver in system that creates a virtual camera, and that all the other applications can then use as a normal webcam.
Unfortunately this software is buggy and often becomes unstable (could be with my particular camera model only though) and might even crash.
Are there any better, easier options that might not require any third-party software?
EDIT:
In case anybody else bumps into same problem:
As suggested by Rudolfs Bundulis, I decided to write a NativeProcess (ANE) that uses FFMPEG to fetch the RTSP stream data, transcode it, and feed it to Flash player.
You might want to look at these for more specific steps:
http://www.purplesquirrels.com.au/2013/02/converting-video-with-ffmpeg-and-adobe-air/
https://www.youtube.com/watch?v=6N7eN9wvAGQ

Take the route described in option 2 - write a Adobe AIR native extension (ANE) that uses FFMpeg to handle the RTSP stream, decode it and pass the RGB data back to AIR for rendering. The hardest part would be compiling FFmpeg if you need cross platform functionality, however, since you mention DirectShow that is Windows only, then I assume you are bound to Windows. Zeranoe provides prebuild FFmpeg libraries for Windows, Stackoverflow has a lot of topics on decoding a stream using FFmpeg and then all you need is a callback to AIR and you're good.

Related

Transcoding Audio/Video/Image file in Android Device

I am working on a chat application like whatsApp, I want to transcode media file before uploading to server,I have gone through so many links but not able to decide which method i should use, is there any straight forward way of transcoding in android ?
FFMPEG i found it is highly cpu intensive process ,it will consume more battery power
Media Codec i want to do the transcoding using mediacodec but not able to get proper steps to understand the process.
Best link to give idea about transcoding
Library to transcode using media codec (It has many bugs)
We used both implementation for our video editing app. Basically we used MediaCodec implementation if android version >= 4.3 and use FFMPEG otherwise.
The problem with using FFMPEG:
As you said, cpu intensive process thus consume more battery
x264 encoder is licensed under GPL, so you might want to use OpenH264 encoder instead which only support Baseline Profile, therefore video quality is not the best
Since it used software encoder, processing speed is relatively slow, at least compared to the MediaCodec implementation
MediaCodec also have some cons though, for example:
If you want to do transcoding, android version need to be >= 4.3 unless you want to deal with color format conversion yourself, which is completely mess, since each vendor may have it's own color format implementation. (Since 4.3, MediaCodec support encoding using input surface)
Hardware encoder may behave differently for different models. (For example some encoder may produces B frames which is not supported yet by android MediaMuxer, so you may want to use ffmpeg for the muxing part)
So I should say if you only support new android version, you should use mediacodec, but if you want to be safe (easier to write code that works on all device) and does not really mind the performance, use FFMPEG with OpenH264
Android's MediaCodec is a relatively better way to transcode on the client since it uses its own low level buffer processing. But then it doesn't provide elaborate tweaking freedom as FFMpeg does.
As to MediaCodec source code, it also is CPU intensive for holding the buffers and processing them but its actually way lesser than FFmpeg.

Capture audio stream from NaCl with Web Audio API?

I have a compiled NaCl module that plays audio directly on the speakers. Is it possible to capture the audio buffer with the Web Audio API, or do I need to get the source code for the NaCl module and modify it (and learn Pepper and C++) to achieve this?
Not entirely sure (at all) how NaCl works, but https://github.com/mattdiamond/Recorderjs can be used to capture audio in Web Audio. It seems likely it'd work in this scenario too.

WebRTC to render H.264 video streams

I am developing a web application where I need to integrate H.264 UDP Streams. I am currently using HTML5. I want to know if I can do the same using WebRTC before I proceed.
WebRTC has lots of code related to h.264 decoding and rendering (i.e. here is WebRTC calling Mac APIs).
However, WebRTC is a monolithic piece of code, in that the networking and decoding are tied together. I don't believe you can build your own networking component, and then give the h.264 stream to WebRTC to decode and render.

XNA Audio and WASAPI conflict?

I've recently been working on integrating some code for a VoIP application. On one end, the UI (mainly the dialer) uses the XNA Audio framework to play sound bytes (DTMF) on button presses. On the other hand, the actual call module uses WASAPI for capturing/rendering audio. After integrating the parts, I was seeing an AUDCLIENT_BUFFER_SILENT flag during the call, and no audio was getting through. I disabled all traces of XNA and tried again, only to see the call work just fine (no silent buffer flag present).
Do XNA and WASAPI not play nice? Is there a way to keep using XNA for the sound bytes and WASAPI for calls?
Just to tie things up here -- Microsoft has stated that while XNA is still available on WP8, it has been officially deprecated. So, assuming the problem stems from cross API headbutting, the answer is to use another API. I found that using MediaElements is a suitable replacement for the XNA sound effects.

How to customize stream protocol to flash client

I need to create a custom communication between server and flash client. For example I want to write UDP protocol using error correction. It is much faster than TCP and does not suffer from routing problem. Unfortunately I absolutely cannot think of how to replace the existing way:
_stream = new NetStream(_connection);
_video.attachNetStream(_stream);
This encloses all communication and I do not have a control over it. I understand that I can use appendBytes, but not sure what exactly to pass to this function. I can do anything on the server side. My video is H.264 and audio is ACC.
Unless it's a AIR application, you can't. It's native API which already handling application layer (OSI model).
If you want make your own, using flash.net.DatagramSocket class (available in AIR 2+) for your application layer and NetStream.appendBytes for audio/video stream decoding & playback (feeded with FLV/F4V chunks)
http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/net/DatagramSocket.html
http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/flash/net/NetStream.html#appendBytes%28%29
I was betting on UDP, but never got it working in Flash. I'll explain it:
In your browser, there is really no way to make usage of UDP!!! Flash applications there run in a sandbox, which only talk TCP!
Air is used for desktop applications, which after compilation run in a desktop wrapper, which itself has direct access to the socket and other possibilities.
That's it! You have to use TCP.