I'm looking for some suggestions or pointers on where to look or how to get started with a project for Windows Phone 8.1. The idea is pretty simple in my mind. I want to constantly record video to a memory stream only keeping say the last five seconds, then an event will trigger saving the video steam to a file on to the phone.
I was originally thinking I could save raw frames to a ring buffer and define the size based on the raw frame size * sample rate. Now I realize that might not work because the video provided by the MediaCapture class will be encoded. Digging on stackoverflow, I came across the idea of using MFTs but it sounds a lot more complicated than I originally had in mind.
Looking around the Development Reference material on MSDN, I'm guessing the MediaCapture class will be my friend. Can I somehow define a fixed size stream for use with MediaCapture.StartRecordToStreamAsync then on my event connect it to MediaCapture.StartRecordToStorageFileAsync? Or perhaps there might be a more appropriate way to do this that I should investigate?
Related
It seems that media pipeline in Windows Phone 8.1 is broken because of a lot of memory management issues.
When you create a background audio app that uses IMediaSource to stream audio in Windows Phone Runtime 8.1, the app's components eventually throw OutOfMemoryException and even StackOverflowException under some circumstances. When looking through the memory dumps, there's a lot of uncollected garbage inside.
The discussion has started on MSDN forums and progressed to this conclusion. I have created a WPDev UserVoice suggestion in order Windows Phone team could notice this, but I still hope it's me (and other guys from MSDN forums) who's wrong and there's a solution for the issue.
I also have a small CodePlex project that also suffers from this, and there's actually an issue report there regarding this exact problem.
I hope that with the help of the community this issue can be worked around or passed directly to Microsoft development team to investigate and eliminate. Thanks!
Update 1:
There's a kind of workaround for StackOverflowException, but it doesn't help against OutOfMemoryException.
Okay, so it seems that the problem is actually with byte array's lifetime in .NET.
In order to resolve the memory problem, one shoud use Windows Runtime's Windows.Storage.Streams.IBuffer. Don't create many new .NET byte arrays in any form, neither by simple new byte[], nor by using System.Runtime.InteropServices.WindowsRuntime.WindowsRuntimeBuffer class as it is a managed implementation of IBuffer interface.
Those byte arrays, being once allocated, live long because of being pinned by OverlappedData structures and overflow the memory threshold for background audio task. IBuffers (the real Windows Runtime ones, like Windows.Storage.Streams.Buffer class) contain native arrays that are being deallocated as soon as IBuffer's reference count reaches 0 (zero), they don't rely on GC.
What I've found out is that this problem is not only background audio specific. Actually, I have seen a lot of other questions about similar problems. The solution is to use Windows Runtime backend where possible because it's unmanaged and frees resources as soon as they have zero references.
Thanks to #Soonts for pointing me in the right direction!
They had memory issues with the way MSS manages its memory, but they have silently fixed it during some update: WP7 Background Audio - Memory Leak or Not?
I’m not sure, but I think the problem is your code. You just shouldn’t call var buffer = new byte[4096]; each time a sample is requested. Doing so may work on the PC, but for the embedded platform, I don’t think it’s a good idea to stress the memory manager that much.
In my MediaStreamSource implementation, I use a single circular buffer that is allocated when the MSS is constructed, and the portions of the buffer are infinitely reused during the playback. In my GetSampleAsync, I construct an instance of my Stream-implementing class, that doesn’t own any memory, but instead only holds a reference to a portion of that circular buffer. This way, only a few small objects are allocated/deallocated during the playback, thus the audio stream data does not load the memory manager.
I want to extract frames from live video stream using windows phone 8. I'm grabbing my video stream from a device via bluetooth connection. Also I want to do some image processing with the extracted frames. But this process should run on background and should notify user along with defined situations. I searched though the internet with my requirements, but didn't find any solution that could be use to handle that kind of scenario.
So please let me know, is it possible to implement that kind of application using windows phone 8 SDK? If so, please kind enough to provide some details and directives that I should concern. Otherwise please let me know what are the major issues with my scenario.
Thank you very much.
Is there anyway to buffer video in a Windows Phone 8 app?
I want to create an app that buffers the last 30 seconds or so of video so that the user can tap the screen and get a video file that includes the 30 seconds of video taken prior to their tapping the screen.
I've looked at both the .NET CaptureSource API, and the WP8 only AudioVideoCaptureDevice, both look like they record directly to a file on IsolatedStorage:
For CaptureSource you use a FileSink object to write an mp4 file of your recorded video.
For AudioVideoCaptureDevice you can write to a RandomAccessStream. WP8 doesn't have the InMemoryRandomAccessStream though, so the only way I see to get a RandomAccessStream is to create one from a storage file.
For CaptureSource you could write your own VideoSink class to buffer your video and use that instead of FileSink, but then you you would be stuck working with the Raw video data, and you'd have to write your own encoder to get it into a formal like an mp4.
Is there anything I'm missing, or is buffering video just not possible on WP8 unless you write your own encoder?
I'm not sure you can do This... for various reasons... Maybe you can cache video in memory, making your own implementation of IRandomAccessStream but... as you noted, you need to play in first instance with RAW video, and depending resolution, 30 seconds of raw video and audio can weight more than the total allowed memory for the application so you can had your app closed by the system.
I don't know if you can use a mediaelement to play the video Without showing it to the user and when the user click on play, rewind to Start position and show it to the user, as OS automatically cache streamed videos (This is a happy idea... i don't test This in anyway....)
Sorry for not begin more useful :(
I've spent plenty of time solving this problem, but it looks like I need some help. I have a web conference application which provides ability to stream live video, chat, share documents, draw on a whiteboard, share desktop, etc. And now I want to record everything that happens in taken separately so called webinar, including video and sound. So I'm looking for tools that can help with this goal.
Here's input data:
This is Adobe Flash based application
Using wowza server
Everything should be recorded on server
Many webinars can be in recording mode at the same time
Record should be represented in video (flv, mp4 or whatever)
What I've done so far and what I problems I have:
I have implemented recording on server side. But this is not a video, this is just a list of commands to recreate passed webinar. It works, but has lot's of limitations and problems with rewinding.
And now I'm testing this FLV Encoding library. I created AIR application that starts on server when record is needed, connects to taken webinar and takes screenshots from itself with BitmapData.draw() method. Works pretty neat, but has some limitation that I'm looking help with:
First of all, this is sound problem. I have no idea how to catch all
sounds from all sources in flash. So far from my tests and googling I conclude that SoundMixer.computeSpectrum() won't help me to do this. Maybe this can be done on server side by mixing all streams on the right time but I think this can lead to synchronization problems and I prefer to capture sound on client. Maybe there is way to capture audio byte array from rtmp stream somehow?
Security problems. We have 2 kinds of them. First ones are with streaming videos. BitmapData.draw() method throws exeptions even after adding <StreamAudioSampleAccess>true</StreamAudioSampleAccess>
<StreamVideoSampleAccess>true</StreamVideoSampleAccess> on server. There are lots of posts about this problem and no good solution.
But more complex problem is that YouTube videos can be opened in webinar using api player. And in this situation I have no idea how to resolve security problem. Maybe someone knows a way or workaround to use BitmapData.draw() on YouTube AS3 player?
Or maybe there is another good way to solve my recording issue?
Free Apache Openmeetings conferencing [1] has a java recording application inside which should work in 3.0 release. Just use it.
[1] http://openmeetings.apache.org/
I'm currently working on a dynamic MP3 player in AS3. The player will also support continuous (in length) radio streams.
Because my player will include a 'seek' bar, I allow the user to seek through the Sound object's data. Now I know that with a continuous stream, data being stored on the users RAM will never stop, as downloading will never stop on a continuous stream. This means, after a few hours of streaming, allot of RAM is being used by my app. I've tested the app on my own machine, running a very high spec, and the app crashes in my browser. When i say the app crashes, I mean the whole of Flash, meaning I have to restart my browser in order to use Flash again. I know my app is the cause as Flash has never crashed in the past. It only does it when my app has been streaming for 2+ hours.
So what I want to do is only allow the user to cache up to an hours worth of audio. After an hour, I want to clear the first half of the sound objects data, meaning that only the most recent half hours audio is stored and available for seeking.
So I have my stream:
var soundObj:Sound = new Sound();
soundObj.load(new URLRequest('stream.mp3'));
//ect ect
and sound is where the data is stored. So my question: How would I clear the first 30 mins of audio from that object?
Perhaps the Sound class is not meant to reliably play "unlimited" MP3 files, which seems to be your case. It is made to play normal MP3 "songs". Two hours of MP3 sound can easily accumulate to be larger than 200 megabytes of data.
But there is a good solution - use NetConnection and NetStream classes to stream audio instead. There are many tutorials out there. You will also be able to stream your MP3s, just a bit differently - a central server will be involved, which will transcode these MP3s on the fly, delivering it to you in a true "streaming" manner. One of such servers is Adobe Flash Media Server, an overpriced piece of work from Adobe. A lot of free and open-source alternatives exist which will work fine for your purposes - Red5, nginx-rtmp to name a few, that I have tested myself.