h264 stream muxing: duration of resulting file is shorter than recording time - h.264

I am muxing H264 stream from v4l device to avi container using the approach outlined in the following stackoverflow question
The resulting files are playable, but for (lets say) 30 second recording the resulting file is only 10 seconds length. In other words, once I press 'Start recording' button until I press 'Stop' recording it is 30 seconds that have elapsed but file is only 10 seconds length (as shown in Windows Media player). Muxing starts immediately once I pressed the 'Start recording' button.
Any ideas on how could I approach this problem?

The problem was with the fps parameter:
AVStream *pst = avformat_new_stream(fc, 0);
vi = pst->index;
AVCodecContext *pcc = pst->codec;
_M;
avcodec_get_context_defaults3(pcc, AVMEDIA_TYPE_VIDEO);
pcc->codec_type = AVMEDIA_TYPE_VIDEO;
pcc->codec_id = codec_id;
pcc->bit_rate = br;
pcc->width = w;
pcc->height = h;
pcc->time_base.num = 1;
int fps = 30; // problem here
pcc->time_base.den = fps;
As it turned out, H264 stream produces frames with 13 fps. Once I made fps = 13, file duration become aligned with the expected time.

Related

web audio api plays beep, beep,... beep at different rate

I am trying to play "beep" sound at different rate based on some sensor readings inside a browser window.
The idea is to "beep, beep, beep, ... beep" faster when the sensor reading is high, and "beep,...beep" slower when the sensor reading is low, all in real-time.
The sensor reading is fed into the browser via socket.io. I can already control a progress bar moving up and down. The audio feedback is an extra feature.
After some googling, I am thinking about using web audio api, creating a sin-wave oscillator, and to turn it on/off with a gain node connect/disconnect.
My question is how do I control the timing in the right way, say I am trying to beep at a range of frequencies from 1 Hz to 20 Hz, and be able to change the frequency dynamically.
I would most specifically NOT turn an oscillator on and off by connecting and disconnecting it - you'd have to do that from the main thread, so not super-predictable.
You can actually do this with a modulating low-frequency oscillator: check out this code:
var context = new AudioContext();
//defaults to A440Hz, sine wave
var src = context.createOscillator();
// Now let's create a modulator to turn beeps on and off
var mod = context.createOscillator();
mod.type="square";
mod.frequency.value = "2"; // Start at 2Hz
var gain = context.createGain();
var scaler = context.createGain();
src.connect(gain);
gain.connect(context.destination);
mod.connect(scaler); // Mod signal is [-1,1]
scaler.gain.value = 0.5; // we need it to be [-0.5,0.5]
gain.gain.value = 0.5; // then it's summed with 0.5, so [0,1]
scaler.connect(gain.gain);
//start it up
src.start(0);
mod.start(0);
// to change rate, change mod.frequency.value to desired frequency

How can I correctly calculate the time delta?

I'm trying to create a game with an independent frame rate, in which myObject is moving to the right at one unit per millisecond. However, I don't know how to calculate deltaTime in this code:
var currentTime = 0;
var lastTime = 0;
var deltaTime = 0;
while( play ) {
// Retrieve the current time
currentTime = Time.now();
deltaTime = currentTime - lastTime;
lastTime = currentTime;
// Move myObject at the rate of one unit per millisecond
myObject.x += 1 * deltaTime;
}
Let's say the first frame took 30 ms, so deltaTime should be 30 but it was 0
because we only know the time at the start of the frame not at the end of the frame. Then, in the second frame it took 40 ms, so deltaTime is 30 and thus myObject.x is 30. However, the elapsed time is 70 ms (30ms in 1st frame + 40ms in 2nd frame ) so myObject.x is supposed to be 70, not 30.
I'm not simulating physics, I'm just trying to move myObject relative to the elapsed time (not the frame).
How do I calculate deltaTime correctly?
I know that some game engine people use chunk of time or tick, so they're animating ahead of time. Also, I've already read Glenn Fiedler's article on fixing your timestep and many other ones, but I'm still confused.
Try this:
float LOW_LIMIT = 0.0167f; // Keep At/Below 60fps
float HIGH_LIMIT = 0.1f; // Keep At/Above 10fps
float lastTime = Time.now();
while( play ) {
float currentTime = Time.now();
float deltaTime = ( currentTime - lastTime ) / 1000.0f;
if ( deltaTime < LOW_LIMIT )
deltaTime = LOW_LIMIT;
else if ( deltaTime > HIGH_LIMIT )
deltaTime = HIGH_LIMIT;
lastTime = currentTime;
myObject.x += 1000 * deltaTime; // equivalent to one unit per ms (ie. 1000 per s)
}
There is alot wrong with this, but it makes it easier to illustrate the basic concept.
First, notice that you need to initialize lastTime with some value BEFORE the loop starts. You can use a lower value (i.e. Time.now() - 33) so that the first frame yields the desired delta, or just use it as I did (you will see that we limit it in the loop).
Next you get the current time at the start of each froame, use it to calculate the time elapsed since the last loop (which will be zero on the first run of this exaple). Then I like to convert it to seconds because it makes much more sense to work in "per second" than "per milisecond" - but feel free to remove the / 1000.0f part to keep it in ms.
Then you need to limit the deltaTime to some usable range (for the example I used 10-60fps, but you can change this as needed). This simply prevents the loop from running too fast or too slow. The HIGH_LIMIT is especially important because it will prevent very large delta values which can cause chaos in a game loop (better to be somewhat inaccurate than have the code break down) - LOW_LIMIT prevents zero (or very small) time steps, which can be equally problematic (especially for physics).
Finally, once you've calculated the new deltaTime for this frame, you save the current time for use during the next frame.

Incorrect buffer length gstreamer

I have the following function that processes a buffer object containing a video frame supplied by GStreamer
def __handle_videoframe(self, appsink):
"""
Callback method for handling a video frame
Arguments:
appsink -- the sink to which gst supplies the frame (not used)
"""
buffer = self._videosink.emit('pull-buffer')
(w,h) = buffer.get_caps[0]["width"],buffer.get_caps[0]["height"]
reqBufferLength = w * h * 3 #Required buffer lenght for a raw rgb image of these dimensions
print "Buffer length: " + str(len(buffer.data))
print "Needed length: " + str(reqBufferLength)
img = pygame.image.frombuffer(buffer.data, self.vidsize, "RGB")
self.screen.blit(img, self.vidPos)
pygame.display.flip()
When running this code however, pygame crashes because the supplied buffer is larger than required and this size needs to match. I know this is probably caused by a faulty encoding of the movie that is played (as most movies do run fine), but is there a way to account for this contingency? Is there a way to resize the buffer on the go to a correct size? I have tried to just cut-off the tail of the buffer at the required length and then the movie does play, but the output is corrupted.
ok, a better solution was to use bufferproxies. They are less fuzzy about the length of the buffer.
img_sfc = pygame.Surface(video_dimensions, pygame.SWSURFACE, 24, (255, 65280, 16711680, 0))
img_buffer = img_sfc.get_buffer()
Then for each new frame:
img_buffer.write(buffer.data, 0)
pygame.display.get_surface().blit(img_sfc.copy(), vid_pos)
And voila, even incorrectly formatted buffers appear on screen without problems

AS3 mp3 import error: The AMF encoding of the arguments cannot exceed 40K

NOTE: I have seen the other question in Error #2084-The AMF Encoding of the arguments cannot exceed 40K
my problem is different. My array IS NOT 0 and it is less than 40960.
My code is a simple one. I got this mp3 recording fla from this link: http://www.jordansthings.com/blog/?p=5
It uses the shinemp3 encoder.
I just wanted to play the recorded sound rather than saving it. So I added the following to the button that saves the recorded file:
private function onWavClick(e:MouseEvent)
{
// WRITE ID3 TAGS
var sba:ByteArray = mp3Encoder.mp3Data;
sba.position = sba.length - 128
sba.writeMultiByte("TAG", "iso-8859-1");
sba.writeMultiByte("Microphone Test 1-2, 1-2 "+String.fromCharCode(0), "iso-8859-1"); // Title
sba.writeMultiByte("jordansthings "+String.fromCharCode(0), "iso-8859-1"); // Artist
sba.writeMultiByte("Jordan's Thingz Bop Volume 1 "+String.fromCharCode(0), "iso-8859-1"); // Album
sba.writeMultiByte("2010" + String.fromCharCode(0), "iso-8859-1"); // Year
sba.writeMultiByte("www.jordansthings.com " + String.fromCharCode(0), "iso-8859-1");// comments
sba.writeByte(57);
//new FileReference().save(sba, "FlashMicrophoneTest.mp3") // this saves the file. I don't need it.
// my addition
var snd:Sound = new Sound();
var channel:SoundChannel = new SoundChannel();
trace(sba.length);
snd.loadCompressedDataFromByteArray(sba,sba.length);
channel = snd.play();
}
Moreover: even if this works... I cannot load an array larger than 40K???
Before calling loadCompressedDataFromByteArray, you should set the position of your ByteArray to 0. For example:
sba.position = 0;
snd.loadCompressedDataFromByteArray(sba,sba.length);
I noticed that in my application the bytesAvailable on the ByteArray was 0. This was because the position on the ByteArray was at the end of the ByteArray. The error message is confusing because it says you are exceeding 40K while you are not. It should be saying that there is nothing to load from the ByteArray.
Also, I can confirm that I can load ByteArrays that are larger than 40K. I tested with 126KB mp3 ByteArray.
In my case the problem was not the size of the ByteArray I wanted to read. I was reading and playing 30 Mb mp3 files without problem (that's a lot, I know!). The problem was that I was reading the file many times, and after the first time the position of the ByteArray was at the end. So you have to restart to 0 the position any time you want to read that byteArray. For security, I assume it is a good practice.

AS3 - How much time until next frame / screen draw

I have a generative art app, and I'd like it to draw as many cycles as possible each frame without reducing the framerate. Is there a way to tell how much time is left until the screen updates/refreshes?
I figure if I can approximate how many milliseconds each cycle takes, then I can run cycles until the amount of time left is less than the average or the peak cycle time, then let the screen refresh, then run another set of cycles.
If you want your app to run at N frames per second, then you can draw in a loop for 1/N seconds*, where N is typically the stage framerate (which you can get and set):
import flash.utils.getTimer;
import flash.events.Event;
private var _time_per_frame:uint;
... Somewhere in your main constructor:
stage.frameRate = 30;
_time_per_frame = 1000 / stage.frameRate;
addEventListener(Event.ENTER_FRAME, handle_enter_frame);
...
private function handle_enter_frame(e:Event):void
{
var t0:uint = getTimer();
while (getTimer()-t0 < _time_per_frame) {
// ... draw some stuff
}
}
Note that this is somewhat of a simplification, and may cause a slower resultant framerate than specified by stage.frameRate, because Flash needs some time to perform the rendering in between frames. But if you're blitting (drawing to a Bitmap on screen) as opposed to drawing in vector or adding Shapes to the screen, then I think the above should actually be pretty accurate.
If the code results in slower-than-desired framerates, you could try something as simple as only taking half the allotted time for a frame, leaving the other half for Flash to render:
_time_per_frame = 500 / stage.frameRate;
There are also FPS monitors around that you could use to monitor your framerate while drawing. Google as3 framerate monitor.
Put this code to main object and check - it will trace time between each frame start .
addEventListener(Event.ENTER_FRAME , oef);
var step:Number = 0;
var last:Number = Date.getTime();
function oef(e:Event):void{
var time:Number = Date.getTime();
step = time - last;
last = time;
trace(step);
}