Incorrect buffer length gstreamer - pygame

I have the following function that processes a buffer object containing a video frame supplied by GStreamer
def __handle_videoframe(self, appsink):
"""
Callback method for handling a video frame
Arguments:
appsink -- the sink to which gst supplies the frame (not used)
"""
buffer = self._videosink.emit('pull-buffer')
(w,h) = buffer.get_caps[0]["width"],buffer.get_caps[0]["height"]
reqBufferLength = w * h * 3 #Required buffer lenght for a raw rgb image of these dimensions
print "Buffer length: " + str(len(buffer.data))
print "Needed length: " + str(reqBufferLength)
img = pygame.image.frombuffer(buffer.data, self.vidsize, "RGB")
self.screen.blit(img, self.vidPos)
pygame.display.flip()
When running this code however, pygame crashes because the supplied buffer is larger than required and this size needs to match. I know this is probably caused by a faulty encoding of the movie that is played (as most movies do run fine), but is there a way to account for this contingency? Is there a way to resize the buffer on the go to a correct size? I have tried to just cut-off the tail of the buffer at the required length and then the movie does play, but the output is corrupted.

ok, a better solution was to use bufferproxies. They are less fuzzy about the length of the buffer.
img_sfc = pygame.Surface(video_dimensions, pygame.SWSURFACE, 24, (255, 65280, 16711680, 0))
img_buffer = img_sfc.get_buffer()
Then for each new frame:
img_buffer.write(buffer.data, 0)
pygame.display.get_surface().blit(img_sfc.copy(), vid_pos)
And voila, even incorrectly formatted buffers appear on screen without problems

Related

h264 stream muxing: duration of resulting file is shorter than recording time

I am muxing H264 stream from v4l device to avi container using the approach outlined in the following stackoverflow question
The resulting files are playable, but for (lets say) 30 second recording the resulting file is only 10 seconds length. In other words, once I press 'Start recording' button until I press 'Stop' recording it is 30 seconds that have elapsed but file is only 10 seconds length (as shown in Windows Media player). Muxing starts immediately once I pressed the 'Start recording' button.
Any ideas on how could I approach this problem?
The problem was with the fps parameter:
AVStream *pst = avformat_new_stream(fc, 0);
vi = pst->index;
AVCodecContext *pcc = pst->codec;
_M;
avcodec_get_context_defaults3(pcc, AVMEDIA_TYPE_VIDEO);
pcc->codec_type = AVMEDIA_TYPE_VIDEO;
pcc->codec_id = codec_id;
pcc->bit_rate = br;
pcc->width = w;
pcc->height = h;
pcc->time_base.num = 1;
int fps = 30; // problem here
pcc->time_base.den = fps;
As it turned out, H264 stream produces frames with 13 fps. Once I made fps = 13, file duration become aligned with the expected time.

CMSIS real-FFT on 8192 samples in Q15

I need to perform an FFT on a block of 8192 samples on an STM32F446 microcontroller.
For that I wanted to use the CMSIS DSP library as it's available easily and optimised for the STM32F4.
My 8192 samples of input will ultimately be values from the internal 12-bit ADC (left aligned and converted to q15 by flipping the sign bit)., but for testing purpose I'm feeding the FFT with test-buffers.
With CMSIS's FFT functions, only the Q15 version supports lengths of 8192. Thus I am using arm_rfft_q15().
Because the FFT functions of the CMSIS libraries include by default about 32k of LUTs - to adapt to many FFT lengths, I have "rewritten" them to remove all the tables corresponding to other length than the one I'm interested in. I haven't touched anything except removing the useless code.
My samples are stored on an external SDRAM that I access via DMA.
When using the FFT, I have several problems :
Both my source buffer and my destination buffer get modified ;
the result is not at all as expected
To make sure I had wrong results I did an IFFT right after the FFT but it just confirmed that the code wasn't working.
Here is my code :
status_codes FSM::fft_state(void)
{
// Flush the SDRAM section
si_ovf_buf_clr_u16((uint16_t *)0xC0000000, 8192);
q15_t* buf = (q15_t*)(0xC0000000);
for(int i = 0; i<50; i++)
buf[i] = 0x0FFF; // Fill the buffer with test vector (50 sp gate)
// initialise FFT
// ---> Forward, 8192 samples, bitReversed
arm_rfft_instance_q15 S;
if(arm_rfft_init_q15(&S, 8192, 0, 1) != ARM_MATH_SUCCESS)
return state_error;
// perform FFT
arm_rfft_q15(&S, (q15_t*)0xC0000000, (q15_t*)0xC0400000);
// Post-shift by 12, in place (see doc)
arm_shift_q15((q15_t*)0xC0400000, 12, (q15_t*)0xC0400000, 16384);
// Init inverse FFT
if(arm_rfft_init_q15(&S, 8192, 1, 1) != ARM_MATH_SUCCESS)
return state_error;
// Perform iFFT
arm_rfft_q15(&S, (q15_t*)0xC0400000, (q15_t*)0xC0800000);
// Post shift
arm_shift_q15((q15_t*)0xC0800000, 12, (q15_t*)0xC0800000, 8192);
return state_success;
}
And here is the result (from GDB)
PS : I'm using ChibiOS - not sure if it is relevant.

How can I record sound with 16 bits per sample (16 bit depth)?

I try to record PCM sound from flash (using Microphone class). I use org.bytearray.micrecorder.MicRecorder helper class.
In Microphone class I cannot find property like bitDepth or bitsPerSample.
I always get 32 bits.
Is it possible to do?
UPDATE: The asker John812 was able to solve this by using..
bit16_bytes.writeShort( data.readFloat() * 32767 ); see comments below for context
METHOD #2: Based on my experience with using the LoadPCMfromByteArray method
I have something you could try but I've only used it with an actual 32bit WAVE file and played via the LoadPCMFromByteArray command.
The AS3 Microphone Class records 32 bits. You have to write the conversion of samples to a different bit-depth by yourself. I have no idea how many samples you are processing but the general code below shows you how to convert. Note: * 512 means use your actual samples amount (example: * 4096? or * 8192?) If you get the numbers wrong there'll be hiss/distortion so either experiment from small or provide the full details in your question for a more helpful edit/answer.
CONVERT: Assuming your recorded byteArray is called data
public var bit16_bytes : ByteArray; //will hold the 16bit version
public function convert_to16Bit () : void
{
bit16_bytes = new ByteArray(); data.position = 0;
while (bit16_bytes.position < data.length - 4)
//if you get noise/distortion try either: 256, 512, 1024, 2048, 4096 or 8192
{ bit16_bytes.writeShort( data.readInt() * 512 ); } //multiply by samples amount
data = new ByteArray(); //recycle for re-use
bit16_bytes.position = 0; //reset or else E-O-File error
bit16_bytes.readBytes( data ); //copy 16bit back into Data byte-array
}
To run the above function whenever you're ready just add the line convert_to16Bit(); inside whatever function deals with your "recording complete" situation.

AS3 mp3 import error: The AMF encoding of the arguments cannot exceed 40K

NOTE: I have seen the other question in Error #2084-The AMF Encoding of the arguments cannot exceed 40K
my problem is different. My array IS NOT 0 and it is less than 40960.
My code is a simple one. I got this mp3 recording fla from this link: http://www.jordansthings.com/blog/?p=5
It uses the shinemp3 encoder.
I just wanted to play the recorded sound rather than saving it. So I added the following to the button that saves the recorded file:
private function onWavClick(e:MouseEvent)
{
// WRITE ID3 TAGS
var sba:ByteArray = mp3Encoder.mp3Data;
sba.position = sba.length - 128
sba.writeMultiByte("TAG", "iso-8859-1");
sba.writeMultiByte("Microphone Test 1-2, 1-2 "+String.fromCharCode(0), "iso-8859-1"); // Title
sba.writeMultiByte("jordansthings "+String.fromCharCode(0), "iso-8859-1"); // Artist
sba.writeMultiByte("Jordan's Thingz Bop Volume 1 "+String.fromCharCode(0), "iso-8859-1"); // Album
sba.writeMultiByte("2010" + String.fromCharCode(0), "iso-8859-1"); // Year
sba.writeMultiByte("www.jordansthings.com " + String.fromCharCode(0), "iso-8859-1");// comments
sba.writeByte(57);
//new FileReference().save(sba, "FlashMicrophoneTest.mp3") // this saves the file. I don't need it.
// my addition
var snd:Sound = new Sound();
var channel:SoundChannel = new SoundChannel();
trace(sba.length);
snd.loadCompressedDataFromByteArray(sba,sba.length);
channel = snd.play();
}
Moreover: even if this works... I cannot load an array larger than 40K???
Before calling loadCompressedDataFromByteArray, you should set the position of your ByteArray to 0. For example:
sba.position = 0;
snd.loadCompressedDataFromByteArray(sba,sba.length);
I noticed that in my application the bytesAvailable on the ByteArray was 0. This was because the position on the ByteArray was at the end of the ByteArray. The error message is confusing because it says you are exceeding 40K while you are not. It should be saying that there is nothing to load from the ByteArray.
Also, I can confirm that I can load ByteArrays that are larger than 40K. I tested with 126KB mp3 ByteArray.
In my case the problem was not the size of the ByteArray I wanted to read. I was reading and playing 30 Mb mp3 files without problem (that's a lot, I know!). The problem was that I was reading the file many times, and after the first time the position of the ByteArray was at the end. So you have to restart to 0 the position any time you want to read that byteArray. For security, I assume it is a good practice.

Resizing a video player

I have pygame based video player, it has berkelium browser used for GUI,
under that are VLC libraries for playing streams. I did not make the player, but I would
like to add a "resize" option, which is not currently present.
The application is made for OS X.
On initialization of the player, pygame.set_mode(screensize,OPENGL|DOUBLEBUF|RESIZABLE) is called, problem is when I resize the window more than the screen size, these parts are not visible(shown bugged), if I try pygame.set_mode again, it causes the app to crash because of dependencies (browser and VLC).
So I decided to initialize screen using:
pygame.set_mode(screenresolution, OPENGL|DOUBLEBUF|RESIZABLE)
Which should set it to the max and then switch back to original, smaller resolution (like resize with the mouse).
How to do that? how to simulate the RESIZABLE flag and mouse move action?
The player works nice resizing to smaller sizes, it is only bigger sizes that are a problem.
class InputAndDisplay:
def init(self, fullscreen=False):
global pygame, sys
# if 'darwin' in sys.platform:
# import sys
# sys.argv = ['Core.py']
# import pygame.macosx
# pygame.macosx.init()
pygame.init()
global screenSize
self.screenSize = screenSize
if fullscreen:
try: self.screenSize = pygame.display.list_modes()[0]
except Exception, e: print e
flags = OPENGL|DOUBLEBUF|FULLSCREEN
else:
flags = RESIZABLE|OPENGL|DOUBLEBUF
print 'screenSize:', self.screenSize
try:
povrsina = pygame.display.set_mode(self.screenSize, flags)
except Exception, e:
print e
self.screenSize = screenSize
povrsina = pygame.display.set_mode(self.screenSize, RESIZABLE|OPENGL|DOUBLEBUF)
pygame.display.set_caption('NetTVBox-WebPlayer')
pygame.mouse.set_pos(list(self.screenSize))
if 'darwin' not in sys.platform:
pygame.mouse.set_visible(False)
#pygame.key.set_repeat(300, 100)
if 'darwin' not in sys.platform:
pygame.mixer.quit()
if 'linux' in sys.platform: # try to turn off vsyncing
import ctypes
libgl = ctypes.cdll.LoadLibrary('libGL.so.1')
proc = libgl.glXGetProcAddressARB('glXSwapIntervalMESA')
_glXSwapIntervalMESA = ctypes.CFUNCTYPE(ctypes.c_int, ctypes.c_int)(proc)
_glXSwapIntervalMESA(0)
glMatrixMode(GL_PROJECTION)
glLoadIdentity()
#gluPerspective(45.0, 640/480.0, 0.1, 100.0)
glOrtho(-0.5, 0.5, -0.5, 0.5, -5, 5)
#glTranslatef(0.0, 0.0, -3.0)
#glRotatef(25, 1, 0, 0)
glMatrixMode(GL_MODELVIEW)
glLoadIdentity()
glEnable(GL_TEXTURE_2D)
glShadeModel(GL_FLAT)
glClearColor(0.0, 0.0, 0.0, 0.0)
glClearDepth(1.0)
glEnable(GL_DEPTH_TEST)
glDepthFunc(GL_LEQUAL)
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST)
self.textures = glGenTextures(2)
self.threeDmode = MODE_3D_OFF
that's init in inputAndDisplay.py
And main file is Core.py which calls it, and a lot of other stuff as well. Thats why I didnt put all the code here, there is a lot of it and not all important
Here is part of Core.py:
input_and_display = InputAndDisplay()
input_and_display.init(self.fullscreen)
rc = RemoteControl()
rc.init()
media = Media()
if 1: # !
media.config = {
'buffering': state.mediaplayer_buffering,
'contrast': state.mediaplayer_contrast,
'saturation': state.mediaplayer_saturation,
}
media.init()
browser = Browser()
browser.init()
Thanks in advance for the help
It sounds like you're missing the VIDEORESIZE event to detect resizing.
( See related events : VIDEOEXPOSE, ACTIVEEVENT http://www.pygame.org/docs/ref/event.html )
http://www.pygame.org/docs/ref/display.html#pygame.display.set_mode
" Then the display mode is set, several events are placed on the pygame event queue. pygame.QUIT is sent when the user has requested the program to shutdown. The window will receive pygame.ACTIVEEVENT events as the display gains and loses input focus. If the display is set with the pygame.RESIZABLE flag, pygame.VIDEORESIZE events will be sent when the user adjusts the window dimensions. Hardware displays that draw direct to the screen will get pygame.VIDEOEXPOSE events when portions of the window must be redrawn. "