Using DX11 and DXVA2 - h.264

I am trying to test decoding a h264/h265 video (with just a single iframe) using DX11 and DXVA2. This is on windows 7 so I probably have to interop between 2 d3d11 devices, one with 11.1 feature set and the other with 9.3. My question is since there is a severe lack of samples for loading a h264 file and decoding it using DXVA, I was wondering if there is a guide for how to layout the data to feed into DXVA to decode? I've read this How do I use Hardware accelerated video/H.264 decoding with directx 11 and windows 7? as well as https://msdn.microsoft.com/en-us/library/windows/desktop/hh162912(v=vs.85).aspx but neither has any guide on how to do the above.
Thanks

If you want a working sample to understand how to feed data into DXVA, look here : MFNode. Under MFTDxva2Decoder, you will see how to feed data. It is for mpeg1/2 file format, but the same apply to H264 (with shades, of course).
EDIT
See my response : How do I use Hardware accelerated video/H.264 decoding with directx 11 and windows 7?

Related

Tiff Output is not as expected for Black and white 1200dpi LZW test file created using Universal Document Converter 6.7 & 6.8 versions

Respected Sir/Madam,
I have a doubt regarding LZW BW 1200dpi tiff file creation using “UDC driver 6.7/6.8 version”.
If we disable “'Perform High-Quality Smoothing”, then output data are not visible in output files.
If we enable this option, it is working fine.
Also, working fine for UDC driver 6.4 for both Enable/Disable 'Perform High-Quality Smoothing'.
We are using below tiff library version in our software.
/* Version number of package */
#define VERSION "4.0.3"
Could you please clarify our below doubts.
LZW support for 'Perform High-Quality Smoothing' always should be enabled?
Is this issue introduced in UDC driver version 6.7?
https://www.print-driver.com/overview/version-history
Best Regards,
Shantala R
Please contact the developers of the software regarding their product.
https://www.print-driver.com/support

Transcoding Audio/Video/Image file in Android Device

I am working on a chat application like whatsApp, I want to transcode media file before uploading to server,I have gone through so many links but not able to decide which method i should use, is there any straight forward way of transcoding in android ?
FFMPEG i found it is highly cpu intensive process ,it will consume more battery power
Media Codec i want to do the transcoding using mediacodec but not able to get proper steps to understand the process.
Best link to give idea about transcoding
Library to transcode using media codec (It has many bugs)
We used both implementation for our video editing app. Basically we used MediaCodec implementation if android version >= 4.3 and use FFMPEG otherwise.
The problem with using FFMPEG:
As you said, cpu intensive process thus consume more battery
x264 encoder is licensed under GPL, so you might want to use OpenH264 encoder instead which only support Baseline Profile, therefore video quality is not the best
Since it used software encoder, processing speed is relatively slow, at least compared to the MediaCodec implementation
MediaCodec also have some cons though, for example:
If you want to do transcoding, android version need to be >= 4.3 unless you want to deal with color format conversion yourself, which is completely mess, since each vendor may have it's own color format implementation. (Since 4.3, MediaCodec support encoding using input surface)
Hardware encoder may behave differently for different models. (For example some encoder may produces B frames which is not supported yet by android MediaMuxer, so you may want to use ffmpeg for the muxing part)
So I should say if you only support new android version, you should use mediacodec, but if you want to be safe (easier to write code that works on all device) and does not really mind the performance, use FFMPEG with OpenH264
Android's MediaCodec is a relatively better way to transcode on the client since it uses its own low level buffer processing. But then it doesn't provide elaborate tweaking freedom as FFMpeg does.
As to MediaCodec source code, it also is CPU intensive for holding the buffers and processing them but its actually way lesser than FFmpeg.

How to apply the OpenGL function

I have a problem of applying some functions of OpenGL(e.g. glDeleteBuffers).
My computer's spec is;
Renderer: AMD Radeon HD 6800 Series
Operating system: Windows 7
Intel(R) Core(TM) i7-2600
I used OpenGL Extensions Viewer 4.4.3 to view information about OpenGL.
I update the latest version of graphic card and found that version of OpenGL is 4.4 and it shows like below picture:
I am not sure what I can do more from now. I would like to use functions like glDeletBuffers, glGenBuffers, glBindBuffer, glBufferData...
Give me some help
The functions you've listed aren't loaded by default even on systems that have hardware supporting modern OpenGL. To get access to these functions you need to query the extension and load it if available.
A guide for querying and loading functions yourself can be found here
If you simply want to load the extensions associated with a OpenGL version (such as 4.1/3.1 etc.) you can use something like GLEW to simply handle the querying and loading for you.
If you want to use older functionality too (while not advised) then make sure to look at loading a compatibility profile to support deprecated version functionality, i.e if you are using the fixed-function pipeline flow. Not that I'd advise it!

How to use Hardware MFT for encoding audio from PCM to AMR in windows phone 8.1?

i need to encode my audio PCM raw stream to AMR. but i didn't find anything much regarding that.i need to know is it possible to use Hardware MFT for encoding and decoding the Audio stream. if yes then how it is possible some idea please.
or any other way to encode Audio Stream from PCM raw to AMR codec?
I posted this question in Microsoft forum and i got the answer, the answer is Microsoft doesn't support in-build AMR codec, to support AMR codec in windows phone we need to use Third Party Codec Library.
The response was:
Unfortunately it doesn't appear that we supply an AMR encoder in box. Because of this you will need to find a 3rd party encoder or use the in box low latency mp4 codec that we do provide.
and then also:
Let me try and clarify. We have limited AMR, 3GP and h.263 support for Windows Phone Silverlight apps. Encoding support for these codecs is not exposed in Runtime apps since we do not provide a default encoding profile implementation. These codecs should only be exposed in Silverlight. There doesn't appear to be a runtime activateable class associated with these codecs so there is no reflection into Media Foundation.

Render Image(other than *.dds) on to the "DrawingSurface" using DirectX3D in Windows Phone 8

How can we render images, other than *.dds, on to the "DrawingSurface" using DirectX3D in Windows Phone 8?
"CreateXXXTextureFromFile" (where XXX is DDS or WIC) is available but
WIC is not supported for Windows Phone 8.
Any help will be highly appreciated.
WIC is in-fact not supported on WP8. I'm not a DirectX expert (far from it) but as far I understand you have two options:
Change your app to a Mixed XAML+D3D app and use XAML to overlay images on top of your app. obviously that has signifcant performance implications due to the additional intermidary surface required by the GPU.
Convert your images to a format that doesn't require WIC before compile time. The
Texconv tool that ships in the DirectXTex project should be able to support that... http://directxtex.codeplex.com/wikipage?title=Texconv&referringTitle=Documentation