Access StageFright.so directly to decode H.264 stream from JNIlayer in Android - h.264

Is there a way to access libstagefright.so directly to decode H.264 stream from JNI layer on Android 2.3 or above?

If your objective is to decode an elementary H.264 stream, then your code will have to ensure that the stream is extracted, the codec-specific-data is provided to the codec which is primarily SPS and PPS data and frame data along with time-stamps is provided to the codec. Across all Android versions, the most common interface would be OMXCodec which is an abstraction over an underlying OMX component.
In Gingerbread (Android 2.3) and ICS (Android 4.0.0), if you would like to create a decoder, the best method would be to create an OMXCodec component and abstract your code through a MediaSource interface i.e. your wrapper code is modeled as MediaSource and OMXCodec reads from this source and performs the decoding.
Link to Android 2.3 Video decoder creation: http://androidxref.com/2.3.6/xref/frameworks/base/media/libstagefright/AwesomePlayer.cpp#1094
Link to Android 4.0.0 Video decoder creation: http://androidxref.com/4.0.4/xref/frameworks/base/media/libstagefright/AwesomePlayer.cpp#1474
The main challenges would be the following:
Model the input as a MediaSource.
Read a wrapper code to read the buffer from the codec and handle the same and release it back to the codec.
For simplification, you could look stagefright command line executable code as in http://androidxref.com/4.0.4/xref/frameworks/base/cmds/stagefright/stagefright.cpp#233
However, if your program is based on JellyBean (Android 4.1.x, 4.2.x) onwards, then these are slightly more simplified. From your JNI code, you could create a MediaCodec component and employ the same for decoding. To integrate the same into your program, you could refer to the SimplePlayer implementation as in http://androidxref.com/4.2.2_r1/xref/frameworks/av/cmds/stagefright/SimplePlayer.cpp#316

Related

How to Store a BluetoothDevice Object to Shared Preferences in my Flutter App

I am using flutter_blue package to get the Bluetooth device. After connection i am trying to store the device locally so that the device info can be seen even if the Bluetooth device is inactive.
the problem I am finding is I cant seem to save the BluetoothDevice object along with its device info since shared preferences doesn't support object type Bluetoothdevice and that there isn't any toJson or fromJson methods provided in the package. I have heard some people say to serialize it but I am still confused. Could someone please help me since I don't understand what to do.
The BluetoothDevice object does not look like information you'd usually cache, since the identifiable features for a BLE Device are usually masked due to various privacy settings.
Still, if you want to store it in the SharedPreferences. Try making a map containing the fields from the class BluetoothDevice and using SharedPreferences to store that. I seem to remember that the serialization from a map to a JSON string is done internally by the library.
I personally use Extension methods to extend the functionality on classes defined by third-party libraries. You can use an extension to define the toJson and fromJson classes that you mention.

MediaStream to C++ data type conversion

I am sending a video and audio stream to C++ from Google Chrome, I am not sure how to cast the data.
What is the data type of this and how do I convert it to C++?
videoStream.getVideoTracks()[0]
I want to build ffmpeg encoder using C++ but I can not figure out how to cast this type?
I was also wondering if there is method to build a test case similar to this data type?

Use Media Foundation H.264 encoder with live555

I want to create an H.264 RTSP stream using the live555 streaming library. For encoding the video frames, I want to use the H.264 encoder MFT. Encoding works using the basic processing model (I do not build a graph, but call the MFT manually). Streaming using a custom FramedSource source also seems to work in the sense that the programme is not crashing and the stream is stable in VLC player. However, the image is crippled - no colour, weird line patterns etc.
I assume that I pass the wrong data from the encoder into the streaming library, but I have not been able to find out what the library is actually expecting. I have read that the Microsoft H.264 encoder outputs more than one NAL in a sample. I further found that live555 requires a single NAL to be returned in doGetNextFrame. Therefore, I try to identify the individual NALs (What does this H264 NAL Header Mean? states that the header can be 3 or 4 bytes - I do not know where to get the information what MF uses, but the memory view of the debugger suggests 4 bytes):
for (DWORD i = 0; i < sampleLen; ++i) {
auto v = *reinterpret_cast<unsigned int *>(sampleData + i);
if (v == ::htonl(1)) {
nals.push_back(sampleData + i);
}
}
This piece of code usually identifies more than one item in one output sample from the MFT. However, if I copy the ranges found by this loop into the fTo output buffer, VLC does not show anything and stops after a few seconds. I also read somewhere that live555 does not want the magic number 0x00000001, so i tried to skip it. The effect on the client side is the same.
Is there any documentation on what live555 expects me to copy into the output buffer?
Does the H.264 encoder in Media Foundation at all produce output samples which I can use for streaming?
Do I need to split the output samples? How much do I need to skip once I have found a magic number (How to write a Live555 FramedSource to allow me to stream H.264 live suggests that I might need to skip more than the magic number, because the accepted answer only passes the payload part of the NAL)?
Is there any way to test whether the samples returned by the H.264 MFT in basic processing mode form a valid H.264 stream?
Here's how I did it MFWebCamRtp.
I was able to stream my webcam feed and view it in VLC. There was no need to dig into NALs or such. Each IMFSample from the Media Foundation H264 encoder contains a single NAL that can be passed straight to live555.

Is there a way to parse JSON data returned from a web service in actionscript without using as3corelib

I am working on a method that needs to parse the JSON response. I looked around and people use JSON.decode() in the as3corelib. But I was wondering if there was another way to do this without having to install any other libraries like this (as3corelib). Like a native json method that comes with the actionscript libraries.
Take a look at the top level JSON class, available since Flash Player 11, AIR 3.0.
The JSON class lets applications import and export data using JavaScript Object Notation (JSON) format.
Why do you have a problem with using an external library?
There is no such thing as an installation required, you just download it.

How to decode RTP/MP4A-LATM audio payload

I am working on an implementation of RTSP in J2ME to connect to Wowza. I have the RTSP part working, and the extraction of RTP packets. I am able to decode and display the h264 video stream.
I am having problems understanding how to create an appropriate audio stream to pass to a J2ME Player object.
As part of the RTSP Setup exchange I get the following information from SDP
m=audio 0 RTP/AVP 96
a=rtpmap:96 MP4A-LATM/24000/1
a=fmtp:96 profile-level-id=15;object=2;cpresent=0;config=400026103FC0
a=control:trackID=1
From this I know that I can expect RTP packets, containing MP4A-LATM format audio, and (most importantly) the mux config data is not present in line with the stream. The mux config data is 400026103FC0
I just don't know how to interpret the config string, and how I might configure a J2ME Player.