Why did H.264, MPEG-4 HE AAC stop working on iphoneXS/Max? - h.264

Issue regarding NEW hardware
I have been investigating like crazy, and haven't found any hints to why my H.264 encoded videos have stopped working on these new devices.
Context: Direct from the ios device, the original is sent to s3, aws elastic transcoder then encodes the original into a more compressed H.264 preset. As of yesterday, a coworker was reporting all videos being "black", now since deliveries on these devices are being fulfilled, i've gotten confirmation. Cannot reproduce this issue on simulator. The encoded videos that are not playing, i've listed the exif data below.
Is there is anyone out there with a domain expertise in codecs, can you weigh in why a new device would fail to decode this H.264 video when devices since iphoneX and below have no problem?
➜ Downloads exiftool 30B3485D-24A3-4B6D-8B27-15B7C11FB864.mp4
ExifTool Version Number : 11.10
File Name : 30B3485D-24A3-4B6D-8B27-15B7C11FB864.mp4
Directory : .
File Size : 202 kB
File Modification Date/Time : 2018:09:24 20:35:47-07:00
File Access Date/Time : 2018:09:24 20:36:02-07:00
File Inode Change Date/Time : 2018:09:24 20:35:53-07:00
File Permissions : rw-r--r--
File Type : MP4
File Type Extension : mp4
MIME Type : video/mp4
Major Brand : MP4 Base Media v1 [IS0 14496-12:2003]
Minor Version : 0.2.0
Compatible Brands : isom, iso2, avc1, mp41
Movie Header Version : 0
Create Date : 0000:00:00 00:00:00
Modify Date : 0000:00:00 00:00:00
Time Scale : 1000
Duration : 4.12 s
Preferred Rate : 1
Preferred Volume : 100.00%
Preview Time : 0 s
Preview Duration : 0 s
Poster Time : 0 s
Selection Time : 0 s
Selection Duration : 0 s
Current Time : 0 s
Next Track ID : 3
Track Header Version : 0
Track Create Date : 0000:00:00 00:00:00
Track Modify Date : 0000:00:00 00:00:00
Track ID : 1
Track Duration : 4.12 s
Track Layer : 0
Track Volume : 100.00%
Balance : 0
Audio Format : mp4a
Audio Channels : 2
Audio Bits Per Sample : 16
Audio Sample Rate : 48000
Matrix Structure : 1 0 0 0 1 0 0 0 1
Image Width : 320
Image Height : 568
Media Header Version : 0
Media Create Date : 0000:00:00 00:00:00
Media Modify Date : 0000:00:00 00:00:00
Media Time Scale : 15360
Media Duration : 4.00 s
Media Language Code : und
Handler Description : VideoHandler
Graphics Mode : srcCopy
Op Color : 0 0 0
Compressor ID : avc1
Source Image Width : 320
Source Image Height : 568
X Resolution : 72
Y Resolution : 72
Bit Depth : 24
Pixel Aspect Ratio : 1:1
Video Frame Rate : 30
Handler Type : Metadata
Handler Vendor ID : Apple
Encoder : Lavf57.71.100
Movie Data Size : 202178
Movie Data Offset : 4545
Avg Bitrate : 393 kbps
Image Size : 320x568
Megapixels : 0.182
Rotation : 0

This bug resolved itself for me in iOS13 beta release. Apple got back to me and informed me that the H264 header said that my video was version 4.0 but the first H264 frame says that it's 3.1, and iOS12 will not allow that.
I was able to fix this in code by specifying my header version as 3.1

I had a similar problem to this with black video on iPhone XS Max, and it turns out that I was setting the keys kCVPixelBufferCGImageCompatibilityKey and kCVPixelBufferCGBitmapContextCompatibilityKey to YES in the sourcePixelBufferAttributes dictionary when creating the AVAssetWriterInputPixelBufferAdaptor. Commenting out those two keys from the dictionary seems to have fixed the problem.

I've figured out the reason. IPhone Xs+ supports all of the H.264 resolutions and frame rates. However certain frame rates require HEVC:
1080p # 240 fps
4K # 60 fps
Thus if you won't configure the captureSession.sessionPreset to some custom - lower resolution values:
if isFullHDVideoEnabled && captureSession.canSetSessionPreset(AVCaptureSession.Preset.hd1920x1080) {
captureSession.sessionPreset = AVCaptureSession.Preset.hd1920x1080
}
else {
captureSession.sessionPreset = AVCaptureSession.Preset.hd1280x720
}
iPhone will capture a video in H.265, and there would be the only option (.hevc) in movieFileOutput.availableVideoCodecTypes.
if #available(iOS 11.0, *) {
if movieFileOutput.availableVideoCodecTypes.contains(.h264) {
movieFileOutput.setOutputSettings([AVVideoCodecKey: AVVideoCodecType.h264], for: connection)
}
else if movieFileOutput.availableVideoCodecTypes.contains(.hevc) {
movieFileOutput.setOutputSettings([AVVideoCodecKey: AVVideoCodecType.hevc], for: connection)
}
}

Related

Is it possible to get H.264's Picture Parameter Set (PPS) from Android encoder?

I am encoding bitmap to H.264. Please allow me to skip the code here because other places have excellent descriptions such as this one. It would take up a lot of space. The main idea is configuring MediaCodec to do the encoding.
The encoding appears to work well. The output frames have the following H.264 NAL unit types:
7 (Sequence parameter set)
5 (Coded slice of an IDR picture)
1 (Coded slice of a non-IDR picture)
1
1
1
...
You can see it generates SPS but not PPS. My understanding is that PPS is needed for producing a valid MP4 file.
Is there a way to obtain PPS from the encoder?

Reverse engineer of APC UPS serial protocol

I have a APC SMC1000-2UC UPS device that has a serial port to connection. The serial port protocol seems that is based on microlink protocol which has not documented. I monitored the communication of the UPC and PC witch UPS deriver has been initialed. I want to detect command of UPS such as shutdown command by a microcontroller-based device. Some information of "https://sites.google.com/site/klaasdc/apc-smartups-decode" site is compatible with things I observed. but calculation of frame checksum and Challenge string calculation don't pass.
Data length of protocol has been set to 32 bytes, So each frame has 35 bytes.
[Msg ID | 32 byte data | 2 byte checksum]
Regarding calculation of challenge frame, the UPS send 0x65 frame ID then 0x68 frame ID, after that the PC response with 0x65 frame ID and UPS send confirmed frame by 0x65 frame ID. based on presented calculation, I think format or Position of Password_1, Header data and two bytes of that has been changed as the protocol has been configured to 32 bytes data. The following frame are a sample of this challenge:
Header frame: 0x00 0a206903fa27090001004000f802fe04fe0940fc1042fc1044fc20f80416fc10 32a6
UPS : 0x65 ffff00010000a0e80000 c0bbb4e1 000001040000001000000004000000000020 7350
UPS : 0x68 000000000000000000000008004c2943000000000966039a063b675601f30000 864f
PC : 0x65 0a 04 8afb65f1 bdf0
UPS : 0x65 ffff000100000eaf62d8 8afb65f1 000001040000001000000004000000000020 6227
How can I satisfy the challenge and checksum type? I try many type of checksum for that data but they not correct.
It may be a bit late, but have you looked at this:
https://github.com/klaasdc/apcups-serial-test
This looks like somebody got pretty far reverse engineering the MicroLink protocol, including the checksum part. The GitHub repo also contains a link to a web page with a protocol description.

Manually calculate h264 video bitrate

I have a problem.
I'm currently trying to manually calculate the bitrate of a .mkv video I want to encode to get a specific file size so I can use that in my batch file.
Size I want the clip to be: 1900 MB --
Duration: 2587 seconds --
Audio bitrate: 1509 kbps
My current calculation is:
1900MB*1024 seconds 1509/8 seconds
(1945600 - (2587 x 188,625) ) / 2587 = 563,44303247004252029377657518361 KBps
563,44303247004252029377657518361 * 8 = 4507,5442597603401623502126014689 kbps
I tried encoding with this bitrate, however the file size won't match 1900 MB, so I
used a bitrate calculator and after putting in my settings it says for 1900 MB, the video needs a bitrate of 4647 kbps (encoded with this bitrate, and it was 1899 MB).
My question is, what did I miss in my calculation?
"kilo" is 1024 for data size, but 1000 for bitrate.
1992294400 bytes for whole file [1900 MB * 1024 * 1024]
-487972875 bytes for audio [1509/8 * 1000 * 2587]
= 1504321525 bytes for video [4652/8 * 1000 * 2587]
video bitrate: 4652 kbps
This result more closely matches the calculator you used than your result, although I can't explain the remaining discrepancy of about 5kbps. Perhaps the calculator accounts for framing overhead or seek tables or some other metadata.
I would trust the calculator, since using its value gave you results very close to your goal.

x264 rate control modes

Recently I am reading the x264 source codes. Mostly, I concern the RC part. And I am confused about the parameters --bitrate and --vbv-maxrate. When bitrate is set, the CBR mode is used in frame level. If you want to start the MB level RC, the parameters bitrate, vbv-maxrate and vbv-bufsize should be set. But I don't know the relationship between bitrate and vbv-maxrate. What is the criterion of the real encoding result when bitrate and vbv-maxrate are both set?
And what is the recommended value for bitrate? Equals to vbv-maxrate?
Also what is the recommended value for vbv-bufsize? Half of vbv-maxrate?
Please give me some advice.
bitrate address the "target filesize" when you are doing encoding. It is understandably confusing because it applies a "budget" of certain size and then tries to apportion this budget on the frames - that is why the later parts of a movie get a smaller amount of data which results in lower video quality. For example, if you have 10 seconds of complete black images followed by 10 second of natural video - the final encoded file will be very different than if the order was the opposite.
vbv-bufsize is the buffer that has to be completed before a "transmission" would occur say in a streaming scenario. Now, let's tie this to I-frames and P-frames: the vbv-bufsize will limit the size of any of your encoded video frames - most likely the I-frame.

What is the default quality for HTML5 Canvas.toDataURL?

According to mozilla, the second parameter for canvas.toDataURL(1,2) is:
If the requested type is image/jpeg or image/webp, then the second
argument, if it is between 0.0 and 1.0, is treated as indicating image
quality; if the second argument is anything else, the default value
for image quality is used. Other arguments are ignored.
But I can't find anywhere that tells me what the default value actually is.
According to the spec, it alludes to the default being browser dependant:
The second argument, if it is a number in the range 0.0 to 1.0 inclusive, must be treated as the desired quality level. If it is not a number or is outside that range, the user agent must use its default value, as if the argument had been omitted.
Edit: According to one user the default for Firefox is 0.92.
You can specify the JPEG quality as the second parameter to the toDataURL function. The default quality in Firefox is 0.92 (92%).
And according to this webkit bug report Chrome uses the same.
...Adds a libjpeg-based image encoder for Skia bitmaps. Default encoding quality
is 92 to match Mozilla...
Tested saving canvas content with .webp in chrome and edge with same result showing the quality parameter default value to be 0.8.
Here are results with quality on the left:
default -> 313.65 kB
1 -> 8.29 MB
0.9 -> 0.98 MB
0.8 -> 313.65 kB
0.7 -> 200.63 kB
0.6 -> 160.19 kB
0.5 -> 130.57 kB
0.4 -> 109.31 kB
0.3 -> 91.17 kB
0.2 -> 75.09 kB
0.1 -> 67.78 kB
0 -> 48.71 kB
This makes me think that quality parameter may be optimized for biggest decrease in weight and highest image quality and can be different for other image types.