What is the easiest way to wrap a raw .aac file into a .m4a container - containers

This question is overflow from the following question:
How do I programmatically convert mp3 to an itunes-playable aac/m4a file?
Anyway, I learned how to create an aac file and then i found out that an aac is not just an m4a file with a different file extension. In fact, I need to somehow wrap the aac into an m4a container. Ideally I'd be able to simply make a call to the command line.

ffmpeg is a general purpose (de)muxer/transcoder. MP4Box is a (de)muxer/transcoder from GPAC, a package dedicated to MP4 related software tech. Right now it seems wiser to use MP4Box because it writes the moov atom at the beginning of the file, which is important for streaming and ipod playing.
Use ffmpeg like this:
ffmpeg -i input.aac -codec: copy output.m4a
Use MP4Box like this:
MP4Box -add input.aac#audio output.m4a

mp4box if you want a dedicated tool; its probably the easiest way to go. ffmpeg can do the job too.

avconv -i input.aac -acodec copy output.m4a
In my case, without the explicit flag to copy, it re-encodes the audio for some odd reason.

Just use any mp4-Muxer like Yamb to create an mp4-file with only the aac audio track in it, then change the file extension to m4a.

Related

audio/mp4; codecs="mp4a.40.2" not playing in Chrome and Firefox

It seems I want to convert audios, which I want to stream on my website, to audio/mp4; codecs="mp4a.40.2".
Using ffmpeg-cli-wrapper, I am converting my uploaded audio files with this command here:
ffmpeg -i /tmp/input.any -acodec aac -b:a 256000 /tmp/output.aac
On the client I am creating a SourceBuffer like this:
this.sourceBuffer = this.mediaSource.addSourceBuffer('audio/mp4; codecs="mp4a.40.2"');
The errors are:
Chrome:
NotSupportedError: Failed to load because no supported source was found.
Firefox:
NotSupportedError: The media resource indicated by the src attribute or assigned media provider object was not suitable.
Here comes the fun part:
If I create the SourceBuffer using audio/aac as mime-type:
this.sourceBuffer = this.mediaSource.addSourceBuffer('audio/aac');
the audio gets played correctly on Chrome but Firefox says:
MediaSource.addSourceBuffer: Type not supported in MediaSource
Update
After changing the command to
ffmpeg -i /tmp/input.any -acodec aac -b:a 256000 /tmp/output.mp4
^^^
Chrome/Firefox do not give an error when using audio/mp4; codecs="mp4a.40.2", but the audio is not being played.
See
https://stackoverflow.com/a/64432478/826983
In your ffmpeg command, use the extension .m4a (or .mp4), not .aac:
ffmpeg -i /tmp/input.any -acodec aac -b:a 256000 /tmp/output.m4a
The .aac extension is used for AAC in the ADTS container, which is not a format that is widely used or supported on the web. The .m4a or .mp4 extension is used for AAC in the MP4 container, which is far more common.
Or, if for some reason you really want the file to be named output.aac but yet want it to contain AAC in MP4, then you can specify the output format instead of having ffmpeg guess the format that you want based on the output file extension:
ffmpeg -i /tmp/input.any -acodec aac -b:a 256000 -f mp4 /tmp/output.aac
But if you use a non-standard extension like that then you may also need to update your web server configuration so that it will report the correct media type.
I was having a similar issue where my fMP4 played fine if used directly as src inside the video tag, but it didn't play when used via MediaSource Extensions. My problem was that my AAC frame in the fMP4 was with the ADTS header and it appears that MSE doesn't like those headers. So, I removed the ADTS header from each AAC frame and it worked fine. (I already had the Audio Specific Config added in my ESDS box, so I could remove the ADTS header without issue.)

How does the HTML5 audio/video tag determine duration

The HTML5 <audio> and <video> tags support a duration property. I am curious how, without downloading the entire audio/video source file is the browser able to determine the duration of the media?
I am asking because I would like to implement the same functionality in a backend service I am writing that will:
Accept a url for an mp3
Determine the length (in seconds) of the file, without downloading the entire file
Most video containers contain both the visual and audio elements and also a metadata block that describes things like the duration, the colorspace, the codecs used and the offset for each frame (useful when seeking). In a typical video encoded for the web as MP4 this block (aka the MOOV atom) defaults to the end of the file (as the frame location won't be known until the end) unless a second pass has been performed to move it to the front eg:
ffmpeg -i source.mp4 -c:a copy -c:v copy -movflags faststart destination.mp4
(copies the audio and video unchanged, just moves the metadata to the start to enable faster access)
You might have experienced some web video where you can seek almost immediately with an MP4 and some where you can't accurately seek until the file has been fully loaded... this is because the browser has to make 'guesses' until it receives that metadata
For mp3 files specifically you could use something like this - to request the server gives you just the ID3 Tag and eTag data (last 127 and the 227 bytes) without having to download the whole file

Piping out h.264 encoded video through avconv to vlc

My end game is to take read a raw video from a file into avconv, h.264 encode it, and pipe it to VLC. However, I cannot seem to get it to work. Even just piping an already encoded video to VLC does not work. Trying:
avconv -i test.mp4 -f h264 - | vlc -
appears to be encoding a video (the cmd line output looks like it is processing frame by frame), but nothing is displayed to VLC. A similar test with an .avi works fine:
avconv -i test.avi -f avi - | vlc -
Is there something different special piping out h264 encoded video?
Specify the demuxer:
cat test.h264 | vlc --demux h264 -
--demux=<string> Demux module
Demultiplexers are used to separate the "elementary" streams (like
audio and video streams). You can use it if the correct demuxer is
not automatically detected. You should not set this as a global
option unless you really know what you are doing.
VLC command line help

Reconstructing fragmented H.264 stream

I have H.264 stream stored as a file. I am trying to create a MPEG4 file by adding this stream to MDAT BOX. I have created other headers required by MPEG-4 standards. But I am unable to read the MPEG-4 file.
I parsed the H.264 stream and I see that there are multiple I-frames in the file. It seems to me that this is fragmented H.264 stream.
Is there any way in which this fragmented H.264 stream can be combined into a single I-frame?
I have gone through the link Problem to Decode H264 video over RTP with ffmpeg (libavcodec).
I implemented what was mentioned in the link but i am still unable to run the MPEG-4 thus created.
With the above technique, I get the fragmentType = 5. I get the following nalTypes (8, 2, 1, 0, 0, ...). I get the startBit as specified and for the other fragments, I get the 00 (for StartBit|endBit). I do not get the endBit.
When i try using FFMPEG to reconvert the MPEG-4 file that was created, i get the following error: "header damaged". It looks like the reconstruction of IDR frames is not working properly.
Please let me know if the method that I am following has any issues.
The H.264 stream file is around 100KB. When this file is converted to MP4 using FFMPEG, I get around 38KB. Does it mean that FFMPEG is re-encoding the file once again inorder to recreate the MP4 file?
With the above technique that was mentioned in the link, the MP4 that I have created is around 100KB itself.
Please let me know what I am doing that is wrong.
Thanks in advance.
It sounds like you'd like to wrap an H.264 elementary stream in a mp4 container so that you can play it back.
A tool like mp4box (http://gpac.wp.mines-telecom.fr/mp4box/) will enable you to wrap you elementary stream into a mp4 file. For example:
mp4box -add MySourcFile.h264 MyDestFile.mp4

Converting Mpeg file to flv and simultaneously playing the converted file in flash cs3 using as3

I have an mpg file which I want to convert to flv format, but I have a requirement that while converting the mpg file, I also have to simultaneously play the converted flv file in the flash cs3. How to do it? I am using cs3 and as3.
If you want to convert your files programmaticly then use ffmpeg. This is a commandline tool which can convert video files to nearly everything. You have to execute ffmpeg with the correct params and wait until the video is ready. This works only on serverside. Means the flash client loads up the video file to the server. There it gets converted. You can execute ffmpeg with any serverside language like php.
Sadly I have no idea if it is possible to watch the video while converting. I think not but maybe someone else knows more.