1

The application of this is for HTTP Live Streaming. (audio only)

I have two 10-second mp2 files that are continuous (they are encoded one after the other but if spliced together play as one seamless file). I would like to create two mpeg-2 transport streams from these files. Unfortunately, when I do the following:

ffmpeg -i 0.mp2 0.ts

The output is successful but I get the following:

[mp3 @ 0x7fcc2a006600] max_analyze_duration 5000000 reached at 5015510 microseconds
[mp3 @ 0x7fcc2a006600] Estimating duration from bitrate, this may be inaccurate

Also, if I create the m3u8 manifest file with 0.ts and 1.ts, there is a space between the two.

Because I am creating the playlist dynamically, I don't have access to both files at the same time, otherwise I would simply concatenate and then segment the file after conversion.

What am I missing here? Is there a different way that I should be encoding the mp2's? I am using twolame and doing 10 seconds at a time.

JonathanC
  • 937
  • 10
  • 30

1 Answers1

2

You should not be doing the segmenting yourself. By encoding the segments separately, you are inadvertently reseting the continuity counter, as well as inserting new priming samples. This is what causes the pause.

You should start with a single audio file, then use ffmpeg to segment.

ffmpeg -i 0.mp2 out.m3u8

more docs here: http://www.ffmpeg.org/ffmpeg-formats.html#hls-1

You can ignore the mp3 warnings. However, I highly recommend you use AAC for HLS. The support for mp3 is pretty poor.

szatmary
  • 27,213
  • 7
  • 39
  • 54
  • I am live streaming the audio as it is recorded and I have to do the encodings every ten seconds to upload a compressed file to the server. Is there a way to set the continuity counter? Also, I wish to use MPEG-TS because that is the only one that works out of the box on Android. Otherwise I would most likely be using AAC. – JonathanC Oct 02 '13 at 16:45
  • You are confused. mpegts is the container, AAC is the codec. You can put AAC frames into an mpeg-ts container (it is stream_type 0x0F in the PMT). – szatmary Oct 02 '13 at 17:50
  • I see. Unfortunately, according to this http://developer.android.com/guide/appendix/media-formats.html only the mpeg-2 ts is supported for android at the moment. Am I misreading this? – JonathanC Oct 02 '13 at 22:22
  • is there any way to do this? Or should I create my mpeg-ts file as I am recording (i.e instead of using twolame on the device to generate mp2 segments, I could use libavformat to create the MPEG TS files) so that I don't have to do any further processing? – JonathanC Oct 03 '13 at 19:34
  • twolame? mpeg2-ts is NOT the same thing as mpeg audio layer two. They are completely different. mpeg2-ts is the container. mpeg audio layer two is the codec. I would used use AAC codec (preferably with the hardware encoder), and use libavformat to create ts segments. – szatmary Oct 03 '13 at 19:40
  • I see. I was quite confused by the terminology. So an mpeg2-ts container could be used with the AAC codec and that will most likely be readable by the android HLS player. – JonathanC Oct 23 '13 at 19:46