2

I am trying to capture video frames, encode it with MediaCodec, and save it into a file. The code that I am using is:

public class AvcEncoder {

    private static String TAG = AvcEncoder.class.getSimpleName();

    private MediaCodec mediaCodec;
    private BufferedOutputStream outputStream;


    public AvcEncoder(String fileDir) {

        Log.d(TAG, "Thread Id: " + Thread.currentThread().getId());

        File f = new File(Environment.getExternalStorageDirectory(), "Download/LiveCamera/video_encoded.h264");

        try {
            outputStream = new BufferedOutputStream(new FileOutputStream(f));
            Log.i("AvcEncoder", "outputStream initialized");
        } catch (Exception e){ 
            e.printStackTrace();
        }

        mediaCodec = MediaCodec.createEncoderByType("video/avc");
        MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", 960, 720);
        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 2000000);
        mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
        //mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
        mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);

        mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
        mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
        mediaCodec.start();
    }

    public void close() throws IOException {
        mediaCodec.stop();
        mediaCodec.release();
        mediaCodec = null;

        //outputStream.flush();
        outputStream.close();
    }

    public void byteWriteTest(byte[] input) {
        try {
            outputStream.write(input, 0, input.length);
        } catch(Exception e) {
            Log.d("AvcEncoder", "Outputstream write failed");
            e.printStackTrace();
        }
        Log.i("AvcEncoder", input.length + " bytes written");
    }

    // called from Camera.setPreviewCallbackWithBuffer(...) in other class
    public void offerEncoder(byte[] input) {
        try {
            ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
            ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
            int inputBufferIndex = mediaCodec.dequeueInputBuffer(-1);

            if (inputBufferIndex >= 0) {
                ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
                inputBuffer.clear();
                inputBuffer.put(input);
                mediaCodec.queueInputBuffer(inputBufferIndex, 0, input.length, 0, 0);
            }

            MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
            int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo,0);

            while (outputBufferIndex >= 0) {
                ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
                byte[] outData = new byte[bufferInfo.size];
                outputBuffer.get(outData);
                try {
                    outputStream.write(outData, 0, outData.length);

                } catch(Exception e) {
                    Log.d("AvcEncoder", "Outputstream write failed");
                    e.printStackTrace();
                }
                //Log.i("AvcEncoder", outData.length + " bytes written");

                mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
                outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);

            }
        } catch (Throwable t) {
            t.printStackTrace();
        }

    }
}

For each frame arrived in SurfaceView's onPreviewFrame, AvcEncoder's offerEncoder() method is called, as follows

public class CameraView extends SurfaceView implements Camera.PreviewCallback,
        SurfaceHolder.Callback {
    ...

    @Override
    public void onPreviewFrame(byte[] pData, Camera pCamera) {

        if (VIDEO_ENCODE) {
            avcEncoder.offerEncoder(pData);
        }
    }

}

The Problem

Now, the problem I am having is with writing the encoded frames to file. It seems that after every N frames (roughly, not exactly the same every time), the statement outputStream.write(outData, 0, outData.length), in AvcEncoder's offerEncoder method, takes longer time (few handred times longer than at other iterations). I have assumed this probably happens when it flushes the buffer, that is actually writes to file. (Please correct me if this assumption is not correct).

This results in dropping the frames that arrive during that time (again, I assume, based on the video result), which in turn results in a pause in the recorded video after every N frame.

When I comment this statement out, then iterations of offerEncoder method takes roughly equal time.

The Question

How can I solve this issue such that writing to file is smooth.Has anyone else encountered this problem. I see that many people use this code, but no one has complained so far about this issue (or at least I did not find one).

Thanks.

Nazar Merza
  • 3,037
  • 1
  • 17
  • 19
  • It'd have less overhead if you moved the camera frames through a Surface rather than a byte[]. See http://bigflake.com/mediacodec/#CameraToMpegTest and the "Show + capture camera" activity in Grafika (https://github.com/google/grafika). If that's not an option, you can try buffering the encoded output internally and writing it to disk on a separate thread. It might be worth using systrace to see where the peformance is going; see also http://stackoverflow.com/questions/19256953/buffering-surface-input-to-mediacodec – fadden Jan 16 '14 at 02:03
  • Thanks for the comment, since using surface is not supported in older API's, I can't use it. I am experimenting with separate threads. – Nazar Merza Jan 16 '14 at 02:13

0 Answers0