9

I am learning about Grafika's "Continuous Capture" Activity, It is about recording a video with MediaCodec.

The activity source code is at https://github.com/google/grafika/blob/master/src/com/android/grafika/ContinuousCaptureActivity.java

The program uses a SurfaceTexture obj to receive data from camera and creates 2 EGLSurface obj with this SurfaceTexture obj, one EGLSurface obj feeds the data to MediaCodec and the other feeds data to SurfaceView for camera preview. MediaCodec encodes the data to h264 data and the MediaMuxer obj writes h264 data to a mp4 file.

But there is a problem, the preview size supported by the camera is landspace (width > height) such as 1920*1080, 1440*1080,720*480 and so on. Usually, we take the phone in portrait orientation when we record a video, so we should use API: Camera.setDisplayOrientation(90) to rotate the picture to a portrait one, then a portrait video will be recorded.

But I want to record a landscape video with the phone portrait in my hand, I have to crop every frame from the camera. My method is that cutting off the bottom and top of every frame picture and retain the middle of the picture, then the left picture will be a landscapce one.

But I am not familiar with opengl, I do not know how to crop the SurfaceTexture data. Could anyone who is good at opengl give me some help?

Stefan
  • 4,831
  • 8
  • 24
  • 48
dragonfly
  • 1,055
  • 12
  • 34

1 Answers1

14

Take a look at the "texture from camera" activity. Note it allows you to manipulate the image in various ways, notably "zoom". The "zoom" is done by modifying the texture coordinates.

The ScaledDrawable2D class does this; the setScale() call changes the "zoom", rather than scaling the rect itself. Texture coordinates range from 0.0 to 1.0 inclusive, and the getTexCoordArray() method modifies them to span a subset of the texture.

To clip the frames, you'd need to modify the texture coordinates proportionally. For example, if the input video is portrait 720x1280, and you want 720x720, you would change the coordinates from this:

[0.0, 0.0]  [1.0, 0.0]
[0.0, 1.0]  [1.0, 1.0]

to this:

[0.0, 280/1280.0]  [1.0, 280/1280.0]
[0.0, 1000/1280.0] [1.0, 1000/1280.0]

and then render that on a square rather than a rectangle.

fadden
  • 48,613
  • 5
  • 104
  • 152
  • Thank you a lot! your answer resolve my problem exactly! I have visited your stackoverflow profile. I get to know that you have worked in google for many years and devoted yourself to the Android os development. In a word, you are a godlike programmer. – dragonfly Jun 03 '15 at 12:59
  • I am excited to see you are the author of grafika and bigflake, because I have studied both of them and I have a question to ask. you wrote several demos about MediaCodec including video and audio. But there is not any demos which record both video and audio. I think we can encode video and audio data simultaneously with MediaCodec and write them to mp4 file with MediaMuxer, But why did not you write such a demo? Now I prepare to write such a demo but I am new to this. Could you give me some advices? – dragonfly Jun 03 '15 at 13:10
  • I was on the "graphics" team, not the "media" team, so sending graphics to MediaCodec was part of my day job. I haven't worked with audio compression, and never got around to playing with it, so no A/V demo. Your plan is correct: create a second MediaCodec that handles the audio, and feed the output of each into a single MediaMuxer instance with two tracks. The muxer portion is tested in https://android.googlesource.com/platform/cts/+/lollipop-release/tests/tests/media/src/android/media/cts/MediaMuxerTest.java , but that exercises MediaExtractor rather than MediaCodec. – fadden Jun 03 '15 at 15:59
  • Thanks for your reply. I will implement A/V by myself and tell you the result as soon as I finish the work. I am very happy to communicate with you! – dragonfly Jun 04 '15 at 02:48
  • I have wrote a demo with MediaCodec and test the a/v recording, but I come with some problem. I post the questions at http://stackoverflow.com/questions/30668846/record-video-with-mediacodec-and-mediamuxer-but-the-bitrate-and-framerate-are-i Could you please have a look and give me some help? – dragonfly Jun 05 '15 at 14:28
  • In ContinuousCaptureActivity.java, you create egl object in SurfaceCreated which run in UI thread and call drawFrame also in UI thread. You did 2 things in drawFrame:draw frame to screen and push data to encoder. Can I create another thread to do the encoding work? Because I want the camera preview to be very smooth. I am worrying about the multithread problem of opengl es. – dragonfly Sep 23 '15 at 12:34
  • Please post new questions as separate questions. Otherwise it just gets muddied up. – fadden Sep 23 '15 at 15:43
  • I have new a question here, please take a look: http://stackoverflow.com/questions/32804950/use-mediacodec-to-record-720p-video-but-fps-is-too-low – dragonfly Sep 27 '15 at 05:42
  • how can i achieve this conversion from 480x720 to 480x480 if the matrix looks as follows? [0.0, -1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 1.0] totally lost here :/ – dy_ Nov 17 '15 at 11:44
  • 1
    @dragonfly I am working on a similar requirement. I need to process a video in 640x480 resolution to 480x480. So I am trying to apply the suggestion by fadden (Initially trying to get 720x720 video from 1280x720). What I did was that, I changed RECTANGLE_TEX_COORDS in Drawable2d class to the values suggested by fadden. Also changed the parameter when Drawable2d class was initialized in FullFrameRect class (Line number 26 in https://github.com/google/grafika/blob/master/src/com/android/grafika/gles/FullFrameRect.java) to be Drawable2d.Prefab.RECTANGLE. This returns a video in 1280x720 itself. – human123 Jan 15 '16 at 08:40
  • @dragonfly The video recorded was distorted on top and left portions. Can you please let me know what were the other changes that you had to make in the code to get video in 720x720 resolution? – human123 Jan 15 '16 at 08:43
  • 2
    @datayeah You should modify the texture coordinate. Matrix has nothing to do with the crop operation. – dragonfly Jan 15 '16 at 11:19
  • 1
    @human123 I guess that you did not change the glViewport before the drawFrame operation. GLES20.glViewport(0,0,720,720) will be OK. – dragonfly Jan 15 '16 at 11:23
  • 1
    @fadden I understand that it was a pretty long time ago, but I'll give it a shot. Could you please elaborate a bit on "and then render that on a square rather than a rectangle."? I didn't really understand where and how I can do this. Thanks! – Roberto Artiles Astelarra May 04 '16 at 00:56
  • 1
    @RobertoArtilesAstelarra: start with Grafika's "texture from camera" class. `updateGeometry()` uses the camera width/height to set the size of the output rect to have a matching aspect ratio. That would need to be square. – fadden May 04 '16 at 05:37
  • @fadden Thanks for pointing in the right direction. Just to make sure that I understand you correctly. I'm trying to achieve the cropping only for the output video, without altering the displayed frames (I'm modifying CameraCaptureActivity example for this purposes). So, what you suggesting is to use technics in updateGeometry() to crop the output frames. Is this correct? – Roberto Artiles Astelarra May 04 '16 at 18:11
  • 1
    Take a square piece of the video frame and render it onto a square. If you want to record video that way, you configure the encoder to create a square video, and use FullFrameRect to fill the screen with it. – fadden May 04 '16 at 18:41
  • @fadden please have a look at this question about grafika: http://stackoverflow.com/questions/38101354/mediacodec-createinputsurface-failed-with-38 – dragonfly Jun 29 '16 at 13:44
  • @fadden Hi, some devices support usb camera and I can open 2 cameras at the same time. Can I use Mediacodec api to encode 2 mp4 files simultaneously? – dragonfly Nov 27 '17 at 10:12