3

I am building a image processing program using Android camera2. Since the image format of each captured frame is YUV_420_888, I need to transform it to RGB efficiently for image processing. I googled and read a lot (especially the following two links), and finally found that renderscript may be the solution. However, I don't know how to use the yuv2rgb script in my code.

http://werner-dittmann.blogspot.jp/2016/03/using-android-renderscript-to-convert.html

Convert android.media.Image (YUV_420_888) to Bitmap

Currently, I use the TextureView surface to show the preview, and use ImageReader to capture each YUV_420_888 frame in onImageAvailable function.

   protected void createCameraPreview() {
    try {
        SurfaceTexture texture = textureView.getSurfaceTexture();
        assert texture != null;
        texture.setDefaultBufferSize(imageDimension.getWidth(), imageDimension.getHeight());
        Surface surface = new Surface(texture);
        Surface mImageSurface = mImageReader.getSurface();
        captureRequestBuilder =    cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
        captureRequestBuilder.addTarget(surface)
        List surfaces = new ArrayList<>();
        surfaces.add(surface);
        surfaces.add(mImageSurface);
        captureRequestBuilder.addTarget(mImageSurface);
     cameraCaptureSessions.setRepeatingRequest(captureRequestBuilder.build(), null, mBackgroundHandler);

        cameraDevice.createCaptureSession(surfaces, new CameraCaptureSession.StateCallback(){
                    @Override
                    public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
                        //The camera is already closed
                        if (null == cameraDevice) {
                            return;
                        }
                        // When the session is ready, we start displaying the preview.
                        cameraCaptureSessions = cameraCaptureSession;
                        updatePreview();
                    }
                    @Override
                    public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {
                        Toast.makeText(MainActivity.this, "Configuration change", Toast.LENGTH_SHORT).show();
                    }
                }, null);
        } catch (CameraAccessException e) {
             e.printStackTrace();
        }
}


  private final ImageReader.OnImageAvailableListener mOnImageAvailableListener  = new ImageReader.OnImageAvailableListener() {
    @Override
    public void onImageAvailable(ImageReader reader) {
        Image img = null;
        img = reader.acquireNextImage(); // we got YUV_420_888 frame here
        // transform to RGB format here?
        // image processing
    }
};

How to update my codes to achieve the goal (e.g., using the yuv2rgb.rs)? Thanks.

Community
  • 1
  • 1
Jun Fang
  • 341
  • 1
  • 3
  • 16

1 Answers1

4

The camera2 sample application HdrViewfinder, which uses RenderScript to do some image processing, may be helpful for how to connect up the camera and RenderScript: https://github.com/googlesamples/android-HdrViewfinder

It doesn't do YUV->RGB conversion, IIRC, and I think yuv2rgb.rs may be intended for a different YUV colorspace than what the camera produces (due to backwards-compatibility concerns - it existed before camera2). But it gets you to the point where you can write your own RS script to apply to camera data.

Eddy Talvala
  • 15,449
  • 2
  • 37
  • 42
  • Thanks. Eddy. I am using code from the following link. http://stackoverflow.com/questions/36212904/yuv-420-888-interpretation-on-samsung-galaxy-s7-camera2. I am still testing it. Will provde the answer soon. – Jun Fang Feb 24 '17 at 01:51