0

This shader works perfectly for Android devices running on API 21

Where pixelStride is 1 and uvRowStride is 320 (preview resolution 640x480)

#ifdef GL_ES
precision highp float;
#endif

varying vec2 v_texCoord;
uniform sampler2D y_texture;
uniform sampler2D u_texture;
uniform sampler2D v_texture;

void main()
{   
  float nx,ny,r,g,b,y,u,v;
  nx=v_texCoord.x;
  ny=v_texCoord.y;
  y=texture2D(y_texture,v_texCoord).r;
  u=texture2D(u_texture,v_texCoord).r;
  v=texture2D(v_texture,v_texCoord).r;

  y=1.1643*(y-0.0625);
  u=u-0.5;
  v=v-0.5;

  r=y+1.5958*v;
  g=y-0.39173*u-0.81290*v;
  b=y+2.017*u;

  gl_FragColor = vec4(r,g,b,1.0);
}

But in Android devices with API 22+ is always having

pixelStride is 2 and uvRowStride is 640 (preview resolution 640x480)

References: https://github.com/wenxiaoming/YUVRender

YUV_420_888 interpretation on Samsung Galaxy S7 (Camera2)

Top part of image shows converted YUV in GLSurfaceView, bottom part shows actual colour in normal TextureView

Image with normal camera preview and GLSurfaceView

Edit:

Work around not depending on shader to do pixelStride calculation was to input U & V byte buffer without any strides. Performance is lower than pure shader code.

How to integrate the below code to shader? (I am really bad with shaders)

 private static Buffer getByteBufferWithoutStrides(int pixelStride, ByteBuffer byteBuffer){
    byteBuffer.position(0);
    byte[] bytes = new byte[byteBuffer.remaining()];
    byteBuffer.get(bytes, 0, bytes.length);

    byte[] noStrides = new byte[bytes.length/2];
    int j = 0;
    for(int i = 0; i< noStrides.length; i++){
        noStrides[i] = bytes[j];
        j+= pixelStride;
    }
    bytes = null;
    byteBuffer.clear();
    Buffer noStridesBuffer = ByteBuffer.wrap(noStrides);

    return noStridesBuffer;
}
Atul Vasudev A
  • 453
  • 3
  • 18

1 Answers1

2

Do not perform YUV conversion in the shader, this is non-trivial since there are several kinds of YUV formats. Instead, use GL_TEXTURE_EXTERNAL_OES and sample from it like you would a traditional RGB texture. See the Android docs for SurfaceTexture.

prideout
  • 2,598
  • 1
  • 18
  • 22
  • This has an added advantage that is that this is often also normally faster, because most mobile GPUs have native support for sampling YUV surfaces. – solidpixel Nov 13 '19 at 11:15
  • I was creating a 2 or 3 frames delayed camera preview, by saving previous frames as byte[]. Let me try if it is possible this way. Currently solved the issue by a work around by removing stride from byte[] before feeding it to shader. Now it takes around 2 - 6 ms for conversion. – Atul Vasudev A Nov 14 '19 at 06:09
  • Is there more than 2 types of YUV format when targeting devices above API 21? – Atul Vasudev A Nov 14 '19 at 06:13
  • 1
    There are *many* types of YUV format - both in terms of pixel layout and color encoding (wide vs narrow gamut). Exactly what you get on any specific device is vendor dependent - but if you use the `GL_TEXTURE_EXTERNAL_OES` you don't need to worry about it. – solidpixel Nov 17 '19 at 19:47