I used the basic CameraX example to be able to use the ImageAnalysis routine and go through the complete bitmap pixel by pixel to detect color similarity with a base given color. What I want is to display a preview in grayscale except for the chosen color. I get my bitmap using "Bitmap bMap = txView.getBitmap();" and after going through the bitmap and changing each pixel I send the bitmap to an Imagview called "img" using "img.setImageBitmap(bMap);" I have my TextureView INVISIBLE and my Imageview on top to display the modified bitmap. If I ONLY get the bitmap and display it I have no problem. But when I add the nested loop to check each pixel it just takes to long.
Here is my code
private void startCamera() {
//make sure there isn't another camera instance running before starting
CameraX.unbindAll();
/* start preview */
int aspRatioW = 480; //txView.getWidth(); //get width of screen
int aspRatioH = 640;//txView.getHeight(); //get height
Rational asp = new Rational (aspRatioW, aspRatioH); //aspect ratio
Size screen = new Size(aspRatioW, aspRatioH); //size of the screen
//config obj for preview/viewfinder.
PreviewConfig pConfig = new PreviewConfig.Builder().setTargetAspectRatio(asp).setTargetResolution(screen).build();
Preview preview = new Preview(pConfig); //lets build it
preview.setOnPreviewOutputUpdateListener(
new Preview.OnPreviewOutputUpdateListener() {
//to update the surface texture we have to destroy it first, then re-add it
@Override
public void onUpdated(Preview.PreviewOutput output){
ViewGroup parent = (ViewGroup) txView.getParent();
parent.removeView(txView);
parent.addView(txView, 0);
txView.setSurfaceTexture(output.getSurfaceTexture());
updateTransform();
}
});
/* image capture */
//config obj, selected capture mode
ImageCaptureConfig imgCapConfig = new ImageCaptureConfig.Builder().setCaptureMode(ImageCapture.CaptureMode.MIN_LATENCY)
.setTargetRotation(getWindowManager().getDefaultDisplay().getRotation()).build();
final ImageCapture imgCap = new ImageCapture(imgCapConfig);
findViewById(R.id.capture_button).setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
File file = new File(Environment.getExternalStorageDirectory() + "/" + System.currentTimeMillis() + ".jpg");
imgCap.takePicture(file, new ImageCapture.OnImageSavedListener() {
@Override
public void onImageSaved(@NonNull File file) {
String msg = "Photo capture succeeded: " + file.getAbsolutePath();
Toast.makeText(getBaseContext(), msg,Toast.LENGTH_LONG).show();
}
@Override
public void onError(@NonNull ImageCapture.UseCaseError useCaseError, @NonNull String message, @Nullable Throwable cause) {
String msg = "Photo capture failed: " + message;
Toast.makeText(getBaseContext(), msg,Toast.LENGTH_LONG).show();
if(cause != null){
cause.printStackTrace();
}
}
});
}
});
/* image analyser */
ImageAnalysisConfig imgAConfig = new ImageAnalysisConfig.Builder().setImageReaderMode(ImageAnalysis.ImageReaderMode.ACQUIRE_LATEST_IMAGE).build();
ImageAnalysis analysis = new ImageAnalysis(imgAConfig);
analysis.setAnalyzer(
new ImageAnalysis.Analyzer(){
@Override
public void analyze(ImageProxy image, int rotationDegrees){
//https://www.codota.com/code/java/methods/android.media.Image/getPlanes
//ByteBuffer yBuffer = image.getPlanes()[0].getBuffer(); // planes[0].buffer // Y
//ByteBuffer uBuffer = image.getPlanes()[1].getBuffer(); // planes[1].buffer // U
//ByteBuffer vBuffer = image.getPlanes()[2].getBuffer(); // planes[2].buffer // V
//byte[] yBufferBytes= new byte[yBuffer.remaining()]; // .remaining is for the size of the buffer
Bitmap bMap = txView.getBitmap();
if (bMap != null){
int Hsize = bMap.getHeight();
int Wsize = bMap.getWidth();
int bmPixel, alpha, redValue, blueValue, greenValue;
int bt_alpha, bt_redValue, bt_blueValue, bt_greenValue;
bt_alpha = Color.alpha(mDefaultColor);
bt_redValue = Color.red(mDefaultColor);
bt_blueValue = Color.blue(mDefaultColor);
bt_greenValue = Color.green(mDefaultColor);
for (int h = 0; h < Hsize; h++){
for (int w = 0; w < Wsize; w++){
bmPixel = bMap.getPixel(w,h);
alpha = Color.alpha(bmPixel);
redValue = Color.red(bmPixel);
blueValue = Color.blue(bmPixel);
greenValue = Color.green(bmPixel);
double d2 = Math.sqrt((0.3*(bt_redValue-redValue)*(bt_redValue-redValue)) + (0.59*(bt_greenValue-greenValue)*(bt_greenValue-greenValue)) + (0.11*(bt_blueValue-blueValue)*(bt_blueValue-blueValue)));
if (d2 > 10){
int newRed, newBlue, newGreen, bmGrey;
bmGrey = (redValue + blueValue + greenValue)/3;
newRed = bmGrey;
newBlue = bmGrey;
newGreen = bmGrey;
bMap.setPixel(w,h,Color.argb(alpha,newRed,newGreen,newBlue));
}
}
}
ImageView img = (ImageView) findViewById(R.id.image_Changed);
img.setImageBitmap(bMap);
}
image.close();
// https://stackoverflow.com/questions/36212904/yuv-420-888-interpretation-on-samsung-galaxy-s7-camera2
//y'all can add code to analyse stuff here idek go wild.
}
});
//bind to lifecycle:
CameraX.bindToLifecycle((LifecycleOwner)this, analysis, imgCap, preview);
}
Any ideas on how to make it work faster?
PS: Sorry for having so many commented lines. I have been testing a lot and looking for help in many places.