We are working on a video streaming application , currently we are able to capture video and process it frame by frame then send/receive data between two devices .
the problem is at the receiving side , as we are sending frames with fixed size so at the receiving side should also process them with a fixed size or he will not be able to get the frame .
Hope this diagrams clarify the problem.
Code :
public void process(@NonNull Frame frame) {
byte[] data = frame.getData();
frameWidth = frame.getSize().getWidth();
frameHieght = frame.getSize().getHeight();
YuvImage yuv = new YuvImage(data, frame.getFormat(), frameWidth, frameHieght, null);
ByteArrayOutputStream out = new ByteArrayOutputStream();
yuv.compressToJpeg(new Rect(0, 0, frameWidth, frameHieght), 25, out);
final byte[] bytes = out.toByteArray();
frameByteSize = bytes.length;
OutputStream outputStream = StreamHandler.getOutputStream();
if (out != null) {
try {
outputStream.write(bytes, 0, frameByteSize);
} catch (IOException e) {
e.printStackTrace();
}
}
Code :
public void run() {
int read = 0;
int hold = 0;
int frameSize = StreamHandler.getFrameByteSize();
try {
Thread.sleep(10000);
if (SrtreamHandler != null) {
InputStream inputStream = StreamHandler.getInputStream();
if (inputStream != null) {
BufferedInputStream bufferedInputStream = new BufferedInputStream(inputStream);
while (microPhoneActive) {
byte[] frame = new byte[frameSize];
read = bufferedInputStream.read(frame, 0, frame.length);
Bitmap bitmap = BitmapFactory.decodeByteArray(frame, 0, frame.length);
if (bitmap != null) {
final Bitmap rotatedBitmap = rotateBitmap(bitmap, -90);
frameEvent.onNewFrame(rotatedBitmap);
}
}
}
}
} catch (IOException e) {
e.printStackTrace();
} catch (InterruptedException e) {
e.printStackTrace();
}
}
currently the buffer reader is reading chunks of data with different sizes ,therefor the bitmap that get build will be as following :
the black side of image have not been received yet because the buffered reader did not block and wait until all buffered have been filled.