0

We are working on a video streaming application , currently we are able to capture video and process it frame by frame then send/receive data between two devices .

the problem is at the receiving side , as we are sending frames with fixed size so at the receiving side should also process them with a fixed size or he will not be able to get the frame .

Hope this diagrams clarify the problem.

Transmitting process : enter image description here

Code :

public void process(@NonNull Frame frame) {
            byte[] data = frame.getData();           
            frameWidth = frame.getSize().getWidth();
            frameHieght = frame.getSize().getHeight();
            YuvImage yuv = new YuvImage(data, frame.getFormat(), frameWidth, frameHieght, null);
            ByteArrayOutputStream out = new ByteArrayOutputStream();
            yuv.compressToJpeg(new Rect(0, 0, frameWidth, frameHieght), 25, out);
            final byte[] bytes = out.toByteArray();
            frameByteSize = bytes.length;

            OutputStream outputStream = StreamHandler.getOutputStream();
            if (out != null) {
             try {
                   outputStream.write(bytes, 0, frameByteSize);
                   } catch (IOException e) {
                     e.printStackTrace();
                   }
    }

Reviving process : enter image description here

Code :

public void run() {
        int read = 0;
        int hold = 0;
        int frameSize = StreamHandler.getFrameByteSize();
        try {
            Thread.sleep(10000);
            if (SrtreamHandler != null) {
                InputStream inputStream = StreamHandler.getInputStream();
                if (inputStream != null) {
                    BufferedInputStream bufferedInputStream = new BufferedInputStream(inputStream);
                    while (microPhoneActive) {
                        byte[] frame = new byte[frameSize];
                        read = bufferedInputStream.read(frame, 0, frame.length);
                        Bitmap bitmap = BitmapFactory.decodeByteArray(frame, 0, frame.length);
                        if (bitmap != null) {
                            final Bitmap rotatedBitmap = rotateBitmap(bitmap, -90);
                            frameEvent.onNewFrame(rotatedBitmap);
                        }

                    }
                }
            }
        } catch (IOException e) {
            e.printStackTrace();
        } catch (InterruptedException e) {
            e.printStackTrace();
        }


    }

currently the buffer reader is reading chunks of data with different sizes ,therefor the bitmap that get build will be as following : enter image description here

the black side of image have not been received yet because the buffered reader did not block and wait until all buffered have been filled.

Ahmed na
  • 846
  • 11
  • 30
  • You should collect the chuncks until you have a complete frame. That bytes come in in chuncks is pretty normal using tcp/ip. – greenapps Feb 04 '18 at 08:03
  • You are doing nothing with the `read` integer. Start checking how much bytes are really read. – greenapps Feb 04 '18 at 08:07
  • @greenapps when I keep reading until have complete frame , in most cases i will end up with a complete frame and some bytes of the next frame witch will corrupt the next frame and all the frames to come after , is there a data structure that handle framing ? as for the read value i've already read the data and it's less than the frame i've corrupted the stream – Ahmed na Feb 04 '18 at 08:09
  • You should not already read the bytes of the next frame of course. You should only read the bytes of one frame first. Collect chuncks until you have a frame. Use the link provided by Oleg. – greenapps Feb 04 '18 at 08:11
  • @greenapps that is exactly my problem right now , i know that the use of inputStream.available() is useless , and read is blocking but not with fixed length – Ahmed na Feb 04 '18 at 08:13
  • It is no problem. Everybody who does tcp/ip programming has to keep track of its 'frames'. You are #123456 who has to deal with it. Nothing special. – greenapps Feb 04 '18 at 08:15

2 Answers2

0

currently the buffer reader is reading chunks of data with different sizes

Okay, what's wrong with that?

read doesn't always read all the data you asked for, for various reasons. You need to keep reading until you've read all the data you wanted to read.

user253751
  • 45,733
  • 5
  • 44
  • 76
0

So the problem was a false assumption that the frame size is fixed,

The logic was to get the first frame size provided by the camera then send it to the other side on a different control Chanel then continue the rest of the stream relaying on that information , that would work if i was not doing a lossy compression to the image

yuv.compressToJpeg(new Rect(0, 0, frameWidth, frameHieght), 25, out);

witch result to different frame sizes ,with help of @Oleg comment about this post How to read all of Inputstream in Server Socket JAVA

the code look like this , sending :

public void process(@NonNull Frame frame) {
            byte[] data = frame.getData();
                frameWidth = frame.getSize().getWidth();
                frameHieght = frame.getSize().getHeight();

                YuvImage yuv = new YuvImage(data, frame.getFormat(), frameWidth, frameHieght, null);
                ByteArrayOutputStream out = new ByteArrayOutputStream();
                yuv.compressToJpeg(new Rect(0, 0, frameWidth, frameHieght), 25, out);
                final byte[] bytes = out.toByteArray();
                frameByteSize = bytes.length;

                if (StreamHandler != null) {
                    OutputStream outputStream = StreamHandler.getOutputStream();
                    if (out != null) {
                        try {
                            byte firstByte = (byte) ((bytes.length & 0xff00) >> 8);
                            byte secondByte = (byte) (bytes.length & 0xFF);
                            outputStream.write(firstByte);
                            outputStream.write(secondByte);
                            outputStream.write(bytes, 0, frameByteSize);
                        } catch (IOException e) {
                            e.printStackTrace();
                        }
                    }

                }

            }
        }

receiving :

public void run() {
        int read = 0;
        int hold = 0;
        int frameSize = StreamHandler.getFrameByteSize();
        try {
            if (StreamHandler != null) {
                InputStream inputStream = SrtreamHandler.getInputStream();
                if (inputStream != null) {
                    DataInputStream dataInputStream = new DataInputStream(inputStream);
                    byte[] frameInfo = new byte[2];
                    while (microPhoneActive) {
                        frameInfo[0] = dataInputStream.readByte();
                        frameInfo[1] = dataInputStream.readByte();
                        ByteBuffer byteBuffer = ByteBuffer.wrap(frameInfo, 0, 2);
                        int bytesToRead = byteBuffer.getShort();
                        byte[] frame = new byte[bytesToRead];
                        dataInputStream.readFully(frame, 0, bytesToRead);
                        Bitmap bitmap = BitmapFactory.decodeByteArray(frame, 0, frame.length);
                        if (bitmap != null) {
                            final Bitmap rotatedBitmap = rotateBitmap(bitmap, -90);
                            frameEvent.onNewFrame(rotatedBitmap);
                        }

                    }
                }
            }

        } catch (IOException e) {
            e.printStackTrace();
        }


    }
Ahmed na
  • 846
  • 11
  • 30