17

I want to display the stream of the front and the back facing camera of an iPad2 in two UIViews next to each other. To stream the image of one device I use the following code

AVCaptureDeviceInput *captureInputFront = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:nil];

AVCaptureSession *session = [[AVCaptureSession alloc] init];
session addInput:captureInputFront];
session setSessionPreset:AVCaptureSessionPresetMedium];
session startRunning];

AVCaptureVideoPreviewLayer *prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
prevLayer.frame = self.view.frame;
[self.view.layer addSublayer:prevLayer];

which works fine for either camera. To display the stream in parallel I tried to create another session, but as soon as the 2nd session is established the first freezes.

Then I tried to add two AVCaptureDeviceInput to the session but seems like at most one input is supported at the moment.

Any helpful ideas how to stream from both cameras?

jfischer
  • 261
  • 2
  • 9
  • possible duplicate of [How can I get autofocus to work in a second AVCaptureSession without recreating the sessions?](http://stackoverflow.com/questions/5427561/how-can-i-get-autofocus-to-work-in-a-second-avcapturesession-without-recreating) – Gabriele Petronella Oct 25 '13 at 15:50

1 Answers1

19

It is possible to get CMSampleBufferRefs from multiple video devices on MacOS X. You have to setup the AVCaptureConnection objects manually. For example, assuming you have these objects:

AVCaptureSession *session;
AVCaptureInput *videoInput1;
AVCaptureInput *videoInput2;
AVCaptureVideoDataOutput *videoOutput1;
AVCaptureVideoDataOutput *videoOutput2;

Do NOT add the outputs like this:

[session addOutput:videoOutput1];
[session addOutput:videoOutput2];

Instead, add them and tell the session not to make any connections:

[session addOutputWithNoConnections:videoOutput1];
[session addOutputWithNoConnections:videoOutput2];

Then for each input/output pair make the connection from the input's video port to the output manually:

for (AVCaptureInputPort *port in [videoInput1 ports]) {
    if ([[port mediaType] isEqualToString:AVMediaTypeVideo]) {
        AVCaptureConnection* cxn = [AVCaptureConnection
            connectionWithInputPorts:[NSArray arrayWithObject:port]
            output:videoOutput1
        ];
        if ([session canAddConnection:cxn]) {
            [session addConnection:cxn];
        }
        break;
    }
}

Finally, make sure to set sample buffer delegates for both outputs:

[videoOutput1 setSampleBufferDelegate:self queue:someDispatchQueue];
[videoOutput2 setSampleBufferDelegate:self queue:someDispatchQueue];

and now you should be able to process frames from both devices:

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{
    if (captureOutput == videoOutput1)
    {
        // handle frames from first device
    }
    else if (captureOutput == videoOutput2)
    {
        // handle frames from second device
    }
}

See also the AVVideoWall sample project for an example of combining live previews from multiple video devices.

Kevin Tonon
  • 835
  • 10
  • 18
  • Thanks this worked for me with one addition. I also had to do: [session addInputWithNoConnections:videoInput1]; [session addInputWithNoConnections:videoInput2]; – Ken Aspeslagh Dec 09 '16 at 03:59
  • 9
    Doesn't work on iOS 10 — adding the second input to the session fails: Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVCaptureSession addInputWithNoConnections:] Multiple audio/video AVCaptureInputs are not currently supported – Kartick Vaddadi Dec 18 '16 at 04:37