Questions tagged [cmsamplebufferref]
72 questions
26
votes
5 answers
iOS - Scale and crop CMSampleBufferRef/CVImageBufferRef
I am using AVFoundation and getting the sample buffer from AVCaptureVideoDataOutput, I can write it directly to videoWriter by using:
- (void)writeBufferFrame:(CMSampleBufferRef)sampleBuffer {
CMTime lastSampleTime =…
![](../../users/profiles/227698.webp)
vodkhang
- 18,281
- 10
- 73
- 109
13
votes
1 answer
Deep Copy of Audio CMSampleBuffer
I am trying to create a copy of a CMSampleBuffer as returned by captureOutput in a AVCaptureAudioDataOutputSampleBufferDelegate.
The problem I am having is that my frames coming from delegate method…
![](../../users/profiles/1895729.webp)
Neil Galiaskarov
- 4,760
- 2
- 23
- 45
11
votes
1 answer
How to convert CMSampleBufferRef to NSData
How do you convert CMSampleBufferRef to NSData?
I've managed to get the data for an MPMediaItem by following Erik Aigner's answer on this thread, however the data is of type CMSampleBufferRef.
I know CMSampleBufferRef is a struct and is defined in…
![](../../users/profiles/1104780.webp)
RyanM
- 4,194
- 4
- 34
- 43
11
votes
1 answer
How to get the current captured timestamp of Camera data from CMSampleBufferRef in iOS
I developed and iOS application which will save captured camera data into a file and I used
(void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection…
![](../../users/profiles/1740097.webp)
Mr.G
- 1,233
- 1
- 17
- 47
11
votes
1 answer
Getting desired data from a CVPixelBuffer Reference
I have a program that views a camera input in real-time and gets the color value of the middle pixel. I use a captureOutput: method to grab the CMSampleBuffer from an AVCaptureSession output (which happens to be read as a CVPixelBuffer) and then I…
![](../../users/profiles/991985.webp)
bbrownd
- 475
- 1
- 6
- 14
10
votes
1 answer
error converting AudioBufferList to CMBlockBufferRef
I am trying to take a video file read it in using AVAssetReader and pass the audio off to CoreAudio for processing (adding effects and stuff) before saving it back out to disk using AVAssetWriter. I would like to point out that if i set the…
![](../../users/profiles/86524.webp)
odyth
- 4,239
- 3
- 35
- 44
9
votes
1 answer
Using AVAssetWriter with raw NAL Units
I noticed in the iOS documentation for AVAssetWriterInput you can pass nil for the outputSettings dictionary to specify that the input data should not be re-encoded.
The settings used for encoding the media appended to the output. Pass nil to…
![](../../users/profiles/489337.webp)
bsirang
- 443
- 4
- 12
9
votes
2 answers
How to set timestamp of CMSampleBuffer for AVWriter writing
I'm working with AVFoundation for capturing and recording audio. There are some issues I don't quite understand.
Basically I want to capture audio from AVCaptureSession and write it using AVWriter, however I need some shifting in the timestamp of…
![](../../users/profiles/303150.webp)
night_coder
- 185
- 5
- 7
6
votes
2 answers
CMSampleBufferRef kCMSampleBufferAttachmentKey_TrimDurationAtStart crash
This has bothering me for a while. i have video convert to convert video into “.mp4” format. But there is a crash that happens on some video but not all.
here is the crash log
*** Terminating app due to uncaught exception…
![](../../users/profiles/1042496.webp)
Xu Yin
- 3,932
- 1
- 24
- 46
6
votes
0 answers
Retaining CMSampleBufferRef cause random crashes
I'm using captureOutput:didOutputSampleBuffer:fromConnection: in order to keep track of the frames. For my use-case, I only need to store the last frame and use it in case the app goes to background.
That's a sample from my code:
@property…
![](../../users/profiles/429433.webp)
Rizon
- 1,426
- 3
- 24
- 42
5
votes
0 answers
Render failed because a pixel format YCC420f is not supported
I'm trying to convert a CVPixelBufferRef into a UIImage using the following snippet:
UIImage *image = nil;
CMSampleBufferRef sampleBuffer = (CMSampleBufferRef)CMBufferQueueDequeueAndRetain(_queue);
if (sampleBuffer)
{
CVPixelBufferRef…
![](../../users/profiles/6452189.webp)
javidecas
- 51
- 4
5
votes
1 answer
Why AVSampleBufferDisplayLayer stops showing CMSampleBuffers taken from AVCaptureVideoDataOutput's delegate?
I want to display some CMSampleBuffer's with the AVSampleBufferDisplayLayer, but it freezes after showing the first sample.
I get the samplebuffers from the AVCaptureVideoDataOutputSampleBuffer delegate:
-(void)captureOutput:(AVCaptureOutput…
![](../../users/profiles/4600240.webp)
didReceiveMemoryWarning
- 51
- 1
- 4
5
votes
1 answer
Crop CMSampleBufferRef
I am trying to crop image in CMSampleBufferRef to a specific size. I am making 5 steps - 1. Getting PixelBuffer from SampleBuffer 2. Converting PixelBuffer to CIImage 3. Cropping CIImage 4. Rendering CIImage back to PixelBuffer 5. Attaching…
![](../../users/profiles/1736654.webp)
Laz
- 431
- 7
- 12
4
votes
1 answer
iOS - Automatically resize CVPixelBufferRef
I am trying to crop and scale a CMSampleBufferRef based on user's inputs, on ratio, the below code takes a CMSampleBufferRef, convert it into a CVImageBufferRef and use CVPixelBuffer to crop the internal image based on its bytes. The goal of this…
![](../../users/profiles/227698.webp)
vodkhang
- 18,281
- 10
- 73
- 109
4
votes
2 answers
Saving CMSampleBufferRef for later processing
I am trying to use AVFoundation framework to capture a 'series' of still images from AVCaptureStillImageOutput QUICKLY, like the burst mode in some cameras. I want to use the completion handler,
[stillImageOutput…
![](../../users/profiles/155683.webp)
NSRover
- 872
- 1
- 12
- 28