No puedo responder a la pregunta formulada específica, pero he estado de grabación de vídeo con éxito y el acaparamiento de marcos al mismo tiempo usando:
AVCaptureSession
y AVCaptureVideoDataOutput
a los marcos de rutas en mi propio código
AVAssetWriter
, AVAssetWriterInput
y AVAssetWriterInputPixelBufferAdaptor
para escribir tramas a un archivo de video H.264 codificado
Eso es sin investigar audio. Termino obteniendo CMSampleBuffers
de la sesión de captura y luego presionándolas en el adaptador de memoria intermedia de píxeles.
EDIT: por lo que mi código es más o menos así, con los bits que está teniendo problemas con rozaron y haciendo caso omiso de los problemas de alcance:
/* to ensure I'm given incoming CMSampleBuffers */
AVCaptureSession *captureSession = alloc and init, set your preferred preset/etc;
AVCaptureDevice *captureDevice = default for video, probably;
AVCaptureDeviceInput *deviceInput = input with device as above,
and attach it to the session;
AVCaptureVideoDataOutput *output = output for 32BGRA pixel format, with me as the
delegate and a suitable dispatch queue affixed.
/* to prepare for output; I'll output 640x480 in H.264, via an asset writer */
NSDictionary *outputSettings =
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:640], AVVideoWidthKey,
[NSNumber numberWithInt:480], AVVideoHeightKey,
AVVideoCodecH264, AVVideoCodecKey,
nil];
AVAssetWriterInput *assetWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:outputSettings];
/* I'm going to push pixel buffers to it, so will need a
AVAssetWriterPixelBufferAdaptor, to expect the same 32BGRA input as I've
asked the AVCaptureVideDataOutput to supply */
AVAssetWriterInputPixelBufferAdaptor *pixelBufferAdaptor =
[[AVAssetWriterInputPixelBufferAdaptor alloc]
initWithAssetWriterInput:assetWriterInput
sourcePixelBufferAttributes:
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA],
kCVPixelBufferPixelFormatTypeKey,
nil]];
/* that's going to go somewhere, I imagine you've got the URL for that sorted,
so create a suitable asset writer; we'll put our H.264 within the normal
MPEG4 container */
AVAssetWriter *assetWriter = [[AVAssetWriter alloc]
initWithURL:URLFromSomwhere
fileType:AVFileTypeMPEG4
error:you need to check error conditions,
this example is too lazy];
[assetWriter addInput:assetWriterInput];
/* we need to warn the input to expect real time data incoming, so that it tries
to avoid being unavailable at inopportune moments */
assetWriterInput.expectsMediaDataInRealTime = YES;
... eventually ...
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
[captureSession startRunning];
... elsewhere ...
- (void) captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// a very dense way to keep track of the time at which this frame
// occurs relative to the output stream, but it's just an example!
static int64_t frameNumber = 0;
if(assetWriterInput.readyForMoreMediaData)
[pixelBufferAdaptor appendPixelBuffer:imageBuffer
withPresentationTime:CMTimeMake(frameNumber, 25)];
frameNumber++;
}
... and, to stop, ensuring the output file is finished properly ...
[captureSession stopRunning];
[assetWriter finishWriting];
_ "Es posible capturar vídeo directamente a presentar con AVCaptureMovieFileOutput Sin embargo, esta clase no tiene datos de visualización-poder y no puede ** ** puede utilizar simultáneamente con AVCaptureVideoDataOutput.." _ Encontrado aquí: [link] (https: // developer.xamarin.com/api/type/MonoTouch.AVFoundation.AVCaptureSession/) ... solo para aclarar la causa real del problema – Csharpest