2012-06-30 18 views
5

Tengo problemas de latencia cuando estoy grabando audio y video usando AVCaptureVideoDataOutput y AVCaptureAudioDataOutput. A veces, el video bloquea durante unos pocos milisegundos, a veces el audio no está sincronizado con el video.Problemas de rendimiento al usar AVCaptureVideoDataOutput y AVCaptureAudioDataOutput

Inserté algunos registros y observé que primero obtengo muchos búfers de video en la devolución de llamada captureOutput, y después de un tiempo obtengo los búferes de audio (a veces no recibo los búferes de audio, y el video resultante es sin sonido). Si comento el código que maneja los búferes de video, obtengo los búferes de audio sin problemas.

Este es el código que estoy usando:

-(void)initMovieOutput:(AVCaptureSession *)captureSessionLocal 
{ 
    AVCaptureVideoDataOutput *dataOutput = [[AVCaptureVideoDataOutput alloc] init]; 
    self._videoOutput = dataOutput; 
    [dataOutput release]; 

    self._videoOutput.alwaysDiscardsLateVideoFrames = NO; 
    self._videoOutput.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] 
                  forKey:(id)kCVPixelBufferPixelFormatTypeKey 
            ];  
    AVCaptureAudioDataOutput *audioOutput = [[AVCaptureAudioDataOutput alloc] init]; 
    self._audioOutput = audioOutput; 
    [audioOutput release]; 

    [captureSessionLocal addOutput:self._videoOutput]; 
    [captureSessionLocal addOutput:self._audioOutput]; 


    // Setup the queue 
    dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL); 
    [self._videoOutput setSampleBufferDelegate:self queue:queue]; 
    [self._audioOutput setSampleBufferDelegate:self queue:queue]; 
    dispatch_release(queue); 
} 

Aquí me juego hasta la escritora:

-(BOOL) setupWriter:(NSURL *)videoURL session:(AVCaptureSession *)captureSessionLocal 
{ 
    NSError *error = nil; 
    self._videoWriter = [[AVAssetWriter alloc] initWithURL:videoURL fileType:AVFileTypeQuickTimeMovie 
                 error:&error]; 
    NSParameterAssert(self._videoWriter); 


    // Add video input 
    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: 
            AVVideoCodecH264, AVVideoCodecKey, 
            [NSNumber numberWithInt:640], AVVideoWidthKey, 
            [NSNumber numberWithInt:480], AVVideoHeightKey, 
            nil]; 

    self._videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo 
                     outputSettings:videoSettings]; 


    NSParameterAssert(self._videoWriterInput); 
    self._videoWriterInput.expectsMediaDataInRealTime = YES; 
    self._videoWriterInput.transform = [self returnOrientation]; 

    // Add the audio input 
    AudioChannelLayout acl; 
    bzero(&acl, sizeof(acl)); 
    acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono; 


    NSDictionary* audioOutputSettings = nil;   
    // Both type of audio inputs causes output video file to be corrupted. 

     // should work on any device requires more space 
     audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys:      
           [ NSNumber numberWithInt: kAudioFormatAppleLossless ], AVFormatIDKey, 
           [ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey, 
           [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey, 
           [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,          
           [ NSData dataWithBytes: &acl length: sizeof(acl) ], AVChannelLayoutKey, 
           nil ]; 

    self._audioWriterInput = [AVAssetWriterInput 
             assetWriterInputWithMediaType: AVMediaTypeAudio 
             outputSettings: audioOutputSettings ]; 

    self._audioWriterInput.expectsMediaDataInRealTime = YES;  

    // add input 
    [self._videoWriter addInput:_videoWriterInput]; 
    [self._videoWriter addInput:_audioWriterInput]; 

    return YES; 
} 

y aquí está la devolución de llamada:

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
     fromConnection:(AVCaptureConnection *)connection 
{ 

    if(!CMSampleBufferDataIsReady(sampleBuffer)) 
    { 
     NSLog(@"sample buffer is not ready. Skipping sample"); 
     return; 
    } 
    if(_videoWriter.status != AVAssetWriterStatusCompleted) 
    { 
     if(_videoWriter.status != AVAssetWriterStatusWriting ) 
     {    
      CMTime lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); 
      [_videoWriter startWriting]; 
      [_videoWriter startSessionAtSourceTime:lastSampleTime]; 
     } 

     if(captureOutput == _videoOutput) 
     { 
      if([self._videoWriterInput isReadyForMoreMediaData]) 
      { 

      [self newVideoSample:sampleBuffer]; 

      } 
     } 
     else if(captureOutput == _audioOutput) 
     { 
      if([self._audioWriterInput isReadyForMoreMediaData]) 
      { 

       [self newAudioSample:sampleBuffer]; 


      } 
     } 
    } 

} 

-(void) newAudioSample:(CMSampleBufferRef)sampleBuffer 
{ 

     if(_videoWriter.status > AVAssetWriterStatusWriting) 
     { 

      [self NSLogPrint:[NSString stringWithFormat:@"Audio:Warning: writer status is %d", _videoWriter.status]]; 
      if(_videoWriter.status == AVAssetWriterStatusFailed) 
       [self NSLogPrint:[NSString stringWithFormat:@"Audio:Error: %@", _videoWriter.error]]; 
      return; 
     } 

     if(![_audioWriterInput appendSampleBuffer:sampleBuffer]) 
      [self NSLogPrint:[NSString stringWithFormat:@"Unable to write to audio input"]]; 

} 

-(void) newVideoSample:(CMSampleBufferRef)sampleBuffer 
{ 
    if(_videoWriter.status > AVAssetWriterStatusWriting) 
    { 
     [self NSLogPrint:[NSString stringWithFormat:@"Video:Warning: writer status is %d", _videoWriter.status]]; 
     if(_videoWriter.status == AVAssetWriterStatusFailed) 
      [self NSLogPrint:[NSString stringWithFormat:@"Video:Error: %@", _videoWriter.error]]; 
     return; 
    } 


    if(![_videoWriterInput appendSampleBuffer:sampleBuffer]) 
     [self NSLogPrint:[NSString stringWithFormat:@"Unable to write to video input"]]; 
} 

¿Hay algo mal en mi código, ¿por qué la demora del video? (Lo estoy probando en un Iphone 4 ios 4.2.1)

Respuesta

3

Parece que está utilizando colas en serie. La cola de salida de audio está justo después de la cola de salida de video. Considere usar colas concurrentes.

+0

Gracias. Esta fue una gran ayuda. – Liron

+0

Sé que esta respuesta es antigua, pero ¿puedes dar un ejemplo de cómo hacerlo? Probé las colas de serie separadas (nuevas), que no funcionaron e intenté configurar una cola con DISPATCH_QUEUE_CONCURRENT, que tampoco me ayudó. –

+0

Elaboración de mi último comentario: cuando uso dos colas separadas, provoco que mi creador de activos falle –

Cuestiones relacionadas