How do you do the Ios camera footage properly?

  • As I've learned, to match the camera footage to the picture, we need a method.

    - (void)captureOutput:(AVCaptureOutput *)captureOutput
           fromConnection:(AVCaptureConnection *)connection

    I also learned that it was clear that he would not be available only through the delegate and through the line. That's not understandable. For example, I don't know how to put in a project so I can work and play with the image processing. From a number of projects on the guitaba, I saw that this method was sometimes just announced in the headline file..hwithout realization. (In general, there's no fitting in the head).

    Question: How can this method be used correctly? What file is the right time to write, and what method? If there's a short project on the network where this method works, please give me a link.

  • What's wrong with the sipple manual? Remove the extra release and remove the obsolete output.minFrameDuration = CMTimeMake(1, 15);

     -(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer

    fromConnection:(AVCaptureConnection *)connection{

    //Метод вызывается при записи данных с камеры в буфер 
    UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
    //Преобразовывать полученный буфер в картинку

    self.imageView.image = [self applyMonoChromeWithRedColor:image];

                  //Применить какие-либо эффекты к изображению,
                 // фильтр, к примеру, и добавить на экран


    In initiating the session, you shall indicate the delegate in the Delegates ' class and describe the method. In fact, the session is in two streams, in fact, in the input, you add the camera to the output of your processing class as a delegate.

Suggested Topics