iOS Still Image Capture Using AVCaptureSession

I had a request to show how to capture a still image of the live video feed in the AROverlayExample project. This is probably the simplest use of the output of the AVCaptureSession possible so I have created a new project based off of the AROverlayExample that uses the scan button to capture an image and saves the image to your device’s photo album.

You can get the source code for AROverlayImageCapture here.

Here are the instructions for how to do this starting with the AROverlayExample. If you haven’t already, it might be a good idea to read through my last post to understand how we got to where we are. I am using Xcode 4.0.

Add Frameworks

The first thing we need to do is add a few frameworks to the project. Click on your project in the Groups & Files pane, then on the right choose your target and then the Build Phases tab. Click on Link Binary With Libraries so that it expands. Click the plus sign and add the ImageIO, CoreMedia and CoreVideo frameworks to your target.

Modify CaptureManager

In your CaptureManager.h file add the following pound define at the top and then these two properties and public methods:


#define kImageCapturedSuccessfully @"imageCapturedSuccessfully"
.
.
.
@property (retain) AVCaptureStillImageOutput *stillImageOutput;
@property (nonatomic, retain) UIImage *stillImage;

- (void)addStillImageOutput;
- (void)captureStillImage;

The first property is of the class AVCaptureStillImageOutput and as is implied by its verbose name it is needed to capture a still image. The UIImage will keep a reference to the image once we have captured it.

At the top of CaptureManager.m import the ImageIO framework with:

#import <ImageIO/ImageIO.h>

I’m not going to show you (though it is in the source code) but remember to synthesize the two new properties and release and nil them in the dealloc method. Then add the following two methods:

- (void)addStillImageOutput 
{
  [self setStillImageOutput:[[[AVCaptureStillImageOutput alloc] init] autorelease]];
  NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil];
  [[self stillImageOutput] setOutputSettings:outputSettings];
  
  AVCaptureConnection *videoConnection = nil;
  for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
    for (AVCaptureInputPort *port in [connection inputPorts]) {
      if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
        videoConnection = connection;
        break;
      }
    }
    if (videoConnection) { 
      break; 
    }
  }
  
  [[self captureSession] addOutput:[self stillImageOutput]];
}

- (void)captureStillImage
{  
	AVCaptureConnection *videoConnection = nil;
	for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
		for (AVCaptureInputPort *port in [connection inputPorts]) {
			if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
				videoConnection = connection;
				break;
			}
		}
		if (videoConnection) { 
      break; 
    }
	}
  
	NSLog(@"about to request a capture from: %@", [self stillImageOutput]);
	[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection 
                                                       completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) { 
                                                         CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
                                                         if (exifAttachments) {
                                                           NSLog(@"attachements: %@", exifAttachments);
                                                         } else { 
                                                           NSLog(@"no attachments");
                                                         }
                                                         NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];    
                                                         UIImage *image = [[UIImage alloc] initWithData:imageData];
                                                         [self setStillImage:image];
                                                         [image release];
                                                         [[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:nil];
                                                       }];
}

The first method just prepares the stillImageOutput by letting you specify the output settings. The settings I have used work fine for our sample app but you may want to explore the other options. I haven’t really looked into them.

The second method actually captures a still image of the live video view and saves the image to the stillImage property we set up earlier. It then posts a notification that the image has been captured which we will be listening for in our AVOverlayViewController which we will get to next.

Modify AVOverlayViewController

Now we need to add a few things to AVOverlayViewController.m. The first thing to do is to declare a new private method at the top of the file like so:

@interface AROverlayViewController ()
- (void)image:(UIImage *)image didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo;
@end

Then add the following two methods to the file:

- (void)saveImageToPhotoAlbum 
{
  UIImageWriteToSavedPhotosAlbum([[self captureManager] stillImage], self, @selector(image:didFinishSavingWithError:contextInfo:), nil);
}

- (void)image:(UIImage *)image didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo
{
  if (error != NULL) {
    UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error!" message:@"Image couldn't be saved" delegate:self cancelButtonTitle:@"Ok" otherButtonTitles:nil, nil];
    [alert show];
    [alert release];
  }
  else {
    [[self scanningLabel] setHidden:YES];
  }
}

The first method is simply a method to save the captured image to your Photo Album and the second method is a callback to call when the image has been saved. I have modified the project so that while the image is being saved the scanning label says “Saving…” Once the save is complete the label is hidden. There is some basic error checking as well.

Now we need to wire all these things together. First of all lets make sure we initialize the stillImageOutput by calling

[[self captureManager] addStillImageOutput];

in our viewDidLoad. Also add an observer for the notification that will be sent when the image has been captured with:

[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(saveImageToPhotoAlbum) name:kImageCapturedSuccessfully object:nil];

Finally change your button method to be:

- (void)scanButtonPressed {
	[[self scanningLabel] setHidden:NO];
  [[self captureManager] captureStillImage];
}

You should now be able to run the app and when you tap the scan button the live video feed will be captured and saved to your photo album. Let me know if you have any issues with this implementation.

Share and Enjoy:
  • printfriendly iOS Still Image Capture Using AVCaptureSession
  • digg iOS Still Image Capture Using AVCaptureSession
  • stumbleupon iOS Still Image Capture Using AVCaptureSession
  • delicious iOS Still Image Capture Using AVCaptureSession
  • facebook iOS Still Image Capture Using AVCaptureSession
  • yahoobuzz iOS Still Image Capture Using AVCaptureSession
  • twitter iOS Still Image Capture Using AVCaptureSession
  • googlebookmark iOS Still Image Capture Using AVCaptureSession
  • reddit iOS Still Image Capture Using AVCaptureSession

About jjob

I'm a technologist and a music producer. I make sound, write code and build things with electronics and microcontrollers.
This entry was posted in AR, code, iOS, iPad, iPhone and tagged , , , , , , , , . Bookmark the permalink.

114 Responses to iOS Still Image Capture Using AVCaptureSession

  1. Bill Brasky says:

    Best image capture tutorial on the web

  2. Tate Allen says:

    This is amazing! Thanks for the great work!

  3. MT says:

    Very good tutorial! I have looked for one in almost a week, and find nothing! But thanks to you, now I found answer to my questions. I wonder if you have some suggestion if it is possible to capture picture every 2 seconds, without scan button? And I wonder if it is free to use a bit of your code in my own project? License?
    Thanks very much!

    • jjob says:

      You could capture a picture every few seconds using an NSTimer. Something like:

      [NSTimer scheduledTimerWithTimeInterval:2.0 target:self selector:@selector(takePicture:) userInfo:nil repeats:YES];

      There are no restrictions on using my code.

  4. Em says:

    Thanks for the tutorial, it is really great and very helpful:)
    I have a question when I capture the picture how can I also add the UIButton or other objects on the saved image.

    • jjob says:

      I can’t remember how you do that but have done it in the past and remember it being pretty easy. Try Googling for how to take a screenshot in an iOS application.

  5. shinrenpan says:

    Thanks for the tutorial
    but in iPhone4, it has front and back camera
    How can I switch the front and back camera with this code
    sorry my bad English

  6. abc says:

    is there any way to save the captures to a new folder/album?

  7. MT says:

    Hello!
    And thanks again for your tutorial! I have implemented your code, in my project, and want to see token image in one imageview. Do you have some tip how to do that? Should I make a new ControllerView, and put imageView in that view? I want to see image I took.
    My problem is that I don’t want to save tooken image, but use it write away and send to another method. Is this possible to do with this kind of classes/objects? I see that here is blocks used, and I wonder if they have to do with my problem- not be able to get image and show in UIImageView?
    Thanks very much!

  8. muckz says:

    HI there,

    thank you for the great tutorial. I tried it in one of my projects and didn’t get the image to be saved to the photoalbum. When I try to I get the following error:

    Error while saving foto Error Domain=ALAssetsLibraryErrorDomain Code=-3304 "Failed to encode image for saved photos." UserInfo=0x5201f0 {NSLocalizedDescription=Failed to encode image for saved photos.

    Perhaps you know how to avoid this?

    Cheers and thanks again for the tutorial.

  9. muckz says:

    Hi,

    yes, with your demo I can save images. I should mention, that i asigned [[self captureManager] stillImage] to another UIImage* variable. Perhaps this breaks the jpeg-encoding? I changed this and then I was able to save the image. How can I assign the value of stillImage to another instance of UIImage? Doing it “normally” doesn’t work.

    Thanks for your help!

  10. jjob says:

    There is nothing special about the stillImage instance variable. Here is the code that creates it:

    NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
    UIImage *image = [[UIImage alloc] initWithData:imageData];
    [self setStillImage:image];
    [image release];

    It is just a regular UIImage and you should be able to pass it around as you would any other UIImage.

    To test I tried reassigning the stillImage and saving the new UIImage. This worked fine for me:

    UIImage *tempImage = [[self captureManager] stillImage];
    UIImageWriteToSavedPhotosAlbum(tempImage, self, @selector(image:didFinishSavingWithError:contextInfo:), nil);

  11. muckz says:

    Hmm, I don’t know if the compiler had a bad day that day, but now everything works just fine. Perhaps I overlooked a line when going through your code…
    Thank you very much for help!!

  12. Jesse Stay says:

    Thanks so much for this – I wish more people would share their code like this! Do you know if there is a way to capture just the part of the image within the square brackets? For my app I’d like to focus just on that area and do stuff with it. Is this possible in XCode?

    • jjob says:

      Yes this is possible, though it has nothing to do with Xcode. Xcode is an integrated development environment or IDE, and a very good one at that, but it is Apple’s API (written mostly in C and Objective-C) that allow you to do things like what you are asking, not Xcode. If you look at the previous comments on this post you will find code to capture a specific area of the image which should help you.

  13. Lawrence says:

    Thanks for this post, it saved my day to find some related topic and tutorial regarding on my project. I have some concern. How can I incorporate this object c code on sencha touch mobile app framework? since I want to use it to be wrap on object c. It is possible? How Can I achieved it?
    Thanks in advance.

  14. Graeme says:

    Hey, thanks for the tutorial, this is the best one I’ve found for what I’m currently trying to accomplish. Just one and a bit questions though, is it possible to add “tap to focus” to this? What about a zoom?
    Again, many thanks :)

  15. Qr says:

    Hey, thanks for amazing tutorials.

    Little questions :)

    How can i save images to applicatons directory?

    • Qr says:


      NSData *jpegData = UIImageJPEGRepresentation([[self captureManager] stillImage], 90);
      NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
      NSString *documentsPath = [paths objectAtIndex:0]; //Get the docs directory
      NSString *filePath = [documentsPath stringByAppendingPathComponent:@"image.png"]; //Add the file name
      [jpegData writeToFile:filePath atomically:YES];

      Is this best way to do that? :)

      • Melisa says:

        If that’s the correct way, is there some additional code so that the images be named sequentially?? like image 001, image 002.. etc..
        By the way, excellent tutorial! just what I was looking for!

  16. Bong says:

    Thanks you are like shining Light in the darkness….thanks to google also….

    Here’s some code you can AvCapture StillImage Output “With” Overlay Image on it!!
    Based on Mr.Jjob’s Code and from google Using CGRect Stuff….remember you need to
    ad overlay.png to supporting files…. I wish Mr.Jjob dosen’t mind about using his code…

    - (void)captureStillImage
    {
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
    for (AVCaptureInputPort *port in [connection inputPorts]) {
    if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
    videoConnection = connection;
    break;
    }
    }
    if (videoConnection) {
    break;
    }
    }

    NSLog(@”about to request a capture from: %@”, [self stillImageOutput]);
    [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
    completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
    CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
    if (exifAttachments) {
    NSLog(@”attachements: %@”, exifAttachments);
    } else {
    NSLog(@”no attachments”);
    }
    NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
    UIImage *image = [[UIImage alloc] initWithData:imageData];
    UIImage *OverlayView = [UIImage imageNamed:@"overlay.png"];
    int width = image.size.width;
    int height = image.size.height;
    CGSize size = CGSizeMake(width, height);
    //create the rect zone that we draw from the image
    CGRect imageRect;

    if(image.imageOrientation==UIImageOrientationUp
    || image.imageOrientation==UIImageOrientationDown)
    {
    imageRect = CGRectMake(0, 0, width, height);
    }
    else
    {
    imageRect = CGRectMake(0, 0, height, width);
    }

    UIGraphicsBeginImageContext(size);
    CGContextRef context = UIGraphicsGetCurrentContext();
    //Save current status of graphics context
    CGContextSaveGState(context);

    //Do stupid stuff to draw the image correctly
    CGContextTranslateCTM(context, 0, height);
    CGContextScaleCTM(context, 1.0, -1.0);

    if(image.imageOrientation==UIImageOrientationLeft)
    {
    CGContextRotateCTM(context, M_PI / 2);
    CGContextTranslateCTM(context, 0, -width);
    }
    else if(image.imageOrientation==UIImageOrientationRight)
    {
    CGContextRotateCTM(context, – M_PI / 2);
    CGContextTranslateCTM(context, -height, 0);
    }
    else if(image.imageOrientation==UIImageOrientationUp)
    {

    }
    else if(image.imageOrientation==UIImageOrientationDown)
    {
    CGContextTranslateCTM(context, width, height);
    CGContextRotateCTM(context, M_PI);
    }
    CGContextDrawImage(context, imageRect, image.CGImage);
    //After drawing the image, roll back all transformation by restoring the
    //old context
    CGContextRestoreGState(context);

    //For viewImage
    CGRect OverlayViewRect;

    if(OverlayView.imageOrientation==UIImageOrientationUp
    || OverlayView.imageOrientation==UIImageOrientationDown)
    {
    OverlayViewRect = CGRectMake(0, 0, width, height);
    }
    else
    {
    OverlayViewRect = CGRectMake(0, 0, height, width);
    }
    CGContextSaveGState(context);
    CGContextTranslateCTM(context, 0, height);
    CGContextScaleCTM(context, 1.0, -1.0);

    if(OverlayView.imageOrientation==UIImageOrientationLeft)
    {
    CGContextRotateCTM(context, M_PI / 2);
    CGContextTranslateCTM(context, 0, -width);
    }
    else if(OverlayView.imageOrientation==UIImageOrientationRight)
    {
    CGContextRotateCTM(context, – M_PI / 2);
    CGContextTranslateCTM(context, -height, 0);
    }
    else if(OverlayView.imageOrientation==UIImageOrientationUp)
    {

    }
    else if(OverlayView.imageOrientation==UIImageOrientationDown)
    {
    CGContextTranslateCTM(context, width, height);
    CGContextRotateCTM(context, M_PI);
    }
    CGContextSetAlpha (context,0.5);
    CGContextDrawImage(context, OverlayViewRect, OverlayView.CGImage);
    //After drawing the image, roll back all transformation by restoring the
    //old context
    CGContextRestoreGState(context);

    //DO OTHER EFFECTS HERE
    //get the image from the graphic context
    UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();

    [self setStillImage:outputImage];
    [image release];
    [[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:nil];
    }];
    }

    Only Problem is it doesn’t capture Exactly what is shown to AV screen….

    the Base image gets smaller …. still figuring how to fit it with the overlay image…

    Anyone Good with CGRect? help us all….

    Sorry for my wierd wierd English….though

    I am Korean …. I can’t help it… Blame Education of my nation:)

    • chipdet says:

      @Bong

      >the Base image gets smaller …. still figuring how to fit it with the overlay image…

      just add this line into method captureStillImage:

      - (void)captureStillImage
      {
      captureSession.sessionPreset = AVCaptureSessionPresetPhoto;

      }

    • Michael says:

      I copied and pasted this exactly, and I’m getting several errors.

    • Tony says:

      I know your post is old, but did you ever find an answer as to why the image is smaller? The image appears to be shifted up and right a little. Thanks for any help!

      • jjob says:

        I actually wrote a whole post about this at http://www.musicalgeometry.com/?p=1681

        In it I explain about the irregularity you guys were noticing and provide a solution.

        To correctly position the overlay you need to apply some scaling. This is due to the fact that the photos taken with the camera are 480×640 whereas the screen is 320×480 so if you don’t scale the position and size of your overlay it will not be positioned or sized the way it was on your iPhone screen.

    • clive dancey says:

      HI
      This is a great addition to Mr JJobs awesome program , but when i run it it always gives me a full screen of my overlay and i want it to display the overlay image on the background of the photo i take.
      Any help with this…it will be much appreciated
      clive

  17. Clark says:

    Great tutorial.
    I have some questions. For your image:(UIImage *)image didFinishSavingWithError…, when the error != NULL?
    What I am doing is entering the camera, taking the first image and saving it, and then re-enter the camera, taking the second image and saving it, but after I take the second picture, the alert will come out, and the image is not saved.
    Can you help me with this? Thank you~

  18. Icarus says:

    In addressing the front facing camera issue – implementing the following should do the trick. I’m not familiar enough with this project to do the implementation, myself ;). Thanks, jjob. Great work

    AVCaptureSession *session = ;
    [session beginConfiguration];

    [session removeInput:frontFacingCameraDeviceInput];
    [session addInput:backFacingCameraDeviceInput];

    [session commitConfiguration];

    • jjob says:

      @Icarus Thanks for the tip! If anyone tries this out please let us know.

    • jjob says:

      I just tested using the front facing camera successfully by changing my addVideoInput method to:

      - (void)addVideoInput {
      NSArray *devices = [AVCaptureDevice devices];
      AVCaptureDevice *frontCamera;
      AVCaptureDevice *backCamera;

      for (AVCaptureDevice *device in devices) {

      NSLog(@"Device name: %@", [device localizedName]);

      if ([device hasMediaType:AVMediaTypeVideo]) {

      if ([device position] == AVCaptureDevicePositionBack) {
      NSLog(@"Device position : back");
      backCamera = device;
      }
      else {
      NSLog(@"Device position : front");
      frontCamera = device;
      }
      }
      }

      NSError *error = nil;
      AVCaptureDeviceInput *frontFacingCameraDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:frontCamera error:&error];
      if (!error) {
      if ([[self captureSession] canAddInput:frontFacingCameraDeviceInput])
      [[self captureSession] addInput:frontFacingCameraDeviceInput];
      else {
      NSLog(@"Couldn't add front facing video input");
      }
      }
      }

      Once things are rolling switching back and forth between front an back should be accomplished as you described.

    • desmond says:

      Where do you add the above to ?

  19. Lucas says:

    Hi jjob, thanks for posting this great tutorial. You explain great and should look up teaching in universities if you haven’t done already.

    CGRect layerRect = [[[self view] layer] bounds];
    [[[self captureManager] previewLayer] setBounds:layerRect];
    [[[self captureManager] previewLayer] setPosition:CGPointMake(CGRectGetMidX(layerRect),
    CGRectGetMidY(layerRect))];

    I think we should change those 3 lines so we can set return YES to interfaceOrientation instead of UIInterfaceOrientationPortrait

  20. AB says:

    Thanks for sharing this amazing tutorial! It is really a great way to learn and do an AROverlay.

    I have a related question for which i couldnt find an answer elsewhere so thought this might be a good place to ask. I am trying to draw an overlay say touchview using UIView (instead of using overlay.png) and this drawing by user generates a CGRect (x,y,width,height) of the drawn rectangle on screen. Now i want to crop the image inside this rectangle but it seems there are issues with scale factor for iPad and i get a very small section of the image for my cropped area and that too highly pixelated. Am i missing something here, can somebody point me to some useful resources or even better a code source to look at? Thanks!

  21. Kedar says:

    Its been more than 7-8 months since you wrote this post, but since I just used some of the concepts here, I wanted to leave a comment. I used a major portion of this tutorial, with one difference. Instead of saving to the iPad album, i needed to make an HTTP request to upload the image to a server. When I did this, I found that the notification sent after capturing the image was being sent multiple times, each corresponding to all of the previous images captured, causing saveImageToPhotoAlbum to be called for each image. Couldn’t find a reason for this. Any idea? Anyway… I solved it by setting AROverLayViewController as captureManager’s delegate, and calling saveImageToPhotoAlbum as the delegate method. This was an awesome, simple tutorial… just what i needed! :D

  22. Tim says:

    Hey, great tutorial!

    I have a question for jjob or readers of this blog.

    I have a MPMoviePlayerController, moviePlayer, playing video embedded in part of the screen. I have another view that I’d like to alternate between “playing the same video” as moviePlayer and not showing anything.

    I understand that I can’t have two MPMoviePlayerController objects playing at the same time. But can I somehow “pipe” thumbnails into an imageView, refreshing often, to make it look like a video is playing in my second view?

    Or is there a better solution?

    I was wondering if you have any thoughts on my problem.

  23. Vakul Saini says:

    JJob Sir,

    Can you send me the Xcode file of this code, actually i am new in iphone so could not understand How to capture still image using AVCaptureSession,but i have done your previous code correctly (AROverlayExample project tutorial), now i want to capture image from video.
    Please help me sir.

    Thanks!
    Email- vkinlove13@gmail.com
    Chandigarh INDIA

  24. Vakul Saini says:

    One more question sir, please dont be angry on me :-)

    i want to add another pic on avcapturevideo, and now when we click the scan button both pic becomes in to one single pic, i mean to say the second pic overrite on first pic.
    for this i used the get current context method, but i dont want to take UIImagePicker so plz tell me how can i take a snapshot on scan button’s click.

    Thanks !!!
    your fan :-)

    • jjob says:

      As far as taking a picture using the scan button and without using UIImagePicker, that is what this tutorial and example already show you how to do. :-)

      I think you are also asking how to include another image on top of the picture that you take. I have never done this but I am sure if you search for how to combine two images you should find what you need. I imagine you just need to add the second image to the same context as the picture you are taking before you save it.

  25. Gertjan says:

    Great tutorial, we are actually about to erect a statue of you, thanks!

  26. Itay says:

    This code doesn’t seems to work for me – i have managed to record pictures from the video stream…but when trying to take a still image with better reseultion using this code – it doesn’t work – and only hangs.

  27. Michael says:

    How do I set it so that it’s set to “Landscape” mode and not “Portrait” when picture is taken? As a default when the picture is taken… I would have to go to the photo album… edit the picture… turn the picture around so that it’s landscape mode.

  28. Pingback: Progress Updates « iOS Application Development and Implementation

  29. bhargavi says:

    Great Tutorial ,i have one doubt please clarify i want to save audio to photo library using image picker is there any way i found methods to save images and videos but not for audios .

    Thankyou

  30. Tony says:

    This tutorial was great. I integrated a lot of what was discussed in my app. I was wondering if anyone has saved the overlay image with the camera image. I saw a post was added by bong… I added a slightly modified version of his post, but I get the same offsetting of the overlaid image. Anyone figured this out?

  31. ANDRE CHAGAS says:

    jason
    I know this post is late but it is a great example of capture.
    I downloaded your code and made the changes I want, but would like your help on one thing …

    I want to capture the picture with the camera image and overlay image, as I do this?

    thank you

    • jjob says:

      I’ve been asked this so many times I decided to write a blog post about it today. :-)

      http://www.musicalgeometry.com/?p=1681

      • Tony says:

        In the article you have the dimensions for the iPhone, which you nations is 480 x 640.

        Is that the same for the iPad, iPad 3?

        Thanks for the help… article has been AWESOME!!!!

        • jjob says:

          I’m not really sure but if you just log the size of the image after you have taken it and have it as a UIImage then you will be able to find out.

          Given a UIImage named ‘photo’ I think this should do the trick:

          CGSize photoSize = [photo size];
          NSLog(@”%d x %d”,photoSize.width,photoSize.height);

          • Tony says:

            LOL, yes, i just did that… sorry, dumb question. But I do have another… In your sample, you resize the overlay image to match the camera width and height, correct? There was no DL for the images you specify in you sample, but are the images the full screen, or can they be any size? If they are any size, don’t you need to position the image also?

          • jjob says:

            Just curious, what size was the photo taken with the iPad?

            All the images used in my sample are in the project so you can look at them. They are definitely not full screen images. And yes, you definitely need to adjust both the position and scale. That is what I say in the post and is what I do in the sample app.

            CGFloat xScaleFactor = imageSize.width / 320;
            CGFloat yScaleFactor = imageSize.height / 480;

            [overlay drawInRect:CGRectMake(30 * xScaleFactor, 100 * yScaleFactor, overlaySize.width * xScaleFactor, overlaySize.height * yScaleFactor)];

  32. Tony says:

    After moe study of your code, I get the image placement, but the only thing I am unclear about is how the overlay image location is determined? Is it off the camera resolution, 480 x 640, or the iPhone screen resolution, 320 x 460.

    • Tony says:

      The iPad dimensions were 720 x 1280. And yes, i saw that in the article, sometimes I am a bit quick to ask…. sorry -:).

      • jjob says:

        Thanks for the info! And no worries on the questions. :-)

        • Tony says:

          Weird thing to me is, when I take a picture with the iPad Camera, the picture is 720 x 960 with the rear camera. But when I look at what is returned from your code it is 720 x 1280 (rear). Do you have any idea why? Does apple crop the image down to 960 to fill the screen?

          When I take the pictures from my app and view them on the iPad they have a black band on the left and right because the are 720 x 1280, not 720 x 960, which apparently scales up on the iPad screen nicely.

  33. Tony says:

    Yes,that is the project I started to work with, but when I download it (zip) it does not contain the person.png or hat.png files from this post: http://www.musicalgeometry.com/?p=1681

    • Tony says:

      Ok, if I change the sessionPreset as follows:

      self.captureSession.sessionPreset = @”AVCaptureSessionPresetPhoto”;

      in the addVideoInputFrontCamera method, then the returned image size is the same as the default camera, 720 x 960.

      • Tony says:

        Alright jjob, got it all working and have a much better understanding of the AVCapture classes. Thanks for the tutorial, it was awesome and VERY helpful!!!

        LAST question… have you done anything with orientation? I was thinking I would need to set the AVCaptureConnection, videoOrientation and then switch my width and height values… sound right?

    • jjob says:

      That was just a generic example, I never made an app that had those images.

      For a concrete example just use the code I provide at the bottom of the How To Combine Two Images In iOS post to modify the AVOverlayImageCapture project as I instruct. With that modification it will save the overlay image with the photo.

      • Tony says:

        jjob, I am able to rotate the preview, but the saved image is to rotated correctly. I suspect there is more to do that just change the previewLayer.orientation to match the device orientation. I saw some posts here, but I could not get them to work.

        Have you done anything with orientation?

        • jjob says:

          Sorry, I haven’t played with rotating the image capture.

          • vikram says:

            hey mr. job can u please!!!!!!

            can anybody tell me that how to start video recording by using capture Manager?

            Please help me there is one another question.. how to capture image from local video or Live Video by using capture Manager?.

  34. vikram says:

    can anybody tell me that how to start video recording by using capture Manager?

    Please help me there is one another question.. how to capture image from local video or Live Video by using capture Manager?.

  35. flashmob03 says:

    thanks ur code helps me a lot

  36. gosalyn says:

    Thanks for the tutorial, it helped me a lot :)

    I have a question about capturing a part of the output as a still image. You were saying that it’s written in an older comment but I couldn’t find it, that’s why I’m asking.

    Assuming that the whole screen’s rectangle is (0,0,320,480), I’m displaying the AV layer on (0,70, 320, 320). Then, after we create the output UIImage, I want to capture the rectangle within (0, 70, 320, 320) and get a square image. I mean, I want the same image as what I see while capturing, not any more parts.

    How can I achieve this? Thanks a lot in advance.

  37. konrad says:

    hi, great tutorial!
    i have one question though.. is there a way of capturing a squared image? or any other of a specific size? i saw someone asking about that before but there is no answer showing how to do that.. it would be wonderful with some help on that one!

  38. athi says:

    Thaks for your tutorial jjob. It’s very helpfull. I’m very appreciate it.
    But I want to ask some questions. How to display image after we capture the image?
    Is it possible if we save image in specific folder? How to implement it?

    Thanks,

  39. quitequick says:

    Thanks for the tutorial but… downloaded the code and ran it through Xcode 4 on a 3GS iOS 4.3 and it crashed immediately.

    2012-07-09 11:26:24.304 AROverlayExample[4891:707] Device name: Camera
    2012-07-09 11:26:24.317 AROverlayExample[4891:707] Device position : back
    2012-07-09 11:26:24.322 AROverlayExample[4891:707] Device name: iPhone Microphone
    2012-07-09 11:26:24.327 AROverlayExample[4891:707] -[__NSCFType supportsAVCaptureSessionPreset:]: unrecognized selector sent to instance 0x10b020
    2012-07-09 11:26:24.351 AROverlayExample[4891:707] *** Terminating app due to uncaught exception ‘NSInvalidArgumentException’, reason: ‘-[__NSCFType supportsAVCaptureSessionPreset:]: unrecognized selector sent to instance 0x10b020′
    *** Call stack at first throw:
    (
    0 CoreFoundation 0x3387b64f __exceptionPreprocess + 114
    1 libobjc.A.dylib 0x309f8c5d objc_exception_throw + 24
    2 CoreFoundation 0x3387f1bf -[NSObject(NSObject) doesNotRecognizeSelector:] + 102
    3 CoreFoundation 0x3387e649 ___forwarding___ + 508
    4 CoreFoundation 0x337f5180 _CF_forwarding_prep_0 + 48
    5 AVFoundation 0x3133f2e3 -[AVCaptureSession _canAddInput:failureReason:] + 390
    6 AVFoundation 0x3133aa39 -[AVCaptureSession canAddInput:] + 16
    etc.

  40. Jaimin says:

    i have used your code but my problem is that when i capture image it is going to be set in frame size only 640*480 or sometimes 480*640 and when i capturing image and going to show image in image-view at that time the image is going to be starched. how can i make the resolution high as per high resolution ( ex. 2048*1986). and with details of image like(location and other) when we capture image from default camera of iPhone. how can i do this. when user capture image i am giving some time to capture method in high resolution but its not working. in “AROverlayViewController.m” file

    -(void)btnCapture:(id)sender
    {
    [[self captureManager] captureStillImage];
    [[captureManager captureSession] stopRunning];
    [self performSelector:@selector(captureStillImage) withObject:nil afterDelay:2];
    }

    -(void)captureStillImage
    {
    UIImage *image = [[self captureManager] stillImage];

    }

  41. Jeff says:

    The quality of the pictures it takes doesn’t match up to the quality the iPhone 4 camera has. Is there a way for it to use the same quality as the built in camera?

    • Jeff says:

      Forgot to mention, the code is awesome :p, thank you for what you have provided.

    • jjob says:

      Sorry, I don’t know the answer to that. If you find out it would be great if you could share that info back here. :-)

      • Jeff says:

        Is it possible because this code originated for an earlier iOS, and there might be some new documentation for iOS 5+?

      • Jeff says:

        ok well I got some things started, but I can’t seem to get it to work just right, can’t get the finishing touch. This manages to do the same thing as before. I’ve seen another app using code like this, but making it work. If you look and see [captureSession setSessionPreset:AVCaptureSessionPresetHigh];, that is the line that actually tells it to use the High Quality camera. However, it works on the other app but not yours :p.

        There is a warning on one line: Sending ‘CaptureSessionManager *const _strong’ to parameter of incompatible type ‘id’

        For: [videoOutput setSampleBufferDelegate:self queue:captureQueue];

        - (void)addVideoInputFrontCamera:(BOOL)front {

        AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
        NSError *error=nil;
        AVCaptureInput* cameraInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error];

        // Set the output
        AVCaptureVideoDataOutput* videoOutput = [[AVCaptureVideoDataOutput alloc] init];

        // create a queue to run the capture on
        dispatch_queue_t captureQueue=dispatch_queue_create(“captureQueue”, NULL);

        // setup our delegate
        [videoOutput setSampleBufferDelegate:self queue:captureQueue];

        // configure the pixel format
        videoOutput.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey,
        nil];

        [captureSession setSessionPreset:AVCaptureSessionPresetHigh];

        [captureSession addInput:cameraInput];
        [captureSession addOutput:videoOutput];

        // Start the session
        [captureSession startRunning];

        }

        • jjob says:

          I don’t recognize this method from my example. In my example this method looks like:

          - (void)addVideoInputFrontCamera:(BOOL)front {
          NSArray *devices = [AVCaptureDevice devices];
          AVCaptureDevice *frontCamera;
          AVCaptureDevice *backCamera;

          for (AVCaptureDevice *device in devices) {

          NSLog(@”Device name: %@”, [device localizedName]);

          if ([device hasMediaType:AVMediaTypeVideo]) {

          if ([device position] == AVCaptureDevicePositionBack) {
          NSLog(@”Device position : back”);
          backCamera = device;
          }
          else {
          NSLog(@”Device position : front”);
          frontCamera = device;
          }
          }
          }

          NSError *error = nil;

          if (front) {
          AVCaptureDeviceInput *frontFacingCameraDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:frontCamera error:&error];
          if (!error) {
          if ([[self captureSession] canAddInput:frontFacingCameraDeviceInput]) {
          [[self captureSession] addInput:frontFacingCameraDeviceInput];
          } else {
          NSLog(@”Couldn’t add front facing video input”);
          }
          }
          } else {
          AVCaptureDeviceInput *backFacingCameraDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:&error];
          if (!error) {
          if ([[self captureSession] canAddInput:backFacingCameraDeviceInput]) {
          [[self captureSession] addInput:backFacingCameraDeviceInput];
          } else {
          NSLog(@”Couldn’t add back facing video input”);
          }
          }
          }
          }

          Even if I add [[self captureSession] setSessionPreset:AVCaptureSessionPresetHigh]; I get no errors or warnings.

          Not sure if the captured image is any higher quality or not but regardless from your example I don’t think your warning has anything to do with the session preset. It is hard to say without seeing all of your code but the warning is clearly about setting the sampleBufferDelegate.

  42. Jeff says:

    change

    - (void)addVideoPreviewLayer {
    [self setPreviewLayer:[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]] ];
    [[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
    }

    to

    - (void)addVideoPreviewLayer {
    [self setPreviewLayer:[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]] ];
    [[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
    [captureSession setSessionPreset:AVCaptureSessionPresetPhoto];
    }

    and you will get the highest quality photo. : )

  43. Jeff says:

    Hey I’m back : ). I added functionality to switch from front to back camera:

    - (void) rotatePressed{
    NSUserDefaults *prefs = [NSUserDefaults standardUserDefaults];
    if([rotateSetting isEqualToString:@"BACK"]){
    [prefs setObject:@"FRONT" forKey:@"rotateSetting"];
    }else{
    [prefs setObject:@"BACK" forKey:@"rotateSetting"];
    }
    [self viewDidLoad];
    }

    Simple setup, make a button for rotating, set it to this function… then save into the NSDictionary.

    And it works! Almost. What ends up happening is that the capture session is still loading from the previous run. So when you take pictures it takes multiples for however many times you rotate.

    I tried adding:
    if([[captureManager captureSession] isRunning]){
    [[captureManager captureSession] stopRunning];
    }

    at the top of viewdDidLoad but that doesn’t do the trick. I’m probably missing something simple, would appreciate the assistance : ). Hey if you get this working with all the stuff I’ve added you could probably do another version :p.

  44. MV says:

    Thanks very much – came in handy.

  45. Jerilyn says:

    Hi there everyone, it’s my first visit at this web site, and paragraph is truly fruitful in support of me, keep up posting these posts.

  46. Romy Ilano says:

    Wow! Thank you so much.
    As a beginner I was flailing around the internet searching for the code… this is great!

    I can only hope I get good enough to write such helpful tutorials in the future!

    I’m definitely checking out your music as well. Nice stuff!

  47. ATOzTOA says:

    Great tutorial, really helped me… thanks

  48. John says:

    Is it possible to capture a frame image from a recorded or streaming video (mp4, h264, etc) instead of from the live camera?

  49. Frank S says:

    This is an amazingly useful example – thank you.

    In the AR Overlay example, I’m seeing the image preview layer rotate when you rotate the phone. (iPhone 5 running ios 5)
    What is going on?
    how do I stop that from happening?

    Thanks

  50. Manuel Payares says:

    This tutorial is really easy to follow. Thank you for taking the time to sit down and explain the process. I have a question would this be compatible for IOS 6? if so how could I make this app run. Currently it is crashing it is not going to the camera when I run the application. I have sent it to my ipad to test.

  51. Michael says:

    Your code is running smoothly out in the field :)
    However, occasionally one or the other app crashes in
    [CaptureSessionManager captureStillImage]
    in the line with

    [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
    completionHandler:completionHandler];

    with

    [AVCaptureStillImageOutput captureStillImageAsynchronouslyFromConnection:completionHandler:] - inactive/invalid connection passed.

    Interestingly, all of devices where the app is crashing are 3rd and 4th generation iPods and iPads 1.

    Any ideas how to deal with that? Simply checking for
    videoConnection != nil avoids the crash but doesn’t solve the problem…

    • jjob says:

      Right, it is telling you that the videoConnection isn’t valid. You should look at where that videoConnection gets created. Look at what happens on a device that it does work on and then look on a device that it doesn’t work on. Hopefully you will be able to see where the creation of the videoConnection is failing and be able to find a solution.

  52. Khoa Pham says:

    Hi jjob, thanks for good tutorial
    But in CaptureSessionManager.m, the line
    AVCaptureDeviceInput *frontFacingCameraDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:frontCamera error:&error];

    can cause crash if frontCamera is nil

  53. karthi says:

    Why does landscape modes result in upside down still images?

  54. Leo says:

    Hi,

    When I try to take photo with this code, the orientation of the image is wrong.
    In fact, I need to take photo in landscape mode(the device is in landscape but the camera lens should take it like in portrait) and I handle the notification of orientation for that :
    [[NSNotificationCenter defaultCenter] addObserver:self
    selector:@selector(notificationHandle:)
    name:UIDeviceOrientationDidChangeNotification
    object:nil];

    - (void)notificationHandle:(NSNotification*)notification {
    NSString *notificationName = [notification name];

    if ([notificationName isEqualToString:UIDeviceOrientationDidChangeNotification]) {
    UIDeviceOrientation deviceOrientation = (UIDeviceOrientation)[[UIDevice currentDevice] orientation];
    [self.captureManager setOrientationOfPreviewLayer:deviceOrientation];
    }
    }

    - (void)setOrientationOfPreviewLayer:(UIDeviceOrientation )orientation {
    AVCaptureVideoOrientation newOrientation;
    switch (orientation) {
    case UIDeviceOrientationPortrait:
    newOrientation = AVCaptureVideoOrientationPortrait;
    break;
    case UIDeviceOrientationPortraitUpsideDown:
    newOrientation = AVCaptureVideoOrientationPortraitUpsideDown;
    break;
    case UIDeviceOrientationLandscapeLeft:
    newOrientation = AVCaptureVideoOrientationLandscapeRight;
    break;
    case UIDeviceOrientationLandscapeRight:
    newOrientation = AVCaptureVideoOrientationLandscapeLeft;
    break;
    default:
    newOrientation = AVCaptureVideoOrientationPortrait;
    }

    if ([device isDeviceAnIPad]) {
    if (newOrientation != AVCaptureVideoOrientationLandscapeRight && newOrientation != AVCaptureVideoOrientationLandscapeLeft) {
    return;
    }
    }
    else {
    if (newOrientation == AVCaptureVideoOrientationLandscapeRight || newOrientation == AVCaptureVideoOrientationLandscapeLeft) {
    return;
    }
    }

    [self.previewLayer.connection setVideoOrientation:orientation];
    }

    And I save the image into a local path and after that, I try to display it within an uiimageview but the orientation of displayed image is wrong.

    Sorry for my weird english.

    Thanks.

  55. naman says:

    first i tell you its great tutorial .when i use your code its work fine but one problem, when i push this view again and its generate the memory warning and application crash i am not understand what happen.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>