Face Detection

Face Detection, detecting someone’s face using your phones camera. We will use iOS native framwork Core Image, a very powerful API built into Cocoa Touch.  In this tutorial, we’ll user Core Image’s face detection features and will learn how to use this in our own iOS apps.

#Note : As you all know Xcode 8 and Swift 3 is officially out. Swift 3 version bring major changes. We will update our tutorials to Swift 3 and all further Swift tutorials will also be in Swift 3.

Initiating Project :

We are using Xcode 8, the UI changes a little you will notice that there are new options like iMessage and Sticker pack application. We will start with SingleView application. Name your app like Face Detection Demo project.

vishalvirodhia.com sswift 3 xcode 8 face detection ios

You can notice a little change in UI. Now we will Name it and save it to anywhere on your machine.vishalvirodhia.com swift 3 xcode 8 face detection ios

Start Coding

Drag drop a UIImageView and a UIButton on your storyboard view controller. Make their outlet. vishalvirodhia.com swift 3 xcode 8 face detection ios

Add CoreImage framework to your project, and ViewController.swift file.

Pick image from Photo gallery, you can check out our tutorial for details here. There is one more thing to do before accessing images, that is updating your Info.plist for accessing your photos. As you know Apple is highly concerned about security of its users, there are some more privacy added in iOS 10. you can update your plist like below. vishalvirodhia.com swift 3 xcode 8 face detection ios

Getting image processed

After picking image from gallery, converts it to a CIImage, using Core Image. We can set Accuracy and type for the detector.

The foundFaces is an array of different faces. You can iterate through all face found, like :

There are some predefined properties we can use like above  which you can use directly.


Highlighting detected faces

We can draw a square rectangle on founded faces :


You observe that the Indicator we made is covering the face but going to other side, thats because Core Image and UIView  use two different coordinate systems. Now we need to change the coordinates  of CIImage in respective ti UIView.

Now we will calculate new bounds for out indicator.

Apply newBounds to the indicator

Now pick image again and see the difference.

vishalvirodhia.com swift 3 xcode 8 face detection ios

Use camera to click image and detect faces. Opening camera instead of gallery from your code with following change. Rest will work same.

So we learned a little about Face Detection. You can modify it according to your requirement. Clone or download complete project from here. Happy coding 🙂

Add a Comment

Your email address will not be published. Required fields are marked *

Time limit is exhausted. Please reload CAPTCHA.