While the Camera app in iOS 8 is only getting a few new features, the Camera application programming interfaces (API) — what developers use to make App Store camera apps — is getting the most significant update in the history of the platform, including and especially manual controls for focus, exposure, and white balance. Not much will change for casual photographers, but for pros and enthusiasts, the best camera we have with us will be getting a whole lot better. So, how does it all work?
Automatic vs. manual
Nokia offers great lenses with optical image stabilization (OIS). They want to capture the best light possible right from the start. Google makes everything awesome on the servers. They never know which device or what quality camera they'll get the data from, they concentrate on finishing strong. Apple, however, has focused on the best custom image signal processors (ISP) in the business. They control not only the software but the hardware down to the chip so they optimize each part to get the best whole.
That's why, with the tap of a finger, the iPhone locks on the most obvious subject in the frame, exposes for the best balance of light and shadow, makes sure white is as close to white as technologically possible, and produces an image that, 9 times out of 10, looks as good if not better than phones with much better optics or server farms can provide.
But what about that 10th time out of 10? What about when the object you want to focus on isn't the most obvious? When you want to force a scene bright or darker for artistic or practical effect? When you want the set a custom white balance?
Just like automatic transmission is the quickest and most reliable way for most people to drive a car most of the time, automatic cameras are the quickest and most reliable way for most people to capture the memories that will matter to them most of the time. For the pros, however, for the artists and experimenters, for those who want to control every aspect themselves — nothing beats full-on manual, not on the road and not on the shoot.
And that's what Apple's providing with iOS 8. The built-in Camera app is getting time-lapse photography and a sun icon you can swipe to change exposure, but developers are getting more. They're getting complete manual control over focus, exposure, and white balance.
Manual focus
Focus means making sure whatever the most important thing is in your photo, whether it's as close a flower petal or as far away as a sunset, is crisp and as sharply captured as possible. Apple has done a lot to make focus "just work" on the iPhone. There's auto-focus, tap to focus, and multiple face detection. All of that is designed so that the camera will sharply capture what it believes is are the most important elements in the scene.
Manual focus control is for when you want to determine for yourself what, if anything, should be sharply captured. Maybe you want the entire photo to be blurry and dreamy, maybe you want to stack focus to avoid any blur, maybe you want to pull focus on a moving subject, or maybe you want to change focus over time.
The manual camera controls let you do just that. Instead of tap to focus, you can do something functionally akin to turning the focusing ring on a traditional camera lens. Hold your iPhone up and frame two objects, one super close, one some distance away. Manually change the focus and watch as the one in front goes from sharp to blurry, and the one in back goes from blurry to sharp.
Manual focus in iOS works on a scale of 0.0 to 1.0, with macro on one end and "infinity" on the other. Developers can lock focus at any lens position to achieve focus along any point of that scale. Apple chose to use a scale rather than distance measures because of the way focus is implemented on the iPhone and iPad.
To change focus, the lens is physically moved via a spring and a magnet. That means that there's bounce when it moves, there's a stretch dependent on the angle of gravity, and changes over time as the spring gets used more and more often. So, telling the lens to move to a certain position can and will produce different results at different times. Telling it you want a certain scale will move the lens to the position required to achieve it, regardless of what that position might be at any given time.
Because even a Retina screen isn't as high resolution as a photo capture — currently 1136x640 vs. 3264x2448 on an iPhone 5s with an 8mp iSight camera — the preview image has to be scaled down. That can make manual focus more challenging. To help compensate for that, Apple is providing ways for developers to show zoomed in previews, to compute their own focus scores, and to highlight sharp areas (focus peaking).
In other words, many of those fancy focusing tools you had on your DSLR are finding their way to your iPhone.
Manual exposure
To determine how bright or dark your image is, you "expose" your camera's sensor to greater and longer or smaller and shorter amounts of light. Normally, in automatic mode, the camera will constantly calculate the best exposure for any given seen so you get the best exposed photo of that scene. Sometimes, however, you might want an image that's surrealistically bright or sullenly dark, an image with minimal motion blur or with a lot of it, and image with as little noise as possible, or as bright as possible regardless of the amount of noise it general. Enter manual exposure.
Exposure is determined by shutter speed, ISO (light sensitivity), and lens aperture.
Shutter speed is the duration of exposure. The faster the shutter closes, the shorter the amount of time the sensor is exposed to light. That means the image will be darker but also have less motion blur will result (because things won't have had much time to move). The slower the shutter closes, the longer the amount of time the sensor is exposed to light. That means the image will be brighter but will have more motion blur (because things will have had time to move more).
Generally you want shorter exposures/faster shutter speed for well-lit action shots, and longer exposures/slower shutter speed for low-light stills.
ISO (International Standards Organization) originally measured how sensitive film stock was to light. Now it means how sensitive the digital camera capture is to light. Low ISO is less sensitive to light which makes for darker images but less noise. High ISO is more sensitive to light which makes for brighter images but with more noise (the result of spikes that occur when amplifying the signal off the camera's CMOS chip).
Aperture is the size of the lens opening. If shutter speed is how long you sip from a straw, aperture is how big the straw is. The bigger the aperture, the more light you can take in while the shutter is open. To date, however, Apple has only shipped fixed-aperture cameras on the iPhone, iPod touch, and iPad. So, manual exposure controls are limited to shutter speed and ISO.
Automatic exposure on iOS tries to ensure a properly exposed image by dynamically changing shutter speed (duration of exposure) and ISO (light sensitivity) based on a constant stream of metering stats it receives from the scene being photographed.
Manual exposure lets you control all that yourself. You could, for example, could choose to minimize noise in a lower-light setting by cranking down the ISO and, if you're stable enough, cranking up the duration. That would give you a better lit, far less noisy image.
Developers can set duration and ISO together or can lock one and only let the other be set. iOS will continue to feed them the metering stats, and supply an offset value that they can use if they want to, but duration and ISO will no longer be bound to it.
Exposure compensation
Sometimes you may want slightly more control than automatic exposure allows, but without the complexity of manual exposure controls. Instead of manipulating duration and ISO, you just want to make an image a little brighter or a little less bright. That's where exposure compensation, also known as exposure target bias, comes in.
With exposure compensation, Apple's automatic exposure algorithms still handle all the heavy lifting, but you get to bias it one way or another to get closer to the look you want. And it works in both continuous and locked modes. So, you can bias the exposure to make a scene brighter, move the camera, and exposure will keep adjusting to keep that level of enhanced brightness. Or, you can lock exposure based on a particular scene and nothing will chance unless you bias the exposure from there.
Exposure compensation is expressed in f-stops. +1 f-stop doubles the brightness, -1 f-stop halves the brightness.
Developers can currently set exposure target biases between -8 and +8 for all existing iOS devices. However, Apple warns that that could change in the future.
Exposure compensation is also the basis for the new adjustable exposure in the iOS 8 Camera app. Tap to focus, get the sun icon, swipe it up to bias exposure and make the scene brighter, or swipe it down to bias exposure and make the scene darker.
Manual white balance
White balance is just what the name implies — making sure the whites (and grays) in your image are as close to white (and gray) as possible. Too cool, and everything looks blueish. Too warm, and everything looks yellowish. In other words, white balance is all about making the colors in your image look as realistic as possible. Why is that hard? Because different light sources give off cooler or warmer light. Incandescent lights are warm and yellow. Daylight is cooler and more blue.
Cameras need to adjust for light temperate by boosting other colors to compensate. For example, if color temperate is tinting a scene blue, the camera software has to boost red and a little green. Under mixed lighting conditions like a blue tinted computer display and a yellow tinted desk lamp, compensating can become more complicated. (See Planckian locus if you're interested in how it works.)
The iOS Camera app handles all this automatically. Traditional cameras often offer automatic white balance as well, along with specific ones optimized for sunlight, cloudy outdoor light, shadowy conditions, incandescent bulbs, florescent lights, and flash photography, along with the ability to set custom white balances.
All of that, and more, is what manual white balance allows.
With iOS 8, Apple is giving developers full control of device red/green/blue (RGB) gains. That includes temperature casts between yellow and blue, and tints between green and magenta. Apple is also providing conversion routines to and from device independent color spaces. That means developers can go to or from the device specific values to x,y chromaticity values or temperature and tint values. That's important when cameras and the RGB gains coming off of them vary from device to device, but apps have to work across all devices.
Developers set red, green, and blue gains all at once in a new struct. Currently, the maximum white balance gain a developer can set on any iOS device is 4, but Apple again cautions this might change in the future. x,y chromaticity, and temperature/tint are also set in new structs. Chromaticity can range from 0 to 1. Temperature is a floating point value in Kelvin, and tint is a green/magenta offset from 0 to 150. Conversion routines don't take into account whether their results are legal color values or not (i.e. can be seen by humans or not), so developers need to check for out-of-range values.
Custom white balances using gray cards is now possible as well. A longtime tool of traditional photographers, a gray card can be invaluable in setting the proper white balance for a scene with mixed or otherwise tricky lighting.
Gray cards are literally cards colored neutral gray that fill the center 50% of the frame. That way the auto white balance can lock onto a known neutral gray value and ignore any colors or reflections that might otherwise bias or misinform it.
For example, if you wanted to take a picture of someone all dressed up in yellow sitting on top of a pile of bananas, it's possible the automatic white balance might mistake sunlight reflecting off all that yellow for incandescent light. So, it might boost the blues to compensate resulting in an image that looks sickly and wrong. Stick a gray card in there, however, lock white balance on the card, and the auto white balance will work to make that gray look gray regardless of any other colors or casts in the frame. You get great looking yellows without them messing up all the other colors in the photo.
Bracketed capture
Bracketed capture allows for bursts of images to be taken, with the option for changing camera values from image to image.
Burst mode on the iPhone 5s is an example of a simple bracket where nothing changes but you make sure you capture all the action from a flip, finish line, or even a baby with eyes wide open.
High-dynamic range (HDR) is the classic example of a bracket with changes. Take photos with the exposure biased to -2, 0, and +2, and then fuse them together pull out detail in both light and shadow.
Combine bracket capture with the new manual camera controls and developers can make apps that do both those things, but also with the potential to do much, much more.
Bar codes, permission requestors, H.264 encoder, and PhotoKit
In addition to the manual controls, a few other features are coming to the iOS 8 camera and audio/video foundation as well.
The camera is gaining support for three new types of bar codes, data matrix, interleaved 2 of 5, and ITF14, as well as global support for camera and mic permission requesters.
Developers will also getting direct access to the hardware H.264 video encoder for real-time capture. Yeah, we're in for some fun.
Then there's the new Photos app and PhotoKit, which ties into the new iCloud Photo Libraries, and gives developers faster performance, read and write access to the library, non-destructive edits, and the ability to delete photos (with permission). And there's photo extensions, which brings App Store filters and transformations into the main Photos app.
In other words, there's a lot. A lot, a lot.
Bottom line
To say this is a major release for photography and photographers would be to greatly undersell it. With iOS 8, Apple is taking the best automatic camera on a smartphone and making a run at the title of best manual camera on a smartphone as well. That Apple isn't including all the new controls in their own Camera app, but is leaving them for developers to implement might even allow for the best of both worlds.
Casual photographers can stay comfortable in the largely automatic, easy and simple to use confines of the Camera app, and developers can make App Store apps that offer those full on manual controls. They can appeal to those pros, those artists, and those experimenters.