I Am More and More Intrigued by OS X Yosemite and Advances for Photographers

I have been absorbing more of the presentations from Apple WWDC 2014, and I'm getting more excited about the possibilities opened up with advances in the imaging subsystems of both iOS 8 and OS X Yosemite.

This isn't about Aperture vs. iPhoto vs. Photos vs. Lightroom. This is about the under-the-hood imaging goodies Apple has been working on. Features that all applications can exploit. Let's face it. Using multiple applications for cataloging and processing your photos is common. As vendors start to hook into these new APIs, it will greatly benefit photographers. If done correctly.

Apps can modify the RAW data stream before rendering

Advances in Core Image

For many years, Apple has had OS X native RAW support. Any application in the OS can render RAW images. At about 36 minutes into this WWDC presentation, this slide really caught my eye. The CIRAWFilter (a subclass of CIFilter).

This shows how 3rd parties can develop custom filters that snap into the existing RAW processing flow, before the image is rendered. Now, imagine 3rd party plug-ins as their own adjustment brick or pane in your native photo editor (be it Aperture or iPhoto or Photos). World class adjustments from the likes of onOne, Google/Nik, etc. working on the RAW data stream non-destructively. I'll get even wilder and envision the user being able to specify the order in which custom 3rd party filters are applied. Simply by dragging and dropping adjustment bricks or panes into different orders.

If the redesigned Photos for Mac in 2015 is done with 3rd party plug-ins in mind, photographers can avoid round-tripping an image. The whole workflow is on the RAW file. In practice, that concept is probably years away. Yet the notion of a round-trip being lighter weight and non-destructive... not so much.

Another bit of goodness from this presentation is extensions in RAW support for much better noise reduction and lens correction. Both of these are much needed. Doing it at the OS level is the right place to do it. Every app benefits. And if any 3rd party wishes to improve upon it, snap in a CIFilter.

Apps can modify metadata, changes reflected in other apps

Apps can modify metadata, changes reflected in other apps

Introducing the Photos Framework

This is an iOS centric presentation, however with a complete redesign of Photos for Mac slated for 2015, I think a lot of what's in this presentation will apply to Mac also. 

About 14 minutes into the video, metadata changes are discussed. The slide it boring and basic, but the ramifications are intriguing. The presentation covers only the basics – setting a favorite photo and organization in albums. The cool thing is changes made in one application are reflected in all others. And in a way that doesn't require cross-application locking. Ok, a little geeky here, but in basic terms there's no races between applications. iOS handles it all.

Now extrapolate this. More metadata like photo ratings, keywords, titles, captions, IPTC, custom fields. Add in iCloud and sync this data to other iDevices. In 2015, to iPhoto on your Mac. The mobile workflow possibilities make me as giddy as a school girl. Ok, well, maybe not that giddy. But giddy. Or at least giddy-like qualities.

Did I say iCloud? At ~35 minutes in, iCloud is front and center. And not just for metadata, for adjustments, too. Adjustments are non-destructive, visible in all apps, and sync'd across devices. Even better, opening up Photos for 3rd party adjustments and effects with the PhotosUI Framework. From within iOS 8 Photos, 3rd party adjustments and filters are available directly in-app. At a high level, it's quite similar to the CIFilter discussed above. 

Apple – if you bring non-destructive editing across applications to the Mac with Photos in 2015 and keep the awesome organizational and workflow tools from Aperture, you've really got me.

An example of a mask and a motion blur effect in iOS 8

Developing Core Image Filters for iOS

This presentation is much deeper than the first two. Lots of code snippets. I've only skimmed through the PDF presentation. What I find interesting is the transforms and filters iOS is providing to application developers. 

There's everything from basic transforms to flip images vertical and horizontally to more advanced techniques applying effects to masks of images. I look at these features in conjunction with a custom RAW filter discussed in "Advances in Core Images" for a RAW image-based, non-destructive workflow.


iOS 8 is going to be very interesting. I am even more intrigued now with the up and coming Photos for Mac. Photography is going mobile. I know I want more workflow capability when away from the studio. I want a better mesh between my iDevice photos and my Mac library. I would be thrilled with an end-to-end non-destructive workflow that includes 3rd party applications.

Apple… keep innovating at the OS level and make rock solid image building blocks. Really engage the 3rd party developers – I'm talking the likes of onOne, Google/Nik, Topaz, and Macphun. Build a Photos for Mac with them in mind. And keep the DAM aspects of Aperture. Do that, and the future is exceedingly bright for you and photographers alike.