iOS 7 beta 2 launched on Monday, and 9to5Mac is reporting today that the release includes features that allow developers to detect blinking and smiles in photos. Face detection has been available since the release of iOS 5, but these APIs allow the detection of more specific expressions. How will these be applied to apps and what possibilities could they bring?
I personally can imagine an app that takes multiple photos at one time–like that of the feature apart of the most recent Samsung Galaxy camera software–and detects which are bad by determining if someone isn’t smiling or if their eyes are closed. Overall, Apple bringing more APIs to its camera means better third-party apps and more useful tools. What would you make?