When coders dug through Apple’s beta versions of iOS5 they found what were deemed to be “highly sophisticated” API systems that let an iPhone automatically track eye positions and mouth positions (so the angle to the user, and possibly where their attention is being directed could be calculated) as well as passing key data on to a face recognition algorithm that would be accessible to all apps…not just Apple’s own ones. The tricks are definitely borrowed from a firm called Polar Rose, which Apple bought relatively recently…

Advertisement

Leave a reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: