At Facebook F8 conference, Mark Zuckerberg unveiled his vision for Augmented Reality (AR). Enabling developers to build AR tools for the camera and bring people together in new ways, Facebook launched an AR-powered platform for developers to encourage augmented reality (AR) camera effects.
Many people use the cameras on their phones to write text on images, add digital objects and modify existing things with face filters and style transfers. ““We’re all about extending the physical world online… so AR is going to help us mix the digital and physical reality better,” Zuckerberg said.
“We’re going to make the camera the first mainstream AR platform,” he further added.
To help thousands of developers around the world build unique filters for the camera, Facebook has launched the platform in closed beta version.
“You’re going to be able to swipe to your camera and swipe through the effects: face masks, art frames, style transfers,” he said. “But instead of a few options to choose from, you’ll have thousands to choose from.”
These filters will eventually become available in all of the different Facebook-owned properties including Instagram, Messenger, and WhatsApp and will be available to download for free.
The platform will allow developers to use precise location, object recognition and depth detection to create their effects. Facebook’s camera will be able to recognise specific objects like a coffee cup, and bubble up related effects to users like steam coming off the cup or sharks swimming inside the coffee.
The Camera Effects Platform currently includes two products for developers – Frames Studio and AR Studio. Frames Studio is an online creative editor, now available globally that allows people to design frames that can be used either as profile picture frames or in the new Facebook camera. To use this, you are not required to understand the coding language, you just upload an image, and your name will appear on the Frame’s preview and News Feed posts to give you credit.
AR Studio, now open for beta applications, can be used to create masks, scripted effects, animated frames and other AR technologies that react to movement, the environment or interactions during Live videos.
Developers can also use three inputs to trigger their augmented reality effects: Face Tracker, Sensor data, such as the gyroscope and location, and Scripting APIs to pull in data from other apps and respond to user inputs in real time.
Zuckerberg’s grand vision sees augmented reality as an economic equaliser. He told TechCrunch, “The ability to just have glasses or eventually contact lenses where you can overlay whatever information you want [means] we could put a TV on the wall and have it be a $1 app instead of a $500 piece of hardware that a lot of people can’t afford. So if the glasses cost $500 or whatever it is, you’re saving a tonne of money compared to all the other hardware that you would have to buy.”
His vision also blends with his 6,000-word manifesto.