facebook, express Wi-Fi

Facebook is looking to move ahead of Snapchat’s augmented reality efforts by bringing in developers into the loop to help out with the development of its AR experiences and filters with the use of its new Camera Effects platform. Currently, the social networking giant is looking to use this dynamic new platform to build preliminary experiences for smartphones, which can eventually be ported on the futuristics smart glasses or contact lenses — whichever we can develop first.

Talking about the same in an official blog post, the company says,

The Camera Effects Platform turns smartphone cameras into the first AR platform, providing an opportunity for artists and developers to create effects for the Facebook camera.

Facebook CEO Mark Zuckerberg answered the criticism over copying Snapchat’s basic, as well as AR features, but he mentions that it was just a stepping stone for the company. He is not worried about the same anymore and is looking towards developers to build apps and filters with the use of their newly release platform.

He continues to describe that any augmented reality platform revolves around three basic principles — information, digital objects, and enhancements. The first step towards making this is a possibility is the use of a popular AI technique called SLAM (simultaneous localization and mapping). This enables the users to precisely map the location of an object in real-time when they point their camera at an object in real world. It also focuses on depth, which will come in handy when placing objects in the surroundings.

Another aspect would be the capture and reproduction of 3D objects from the real world with high precision, which will enable you to add effects and tweak the settings/aspects of that world in the virtual space. This will only be made possible if Facebook has the state-of-the-art object recognition technology, which it probably does. Zuckerberg says the Camera Effects platform will enable you to tap on objects in the real world and get suggestions of effects based on the recognition of the same.

For example, you could be sitting in a coffee shop and just whiling your time away. You could pull out your smartphone and tap on your coffee mug to surface virtual effects such as hot fumes or another coffee cup to enhance your photos. Also, you could tap on a flower pot with a plant with no flowers to see flowers bloom through your camera lens. You can also call upon the rain Gods to then water the plant — of course, virtually. Nike is employing the same technique to map your face to place a headband and your running stats on the screen.

The Camera Effects platform is being made available to developers in closed beta today. This will enable them to access all the SDKs and special tools that’ll enable them to easily recognize objects, gain access to precise locations, and detect the correct depth of your surroundings. This will be extremely necessary to seamlessly place virtual objects next to real objects in the space. Some partners for the launch of this platform are GIPHY, Manchester United, Nike, TripIt, and others.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.