For many, technology seems inaccessible. But art makes technology more human.

These are the words the American designer Heather Day wrote in a letter to Facebook (Facebook should use to promote a B2C 3D configurator) in which she hoped to combine her art with Facebook`s Augmented Reality (AR) technology. She never sent the letter, but in a happy turn, Facebook contacted her weeks later with a suggestion of her own: an AR art project for Facebook`s Menlo Park headquarters.

Both Virtual Reality (VR) and AR have made impressive progress over the past 12 months. Customers can already experience things that were only conceivable in dreams a short time ago.

Imagine watching extinct animals or dragons in their park. Or you can open a portal in your room and turn it into a blooming cityshape where you walls are suddenly covered with graffiti. Scenarios like these are no longer fantasy or sciency fiction. Instead, they are examples of what Augmented Reality can do on your smartphone today.

At its core, AR technology is changing the way you see the world around you. Facebook is now taking advantage of the progress of this innovative technology on the smartphone app.

To do this, first create a map of the environment while you are exploring it in real time. The position and orientation of the smartphone camera in relation to this map must also be accurately estimate. This ability to place and lock digital objects realistically to real objects is called simultaneous localization and mapping (SLAM) and is a constant challenge in computer vision and robotics search.

The history of SLAM.

To be able to use SLAM on mobile devices, we needed more than 40 years of research. The first SLAM techniques were published as research in the 1980s and were originally developed for robotic navigation in unknown environments.

Back then, SLAM used expensive or customized sensors such as LIDAR, SONAR or stereo cameras. But with technological progress and the use of modern smartphones, almost all of which today include at least one camera and a gyrosscope and acceleration sensor, they are now available to everyone. Today, SLAM is not only used to place objects in a scene, but also for a variety of other applications, including self-propelled cars, robotic vacuum cleaners and minimally invasive surgery.

Mobile SLAM on Facebook.

Our Applied Machine Learning (AML) team, which picks up on the latest advances in AI research and translates them into an infrastructure for new products, used the initial work of Oculus in its Computer Vision group to develop and deploy SLAM on a large scale. There were three major technical challenges along the way.

An algorithm tailored to each device.

Facebook`s SLAM library integrates functions from multiple systems (ORB-SLAM, SVO and LSD SLAM), but what really sets it apart is the performance optimization of the library to the last application. A SLAM system that can run at 60 Hz on mobile devices is difficult: every 16 milliseconds , your phone must take a picture, find hundreds of interesting key points, match them to the same points in the previous frame, and then use trigonometry to determine where each of these points is in 3D space. Afterwards, many fine-grained optimizations have to be made and how these algorithms actually work has to be constantly reconsidered.

In addition, the challenge of using mobile SLAM in the Facebook ecosystem is that the cmmunity uses a variety of mobile devices. Facebook wants to support as many users as possible, so it is part of the effort to ensure that the SLAM implementation is downward compatible.

For an exmaple, see the device calibration requirements. Both iOS and Android smartphone models have unique features, but Android is particularly versatile and there are thousands of models with different hardware capabilities. Each model has a different camera calibration of focal lenght, main point and distortion parameters, so 3D points can be projected into the camera room with sub-pixel accuracy.

Mobile devices also feature rolling shutter cameras with autofocus and -exposure, which must also be taken into account. Since the camera focuses on things that are closer and further away, this calibration changes. The IMU ( inertial measurement unit that tracks the acceleration and rotation of the device) also needs to be calibrated, and the camera and IMU clocks need to be synchronized. Facebook starts with a rough calibration for each model and over time tunes it to the specific devices of the users.

Search by binary size.

The Facebook app is already one of the more complex apps in the Android or iOS App Store and Facebook is constantly working to add exciting new features to the app while keeping the overall size as small as possible. The original SLAM library was developed at Oculus for a different use case and was about 40 Mb in size because it used several large open source libraries. Facebook extracted the minimal SLAM functionality that made the work possible and reworked it to use popular Facebook librariesm resulting in a library size of less than 1 Mb.

Using a credible technique.

A compeling mobile AR requires more than just SLAM. Facebook began researching the first prototype last November to place 3D art on SLAM`s reconstructed surfaces. Since then, Facebook`s UX research into the most intuitive gestures has been in full swing to place and replace art, change art, and rotate/move/zoom art after it has been placed so people can precisely frame their compositions with their mobile devices. Facebook also explored how to recognize specific locations to place AR content and how to analyze scene geometry to make virtual objects adhere to real surfaces.

In order to create a better user experience, Facebook also had to take into accountv the failure modes of the respective technologies and develop alternative solutions. To this end, Facebook has developed the WorldTracker API, a comprehensive interface that combines SLAM with other tracking technologues to “place things in the world”. The current version of World Tracker alternates between SLAM and a gyroscopically enhanced image-based tracker to place things in the world when SLAM is not sure where they are.

Facebook`s first AR-driven art project with Heather Day.

After Facebook developed these basic tools, it was time to work with a designer to learn new techniques to help Facebook make AR authentic and part of everyday life. Facebook invited Heather Day to the Menlo Park Campus, where her artworks were virtually installed. Every time she poured paint, made a brush stroke, or made a different movement, the AML team recorded these movements with the camera and added them to a digital library.

The AML team worked with Heather to determine what images should be given to the animators and what movements they should perform in the living, breathing AR installation. Within two weeks, she built a technology that recognizes the specific location of the art and analyzes the scene geometry that makes Heather`s virtual installation adhere to real surfaces.

For privacy reasons YouTube needs your permission to be loaded. For more details, please see our Datenschutzerklärung.
I Accept

At Facebook`s F8 developer conference this year, the audience saw Heather`s art flow in rhythm from the walls to the floor like a waterfall. Through SLAM technology and Heather`s creative know-how, the boundaries between virtual and real could be broken and an insight into the interweaving of technology and art could be given. This is our vision, to enrich everyday life with the possibilities of the virtual, digital ecosphere.

Outlook.

AR enables unlimited possibilities to deal with and experience the world. While Facebook has come incredibly far in improving AR technology, there is more to do. The next step for Facebook is to create even more geolocalized and persistent experiences, like what was already built for Heather`s AR installation in Menlo Park. Facebook is also exploring how the power of deep neural networks and Caffe2 can be combined to create more complete SLAM maps, edit dynamic objects, add semantic information and create persistent AR experiences that are deeply integrated into the Facebook ecosystem.

Thank you for reading.