At its Snap Partner summit, the company behind Snapchat announced that it would be strengthening its camera function by adding new augmented reality(AR) features along with search experiences driven by AR. For Snap, which has called itself as a ‘camera’ company for sometime now, the double down on its core function comes as it faces tough competition from bigger rivals like Facebook and Instagram, which have copied most of Snapchat’s core functions to their advantage.
Snapchat’s big advantages still remains the camera, especially with its Lens feature, which has the edge in providing an accurate and a rather whimsical take on AR.
For one, Snapchat will start showcasing and promoting Lenses created by other developers in the community with a new dedicated Creators profile as well. In 2017, Snap had announced its desktop tool called Lens Studio, which would let outside creators releases their own Lenses for the platform.The Lens Studio was a big change from the tightly controlled in-house approach of Snapchat, and the new camera features continue in this direction.
“Close to 400,000 Lenses have been created using its tool, and that these third-party Lenses have been viewed closed to 15 billion times on the app,” revealed Snap’s other co-founder and CTO Bobby Murphy, during the keynote session. Lens Studio works on desktop on both Windows and Mac to let developers create AR experiences.
With the new camera, Snap is adding a new AR bar to the bottom of the camera below the Lens Carousel, which will start rolling out slowly. This bar will showcase all the new AR features, including Lenses of these developers, the new ‘Creators Profile’, and the Scan features which adds more search functions to the camera.
In its Lens Studio, Snapchat is introducing accurate hand, body tracking as well as pet tracking, which will let developers create AR experiences for these kinds of scenarios. The hand tracking for instance, will work with the front and rear camera. A user could point their camera to the hand, and an AR experience will appear, depending on the Lens.
While traditionally Snapchat Lenses need a face to create the AR experience, a user could soon point them to their hand or to a person to get a similar kind of experience. It will also be extended to pets, which has in fact been in much demand from users, given that photographing pets is a big part of the user generated social media content.
The Len Studio will also support a new Landmarker Lens experience. These Lenses will give users the option of experiencing AR on iconic landmarks in the world. The locations that have been chosen for now are Buckingham Palace (London), United States Capitol Building (Washington, DC), Eiffel Tower (Paris), Flatiron Building (New York City), and the TLC Chinese Theater (Los Angeles). Snap Inc plans to add more locations on the future.
When asked whether it would control these Lens experiences for Landmarks, given there could be a risk that someone puts up a Nazi Lens on the Eiffel Tower, Snap executives said this would be a highly curated and carefully vetted feature.
Snapchat is not just stopping at adding more AR experiences, which will come in the form of funny filters, but also introducing more utility functions to the Scan and search feature.
Right now, Snapchat users can “press and hold” on the camera screen to scan and unlock relevant experiences. For instance, a Snapcode will be scanned to unlock new Lenses, special Filters or add friends. Scanning a physical barcode can open results on Amazon in the US or while listening to music, one can scan and get more information about the song via an integration with Shazam.
In addition to this Snap is adding two new partners to the Scan function. One will be Giphy which will show relevant GIFs for users to add to their Stories on a long press. Another will be PhotoMath, which will help with math equations when a camera is pointed at it. Considering Snapchat’s core user base is still younger audiences in countries like US, UK, France, etc, the math feature will find favour with many high school and college students.
Snapchat may have popularised the idea of ephemeral messaging, with its disappearing chats, photos and Stories, but the company has understood that its AR-driven Lenses are not a passive mode of consumption.
According to Snap executives, users are active participants in how they engage with these Lens, which explains why the company is adding more features to them. But what remains unclear is exactly how revenues will work out for all these developers who create Lenses on the platform.
Snapchat has had branded Lenses as well, where a particular brand can pay for an AR-driven Lens. The company says it will encourage brands to reach out to external developers or even partner with its own in-house team of AR experts. But whether it will be enough to turn the funny Lenses into a serious revenue model for Snapchat and external creators is still unclear.
Disclaimer: The author is in Los Angeles at the invite of Snap Inc.