With Snapchat’s growth, the way AR lenses and filters are utilized has altered. The app initially just featured basic AR filters, but it rapidly became popular since it enabled new kinds of social media participation.
Snapchat, on the other hand, did not invent AR lenses as we know them until 2015. Continuing the trend, Snapchat announced Lens Studio in December 2017, allowing users and marketers to build custom filters and apply them to personal snaps as well as sponsored material.
Two years later, in 2019, Snapchat improved its AR lenses capabilities by allowing users to utilize augmented reality to change dogs, hands, bodies, and renowned places all around the globe.
Snapchat users, for example, may now utilize the “Ground Transformation” function to turn their floors into lava. According to Forbes, this function will be the next to be used by businesses, since any terrain can now be transformed into a branded environment.
Snapchat lenses have progressed from a simple form of amusement to a full-fledged social environment and digital economy that allows users to communicate and engage with one another.
Users may now make money by designing bespoke filters and selling them to the Snapchat community and companies some designers charge up to $30,000 for AR glasses.
The technology behind Snapchat filters
Snapchat’s AR filters initially gained traction when the firm paid $150 million for Looksery in 2015. As a result, the in-house team was able to create over 3000 AR filters for consumers to pick from.
Looksery was a machine vision-focused Ukrainian firm that created an app that allowed users to change their face look during video chats. Looksery’s technology led to the development of modern-day Snapchat AR lenses and filters.
Snapchat lenses became widespread less than a year after the purchase, with numerous celebrities, like Jesica Alba and Ariana Grande, experimenting with AR effects including dog masks, bread faces, and golden goddess glasses.
During the 2016 Oscars, celebrities used Snapchat’s “Face Swap” filter to trade faces with Leonardo Dicaprio, resulting in one of the most well-known viral campaigns.
The technology underlying the filters and lenses developed with the Snapchat app. These characteristics are now made available by a variety of AR mechanics, advanced computer vision algorithms, and neural networks.
Face recognition and tracking technologies are used in the majority of filters. Let’s look at how it all works in more detail.
Snapchat face recognition and tracking technology
Computer vision technology is used for face identification and tracking. But, precisely, what does it imply? Our faces are represented by computers as a series of 1s and 0s that correlate to various facial features such as brows, noses, and foreheads.
Some parts of our features are darker, such as our eyes, while others, such as our cheeks, are lighter.
Computers can recognize faces from other things when similar combinations of 1s and 0s — signifying brighter and darker regions — are continuously detected inside the same coordinates when scanning a picture with the camera.
See below to learn more about the Chad filter. You may get this lens for your Snapchat account in two ways.
To unlock the Snapchat lens on your device, open Snapchat on your phone and use the Snapchat camera to see the snapcode picture above. Hold your finger on the camera screen to unlock the Snapchat lens.
If you’re on a mobile device, you may unlock the lens in your Snapchat app by scanning the snap code below.
How Snapchat adds filters to our faces
We now understand how a computer can detect a face in a photograph. But how does it give it a canine nose, and how does it stay on our face as we move throughout a video? A computer must do more complicated computations based on the Active Shape Model in order to apply a face filter (ASM).
People had to physically label the face boundaries on several photographs to produce a dataset that computers used to reach inferences in order to train computers in facial recognition based on this statistical model.
In the end, a computer considers all of the points on your face to figure out which portions of a picture belong to your ears, eyes, nose, and other facial traits.
A computer may use this information to generate a 3D mask of your face that can be resized, rotated, and manipulated as new data from your camera comes in. Similarly, our 3D face tracking technology can instantaneously detect a real face in order to discover AR filters, try on goods, virtual cosmetics, and much more.
Unlike other face identification and tracking algorithms, ours supports 90-degree face rotation, head tilts, and partial facial occlusion in addition to frontal face recognition. As the user moves through a video flow, the filters and lenses follow his or her face.
Some of the best Snapchat filters explained
Let’s take a look at some of the most popular Snapchat filters to understand what they accomplish and how the technology behind them works.
This filter combines a number of Beauty AR technologies and characteristics to blur your skin, expand your eyes, and make your features look smoother and more delicate. It is well-liked by users since it encourages people to create and share information.
According to Forbes, 70% of shoppers say it’s difficult to purchase clothing online, and retailers lose $550 million due to returns. That’s why physics-based try-on technology, which allows for accurate item depiction, has become popular among both businesses and customers.
Farfetch, Prada, Dior, and others leverage Snapchat as a consumer interaction tool, providing filters that enable users to visually try on things such as cosmetics, eyewear, jewelry, and clothing. Dior lenses for shoe try-on garnered 2.3 million views and a 6.2x return on ad budget during one advertising campaign.
When snapping images or recording videos with your smartphone, our technology recognizes faces automatically and enables you to flip between them.
All of your expressions will immediately display on the swapped face when the faces are exchanged. Faces are turned into filters, which means they’re applied using the same Active Shape Model as the rest of the lenses.
The “Time Machine” is a feature that employs a neural network that has been trained to age faces. By sliding a slider left or right, users may seem younger or older. When it was first released in 2019, it went viral and is still one of the most popular Snapchat filters today.