Has the stale selfie that’s served as your profile picture gone a little too long without a refresh? You’ve likely seen friends using the Lensa AI app to create colorful, custom cartoon images of themselves as ethereal fairies or stern astronauts. Prisma Labs, the company behind Lensa, went viral back in 2016 with a similar (albeit less powerful) app that turned smartphone pics into paintings.
The release of Lensa’s “magic avatars” feature is a global hit for the company. Recent advancements in generative artificial intelligence allow the app to produce more impressive and varied results than its predecessor. According to preliminary estimates provided by Sensor Tower, more than 4 million people worldwide downloaded the app during the first five days of December. In that same time period, users spent over $8 million in the app.
AI-generated profile pictures have always raised questions about digital privacy, however. If you’re curious about whether it’s a good idea to use Lensa, here’s what you should consider before spending money and uploading your selfies.
Before diving in, take a minute to browse through the privacy policy and the terms of use to get a better understanding of what the app does with your data. “We always have to be aware when our biometric data is being used for any purpose,” says David Leslie, a director of ethics and responsible innovation research at The Alan Turing Institute and professor at the Queen Mary University of London. “This is sensitive data. We should be extra cautious with how that data is being used.”
Andrey Usoltsev, the CEO and cofounder of Prisma Labs, claimed the company is working to update the privacy policy in an email to Wired. “Lensa uses a copy of the Stable Diffusion model and teaches it to recognize the face on the uploaded images in each particular case,” writes Usoltsev. “This means there is a separate model for each individual user. The user’s photos are deleted from our servers as soon as the avatars are generated. The servers are located in the US.”
Although it’s impossible to know exactly how a company is using and storing your data without an independent assessment, this statement is a move in the right direction. With that in mind, however, uploads are only a small part of the larger equation.
While biometrics might be your initial concern, it’s also crucial to understand just how much additional data is automatically collected from your smartphone. Lensa may use third-party analytics, log file information, device identifiers, and registered user information to gather data on you. Go to section three of the privacy policy to check it out in detail.
Any user can opt out of that data collection by contacting the company at privacy@lensa-ai.com. If you use an iOS device, you have the option to opt out by going into your privacy settings. To be fair, it’s not just Lensa: Every app on your phone is probably collecting more data than you realize. Even if you decide to trust Lensa with your personal data, it’s quite possible for the data to change hands if the company is acquired in the future. “That especially happens when it goes into bigger companies that are much more adept at bullshitting around how they talk about it,” says Ben Winters, a lead on the AI and human rights projects at the Electronic Privacy Information Center.
It goes against the terms of use to insert images of children or nudity to generate images on Lensa. But even if you don’t input nudes, women may receive hypersexual results. “The app not only generates nudes but also ascribes cartoonishly sexualized features, like sultry poses and gigantic breasts, to their images,” writes . “I, for example, received several fully nude results despite uploading only headshots.” The app generated disturbing imagery when Snow uploaded her childhood photos, changing what would have been stylized mementos into dehumanizing imagery. “Since the feature is not designed for minors, we advise against using any images of children,” writes Usoltsev.
Lensa can also produce sexual images of adults without their consent. “It’s a potential instance where insufficient forethought has been put into protecting the dignity of individuals,” says Leslie. “When technologies can harm, we’re on the hook to do all that we can to anticipate those impacts.”
It’s not just funky fingers and second heads, you may also receive results that are racist or sexist when you interact with generative AI. “The internet is filled with a lot of images that will push AI image generators toward topics that might not be the most comfortable, whether it’s sexually explicit images or images that might shift people’s AI portraits toward racial caricatures,” says Grant Fergusson, an Equal Justice Works fellow at the Electronic Privacy Information Center.
Some artists are embracing the potential for generative AI to produce fascinating results. Others are much more hesitant about the technology’s potential repercussions. “The commercialization of these image generators will have an impact on the ability for artists to keep sort of sustaining themselves in the long term,” says Leslie.
Although it’s a more expensive route, those who are able to should consider commissioning smaller artists to create digital pieces for their new profile picture, phone wallpaper, or portrait. Instagram and Twitter are full of artists who work in a variety of styles that you could never get from generative AI, and many of them are eager for commissions. For a nominal fee, you can get something truly unique and personal. You could even ask around at your community art center and support a local artist.
Let’s say you’ve considered everything above and still decided to spend a couple of bucks to get a pack of “magic avatars.” After you’ve saved the colorful creations, check out the photo- and video-editing capabilities on Lensa. Is this something you want to use often or is it just another app that’ll collect digital dust on your smartphone? “Control the settings, delete after use, and exercise any and all rights that they offer you,” recommends Winters to anyone worried about data collection.
This post was originally published on Wired.