Social media consultant Matt Navarra posted screenshots of Instagram asking for a video selfie, showing multiple angles of the user’s face, as a way to verify the person’s identity.
Bots have long plagued Instagram, leaving spam messages and harassing people — or fabricating likes and followers — and Meta (formerly Facebook, Instagram’s parent company) may be utilizing this verification feature to tackle the bot menace.
XDA Developers reports that the company initially tested the feature last year, but reportedly encountered technical problems.
Multiple users have reported being asked to take a video selfie to verify their existing accounts.
Read More: $310M SNAP Benefits to be provided by Health and Human Services
In his tweet, Matt Navarra criticized Instagram for not adhering to their promises made to their users by not collecting biometric information.
Bettina Makalintal, another Twitter user, posted on Twitter a screenshot of the instructions for taking the video selfie – it reminds you that you need to view “all angles of your face” to verify your identity and shows that multiple users are seeing the verification screen.
“why the fuck is Instagram making me take a video selfie in order to access my account,” Bettina said on Twitter.
There’s no official information available about whether this feature is a test or is being rolled out gradually.
In a tweet, Instagram said that accounts with suspicious behavior (like following a lot of accounts in a short time) might be forced to do a video selfie.
Instagram did not comment on whether everyone would eventually have to do a video selfie.
Instagram also added that the feature does not utilize facial recognition and that the team reviews the videos submitted.
Read More: Biden Planning to Impose Vaccine Mandate on Small Businesses. Will You Get Impacted?
“One of the ways we use video selfies is when we think an account could be a bot. For example, if the account likes lots of posts or follows a ton of accounts in a matter of seconds, video selfies help us determine if there’s a real person behind the account or not.” Instagram stated.
Since Meta recently announced that one of its features for Face Recognition would be shut down, the move may come as a surprise to some.
In actuality, however, Meta’s use of facial recognition as a whole has not been terminated; it only shut down one particular Facebook feature in the AI recognition system.
For users who have concerns, the video will be deleted after 30 days and no face recognition is used to determine the video’s owner.
Some users who are already distrustful of Facebook may not be reassured by Meta’s promise not to store or post the data.
In September, Instagram users can recall how attackers had access to their private info such as birthday information with just a message.
Instagram had not guaranteed to delete the birthday information as it did with the video selfie.
People might feel uncomfortable with providing that information, though it’s hard to fault them.