Meta used back-to-school photos of schoolgirls to promote its social media platform Threads to a 37-year-old man, a move that parents called “outrageous” and “upsetting.”
The man noticed that ads encouraging him to “get Threads”—Mark Zuckerberg’s rival to Elon Musk’s X—were appearing in his Instagram feed. These ads featured embedded posts of girls as young as 13 in school uniforms, with their faces visible and, in most cases, their names included.
The images were originally posted by parents on Instagram to celebrate their children’s return to school. The parents were unaware that Meta’s settings allowed the company to use these photos in this way. One mother said her account was set to private, but her posts were automatically cross-posted to Threads, where they became visible. Another parent acknowledged posting the image to a public Instagram account. The photos of their children were highlighted to the man as “suggested threads.”
The recipient told The Guardian that the ads felt “deliberately provocative and ultimately exploitative of the children and families involved.”
The father of a 13-year-old girl whose photo was used said the situation was “absolutely outrageous.” The images showed schoolgirls in short skirts, with either bare legs or stockings. He added, “When I found out an image of her has been exploited in what felt like a sexualized way by a massive company like that to market their product, it left me feeling quite disgusted.”
Meta, the $2 trillion company based in California, stated that the images did not violate its policies. It explained that it recommends Threads by showing publicly shared photos that comply with its community standards and guidelines. The company’s systems are designed not to recommend content shared by teenagers, but these posts came from adult accounts set to public viewing.
The man who received the ads noted that he was only shown promotional posts featuring schoolgirls—there were no boys in uniform—which he felt suggested “an aspect of sexualization.”
The mother of a 15-year-old girl whose photo was used in an ad with a large “Get Threads” button said, “For me, it was a picture of my daughter going to school. I had no idea Instagram had picked it up and was using it as a promotion. It’s absolutely disgusting. She is a minor.” She stated she would never have consented and “not for any money in the world would I let them use a girl dressed in a school uniform to get people onto their platform.”
Her Instagram account, which has 267 followers, usually has a modest reach, but the post of her child attracted nearly 7,000 views—90% from non-followers, half of whom were over 44, and 90% of whom were men.
Another mother, whose post of her 13-year-old was used, said, “Meta did all of this on purpose, not informing us, as they want to generate content. It’s despicable. And who is responsible for creating that Threads ad using children’s photos to promote the platform for older men?”
Meta referred to these posts as “recommendation tools” and stated that public posts can be used for this purpose. A company spokesperson said, “The images shared do not violate our policies and are back-to-school photos posted publicly by parents. We have systems in place to help make sure we don’t recommend Threads shared by teens, or that go against our recommendation guidelines, and users can control whether Meta suggests their public posts on Instagram.”
The 37-year-old Instagram user from London, who received the ads and asked to remain anonymous, said, “Over several days, I was repeatedly served Meta ads for Threads that exclusively featured parents’ images of their daughters in school uniform, some revealing their names. As a father, I find it deeply inappropriate for Meta to repurpose these posts in targeted promotion to adults.”Sign Up
Privacy Notice: Our newsletters may include details about charities, online advertisements, and content sponsored by external organizations. If you don’t have an account, we’ll set up a guest account for you on theguardian.com to deliver this newsletter. You can complete full registration whenever you like. For more details on how we handle your data, please review our Privacy Policy. We use Google reCaptcha to secure our website, and Google’s Privacy Policy and Terms of Service also apply.
After the newsletter promotion, he stated that he had never posted or liked similar images before receiving the schoolgirl pictures.
“To me, featuring this kind of content as trending or popular seems intentionally provocative and ultimately exploits the children and families involved, putting their online safety in danger.”
Beeban Kidron, a crossbench peer and advocate for children’s online rights, commented: “Using school-age girls as bait to promote a commercial service marks a new low, even for Meta.
At every turn, Meta prioritizes profit over safety and company growth over children’s privacy. That’s the only explanation for why they thought it was acceptable to send images of schoolgirls to a 37-year-old man—as bait. Meta is a deliberately negligent company.”
She urged the regulator Ofcom to assess whether the measures introduced this summer, aimed at preventing unknown adults from contacting children, clearly state that “companies cannot use sexualized images of children as bait for unknown men.”
Ofcom’s illegal harms codes, designed to combat online grooming, specify that “children’s profiles, locations, friends, and connections should not be visible to other users.”
Meta’s system allows that if a Threads profile is public, posts from adult profiles may be recommended on Facebook or Instagram “to help people discover, follow, and interact with you.” Users can disable these suggestions or switch their Threads profile to private.
Frequently Asked Questions
Of course Here is a list of FAQs about the issue of Meta using photos of schoolgirls in advertisements designed to be clear and helpful
General Beginner Questions
1 What exactly happened with Meta and the schoolgirls photos
Metas AI advertising system allegedly used publicly posted photos of Australian schoolgirls without their consent to create ads for adult dating services and diet pills which were then shown to men
2 Why are parents so angry about this
Parents are furious because their childrens images were used without permission to create inappropriate ads violating their privacy and potentially putting them at risk
3 Was this done on purpose by Meta employees
No according to reports this was not a deliberate human action It was the result of Metas automated AI systems scraping and repurposing public photos to target ads
4 Which platforms were involved
The issue was primarily reported on Facebook and Instagram both of which are owned by Meta
Intermediate Impact Questions
5 How did Metas AI get these photos
The AI likely scraped the photos from public Instagram or Facebook profiles If an account is set to public its content can be accessed and used by automated systems
6 Is this illegal
It depends on local laws Using someones likeness especially a minors for advertising without consent often violates privacy and advertising regulations Investigations are ongoing to determine if specific laws were broken
7 What has Meta said in response
Meta has stated that the ads in question violate its policies and that it has removed them They attribute the issue to a failure in their automated enforcement systems
8 What are the potential dangers of this happening
It can lead to the exploitation and harassment of minors damage to their online reputation and significant emotional distress for both the children and their families
Advanced Technical Questions
9 How does Metas AI decide to use a photo for an ad
The AI analyzes public content to understand the context It then matches this data with advertiser targets to automatically generate and place ads
10 Does this mean any public photo I post can be used in an ad
Technically yes according to Metas terms of service for platforms like Instagram you grant them a broad license to use your content