


01/10/2025

See you all next month
Identity, Privacy, and Power — the Unseen Costs of Age‑Verification
When protecting children is the law, very few people are willing to stand in the way. That’s the driving rationale behind the UK’s new Online Safety Act, which from mid‑2025 begins requiring what Ofcom calls “robust” age‑verification systems for websites that host adult content — be they pornographic sites or social media platforms allowing user generated sexual or mature material. You can no longer simply check a box saying “I am over 18.” You must prove it. Via credit cards, photo‑ID, facial recognition, selfie checks. And for adult creators, this new regime changes the game in profound ways. (BBC)
It is, on its face, a worthy goal. Research shows that many minors are exposed to explicit content online well before they turn 18. Some data suggest exposure begins around age 13, and even earlier in many cases. (BBC) If one believes that early exposure to explicit material can be harmful, imposing age checks seems a reasonable safeguard.
But digging deeper, several tensions emerge—between protection and privacy, regulation and free expression, safety and surveillance. For adult creators, these tensions are not theoretical: they implicate livelihood, creativity, autonomy.
What’s Actually Changing
Here are some of the concrete provisions:
-
Providers of pornography (including platforms with user content) must implement measures such as photo‑ID matching, credit card or bank checks, mobile operator authentication, or digital identity services. (BBC)
-
Self‑declaration (just clicking a button) is no longer considered acceptable or sufficient. (BBC)
-
Adult content must be blocked or blurred until age is verified. No more homepage previews or “just one click” access without verification. (SEXTECHGUIDE)
-
Platforms like Reddit are now using third‑party verification services (e.g. Persona) to process selfies or IDs, promise limited data retention, etc. (BBC Feeds)
What This Means for Adult Creators
1. Barriers to Entry and Increased Costs
Creators of adult content may face higher friction both in being allowed to distribute content and in their audience being able to access it. Platforms will need to invest in verification technology, vet identities, ensure compliance with data protection laws. These costs are likely to be passed on — whether in reduced platform share, higher fees for creators, or fewer small creators being able to participate.
2. Privacy Risks and Data Exposure
Even though many platforms promise that IDs or selfies will be handled securely, retained only briefly, etc., the risk remains: leaked data, misuse, or scope creep. Creators often value anonymity or minimal exposure, especially in more conservative social or legal contexts. These rules force many to choose: comply (and expose sensitive data) or suffer reduced visibility / being locked out.
3. Chilling Effect on Expression
Defining what is “adult content” or “mature content” isn’t always clear. Some creators might tread close to the line — erotic art, nudity, sexuality — which may get caught up in broader “mature content” filters. The new regime could discourage creators from pushing boundaries or exploring themes that previously fell in legal grey zones out of fear of being shut out or having to verify every piece of content.
4. Disadvantage to Smaller Platforms / Crew members
Big platforms have more resources to comply; small or independent creators (or smaller platforms) may struggle. If compliance is expensive or technically complex, it may consolidate power in larger platforms — the ones that can afford secure ID services, legal teams, data privacy infrastructure. Creators working outside those platforms may be marginalized.
5. Impact on Audience Reach
Even if creators comply, their audiences may be deterred by the requirement to provide ID or use verification services. Some consumers will refuse on privacy grounds; others may be blocked by technical or socioeconomic barriers (lack of credit card, no acceptable ID, etc.). That in turn reduces potential revenue, engagement, or visibility for creators.
Broader Concerns and Ethical Trade‑Offs
-
Privacy vs Protection: The state has a legitimate interest in protecting minors, but at what cost to civil liberties? Biometric data and ID checks aren't neutral: they carry risks of surveillance, data breaches, and misuse.
-
Effectiveness vs Evasion: No system is foolproof. Young people may use fake IDs, VPNs, or other means. Regulation can reduce accidental exposure, but determined access may still happen. Is the good-done worth the burden imposed on all users and creators?
-
Slippery Slope toward Digital Identity: Once proving identity becomes normalized for accessing certain content, it might expand to broader surveillance or identity demands in other contexts. Some fear such laws normalize a “you must show papers” culture online.
-
Equity and Access: Who has valid ID? Who feels comfortable giving personal documents to third‑parties (especially foreign or private companies)? There can be a class and global divide: migrants, people in poverty, or from marginalized groups may lack stable IDs, credit services, or trust in verification platforms.
Where the Balance Should Lie
If these rules are here to stay, how might that be shaped to better protect both children and creators?
-
Strong oversight over third‑party verification services: transparency about data use; limits on what is stored; audits.
-
Clear, narrow definitions of adult/mature content; safe room for creators doing work of art, education, commentary, etc., without blanket restriction.
-
Accessibility alternatives: a creator or user with limited ID should have options; maybe non‑document‑based verification; low‐barrier paths.
-
Support or subsidies for smaller platforms/creators to comply without being swamped by cost.
-
Regular review of whether rules are working as intended; whether harms are reduced; whether privacy incidents are rising; whether creators are being unfairly penalized.
Final Thoughts
In many ways, the UK’s Online Safety Act addresses a gap that was long overdue. That children can easily stumble into adult content is clearly a problem, and something society does have a duty to address. However, the way we protect matters as much as what we protect.
These new age verification rules are not benign. They reshape the relationship between creator, platform, and audience; they impose identity demands in spaces where anonymity or pseudonymity might have been central; they risk privileging large platforms over small ones; they risk normalizing state/third‑party access to personal biometric or identity data.
For adult creators, the message is clear: if you want to work in this space, the terrain has shifted. It is no longer enough to produce content; you must navigate compliance, identity, privacy, trust.
The greater hope is that the regulation finds the right balance — one in which the safety of minors is upheld without unnecessarily infringing on the privacy, creativity, and agency of adults. If that balance is missed, we may protect some from harm only to inflict others in the name of safety.
Editor & Photographer
Struthers
Eugene Struthers




