BIMA x ORN: Online Safety & Digital Trust in the Age of AI

11 Feb 2026

When the internet first began, it was full of optimism.

As Andrea reflected in her opening remarks, many believed digital would make the world better by default. Instead, the sector has “sleepwalked into regulation” — responding to harm rather than designing for responsibility from the start.

At our recent BIMA x Online Responsibility Network event, industry leaders, academics and trust & safety specialists came together to explore a pressing question:

What does a trustworthy digital future actually look like in the age of AI?


From Compliance to Collaboration

One of the clearest themes of the afternoon was this:

Regulation alone is not the answer.

Following the rules is necessary. But it is not the same as building trust.

Andrea emphasised that Online Responsibility Network’s ambition is to become a recognisable, customer-facing brand — one that signals to consumers that they are choosing services that are actively working to make digital better.

Trust cannot live in policy documents alone. It must be visible.


The Industry’s Responsibility

The panel challenged a key assumption:

Should regulators define digital trust — or should industry help shape it?

Paul Spreadbury highlighted the need for brands and agencies to work together to create content that is genuinely trustworthy. Where brands invest their media spend matters. As one panellist noted:

“You are essentially partnering with the platforms where you put your money.”

Emily Conyard pointed to student research showing digital fatigue among younger generations:

Community, not just scale, is becoming the focus.

Meanwhile, Natalia Greene reminded the room that Trust & Safety professionals are deeply passionate — but often not heard. Showing what good practice looks like, and amplifying it, is critical.


The AI Reality

AI was not framed as the villain of the story.

As Natalia noted:

“AI is a mirror on society. The modelling is done by people.”

But the risks are real:

Brands were urged to take greater ownership of how AI is used in their ecosystems — and to communicate clearly and accessibly about how algorithms work. Examples like Monzo and Octopus Energy were referenced as brands leading with transparency.


Education, Transparency and Shared Responsibility

Looking ahead five years, the conversation moved beyond technology to culture.

What does trustworthy digital mean?

Students expressed a desire not to disconnect from digital entirely — but to feel confident navigating it.

To understand how AI works.
To spot misinformation.
To know what is real.

That confidence gap may be one of the most urgent challenges ahead.


One Change Today?

When asked what single change they would implement immediately, the panel suggested:

In other words: Transparency that empowers.


Preserving the Best Bits of Digital

Despite the serious themes, the tone of the event was not cynical.

Digital still offers enormous opportunity. Connection. Access. Community.

But preserving those benefits requires intent.

As Andrea closed the session, the call was clear:

Digital trust must be a shared responsibility — designed in, not retrofitted.


What Happens Next?

If you believe industry has a role to play in shaping a safer, more transparent digital future, this is the moment to step forward.

To learn more about the Online Responsibility Network and register your interest in getting involved, click here.

Because building a trustworthy digital ecosystem won’t happen by compliance alone. It will happen when industry chooses to lead.

Artificial Intelligence & Machine LearningBIMA News

Latest news