December 13, 2023

Interview: What Does Tech Justice Look Like In The UK And Beyond?


What opportunities are there for shifting power towards the most racially marginalized?

In a world as increasingly digitized as ours, there are urgent questions arising about centralized power, corporate accountability, and the impact on individual freedoms. 

An upcoming research report, “What Does Tech Justice Look Like In The UK?” explores Tech Justice and opportunities to empower the most racially marginalized.

The research, funded by Catalyst and supported by the Engine Room, was carried out by a team of British women of global majority descent. 

Their backgrounds span West and East Africa and South East and East Asia, and they all have lived experiences of racial marginalization.

Siana Bangura, who authored the report, spoke to POCIT about Tech Justice.

Bangura is a writer, performer, campaigner, and community organizer, co-creating networks and ecosystems out in the field.

The dark side of digital

 A digital enthusiast and early adopter of many social media platforms, Bangura used these platforms to amplify her voice as a young Black woman.

However, as she delved into campaigning, it became clearer that there was a darker side to the digital world. 

“I really got to see how certain voices were punished punitively for having certain views not aligned with those in power. I started to see censorship and those kinds of things,” said Bangura.

“It came to the point where it can often make you pull away from engaging, but you can’t run from engaging with digital. So it’s very much like how do you use it to your advantage?”

After leaving campaigning organization, Campaign Against Arms Trade (CAAT), she was drawn to work at Catalyst due to their ambitions of using digital to make systemic change.

While a producer at Cayalst, Bangura noticed what felt like a mismatch between the intentions to do this systemic work and how they were actually going about doing it.

“I think a lot of that was to do with just the structure that we were in. So I had to shift out of that particular structure in order to do the research.”.

What is Tech Justice?

Tech Justice in the UK is part of a global conversation that is happening in the US, Europe, and the Global South.

“The US, I think, by necessity because of the way that social justice issues manifest, and the resistance that has also manifested in response to that, is a bit more ahead in this conversation,” Bangura told POCIT.

Bangura and her team aim to connect the dots and understand what this conversation if happening at all, looks like in a British context.

While there isn’t a singular definition of Tech Justice, Bangura detailed its core elements.

“When we talk about tech justice, we are talking about the digital landscape, technology, and social justice.” 

“We are talking about technology being designed, birthed by, and used by particularly global majority people, folks who are marginalized and then having a proactive role in design from start to finish, rather than having it be done on to them.”

Tech Justice may also be commonly referred to as Algorithmic Justice, Data Justice, Digital Rights, Data Ethics, AI Ethics, AI for Good, or Tech for Good.

It challenges the norms of whiteness and how technology replicates its biases against marginalized peoples.

The idea is to ensure that the digital world and the digital landscape aren’t just in service to big corporations but also to the people. 

Who Holds The Power?

The research by Bangura and her team concluded that the UK has a long history of power being concentrated in the hands of the few and not equally or equitably distributed amongst and across society.

The digital world reflects this, holding an online mirror to our offline lives.

“It’s all about power, isn’t it?” said Bangura.

“All of these spaces are spaces of power, and more often than not, we see white men in those spaces. Digital, like anything, can be used for good, wickedness, and evil. And it’s all about who does it and who designs it.”

Bangura stated that it’s important to consider what biases and isms are built into it, noting that the design often does not serve racially marginalized groups.

Tech replicates what humans do and think, as exemplified by biases in generative AI. However, Bangura said that it doesn’t mean these things are bad, but the problem is who has access to it and who it is working for.

“The digital world and the offline world are not separate. I think sometimes we’re encouraged to see them as like these two separate things,” she said.

“But we know that the digital world has a real impact on the real world and vice versa.”

Why is Tech Justice needed? 

The state’s use of digital tools for monitoring and tracking migrants and police surveillance in public spaces and in schools has also raised concerns about how this data is used and by whom. 

“We are basically seeing states be able to commit acts of violence on other groups of people with seemingly no consequence,” said Bangura.

 “We are seeing surveillance and the harm of that. We are seeing that in real-time, particularly what that looks like on protesters, for example.”

This technology use is happening without clear evidence of benefits or comparisons with existing systems, the report states.

Minoritized groups – primarily by race and those in lower socioeconomic circumstances – bear the brunt of the impacts of emerging technologies, a lack of regulations, and limited accountability, Bangura explained.

“The biggest thing is people in power acting with no consequences, so people feel like there is no way to hold them accountable for the abuses,” added Bangura. “And, of course, there are ways to hold people accountable.

For example, there have been calls for tech companies to address the online abuse facing Black women in digital spaces and for lawmakers to create tech policies that center racial justice. 

There have also been several lawsuits seeking to hold tech companies accountable for their role in violent racist attacks, like the Buffalo massacre and the treatment of their ‘ghost workers’.

In the UK, dozens of politicians, race equality groups, and human rights organizations joined forces to call for an “immediate stop” to live facial recognition surveillance.

While there doesn’t appear to be a Tech Justice “movement” in the UK yet, it is clear there is a growing concern about whether technologies are being used justly. 

A Global Conversation

Ultimately, the question of Tech Justice is a global one the report found, due to the interconnected nature of our offline and online worlds.

“You can’t quite divorce it. Anything that really applies to the UK has a lot of application elsewhere. Ultimately, we’re thinking about the fact that all of this is connected,” said Bangura.

“Even though it’s very important for there to be a specificity when it comes to the UK, this work is international and global. Just like our struggles are interconnected, so is tech.”

Since Brexit, the UK has taken what can be described as a sector-based approach to regulating technologies such as AI instead of having a dedicated piece of legislation.

How this will impact the deployment of AI in the UK compared to elsewhere, like the EU, is yet to be seen.

“The next steps are making sure we’re connecting the dots always with our siblings in places like the US and across Europe and folks in the Global South who are already thinking about this stuff, who are really living it and making sure that whatever they’re learning is practically applicable to our context,” Bangura concluded.

The “What Does Tech Justice Look Like In The UK?” report will be available at

Sara Keenan

Tech Reporter at POCIT. Following her master's degree in journalism, Sara cultivated a deep passion for writing and driving positive change for Black and Brown individuals across all areas of life. This passion expanded to include the experiences of Black and Brown people in tech thanks to her internship experience as an editorial assistant at a tech startup.