https://delivery-p150664-e1601913.adobeaemcloud.com/adobe/assets/urn:aaid:aem:991252f8-30dd-48f0-a0ac-2f1ac27092e6/as/Detailed-perspective-(122).avif?assetname=Detailed+perspective+%28122%29.jpg
alternative text
alternative text secondary
ARTICLE | 3 min read
Ethical AI
Turning intent into impact
false
aiSummary
Summarise with AI
AI summary
/content/shoosmiths/index
Summarise with AI
title
true
Modal title
medium
17B078

AI promises efficiency and innovation, but without ethical and sustainable practices, it risks amplifying bias and environmental harm. Organisations must move beyond principles to practical governance, ensuring fairness, transparency and resource stewardship.

Published: 19 January 2026

Author: Max Finney

Our recent AI governance survey of 200 GCs and senior executives, produced in collaboration with FT Longitude, revealed a critical issue: while AI adoption accelerates, strategy and safeguards are lagging.

That gap matters. In this article we take a closer look at the ethical implications of AI, as many businesses grapple with how to adopt AI responsibly and without risking revenue, reputation or compliance.

The state of play: Where are we now?

AI is no longer a future concept – it’s here, shaping decisions and operations. Yet governance hasn’t kept pace. Our AI governance research shows only a third of organisations rate their AI accountability as advanced, and fewer than one in five have mature ethical foundations. Environmental impact is climbing the risk agenda, with energy and water use from large-scale models now firmly on the radar. Despite this, compliance and privacy still dominate governance priorities, leaving ethics and sustainability trailing.

The one group that does understand AI’s environmental cost is the C-Suite however, with respondents twice as likely to identify it as a key risk than GCs and below. Despite the awareness of organisations’ leaders this is clearly not being translated into practice. This gap between ambition and action matters: without robust frameworks, AI risks amplifying bias, eroding trust and harming people and the planet.

What’s holding organisations back?

The barriers are familiar: limited expertise, resource constraints, and the relentless pace of tech change. Many organisations articulate ethical principles but fail to embed them in day-to-day operations. Confidence in compliance is high, yet without independent audits and transparent reporting, it’s often misplaced. This mirrors challenges in supply chain transparency and carbon footprint reporting – complexity and opacity make it hard to verify reality. Moving beyond policy statements to measurable, accountable practice is essential.

Why it matters

Responsible AI isn’t just about compliance – it’s about shaping outcomes that matter. Algorithmic bias has the potential to entrench existing inequalities, while disparities in access to AI tools risk deepening digital divides. Within organisations, uneven distribution of AI expertise is common, and without inclusive training initiatives, these gaps can become further embedded. Those already disadvantaged may be left even further behind, undermining social mobility and cohesion. To close these divides, organisations must invest in accessible, equitable training programmes that bring everyone along on the AI journey.

Design choices, such as voice, tone and UX, can reinforce stereotypes if unchecked. On the environmental front, data centres powering AI could account for 8% of global emissions by 2040.  Responsible AI practices are essential to ensure organisations are future-ready for a rapidly evolving legal landscape, but beyond compliance are able to demonstrate they are using AI to drive progress, not harm.

Turning principles into practice

Here are some ways that organisations can deliver AI responsibly:

More information

To find out more about the state of AI governance check out our full report or contact one of our team.

Alternatively, join us at our Clean Currents event to explore the evolving intersection of energy development and data centre infrastructure.