
Conscious use of AI: Practical tips and thoughts from environmental researchers
Artificial Intelligence (AI) is becoming more common in everyday life, and there are growing concerns about its impacts on our changing climate.
We asked scientists at the National Centre for Atmospheric Science their thoughts on how to use AI in conscious ways for scientific research. Scroll to the end of the article for quick tips and recommended resources for conscious AI use.
It is an exciting time to do atmospheric science with AI, yet it is happening at an overwhelming pace, alongside major ethical questions. We don’t have all the answers, but we wanted to share our current understanding and some good practice tips to help others.
– Dr Ioana Colfescu, machine learning climate scientist at the National Centre for Atmospheric Science and University of St Andrews
Why think consciously about AI use?
AI tools like ChatGPT, Copilot, and others can significantly boost productivity – helping with everything from writing code and summarising reports to drafting presentations. But behind their convenience lies a substantial environmental footprint.
Early studies estimated that training some large language models emitted hundreds of tonnes of carbon dioxide equivalent, comparable to several cars’ lifetime emissions, but actual impacts now vary widely by hardware, data centre, and renewable mix. Even daily use, known as “inference,” adds up quickly when deployed at scale. In addition to energy, AI systems use water to cool data centres, sometimes hundreds of thousands of litres for very large training runs, up to millions of litres per day at campus scale. These are often located in regions already facing water stress.
Dr Kieran Hunt – research fellow in tropical meteorology and AI at NCAS and the University of Reading
These hidden costs are rarely visible to users but are critical to consider when thinking about AI’s overall impact.
That said, AI can still deliver environmental benefits if used intentionally. When it replaces more carbon-intensive processes or reduces the time spent on repetitive tasks, the net impact can be positive. As Ag Stephens at the Centre for Environmental Data Analysis and National Centre for Atmospheric Science puts it: “Using AI can be like using a super-efficient assistant. If it means spending less time and energy on routine work, the environmental impact might actually be lower overall.”
However, conscious use of AI also means grappling with complex ethical and social implications. The true environmental cost may never be fully measurable in simple terms, but thoughtful use hinges on trade-offs, transparency, and purpose.
“Because AI models learn patterns from their training data, any biases in those data, such as under-representation, historical prejudice, or culture-specific norms, can end up embedded in model output. Even if developers are careful, bias can arise at other points in the full machine learning pipeline e.g. data collection or labelling,” explains Dr Kieran Hunt.
For example, a study of commercial vision models found error rates of 34.7% for darker-skinned women against only 0.8% for light-skinned men. The effect in large language models, which are trained on huge, but somewhat opaque, web corpora is that they reproduce dominant viewpoints and marginalise low-resource languages or communities. This impacts our research, because biased systems can miss relevant sources and over-privilege other sources.
Beyond climate concerns, there are questions about fairness, accountability, and the future of work. For example, what happens when AI is used to write job applications, make hiring decisions, or assess performance? As automation evolves, how do we ensure we’re supporting the people whose roles may be affected?
“The working landscape will look different in 5–10 years. Research organisations like NCAS should attempt to plan for this in a way that looks after and supports its people,” adds Ag Stephens.
Systemic considerations: sustainability starts with policy
While individual actions matter, systemic change is essential to reduce the environmental impact of AI at scale. One key area is procurement. Universities and research labs can push for accountability and transparency in how they purchase cloud computing and AI services.
Sustainability criteria – including energy use, carbon intensity, and hardware lifecycle – should be part of procurement frameworks. By favouring providers who use renewable energy, publish location-based emissions, or offer transparent reporting, institutions can help drive industry-wide change.
It’s also worth reframing how we adopt AI: is it genuinely the best solution to the problem at hand, or is it adding unnecessary complexity and emissions without a clear benefit? These are important questions to ask before defaulting to an AI-based approach.
Using AI responsibly
Only use AI when it clearly improves the quality or efficiency of your work. Ask whether the task truly benefits from automation, or whether using AI is simply the path of least resistance. Conscious use means choosing AI purposefully, not by default.
Where possible, choose smaller or open-source tools that are designed to be more energy-efficient. Many lightweight models can handle routine tasks like summarising or categorising with far less environmental impact than large-scale language models.
Always log what you use and how you use it. Just as with any other scientific tool or method, keeping a clear record helps ensure transparency, reproducibility, and reflection on your choices.
Cross-check AI-generated results, especially when working with facts, data, or scientific content. Even advanced models can produce confident-sounding but incorrect information, so it’s essential to verify outputs using trusted sources.
Rather than asking about a topic, I find it more reliable to ask AI for sources. For most scientific searches, I still prefer Web of Science because it feels more reproducible.
– Professor Martin Juckes, senior climate data research scientist at the National Centre for Atmospheric Science and University of Oxford
Don’t forget to consider hidden costs, such as the energy and water used in data centres, and broader ethical implications like labour, bias, and job automation. These may not show up in your workflow, but they are part of the real-world impact of AI.
Doing science is expensive (in £ and Carbon), but we invest those resources because we believe the return on investment is a net good. Using AI can be the same. Deciding what is acceptable use is non-trivial.
– Professor Bryan Lawrence, weather and climate computing researcher at the National Centre for Atmospheric Science and University of Reading
Finally, a simple but powerful question to ask yourself is: “Would I do this differently without AI?” If the answer reveals shortcuts that compromise quality or accuracy, it may be worth rethinking your approach.
Green AI and time-sensitive computing
A growing movement known as Green AI advocates for transparency and efficiency in the energy use of artificial intelligence. The aim is to shift away from “performance at any cost” toward research and tools that prioritise sustainability and fairness.
One practical idea is to include energy and environmental impacts in AI ethics checklists or project approval processes, alongside privacy, bias, and data security. This ensures sustainability becomes part of the conversation from the start.
A practical and easy way to reduce the environmental impact of your personal use of AI is to use effective prompts. Poorly constructed prompts, that then require you to resubmit additional detail, are particularly wasteful. Prompt engineering is the skill of crafting clear, effective inputs to get the desired outputs from AI – we have to learn how to ask the right question in the right way. Developed as part of the open access CoDesignS AI Framework (2024), ROCKS is a practical, easy-to-remember method to writing effective prompts:
- Role: Identify your role.
- Objective: State your objective.
- Community: Specify your audience
- Key: Describe the tone or style, and any related parameters
- Shape: Note the desired format of the output
Another emerging idea is time-sensitive use. If you’re running compute-intensive tasks, such as training a model or processing large datasets, doing so at times when renewable energy is more available (for example, in the middle of the day) can reduce carbon emissions. Some data centres and cloud platforms even offer real-time carbon intensity data to help time your jobs more efficiently.
How to choose the right AI tool?
When picking a tool, it’s helpful to reflect on a few key questions.
Consider whether the tool is generative – is it designed to create content like text, images, or code? Or is it analytical and focused on finding patterns, trends, or summarising data. This distinction matters because the purpose and complexity of the task influence both how useful the tool will be and how much energy it might consume.
Think about data privacy. How is your information handled? Some tools store queries to improve their models, while others allow more control over what is saved. For research or sensitive work, it is important to know what happens to your data.
You should also consider the environmental footprint of the tool. Larger models require more computing power and energy, especially when used repeatedly. Are you using a heavyweight tool when a lighter one would do? Lightweight models (e.g. DistilBERT), or retrieval-augmented tools (e.g. Elicit) are designed to reduce compute time and emissions.
Reflect on the tool’s transparency. Can it explain how it arrived at an answer? Tools with more explainability can be easier to trust, especially in scientific work where evidence and reproducibility matter.
An ongoing and complex conversation about AI
Many NCAS staff are already exploring and using AI to improve their work. “It is absolutely incredible how AI is changing the research landscape. For me, it felt like having a colleague next to me, chatting about science and technical problems,” shares Ioana.
I’ve used Copilot to help me with so many tasks – bulk adding text to slides, producing meeting summaries from transcripts, refining text that I’m struggling to condense, expanding bullet points into full sentences, summarising long documents into key points (with references back to the source), brainstorming ideas, planning meeting structures, … the list goes on! For me, it’s about aiding me to do the same (usually tedious and repetitive) tasks but with less procrastination – so I can spend my time adding value and exploring tasks I actually enjoy.
– Poppy Townsend, senior data scientist at the Centre for Environmental Data Analysis
Dr Dan Hodson, a research scientist at the National Centre for Atmospheric Science and University of Reading, explains how he has been using AI to write code more quickly and the unknown questions about this on carbon emissions: “In some cases, using ChatGPT is ~10x faster than writing the initial draft code myself (with many internet searches). But it does depend on how precisely I word the prompt. If I produce more analysis faster, is that a good thing? Does it produce less emissions than when I spend longer researching it myself with internet searches?”
I’ve been exploring ways to ‘turn off’ AI tools when they are not useful for me. For example, you can turn off Google’s AI summaries by adding ‘-ai’ to the end of your search term.
– Dr Charlotte Pascoe, senior data scientist at the Centre for Environmental Data Analysis
This article is part of an ongoing complex conversation. If you’d like to share how you’re using AI in your work, or how you’re thinking about its impact, we’d love to hear from you via email at comms@ncas.ac.uk. For transparency, this article was proofread by AI and suggestions were made to improve the flow and structure of the information we are sharing.
Quick tips for conscious AI use
- Use AI when it clearly improves quality or efficiency
- Choose smaller or open-source tools when possible
Write effective prompts using methods like ‘ROCKS’ - Log what you use and how
- Cross-check results, especially factual ones
- Consider hidden costs: energy, water, ethics
Ask: “Would I do this differently without AI?”
Encourage others to use AI consciously – if you spot a use case that doesn’t feel conscious/ethical, talk to them about it!
Recommended resources for conscious AI use
- UK Government AI Playbook – accessible technical guidance on the safe and effective use of AI for public sector organisations
- IBM: What is Human-Centred AI? – explores how the future of data science work will be a collaboration between humans and AI systems, in which both automation and human expertise are indispensable
- Machine Learning Emissions Calculator – a tool that helps you to estimate the carbon footprint of your machine learning activities
- Climate Aware Task Scheduler (CATS) – a tool which lets researchers schedule computing when low-carbon electricity is available