“I’ll just use ChatGPT”: What now for government guidance?
By Angela Moore
Angela is a content designer and and long-term Scrollie. Over the last decade, she’s designed content for GOV.UK about everything from Universal Credit to farm subsidies, legal aid and school data.
AI is shifting user behaviour
If you’re a content designer, you might hear a lot at the moment about AI: the AI skills we need, AI impacts on government teams, AI impacts on jobs and so on and so endlessly forth.
But there is something really interesting going on that we don’t hear enough about - the shift in user behaviour.
People are rapidly adopting AI tools to find out about services and understand how to interact with government. This was previously very much GOV.UK territory.
This shift is a real change and a real challenge. How do we make sure that users can get accurate, trustworthy, accessible government information, when this is first mediated by AI?
What we’re seeing in user research
Over the past year we’ve run 4 rounds of user research to inform some guidance on GOV.UK.
We start each session by asking, “How would you find out about this?” In the first 3 rounds, most users said they would start on Google, or by asking a friend, or phoning a charity - expected behaviour.
But in the latest round, half the participants said some variation of:
“I’d just ask ChatGPT.”
“I’d use Gemini.”
“I’d probably ask an AI tool.”
I guess we all knew this was coming, but it seems to have happened quite suddenly and definitively. We’re not in Kansas any more.
What AI tools surfaced and what they missed
The results of user conversations with AI tools were interesting and sometimes alarming.
In several cases, AI tools surfaced information that users would previously have struggled to find.
In one example, the AI answer gave the helpline phone number for a service. That number technically exists in the guidance, but it’s buried and easy to miss. This is fairly common practice: departments might think users should try digital routes first, or be concerned that helpline numbers would be swamped if openly available.
The AI just neatly solved this. This is jam for users. But it will definitely increase demand to that phone number. This is a good (if trivial) example of what Tom Loosemore writes about in his post on removing friction in government services with AI (opens new tab).
But the AI also surfaced some things that did not best meet user need.
For example, one result linked to a separate service. That service is analogous to an appeal or complaint - it is not suitable at the start of a user journey. The AI presented both options as if they were equivalent starting points. The context explaining when and why to use each had disappeared.
Wild overconfidence, even when wrong
Alongside these structural issues, we also saw more familiar AI limitations. In some cases, responses included:
inaccuracies
over-confident statements about things that are normally conditional
For example, because they tend to be complicated and conditional, government eligibility rules are often expressed carefully: “If you earn more than £X, you usually will not be able to claim Y.”
The AI might simplify this to: “If you earn more than £X, you cannot claim Y.”
Government rules depend on individual circumstances, and guidance must reflect that nuance.
Why users find this so appealing
Despite these issues, participants themselves felt pretty confident about the answers they received.
And no wonder. We know that users don’t do things with government for fun. When they come to us, they are stressed, confused, unsure what they are looking for, about to run out of data, dealing with a crying baby.
And it’s so damn effortful. Navigating guidance online is a rabbit hole: click click click, which site is trustworthy, which guidance is right, what does this even mean, where was I?
So of course you’re going to love that AI tool tone. It feels like a shortcut through complexity, with answers that sound so certain, so concise, so authoritative.
What this means for government guidance
We’re already behind the curve here. Users are adopting these tools quickly and the way they access government information is changing.
This does probably mean that it’s more important than ever for departments to sort out their content estates. Sort out all that poorly structured, duplicated, outdated information sprawl. Actually apply the cross-government principles and standards for content design. (Content designers everywhere are inwardly shouting, “WE TOLD YOU SO!”)
If nothing else, doing this work means that we will be providing a proper, trustworthy, single source of truth on GOV.UK. Content is infrastructure. And if AI tools increasingly become part of the way people access public services, the quality of that infrastructure will matter more than ever.
We don’t know how this will evolve. But it does feel like we need to have a think, sharpish, across government, about how we continue to provide good government information in this new world.
Feeling the need to get on top of your GOV.UK content estate? Scroll can help. Just get in touch.

