A few weeks ago, I came across a metaphor from one of my favorite authors, Brené Brown, that has stayed with me. She describes trust-building in human relationships as filling a marble jar: every trustworthy action we do for each other—keeping a promise, remembering personal details, showing up when it matters—is like adding a marble to the jar.
Over time, these small moments stack into a pattern of reliability, confidence and ultimately trust. And just as they fill the jar, breaches of trust remove marbles. What I like about this metaphor is that Brené Brown reminds us that trust isn’t built through declarations. It’s built through small, consistent and transparent actions.
Lately, I’ve been thinking about how this applies to digital transformation journeys and especially to the building of digital public infrastructure (DPI). Around the world, governments and multilateral organizations are making massive investments in digital ID systems, payment infrastructures, and data exchange layers. In parallel, Artificial Intelligence (AI) is increasingly being embedded within these infrastructures to improve targeting, automate processes, and expand access.
But even as these technologies advance, a persistent gap limits their impact: the trust gap. As we detailed in our piece of governing Africa’s DPI, technology alone is insufficient; without a foundation of legitimacy, adoption stalls.
Because the truth is trust in digital public systems is not something you announce. It’s something you demonstrate, iterate, and earn through many small actions over time.
A framework for legitimacy
There are three major steps that need to stack together to build trust in digital systems: awareness, access and adoption (see Figure 1 below):
- Awareness is the outer circle: whether people know that a digital service exists
- Access is the middle layer: whether people have the connectivity, devices, and digital literacy to use it
- Adoption is the inner circle: the point where people actually choose to use a service because they find it useful and trust it
The three levels of the trust issue
Globally, we’ve made significant progress in awareness and access. Governments invest heavily in digital literacy campaigns and broadband infrastructure expansion. While many are still excluded, more people than ever have smartphones and Internet access. But the final step of adopting and using the tools remains elusive. This is because trust cannot be built through connectivity alone. Adoption requires:
- Addressing concerns such as privacy, exclusion, misuse, or politicization.
- Testing technologies openly and iteratively rather than unveiling them at scale without validation.
- Involving communities as co-creators, not just passive recipients of digital services.
A national payments system or data exchange layer may exist, but if people worry it will expose them to scams, surveillance, exclusion, or political manipulation, they will not use it. And in a world saturated with misinformation, those fears spread faster than any government awareness campaign.
This becomes even more urgent as we introduce AI-powered DPI. If DPI depends on trust, layering AI (through e.g., automated decisions, algorithmic targeting) makes trust both more essential and more fragile. People want to understand: Who is making decisions about me? How does this system work? What happens when it gets something wrong? Where and how can I challenge or correct it?
These questions are not just technical, but emotional, relational, and deeply tied to people’s lived experiences. And we need to be bold to face them and answer them not only to ensure that investments are effective, but also to guarantee that these digital systems are creating real and concrete value for people everywhere.
Building the TrustStack
Borrowing the concept of a “stack”, that is often used in tech policy framing, what if we thought of a stack for trust? A TrustStack is made up of the consistent, repeatable actions that governments, technologists, and institutions must take to ensure digital transformation journeys, including DPI and AI-powered DPI, earn public confidence.
If trust is a marble jar, then people, iteration, and transparency are the marbles that fill it.
1. Co-creating with people
Trust starts with people. Not as users at the end of the process, but as co-creators from the beginning. It means including all the relevant voices, especially those we often overlook.
Young people play a unique role here. They are early adopters. They are experimenters. And they often become “trust spreaders” within their families and communities. A grandmother trying a digital payments service for the first time usually does so because a young person guided her, not because she saw a formal campaign.
Engaging youth and other key groups through workshops, labs, and co-design processes is a strategic investment. When people shape a system, they are more willing to trust and use it.
2. Iterative testing
Trust grows through systems that evolve, not systems that arrive fully formed. Instead of launching a full system all at once, starting small, adapting and testing can foster more trust and drive adoption.
Iteration ensures digital systems match the local context and actual user needs. It reduces risks before scaling and shows people that their feedback matters.
Tools like sandboxes, prototypes, and phased pilots make iteration visible. They allow governments and communities to experiment together, identify issues early, and fix them openly. Every cycle of testing and improvement adds another marble to the jar.
3. Intentional transparency
People trust what they understand. Transparency means explaining not only the benefits of digital systems but also how they work, how they use data, and what the risks are.
This communication must be ongoing. It must be simple, accessible, and honest. People want clarity, not slogans. They want to know what is being tested, what is changing, and why it matters.
When governments communicate clearly and consistently, even hesitant users can find confidence in the system.
Thoughts for the future
Small actions, repeated over time, create the foundation for digital systems that people can truly trust and choose to use.
This is the essence of the TrustStack: a deliberate, structured way of building trust into every layer of digital transformation. It is a governance architecture and a set of human-centered practices that allow DPI and AI-powered systems to earn legitimacy. It helps us close the trust gap through demonstrated care.
As AI becomes more deeply woven into DPI, from eligibility decisions to service delivery, trust will be decisive. Without it, people will opt out, resist, or turn to alternatives they perceive as safer. With it, digital systems can become tools for inclusion, resilience, and shared opportunity.
So the task ahead is clear: we must build the TrustStack with the same ambition, investment, and creativity that we bring to building the technology itself. As we look toward the future of AI-powered DPI, the challenge is not only technological, but deeply human. And trust, once again, is the deciding factor.
The post TrustStack: filling the marble jar of digital trust appeared first on The Datasphere Initiative.
