
Redefining trust in the age of AI
April 10, 2025 / 5 min read

Privacy is no longer the sole driver–it’s also about the value exchange
Google recently announced an update to its Gemini AI assistant that says a lot about where conversational AI is headed. The update allows Gemini to deliver more tailored responses by using personal data, starting with users’ search histories.
This change signals a shift in how AI builds consumer trust. Trust is about more than protecting privacy; it’s also about the value AI provides. And this evolution is not limited to AI.
Think about it. How often do you closely review the consumer privacy caveats when you buy the latest iPhone or update your operating system? We share sensitive personal data with Apple without much consideration because we trust Apple to deliver consistent performance personalized to our needs. And we share location data with wayfinding apps without balking because they need to know where we are to give us accurate instructions.
Conversational AI can gain our trust by offering a value exchange: personalization paired with empowerment and transparency that brings consumers along for the ride.
Trust as a value exchange
The old playbook says trust hinges on privacy. Keep data safe, and consumers will feel secure. But reality tells a more nuanced story. The 2024 KPMG Generative AI Consumer Trust Survey found that 70% of U.S. consumers believe AI’s benefits outweigh its risks, with over half trusting it for personalized recommendations or customer service.
Privacy is part of a larger mosaic of trust. Take Amazon’s AI-powered recommendation engine, which drives 35% of its sales. Shoppers know their browsing history fuels those suggestions, yet they keep clicking because the payoff (finding exactly what they need) feels worth it.
This doesn’t mean privacy is irrelevant. Privacy breaches can shatter trust. But privacy is a baseline expectation. When AI delivers, consumers trade data for outcomes. Conversational AI tools such as ChatGPT, Gemini, and Perplexity must lean into this exchange.
Personalization and empowerment
If performance opens the door to trust, personalization and empowerment deepen trust. According to Twilio, 57% of consumers will spend more on brands that personalize experiences, and 66% will quit a brand that does not.
Personalization is where AI excels, and why Gemini’s move toward personalization is significant for conversational AI, more specifically. AI’s ability to tailor experiences through conversational tools, predictive analytics, and curated content makes interactions seamless and relevant.
Consider how well Netflix builds engagement through personalization and Spotify uses AI to enrich music discovery in ways that improve the listening experience. As a result, both have attracted and kept hundreds of millions of subscribers.
Empowerment, on the other hand, is about AI giving people agency. Apple’s photo Clean Up feature (available for more recent iPhone models) helps photographers edit their work by removing unwanted objects from photos, like holding Photoshop in your hands.
More generally, AI empowers users every day in ways that are less obvious but still important. For example, in automobiles, adaptive cruise control uses AI with radar and cameras to adjust speed based on traffic. The driver sets the speed and distance preferences; AI handles microadjustments, freeing up the driver to focus on steering and decisions, like a co-pilot.
Bringing consumers along for the ride
Conversational AI has an opportunity to earn our trust with more transparency. As it stands, consumer trust in generative AI is a mixed bag, shaped by a blend of enthusiasm for their capabilities and skepticism. A 2024 SEMRush survey reported widespread user distrust in the accuracy of generative AI search results. Some 60% of users surveyed wanted more security by getting more accurate data and sources included in search results. In other words, conversational AI has a challenge in the “value” component of trust.
Consumers don’t need AI to be perfect, but they do need to understand its imperfections, and they need to know what conversational AI providers are doing to improve accuracy and inject rigor. Perplexity footnotes its AI-generated answers, allowing users to see where information is sourced. But even so, Perplexity makes mistakes by drawing wrong conclusions from those sources. AI providers could build trust by offering real-time error tracking or public-facing reports that document both progress and challenges.
Beyond transparency, trust flourishes when consumers feel they have a seat at the table. Providers of generative AI apps can earn trust by showing how they empower people to get better at what they do, not replace our work. AI companies could also involve users in co-creation, whether through feedback loops, beta testing, or personalized customization. An AI assistant that periodically checks in with a “Did I get that right?” is the kind of participatory trust-building that turns AI from a black box into a collaborator.
A new foundation for trust
Trust in the AI era isn’t static, nor is it rooted in fear. It’s dynamic, tied to value, personalization, empowerment, and openness. Privacy is no longer the sole driver of trust—conversational AI can build trust by doubling down on value.
Featured in AdWeek
Categories
- Perspective