Credit: patpitchaya on Shutterstock
Americans May Use AI, But That Doesn’t Mean They Trust AI
In A Nutshell
- Environmental concerns are rising: 57% express worry about the energy consumption required to power AI systems
- 79% of Americans support government regulation for AI answer engines, with only 12% believing no additional oversight is needed
- Trust remains extremely low: Only 16% trust AI answer engines “a great deal,” despite one-third using them daily
- Privacy is the top concern at 48%, followed by accuracy (36%) and transparency (32%), with 81% worried about data access
Americans have decided: artificial intelligence needs a referee. A new survey of over 1,400 people reveals that 79% now favor some level of government regulation for AI answer engines, with more than one-third calling for strong oversight. Only 12% believe no additional regulation is needed.
Nearly one-third of survey respondents use AI every day, according to Shift Browser’s 2026 AI Consumer Insights Survey. More than half say these tools improve their online experience. But growing familiarity hasn’t bred trust, instead sharpening concerns about how these systems operate behind the scenes.
Support for regulation cuts across the usual divides. While 35% want strong regulation, the rest favor moderate oversight, pointing to a public that wants guardrails without stifling innovation.
Privacy and Trust Drive Regulatory Demands
Eighty-one percent of respondents worry about AI systems accessing personal data or private conversations. Privacy tops the list of concerns at 48%, followed by accuracy at 36% and lack of transparency at 32%. These aren’t abstract worries. People fear their search histories, messages, and documents are being fed into systems they don’t understand and can’t audit.
“AI is moving quickly and so are user expectations for transparency and control,” said Michael Foucher, Vice President of Product and Customer Success at Shift. “Consumers clearly see value in AI tools, yet they also want greater clarity and control over how those systems operate.”
Only 16% of people trust AI answer engines “a great deal.” Even among daily users, trust remains tentative. Sixty percent say they trust these engines at least somewhat, a qualified endorsement that means people rely on AI out of convenience rather than confidence. The gap between usage and trust explains much of the regulatory appetite. When a technology becomes widespread but trusted by only a small minority, the public looks to external oversight rather than corporate self-regulation.

AI’s Growing Influence on Opinion
Fifty-eight percent said AI-generated answers have influenced their opinions at least occasionally. These tools have moved from novelty to influencer, shaping decisions and beliefs without the traditional editorial oversight applied to news outlets or published research.
When asked what bothers them most about AI systems, 32% cited an inability to understand how answers are generated. People want to know not just what AI tells them, but how it arrived at those conclusions and what data it used.
Nearly half of respondents, 48%, reported comfort with autonomous AI features when oversight is present. Acceptance depends less on the technology itself and more on the degree of visibility and control users maintain. Forty-four percent worry about AI taking actions without approval.
Twenty-six percent report difficulty managing or turning off AI features once enabled, pointing to a design problem as much as a policy one. Tools that bury opt-out mechanisms in layers of settings feed distrust even among users who appreciate the benefits.
Energy Consumption Becomes Part of the Debate
Fifty-seven percent of respondents expressed concern about the energy required to power AI systems. Data centers running large language models consume massive amounts of electricity, and as awareness grows, environmental impact may increasingly factor into how people evaluate AI platforms.
This concern represents a newer front in the AI debate. While privacy and accuracy have dominated discussions for years, sustainability is emerging as a third pillar of evaluation. People want to know not just whether AI works and whether it respects their data, but also what it costs the planet.
Oversight could extend beyond privacy and accuracy to include operational responsibility, requiring companies to disclose energy usage or meet efficiency standards. Such requirements would give consumers the information needed to choose platforms based on environmental performance, not just features.
What Users Actually Want
When asked about desired functionality, 54% prioritized research assistance, 34% wanted article summarization, and 32% sought task automation. People value AI most for augmenting their own work rather than replacing it. They want tools that help them sift through information, not systems that make decisions on their behalf without input.
Fifty-one percent said the ability to customize or limit AI features is important. People don’t want to abandon AI, they want to decide when and how it operates.
Daily engagement with AI is highest among 25- to 34-year-olds and working professionals, while adults 65 and older are least likely to use these tools. Twenty percent of respondents said they never use AI.
For many, AI improves digital workflows but hasn’t delivered transformational time savings. Practical, task-oriented applications dominate usage patterns, meaning the next phase of adoption may hinge on whether these tools can move from helpful to indispensable.
Support for regulation may reflect this pragmatic relationship. People use AI because it’s useful, not because they’re enthusiastic about it. They see its flaws clearly and want safeguards in place before it becomes more deeply embedded in critical systems like healthcare, finance, or education.
Public opinion appears to be settling on a middle path: embrace the technology, but don’t let it run unchecked. The question now is whether policymakers will move fast enough to meet that demand.
Survey Methodology
The survey was conducted among 1,448 respondents and weighted to be nationally representative by income, ethnicity, age, gender, and region. Shift Browser, which commissioned the research, produces a customizable browser designed for professionals managing multiple accounts and apps.







