Public trust in institutions and national leaders, across the globe, has declined for decades with alarming consistency.
This trend is alarming because trust acts as a sort of bellwether for many critical components of our everyday lives: the perceived moral quality of society, the perceived strength of the country’s government, belief in the competence of national leadership, belief in the possibilities of the future, and in economic potential.
When trust is low, nations collapse, religion collapses, democracy collapses. The youngest two generations of our current workforce, Gen Z and Millennials, are especially untrusting – an unsurprising reality for two generations who are navigating a world without the sense of security that characterized the 1950s and 1960s (and an argument can be made for the 90s), eras marked by family stability and relative prosperity, against which rebellion against authority and a strong sense of individuality could flourish.
In contrast, the GenZ and Millennial experience is marked by precariousness – by the time baby boomers hit the age of 35 in 1990, they collectively owned 21 percent of American wealth. Millennials who this year hit the age of 35 own just 3.2 percent of American wealth. These economic indicators are unsettling for trust researchers because we can anticipate that further declines in trust are coming, and with that anticipated decline in trust are further constraints to economic growth.
So imagine the surprise when one group bucked the trust decline trend: Trust in business has increased for Americans. And only Americans. (Source: 2023 Edelman Trust Survey).
This finding confirms a suspicion that has been brewing, but to fully flesh my thinking out, we have to go back to a seminal contribution to trust research: The Trust Game. The Trust Game is a game theory exercise (related to the Dictator Game) that simulates a trust end-state based on the moves of two individuals who have been given a small sum of money (one coin).
If Player 1 puts their one coin into a machine, Player 2 will get 3 coins. Player 2 can then decide whether to put in one coin themselves which turns into 3 coins for Player 1, or Player 2 can give zero coins and leave with 4 coins, while Player 1 leaves with no coins. The creators of the game complicate the parameters of the Trust Game by adding strategy variations: some players “always cooperate”, some players “always cheat”, and some players simply repeats the moves of the other player, “the copycat”.
If player 1 cheats, the copycat cheats, if player 1 cooperates, the copycat cooperates. These various strategies shake out like this: in a game where there is a player who “always cheats” and a player who “always cooperates”, cheaters do not win big. Everyone might leave the table with maybe a little bit of something, but it’s only a little bit. In a game of only “always cooperate” players, the end result is a big win for everyone. All players, including the players who always cheat, are better off because of the player who always cooperates.
But if we eliminate the cooperators and introduce copycats (players who will do as their neighbor does), they will eventually eliminate the cooperators and produce a society entirely of copycats or a society entirely of cheaters, depending on the number of interactions. A lower number of interactions produces a society of cheaters, a higher number of interactions produces a society of copycats.
Now, if fewer interactions will produce a society comprised entirely of cheaters, then where are we headed when we can have groceries delivered to our doorstep without speaking to a single person, have pizza delivered via an app, and connect with potential mates outside of the communities we live and work using dating apps? These questions bring us back to the hypothesis and to the increasing levels of trust in business for American respondents: for some individuals, interactions with large organizations ARE their most frequent interactions and thus have the potential to be the most trusted interactions.
I might never speak to my next-door neighbor, but I will interact with Google, Amazon, or Microsoft one or two (or forty) times a day. I might talk to my brother on the phone every couple of week or so but will likely shop at Target, Walmart, Costco – whatever your big box store of choice – every week.
Trust forms through repeat interactions. If there is no possibility of a repeated interaction, there is no real need for trust. It’s for this reason that tourist traps are, in fact, traps. The majority of visitors to that area are unlikely to come back, at least in the short term, and as a result, the quality of goods can be shoddier, the food can be of poorer quality, and the overall experience can be somewhat lacking. Contrast that with your neighborhood bakery or cafe, whose success depends on the repeat patronage of neighborhood residents – the coffee and pastries are excellent, or the variety of goods perfectly meets your needs – something desirable compels you to return. That bakery or cafe has built their long-term viability on the trust their customers have in the business.
The IDC Future of Trust program is focused on what engenders and maintains trust in business, and what security, privacy, compliance, and ESG offerings lend themselves most to trust and trustworthiness. We also focus research on what breaks trust, as in the recently published survey spotlight on what was perceived to be the greatest “trust-breaking” event for businesses.
Research has long shown that trust, when compared to direct oversight, facilitates more efficient operations, mitigates the adverse outcomes of negative events such as data breaches, and is a pre-requisite to getting individuals to share high-quality personal information. IDC research has shown that trust confers benefits onto key business outcomes as well, namely in the areas of business resilience, operational efficiency, and sustainability.
In the coming months the IDC Future of Trust Program will offer greater insight into the nature of trust and into key features of trust, with empirical research applied to trusted AI, trusted Security and Privacy, and trust by industry. Stay tuned.