As the sharing economy continues to gain momentum, the importance of security and trust between users is becoming increasingly apparent. Not only the Airbnb incident in June 2011 or the shutdown of the luxury carsharing company HiGear have shown that peer-to-peer (P2P) marketplaces involve higher risks than business-to-consumer e-commerce. The key advocates of the sharing economy Rachel Botsman and Lisa Gansky have also identified “trust between strangers” as a necessary foundation for the functioning of P2P asset-sharing marketplaces. While the existing reputation systems such as eBay’s rating system may have been sufficient for e-commerce, the newer P2P platforms, such as car or flat sharing, require more complex trust systems. Since acting anonymously is far easier on the Web than in real life, P2P transactions also call for some type of identity verification, that confirms that you are who you say you are. Having recognized these issues, several entrepreneurs in different countries have begun to build portable cross-platform trust and identity systems meant to facilitate the sharing of assets between individuals, such as TrustCloud, Briiefly, Legit and Peertrust.
In my bachelor thesis, I took a closer look at these trust systems and examined the trust issue in P2P marketplaces in general. Thanks to Skype I was able to interview eleven interesting individuals from across the globe involved in the collaborative consumption movement (researchers and social innovators, P2P platforms and companies attempting to create trust systems) and gained some interesting insights.
Here are some takeaways from my research:
Trust is complex.
When I first started working on trust, one of the things I discovered was that trust is difficult to define. If you ask different people to describe trust, you will probably hear a variety of keywords: honesty, credibility, reliability, empathy, belief or faith. None of these is wrong or right, they are merely different dimensions of trust. There are many emotional and cognitive factors that influence our decisions to trust or not. These are, however, very personal, since every person perceives trustworthiness differently. A person may make a trustworthy impression on one person, but not on another. Trust also varies by situation: you may trust a friend to return your borrowed vacuum cleaner in good condition, but not your car. All these characteristics of trust make it challenging to find a universal solution to building trust in P2P marketplaces.
Transparency is key.
On the Web, vast amounts of data are created every day. Most of the companies I examined in my thesis are looking for ways to make this data available and useful to users, for instance by calculating so-called “trust scores” with the help of algorithms. These scores, which are based on data from social networks and other sources (that provide things like damage reports, peer reviews and transaction history) are supposed to help strangers judge each other’s trustworthiness. This information facilitates and accelerates the process of building trust between strangers on the Web. Since you take your trust score with you whatever platform you are on, it encourages good behavior. A person who has worked hard to build up an online reputation will not want to jeopardize that.
My research also showed that it is crucial for companies offering these systems to remain as transparent as possible about how their trust scores are derived. Since trust is complex and every platform requires different dimensions of trust, every person should be able to understand the score and decide themselves whether they want to trust a person or not. Being a good driver is very different, for example, from being a friendly and reliable CouchSurfing host.