Report from the EU H2020 Research Project Ps2Share:
Participation, Privacy, and Power in the Sharing Economy
Privacy in the Sharing Economy
Giulia Ranzini, VU Free University Amsterdam
Michael Etter, Copenhagen Business School
Christoph Lutz, BI Norwegian Business School
Ivar Vermeulen, VU Free University Amsterdam
Report from the EU H2020 Research Project Ps2Share:
Participation, Privacy, and Power in the Sharing Economy
Privacy in the Sharing Economy
Giulia Ranzini1, Michael Etter2, Christoph Lutz3, and Ivar Vermeulen1
1
VU Free University Amsterdam
Copenhagen Business School
3
BI Norwegian Business School
2
This project has received funding from the European Union's Horizon 2020
research and innovation programme under grant agreement No. 732117
1
1. Introduction: On Privacy and the Sharing Economy
The emergence of the sharing economy has brought a small but substantial redefinition of the key
actors within economic exchanges. Not only have consumers taken a far more substantial role in
the sharing of their goods and time, but also their participation in the economy has grown to the
point of redefining the meaning of ownership, personal goods, and spaces. This phenomenon has
been defined as emerging from a true cultural shift in consumer preferences: having temporary,
even shared, access to assets has become more attractive than singularly owning them (Bardhi &
Eckhardt, 2012; Belk, 2013; John, 2013).
Within the sharing economy, however, different underlying processes are to be found. Consumers are drawn towards more social and sustainable alternatives to traditional accommodation,
workplaces, and transport. Platform organizations within the sharing economy offer consumers
the opportunity to experience what is often perceived as a more personal and unique service
(McNamara, 2015), with both high affective and high participative value (Liu & Mattila, 2017). The
act of sharing makes consumers feel like they are somehow contributing positively to the community (Hellwig, Morhart, Girardin, & Hauser, 2015). At the same time, however, some consumers
participate in the economy by providing their own goods and accompanying services. This is also
often motivated by a willingness to create and sustain a community. However, because of the
facilitation offered by online platforms, participating in the sharing economy as a provider can
become an additional source of income and an opportunity to profit from networks of knowledge
and value (Ikkala & Lampinen, 2015).
As the meaning of ownership shifts, with users sharing goods, spaces, and information, questions arise regarding perceptions of privacy in relation to organizations and in relation to their
peers. Indeed, the platform-mediated sharing process typically involves the exchange of personal
information. Addresses, credit card information, as well as geo-location, travel habits, photos of
personal items, individual preferences related to the use of various goods, and personal spaces are
exchanged or made public and therewith require some implicit or explicit privacy considerations
by providers, consumers, and platform organizations. With regard to the new ways in which personal information is disclosed and exchanged, it is interesting to address what privacy trade-offs
could be taking place in exchange for users’ participation in the economy, how it affects their willingness to participate in the sharing economy, and the benefits they derive from their participation. Indeed, privacy concerns have been identified as some of the main factors for participating –
or not – in the sharing economy.
Privacy has been defined as a fundamental right for humans. However, its meaning and limits
have evolved together with society (Solove, 2008). As individuals began to interact online, what
used to be defined as their ‘right to be left alone’ (Warren & Brandeis, 1890) has evolved into a
more nuanced trade-off, in which the risks related to user data are evaluated against the benefits
of participating in the interaction (Egelman, 2013). Talking about privacy online is, in fact, often a
consideration of the trade-offs occurring with regard to user data. The costs related to the risk of
privacy loss are weighed against the advantages that the digital world offers, for example, in terms
2
of personalization (Chellappa & Sin, 2005), self-expression (Trepte & Reinecke, 2011), social capital
(Stutzman, Vitak, Ellison, Gray & Lampe, 2012), economic gains (Teubner & Flath, 2016), or even,
more simply, the convenience of staying connected (Egelman, 2013).
Conceptually, the mere existence of a sharing economy brings about questions of privacy as it
involves the simultaneous sharing of consumer data (in exchange for participation on sharing platforms) and consumer-owned goods, spaces, and services. For both categories of users involved,
both consumers and providers, optimal privacy is achieved when they reach a solution that allows
them to both take part in the sharing economy and corresponds to a desired level of exposure to
peers and organizations. Additionally, when the sharing system involves a monetary exchange,
different motivations and expectations can alter the privacy calculus of all the users involved.
The complex exchange that takes place within the sharing economy, which involves data, goods,
and services, makes privacy a particularly multifaceted concept, which can be approached through
different perspectives. In the following paragraphs we will highlight a number of important opportunities and challenges related to privacy within the sharing economy.
This report forms one part of a European Union Horizon 2020 Research Project on the sharing
economy: Ps2Share ‘Participation, Privacy, and Power in the Sharing Economy’
(www.ps2share.eu). We aim to foster better awareness of the consequences which the sharing
economy has on the way people behave, think, interact, and socialize across Europe. Our overarching objective is to identify key challenges of the sharing economy and improve Europe’s digital
services through providing recommendations to Europe’s institutions.
The initial stage of this Research Project involves a set of three literature reviews of the state of
research on three core topics in relation to the sharing economy: participation (Andreotti, Anselmi, Eichhorn, Hoffmann, & Micheli, 2017), privacy (this report), and power (Newlands, Lutz, &
Fieseler, 2017).
2. Privacy and the Exchange Process
Privacy in the context of the sharing economy concerns the sharing of data but also the sharing of
other resources, such as objects and owned spaces. Privacy in data sharing is two-fold: data exchanges take place between users and platform-organizations, and between users and peer-users.
Sharing of data by consumers with the platform-organization is done in exchange for the option to
participate on the sharing platform. Sharing of data by consumers with providers is done in order
to access a service or good. Similarly, providers share data with organizations in exchange for the
option to provide a service on the sharing platform and with consumers for revenue.
3
Figure 1: Summary of the data exchange for consumers and providers
Participation in the sharing economy resembles interactions on Social Network Sites (SNS) with
a relational purpose, such as dating sites. Similar to dating sites (and now dating apps), users on
sharing platforms tend to disclose information that allows them to present themselves in a way
that is both desirable (Peterson & Siek, 2009) and attractive for the type of user they consider to
be a good match (Tussyadiah, 2016). Furthermore, as on dating sites, users will utilize the private
information available to evaluate the trustworthiness of other users, regardless of whether they
are participating in the sharing economy as consumers or as providers (Ert, Fleischer, & Magen,
2016; Ma, Hancock, Mingjie, & Naaman, 2017).
Privacy in the sharing of physical spaces and goods becomes an issue when users either invite
other users within their ‘private spaces’, such as apartments, offices, or cars, or give other users
access to their goods, such as books, power drills, or clothes etc.. When doing so, users are effectively re-negotiating their private boundaries (Lampinen, 2015). They are achieving a new “contextually desirable degree of social interaction” (Lampinen, 2015, p. 26). In many ways, by participating in the sharing economy, users blend the borders between online and offline sharing. In fact, it
is from the online exchange of information that users determine whether to share their spaces
and physical goods. Because the desirability of their assets determines their ability to match with
others, the disclosure of personal information becomes essential for the process to take place.
This process requires what Tan (2010) defines as ‘the leap of faith’, i.e. the achievement of a degree of trust that can compensate for the risk related to missing information in the transition from
online to offline interaction.
3. Privacy and Users
The negotiation of boundaries in the sharing economy differs depending on the role of users, as
well as on their needs and expectations. Historically, privacy is conceptualized as a flexible, rather
than a fixed, need. It tends to reach optimal levels in the trade-off against responses to other individual needs such as security (Altman, 1975; Prabhakar, Pankanti, & Jain, 2003; Solove, 2011),
4
trust (Pearson, 2013; Raya, Shokri, & Hubaux, 2010), and utility (Egelman, 2013; Guo & Chen,
2012; Krause & Horvitz, 2008; Li & Li, 2009). The maintenance of relationships and social capital
can also be something that individuals decide to negotiate privacy against (Ben-Ze’ev, 2003; Ellison, Vitak, Steinfield, Gray, & Lampe, 2011).
In order to participate in the sharing economy, both providers and consumers must disclose a
certain amount of information to the platform organization in exchange for access to the platform
on which the exchange takes place. This information, which is typically disclosed upon registration,
typically includes users’ names, phone numbers, and email addresses. Platforms such as Airbnb
and Couchsurfing also require user profiles to feature a personal picture (Fagerstrøm, Pawar, Sigurdsson, Foxall, & Yani-de-Soriano, 2017), while others, such as Uber, only offer it as an option.
This information, which acts as a de facto gateway to platform access, also serves to ‘anchor’ the
online identity to a real, existing offline person (cf. Zhao, Grasmuck & Martin, 2008).
In addition to accessing the platform, consumers disclose information in order to receive a service or good from providers. This information, generally, has the purpose of making a consumer’s
profile attractive enough to providers, who offer a service or good matching their interest (Lampinen, 2015). Consumers will disclose information that makes them appear trustworthy and recommendable (Rosen, Lafontaine, & Hendrickson, 2011), hereby minimizing the uncertainty of providers (Lampinen & Cheshire, 2016). In this sense, impression management takes place in the kind of
information disclosed (Yang, Quaan-Haase, Nevin, & Chen, 2017) but also in the amount, as studies have demonstrated that giving providers a complete profile can be crucial for minorities to
receive the shared service (Ma et al., 2017).
Providers, differently from consumers, are asked to disclose information about who they are, as
well as about the resources that they share. This is done strategically, so as to be attractive to consumers. In a study of Airbnb, Tussyadiah (2016) refers to this as a process of ‘branding’. Accordingly, hosts most likely adopt one of five roles (The Global Citizen, The Local Expert, The Personable,
The Established, or The Creative) in order to attract guests who could be compatible to their offer.
Impression management takes place with respect to the person offering the service or good,
especially when the service or good provided puts providers in direct contact with consumers
(Fagerstrøm et al., 2017; Molz, 2012b). Similar to a SNS, users will tend to share positive and intimate information to feel more connected to other users (Utz, 2015), and to generate positive impressions (Lee-Won, Shim, Joo, & Park, 2014). Additionally to SNS, the potential revenues for consumers increases their need to achieve desirability with the information shared (Ma et al., 2017).
5
Self- Presentation /
Voluntary Data
•
•
Information
disclosed to increase
likelihood of
participation in the
sharing economy
Profile personalization
Additional photos, personal
descriptions, details on service
requested or provided
Mandatory data
•
•
•
•
Information
disclosed to
receive or
provide goods
and services.
Real names
E-mail address
Phone number
Personal photo
Technical Data:
•
•
Information
available to the
organization.
Location and navigation data
Interaction data
Figure 2: Three levels of information disclosure.
Impression management also takes place with respect to the service or good offered, especially
when the contact between providers and consumers is limited. The objects and spaces represented will have to be desirable for the consumer (Ikkala & Lampinen, 2015) and project a positive
light on their owners (Festila & Müller, 2017). The identities of the provider and of their shared
resource are intertwined; they both influence how the provider’s reputation is assessed. Therefore, information about a shared apartment or a shared car is shared strategically.
6
4. Privacy and Motivation to Share
The disclosure of private information is considered to be a gateway to accessing the sharing economy, to the extent that personal details become an “integrated part of the service that is delivered” (Fagerstrøm et al., 2017, p. 124). The relationship between individuals’ privacy concerns and
their willingness to share private information online has been at the center of substantial research,
highlighting a relationship of surprising complexity.
In fact, earlier research on SNS emphasized a somewhat paradoxical lack of relationship between users’ privacy concerns and their self-disclosure online (Barnes, 2006). SNS users, despite
being concerned and somewhat aware about privacy, make very private and intimate information
publicly accessible on social networks. More recent approaches have introduced the concept of a
‘privacy calculus’, i.e. an evaluation of privacy risks against the benefits of disclosing personal information, taking place at all times while users interact online (Dienlin & Trepte, 2015). This perspective suggests that users may be weighing their privacy concerns against other motivations,
such as impression management (Utz & Krämer, 2009) or convenience (Sun, Wang, Shen, & Zhang,
2015). When privacy risks are too pressing to be offset by perceived benefits, users limit their selfdisclosure or engage in self-withdrawing behavior (Dienlin & Metzger, 2016).
When users interact online to engage in activities with a more strictly defined purpose, such as
dating sites or e-commerce platforms, their privacy calculus becomes more pronounced (Dinev &
Hart, 2006). In fact, if privacy concerns can become an obstacle for the willingness to participate in
an online transaction (Dinev & Hart, 2006) or in the disclosure of information with an e-marketer
(Morosan & DeFranco, 2015), they can also be offset by the perceived usefulness of information
disclosure (Li, Sarathi, & Xu, 2010; Morosan & DeFranco, 2015). Users are also more willing to disclose information in order to obtain a service when they feel like they are capable of protecting
their data (Li et al., 2010).
In the context of the sharing economy, both providers and consumers weigh their privacy concerns against the benefits they receive from participating. Benefits of the sharing economy have
been identified as belonging to three essential categories: economic (Bucher, Fieseler, & Lutz,
2016; Lampinen & Cheshire, 2016), social capital (Hamari, Sjöklint, & Ukkonen, 2016; Lampinen &
Cheshire, 2016), and reputation (Hamari et al., 2016).
Economic benefits in the context of the sharing economy can be thought of in terms of earnings
for providers and savings for consumers (compared to non-sharing economy options). Hui, Teo,
and Lee (2007) identify a category of users (‘information sellers’) as systematically valuing economic gains over their personal information. More generally, economic benefits will likely drive
users to disclose private information to an organization if they perceive that the exchange is fair (Li
et al., 2015). Within the sharing economy, this fairness can be enhanced by the shared perception
of the market as being more sustainable than traditional alternatives (Bucher et al., 2016). This can
raise the perceived value of the economic benefits from participating in the sharing economy and
increase the user willingness to participate.
7
Social capital can be understood as a benefit provided by the sharing economy when exchanges
rely more strongly on network structure. Previous research on SNS has highlighted how the perception of social capital moderates the relationship between disposition for self-disclosure and
participation in SNS (Ellison et al., 2011; Trepte & Reinecke, 2011). Rosen and colleagues (2011),
investigating the role of social capital within Couchsurfing, highlight how information exchange
determines the formation of ties among participants and grants them access to resources.
The option to improve one’s reputation has been established as both a motivation for users to
interact within a community (Wasko & Faraj, 2005) and as a signal of trustworthiness towards
other users, which can determine further engagement (Utz, Kerkhof, & van den Bos, 2012). Participants within a community improve their reputation by sharing information (Park, Gu, Leung, &
Konana, 2014). This mechanism can be incentivized if a higher reputation provides financial benefits (Cabral & Hortacsu, 2010; Ert et al., 2016).
5. Privacy in Shared Goods and Spaces
As individuals share their personal goods with others, privacy concerns and new privacy management practices may emerge in two interrelated ways: In relation to the physical use of these goods
by others and in relation to the online exposure of these goods. Both privacy concerns are related
to the idea of extended self as in its original form (Belk, 1988) and in its new form related to digital
consumption (Belk, 2013).
First, the use of goods by others (Teubner & Flath, 2016) may be seen as an intrusion into private and personal physical spheres (Bialski, 2012a; 2012b; Lampinen, 2015), when other people
literally enter one’s own home or use one’s own car (Buchberger, 2012; Tan, 2010; Zuev, 2012).
Furthermore, personal goods enable “inferences on personal styles and preferences, and often –
more delicately – also on life situation and personality traits” (Teubner & Flath, 2016, p. 1).
Privacy concerns might evolve as individuals “knowingly, intentionally or unintentionally” regard
possessions as parts of themselves (Belk, 1988, p. 139). It has been argued that things to which
one feels attached will become “parts of the extended self” (Belk, 1988, p. 141). In other words,
the self is seen as embodied in objects, which give cues to others about one’s person (Belk, 1991;
Goffman, 1959).
Research suggests that privacy management might be enacted through interaction between users and providers (Bialksi, 2012; Molz, 2012a; 2014), by establishing preventive privacy rules that
are communicated (Lampinen, 2016), and through temporal and physical structures (Lampinen,
2016).
Secondly, the exposure of one’s goods by others (Teubner & Flath, 2016) may be seen as an intrusion into one’s private and personal digital sphere, when other people assess one’s own goods
online, either through photographs and videos or through evaluations, judgments, and recom-
8
mendations. This exposure ‘on behalf of others’ (Lampinen, 2015), i.e. conducted by third parties,
occurs without the prior negotiation or consent of the provider.
In an era of ‘self-portraiture’ (Schwarz, 2010), others influence how we present our extended
self and the idealized view of one self, which might also impact the way one’s past is constructed
(Van Dijck, 2008). It is through photos that we post online of our “house, the kind of car we drive,
and our stock portfolio” (Belk, 2013) that we display ourselves for the whole world to see. Furthermore, it is not only ourselves that influence this process, but it is also a co-construction of the
digital self which occurs by validation and affirmation through others (Belk, 2013; Drenton, 2012).
The judgment of one’s goods impacts this co-construction (Belk, 2013; Solove, 2008).
Because it is harder to control all of our digital self-representations (Belk, 2013), the loss of control over the exposure of one’s things might create privacy concerns and new coping mechanisms.
Indeed, while we may exercise self-control, it is far harder to control all of our digital selfrepresentations when others share photos of our things with unintended consequences (Teng,
Lauterbach, & Adamic, 2010).
The interesting questions are, then, what kind of boundary management techniques providers
apply in order to manage privacy, whether there are any co-operative processes (Lampinen, 2015)
in boundary management, and how technology and policies impact these boundary management
processes.
6. Privacy and Trust
Any instance of sharing between individuals, whether it might concern physical or virtual goods,
presumes a certain level of trust. Trust can be broadly defined as “a psychological state comprising
the intention to accept vulnerability based upon positive expectations of the intentions or behaviors of another” (Rousseau, Sitkin, Burt, & Camerer, 1998, p. 395). As such, it can be believed to
play a major role in the privacy calculus of individuals who operate online (Dinev & Hart, 2006).
Whenever users evaluate an organization or another user as trustworthy, they use their trust as
a way to overcome uncertainty. It is therefore not surprising that the study of trust online, especially in conjunction with privacy, has been mainly been carried out in the context of e-commerce.
According to McKnight, Choudhury, and Kacmar (2002) four forms of trust exist in an online context: Disposition to trust, institution-based trust, trusting beliefs, and intention to trust. Disposition to trust describes individuals’ generalized trust and can be interpreted as a cultural trait
(some cultures are more trusting than others). Institution-based trust is based on the idea that
once “structural conditions are present (e.g., on the Internet)” there is an increased probability
that an exchange, or a transaction is going to take place as expected (McKnight et al., 2002, p.
339). Trusting beliefs refer to the user’s perception that the trustee might have beneficial attributes. Different trusting beliefs have been used, three of which are most often summarized as
trusting beliefs: competence, benevolence, and integrity (Bhattacherjee, 2002). Finally, trusting
intentions signal users’ willingness to depend on the trustee. Trusting intentions lead to trusting
9
behavior. In an e-commerce context, trusting intentions could lead a buyer to anticipate payment
to a seller before having received the exchanged good.
An important element of the sharing economy is its reliance on networks of peers. Interpersonal trust within a community of peers has been found to promote information sharing and altruism
(Fang & Chiu, 2010). In the context of SNS, the relationship of trust and information disclosure has
been found to be substantially more complex. On the one hand, a certain degree of interpersonal
trust is necessary for users to decide to disclose their information (Dwyer, Hiltz & Passerini, 2007).
On the other, however, some information disclosure needs to take place in order for users to be
able to trust other users (Sheldon, 2009). In a context where users need to interact with others in
order to purchase something, or perform an exchange of goods, users’ trust towards one another
can motivate interactions when privacy concerns are high (Eastlick, Lotz, & Warrington, 2006).
Considering trust and its interaction with privacy, in the context of the sharing economy, means
considering not only the trust users have towards other users, but also the trust users have towards organizations and institutions. Previous research has established that trust towards an
online institution can be significantly lowered by users’ perceived privacy risks (Belanger, Hiller, &
Smith, 2002; Büttner & Göritz, 2008; Jarvenpaa, Tractinsky, & Vitale, 2000). On the other hand,
however, trust in an organization, or even trust in the Internet, has been shown to have a positive
effect on users’ attitudes towards transactions (McCole, Ramsey & Williams, 2010). This effect
appears to be particularly strong for individuals with high privacy concerns (McCole et al., 2010),
suggesting that trust might indeed enter the privacy calculus (Dinev & Hart, 2006) and provide a
stronger motivation for the most concerned users. For organizations within the sharing economy,
this suggests that privacy concerns might stand in the way of participation for users, especially if
the company does not provide enough information for users to perceive it as trustworthy (e.g. Liu,
Marchewka, Lu, & Yu, 2005).
In the context of the sharing economy, trust is essential for the proper functioning of the relationships between consumers and providers, as well as between both categories of users and the
platform organization (Lauterbach, Truong, Shah, & Adamic, 2009; Rosen et al., 2011). However,
some elements on which user trust towards an organization could be founded, such as the functioning of matching algorithms, remain largely unknown (Marr, 2016) and internal review systems
have been proven to be substantially inflated, leading users to make decisions based on other
cues, such as photographs (Ert et al., 2016). This might have important consequences for the way
in which users represent themselves and their assets and, consequently, on their information disclosure and privacy. Unfortunately, to this day, research connecting trust and privacy in the sharing economy is substantially lacking. We wish for future research to better identify the link between privacy and user trust in the sharing economy, both towards peers and towards organizations.
10
7. Propositions for Future Research on Privacy and Sharing
Overall, privacy results as both an important and a relatively understudied element within the
sharing economy. In fact, the mere act of sharing goods and services, with or without a payment
involved, redefines the boundaries between what is one’s own and what belongs to someone else.
This shift to accessing from owning goods has implications for the kind of personal data that is
exposed and exchanged, such as preferences in the use of goods or travel patterns. Furthermore,
as sharing happens simultaneously online and offline, user boundaries are re-discussed when it
comes to their tangible and intangible assets, goods, and identities.
In this memo, we have focused on the exchange happening behind the act of sharing data,
goods, and services. Using the privacy calculus framework, existing literature on sharing economy
platforms, as well as SNS and dating sites, we attempted to investigate what advantages are available for both providers and consumers in exchange for their data. Within this framework, we have
also explored the role of impression management and the consequences from sharing each type
of information. Consequently, we have investigated the motivations leading both categories of
users to participate in the sharing economy. We have separately addressed the issues of boundary
management relating to the sharing of goods and spaces. Finally, we have covered the role of trust
in connection with privacy within the sharing economy and how it impacts the relationships between peers, as well as between users and organizations.
As the sharing economy phenomenon increases in size and popularity, it appears clear that several areas of research could be reinforced in the future:
•
•
Impression Management and Privacy Calculus of Consumers: The role of strategic information disclosure in order to participate in the sharing economy has so far largely concerned providers, as their self-presentation is instrumental to the offer of their
goods/services (Ert et al., 2016; Lampinen, 2016). However, the right typology of shared
information can determine whether a consumer can access a shared service or be refused
participation (Fagerstrøm et al., 2017). In particular, as studies discuss instances of consumer discrimination (Edelman, Luca, & Svirsky, 2017), research should dedicate attention
to information sharing behaviors and on the strategies put in place to ensure participation
in the economy.
Institutional vs Peer Trust: As user information is diffused to both peers and organizations
within the sharing economy, some academic attention has been dedicated to trust, especially in combination with user reputations and the internal review systems (Ert et al.,
2016). However, as organizations grow in popularity and expose themselves with sometimes questionable conduct, it will be interesting to investigate whether and how their
reputations influence users’ willingness to be involved. At the same time, as previous research on SNS has shown significant differences in users’ perception of privacy risks, as de-
11
riving from institutional versus peer interactions (Young & Quan-Haase, 2013), it will be interesting to test whether such observations can also apply within the sharing economy.
•
The Sharing of Spaces and the Extended Self: As the sharing of spaces, objects, and private possessions becomes more and more popular, some questions are raised in terms of
the shift this generates in users’ perception of their identities. In fact, using Belk’s theory
of the ‘Extended Self’ (1988), users might be still assigning an identity value to objects they
own. This can inform their willingness to share them. More research should be dedicated
to understand whether this might be the case.
Implications for platforms
Over the last years we have witnessed the development of different platform organizations with
varying degrees of maturation and commercialization. While some platform-organizations are run
as profit oriented corporations with a need to satisfy investor and shareholder interests, other
platforms with a strong community orientation are genuinely dedicated to more altruistic goals.
Consequently, the resources allocated to the development of platforms and their maturation may
vary among platform organizations, with possible implications for privacy related matters. Depending on the degree of commercialization and maturation, it can be expected that certain platforms might have a stronger interest in monetizing data and, therefore, a more strategic approach
to user privacy.
This can become problematic, especially as governments and regulators draw their attention to
the consequences of data sharing. This could strongly impact how platforms handle and manage
privacy issues. Less restrictive regulations may lead platform organizations to extensively use user
data for market research and further commercial use. While such use may improve sharing platforms’ offerings and support their economic survival, it may also trigger users’ privacy concerns.
Recent media attention towards platforms’ questionable use of private data has increased the
awareness and sensitivity of users. With this growing user awareness of privacy and platforms’ use
of data, there might be a trade-off for platforms’ privacy management: On the one hand, the use
of private data can improve algorithms, offerings, and platforms’ economic survival. On the other
hand, this extended use of user data may trigger a loss of trust with detrimental effects for platform organizations. Similar to users, organization platforms will need to balance the trade-off between use of user data, within the limitations of legal possibilities, and the social approval of this
use by providers and customers.
Implications for providers
Whatever the motivations are that lead providers to share their goods and services on sharing
platforms, privacy is a crucial factor that users will more or less consciously take into account
when deciding to participate in the sharing economy. The privacy calculus model, developed for
12
SNS, provides a viable cognitive model to understand possible trade-offs that users evaluate while
participating in the sharing economy. Especially for providers, trade-offs do not only involve data
privacy, but also the physical privacy that is compromised when houses and private possessions
(such as cars or objects) are shared with strangers. This extends the privacy concerns beyond fears
about the use of one’s data (e.g., credit card information, financials, etc.) to social concerns, such
as social identity and boundary management. The profitability of participation in the sharing
economy is also likely to have consequences on the perceptions providers have of their own privacy. This is an aspect that should not be forgotten while addressing providers’ information sharing
practices and the value of their data. Additionally, when providers participate in the sharing economy, they do not only act as economic actors. They also interact socially through the creation an
identity and are thus exposed to social judgments related to status, reputation, and stigma. This
should be addressed by platform organizations permitting providers an ample choice of degrees of
data-sharing, so as to shield providers from the risks that might be associated with their exposure.
Implications for consumers
Consumers participate in the sharing economy for various reasons and motives. Similar to providers, the privacy calculus model helps to understand how benefits earned from participation in the
sharing economy outweigh fears of privacy exposure. Privacy concerns not only include the use of
data, but also social relations and approval, such as status, reputation, and stigma. These different
privacy concerns may need further elaboration, as they seem to relate to different qualities. On
the one hand, they relate to the use and misuse of personal data, such as credit card information,
addresses, travel patterns, etc. These privacy concerns are mostly associated with platform organizations and their use of this kind of data. The awareness and trust of consumers towards platform
organizations is hence crucial, while the use and misuse of data also becomes matter of regulation
and corporate ethics. On the other hand, privacy concerns related to social concerns are more
associated with the community around a platform, other consumers, and providers. Users care
about the degree to which they are able to control and manage boundaries, as well as how they
are seen, portrayed, and evaluated. These concerns might be less a matter of trust, regulation, and
ethics related to corporate conduct, but rather a matter of affordances that enable users to control their social relations and identities.
8. References
Altman, I. (1975). The environment and social behavior: privacy, personal space, territory, crowding.
Monterey, CA: Brooks/Cole.
Andreotti, A., Anselmi, G., Eichhorn, T., Hoffmann, C. P., & Micheli, M. (2017). Participation in the
sharing economy. Report for the EU Horizon 2020 project Ps2Share: Participation, Privacy, and
Power in the Sharing Economy. Retrieved from www.ps2share.eu/documentation
Bardhi, F., & Eckhardt, G. M. (2012). Access-based consumption: The case of car sharing. Journal of
Consumer Research, 39(4), 881-898.
13
Barnes, S. B. (2006). A privacy paradox: Social networking in the United States. First Monday, 11(9).
Belanger, F., Hiller, J. S., & Smith, W. J. (2002). Trustworthiness in electronic commerce: the role of
privacy, security, and site attributes. The Journal of Strategic Information Systems, 11(3-4), 245–
270.
Belk, R. W. (1988). Possessions and the extended self. Journal of Consumer Research, 15(2), 139168.
Belk, R. W. (1991). Possessions and the sense of past. SV-Highways and Buyways: Naturalistic Research from the Consumer Behavior Odyssey.
Belk, R. W. (2013). Extended self in a digital world. Journal of Consumer Research, 40(3), 477-500.
Ben-Ze’ev, A. (2003). Privacy, emotional closeness, and openness in cyberspace. Computers in Human Behavior, 19(4), 451-467.
Bhattacherjee, A. (2002). Individual trust in online firms: Scale development and initial test. Journal
of Management Information Systems, 19(1), 211-241.
Bialski, P. (2012a). Becoming intimately mobile. Frankfurt, Germany: Peter Lang.
Bialski, P. (2012b). Technologies of hospitality: How planned encounters develop between
strangers. Hospitality & Society, 1(3), 245–260.
Buchberger, S. (2012). Hospitality, secrecy and gossip in Morocco: Hosting CouchSurfers against
great odds. Hospitality & Society, 1(3), 299–315.
Bucher, E., Fieseler, C., & Lutz, C. (2016). What's mine is yours (for a nominal fee)–Exploring the
spectrum of utilitarian to altruistic motives for Internet-mediated sharing. Computers in Human
Behavior, 62, 316-326.
Büttner, O. B., & Göritz, A. S. (2008). Perceived trustworthiness of online shops. Journal of Consumer Behaviour, 7(1), 35–50.
Cabral, L., & Hortacsu, A. (2010). The dynamics of seller reputation: Evidence from eBay. The Journal of Industrial Economics, 58(1), 54-78.
Chellappa, R. K., & Sin, R. G. (2005). Personalization versus privacy: An empirical examination of the
online consumer’s dilemma. Information Technology and Management, 6(2-3), 181-202.
Dienlin, T., & Metzger, M. J. (2016). An extended privacy calculus model for SNSs: Analyzing selfdisclosure and self-withdrawal in a representative US sample. Journal of Computer-Mediated
Communication, 21(5), 368-383.
Dienlin, T., & Trepte, S. (2015). Is the privacy paradox a relic of the past? An in-depth analysis of
privacy attitudes and privacy behaviors. European Journal of Social Psychology, 45(3), 285-297.
Dinev, T., & Hart, P. (2006). An extended privacy calculus model for e-commerce transactions. Information Systems Research, 17(1), 61-80.
Drenten, J. (2012). Snapshots of the self. Online consumer behavior: Theory and research in social
media, advertising, and e-tail, 3-34.
Dwyer, C., Hiltz, S., & Passerini, K. (2007). Trust and privacy concern within social networking sites: A
comparison of Facebook and MySpace. AMCIS 2007 proceedings, 339.
Eastlick, M. A., Lotz, S. L., & Warrington, P. (2006). Understanding online B-to-C relationships: An
integrated model of privacy concerns, trust, and commitment. Journal of Business Research, 59(8), 877-886.
Edelman, B.G., Luca, M., & Svirsky, D. (2017). Racial discrimination in the sharing economy: Evidence from a field experiment. American Economic Journal: Applied Economics, 9(2), 1-22.
Egelman, S. (2013). My profile is my password, verify me!: the privacy/convenience tradeoff of facebook connect. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 2369-2378). New York, NY: ACM.
14
Ellison, N. B., Vitak, J., Steinfield, C., Gray, R., & Lampe, C. (2011). Negotiating privacy concerns and
social capital needs in a social media environment. In S. Trepte, & L. Reinecke (Eds.), Privacy Online (pp. 19-32). Berlin/Heidelberg: Springer.
Ert, E., Fleischer, A., & Magen, N. (2016). Trust and reputation in the sharing economy: The role of
personal photos in Airbnb. Tourism Management, 55, 62-73.
Fagerstrøm, A., Pawar, S., Sigurdsson, V., Foxall, G. R., & Yani-de-Soriano, M. (2017). That personal
profile image might jeopardize your rental opportunity! On the relative impact of the seller’s facial expressions upon buying behavior on Airbnb™. Computers in Human Behavior, 72, 123-131.
Fang, Y. H., & Chiu, C. M. (2010). In justice we trust: Exploring knowledge-sharing continuance intentions in virtual communities of practice. Computers in Human Behavior, 26(2), 235-246.
Festila, M., & Müller, S. (2017). The impact of technology-mediated consumption on identity: The
case of Airbnb. In Proceedings of the 50th Hawaii International Conference on System Sciences
(pp. 54-63). New York, NY: IEEE.
Goffman, E. (1959). The moral career of the mental patient. Psychiatry, 22(2), 123-142.
Guo, S., & Chen, K. (2012). Mining privacy settings to find optimal privacy-utility tradeoffs for social
network services. In Privacy, Security, Risk and Trust (PASSAT), 2012 International Conference on
and 2012 International Conference on Social Computing (SocialCom) (pp. 656-665). New York, NY:
IEEE.
Hamari, J., Sjöklint, M., & Ukkonen, A. (2016). The sharing economy: Why people participate in collaborative consumption. Journal of the Association for Information Science and Technology, 67(9),
2047-2059.
Hellwig, K., Morhart, F., Girardin, F., & Hauser, M. (2015). Exploring different types of sharing: A
proposed segmentation of the market for “sharing” businesses. Psychology & Marketing, 32(9),
891-906.
Hui, K. L., Teo, H. H., & Lee, S. Y. T. (2007). The value of privacy assurance: An exploratory field experiment. MIS Quarterly, 31(3), 19-33.
Ikkala, T., & Lampinen, A. (2015). Monetizing network hospitality: Hospitality and sociability in the
context of Airbnb. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing (pp. 1033-1044). New York, NY: ACM.
Jarvenpaa, S. L., Tractinsky, N., & Vitale, M. (2000). Consumer trust in an Internet store. Information
Technology and Management, 1(1-2), 45–71.
John, N. A. (2013). The social logics of sharing. The Communication Review, 16(3), 113-131.
Krause, A., & Horvitz, E. (2008). A Utility-Theoretic Approach to Privacy and Personalization. In Proceedings of the Twenty-Third AAAI Conference on Artificial Intelligence (pp. 1181-1188). Menlo
Park, CA: AAAI.
Lampinen, A. (2015). Networked privacy beyond the individual: Four perspectives to 'sharing'. In
Proceedings of The Fifth Decennial Aarhus Conference on Critical Alternatives (pp. 25-28). Aarhus,
DK: Aarhus University Press.
Lampinen, A. (2016). Hosting together via Couchsurfing: Privacy management in the context of
network hospitality. International Journal of Communication, 10, 1581-1600.
Lampinen, A., & Cheshire, C. (2016). Hosting via Airbnb: Motivations and financial assurances in
monetized network hospitality. In Proceedings of the 2016 CHI Conference on Human Factors in
Computing Systems (pp. 1669-1680). New York, NY: ACM.
Lauterbach, D., Truong, H., Shah, T., & Adamic, L. (2009). Surfing a web of trust: Reputation and
reciprocity on couchsurfing.com. In CSE'09: International Conference on Computational Science
and Engineering (pp. 346-353). New York, NY: IEEE.
15
Lee-Won, R. J., Shim, M., Joo, Y. K., & Park, S. G. (2014). Who puts the best “face” forward on Facebook? Positive self-presentation in online social networking and the role of self-consciousness,
actual-to-total Friends ratio, and culture. Computers in Human Behavior, 39, 413-423.
Li, H., Sarathy, R., & Xu, H. (2010). Understanding situational online information disclosure as a privacy calculus. Journal of Computer Information Systems, 51(1), 62-71.
Li, T., & Li, N. (2009). On the tradeoff between privacy and utility in data publishing. In Proceedings
of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining (pp.
517-526). New York, NY: ACM.
Liu, C., Marchewka, J. T., Lu, J., & Yu, C. S. (2005). Beyond concern—a privacy-trust-behavioral intention model of electronic commerce. Information & Management, 42(2), 289-304.
Liu, S. Q., & Mattila, A. S. (2017). Airbnb: Online targeted advertising, sense of power, and consumer
decisions. International Journal of Hospitality Management, 60, 33-41.
Ma, X., Hancock, J. T., Mingjie, K. L., & Naaman, M. (2017). Self-Disclosure and perceived trustworthiness of Airbnb host profiles. In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing (pp. 2397-2409). New York, NY: ACM.
Marr, B. (2016). The sharing economy - What it is, examples, and how big data, platforms and algorithms fuel it. Forbes Online, 21 October. Retrieved from
https://www.forbes.com/sites/bernardmarr/2016/10/21/the-sharing-economy-what-it-isexamples-and-how-big-data-platforms-and-algorithms-fuel/
McCole, P., Ramsey, E., & Williams, J. (2010). Trust considerations on attitudes towards online purchasing: The moderating effect of privacy and security concerns. Journal of Business Research, 63(9), 1018-1024.
McKnight, D. H., Choudhury, V., & Kacmar, C. (2002). Developing and validating trust measures for
e-commerce: An integrative typology. Information Systems Research, 13(3), 334-359.
McNamara, B. (2015). Airbnb: A not-so-safe resting place. Journal on Telecommunications & High
Technology Law, 13(1), 149-170.
Molz, J. (2012a). CouchSurfing and network hospitality: “It’s not just about the furniture.” Hospitality & Society, 1(3), 215–225.
Molz, J. G. (2012b). Travel connections: Tourism, technology, and togetherness in a mobile world.
London, UK: Routledge.
Molz, J. G. (2014). Toward a network hospitality. First Monday, 19(3). Retrieved from http://ojsprod-lib.cc.uic.edu/ojs/index.php/fm/article/view/4824
Morosan, C., & DeFranco, A. (2015). Disclosing personal information via hotel apps: A privacy calculus perspective. International Journal of Hospitality Management, 47, 120-130.
Newlands, G., Lutz, C., & Fieseler, C. (2017). Power in the sharing economy. Report for the EU Horizon 2020 project Ps2Share: Participation, Privacy, and Power in the Sharing Economy. Retrieved
from www.ps2share.eu/documentation
Park, J. H., Gu, B., Leung, A. C. M., & Konana, P. (2014). An investigation of information sharing and
seeking behaviors in online investment communities. Computers in Human Behavior, 31, 1-12.
Pearson, S. (2013). Privacy, security and trust in cloud computing. In Privacy and Security for Cloud
Computing (pp. 3-42). Berlin/Heidelberg: Springer.
Peterson, K., & Siek, K. A. (2009). Analysis of information disclosure on a social networking site. In
International Conference on Online Communities and Social Computing (pp. 256-264). Berlin/Heidelberg: Springer.
Prabhakar, S., Pankanti, S., & Jain, A. K. (2003). Biometric recognition: Security and privacy concerns. IEEE security & privacy, 99(2), 33-42.
16
Raya, M., Shokri, R., & Hubaux, J. P. (2010). On the tradeoff between trust and privacy in wireless ad
hoc networks. In Proceedings of the third ACM conference on Wireless network security (pp. 7580). New York, NY: ACM.
Rosen, D., Lafontaine, P. R., & Hendrickson, B. (2011). CouchSurfing: Belonging and trust in a globally cooperative online social network. New Media & Society, 13(6), 981-998.
Rousseau, D. M., Sitkin, S. B., Burt, R. S., & Camerer, C. (1998). Not so different after all: A crossdiscipline view of trust. Academy of Management Review, 23(3), 393-404.
Schwarz, O. (2010). On friendship, boobs and the logic of the catalogue: Online self-portraits as a
means for the exchange of capital. Convergence, 16(2), 163-183.
Sheldon, P. (2009). "I'll poke you. You'll poke me!" Self-disclosure, social attraction, predictability
and trust as important predictors of Facebook relationships. Cyberpsychology: Journal of Psychosocial Research on Cyberspace, 3(2). Retrieved from
https://journals.muni.cz/cyberpsychology/article/view/4225
Solove, D. J. (2008). Understanding privacy. Cambridge, MA: Harvard University Press.
Solove, D. J. (2011). Nothing to hide: The false tradeoff between privacy and security. New Haven,
CT: Yale University Press.
Stutzman, F., Vitak, J., Ellison, N., Gray, R., & Lampe, C. (2012). Privacy in interaction: Exploring disclosure and social capital in Facebook. In Proceedings of the 6th annual International Conference
on Weblogs and Social Media (ICWSM ’12). Menlo Park, CA: AAAI Press.
Sun, Y., Wang, N., Shen, X. L., & Zhang, J. X. (2015). Location information disclosure in locationbased social network services: Privacy calculus, benefit structure, and gender differences. Computers in Human Behavior, 52, 278-292.
Tan, J. E. (2010). The leap of faith from online to offline: An exploratory study of Couchsurfing.org.
In International Conference on Trust and Trustworthy Computing (pp. 367-380). Berlin/Heidelberg: Springer.
Teng, C. Y., Lauterbach, D., & Adamic, L. A. (2010). I rate you. You rate me. Should we do so publicly? In Proceedings of the 3rd Conference on Online Social Networks (pp. 12-12). Berkeley, CA:
USENIX Association.
Teubner, T., & Flath, C. M. (2016). Privacy in the sharing economy. Working Paper.
Trepte, S., & Reinecke, L. (2011). The social web as a shelter for privacy and authentic living. In S.
Trepte, & L. Reinecke (Eds.), Privacy Online (pp. 61-73). Berlin/Heidelberg: Springer.
Tussyadiah, I. P. (2016). The influence of innovativeness on on-site smartphone use among American travelers: Implications for context-based push marketing. Journal of Travel & Tourism Marketing, 33(6), 806-823.
Utz, S. (2015). The function of self-disclosure on social network sites: Not only intimate, but also
positive and entertaining self-disclosures increase the feeling of connection. Computers in Human Behavior, 45, 1-10.
Utz, S., & Krämer, N. C. (2009). The privacy paradox on social network sites revisited: The role of
individual characteristics and group norms. Cyberpsychology: Journal of Psychosocial Research on
Cyberspace, 3(2). Retrieved from https://journals.muni.cz/cyberpsychology/article/view/4223
Utz, S., Kerkhof, P., & van den Bos, J. (2012). Consumers rule: How consumer reviews influence
perceived trustworthiness of online stores. Electronic Commerce Research and Applications, 11(1), 49-58.
Van Dijck, J. (2008). Digital photography: communication, identity, memory. Visual Communication,
7(1), 57-76.
Warren, S. D., & Brandeis, L. D. (1890). The right to privacy. Harvard Law Review, 4(5), 193-220.
17
Wasko, M. M., & Faraj, S. (2005). Why should I share? Examining social capital and knowledge contribution in electronic networks of practice. MIS Quarterly, 29(1), 35-57.
Yang, S., Quan-Haase, A., Nevin, A. D., & Chen, Y. (2017). The role of online reputation Management, Trolling, and Personality Traits in the Crafting of the Virtual Self on Social Media. In L.
Sloan, & A. Quan-Haase (Eds.), The SAGE Handbook of Social Media Research Methods (pp. 7489). London, UK: Sage.
Young, A. L., & Quan-Haase, A. (2013). Privacy protection strategies on Facebook: The Internet privacy paradox revisited. Information, Communication & Society, 16(4), 479-500.
Zhao, S., Grasmuck, S., & Martin, J. (2008). Identity construction on Facebook: Digital empowerment in anchored relationships. Computers in Human Behavior, 24(5), 1816-1836.
Zuev, D. (2012). CouchSurfing as a spatial practice: Accessing and producing xenotopos. Hospitality
and Society, 1(3), 227–244.
18