data protection

The Drive for Privacy and the Difficulty of Achieving It in the Digital Age

Technologies, interfaces, and market forces can all influence human behavior. But probably, and hopefully, they cannot alter human nature. If privacy is both culturally specific and culturally universal, chances are that people’s quest for privacy will not dissipate

Pubblicato il 02 Ago 2021

Alessandro Acquisti

Professor of Information Technology and Public Policy at the Heinz College, Carnegie Mellon University

Laura Brandimarte

Assistant Professor of Management Information Systems, University of Arizona

George Loewenstein

Herbert A. Simon University Professor of Economics and Psychology

cybersecurity governo meloni

Our goal in this piece is to review different streams of social science literature on privacy to understand consumer privacy decision making and derive implications for policy. Specifically, we examine both psychological and economic factors influencing consumers’ desire and ability to protect their privacy.

Do consumers in fact care about, and act to protect, their privacy? The evidence for elevated privacy concerns in Western societies is ample and enduring. In the case of the United States, it can be found in Alan Westin’s seminal surveys from the last century, in which an overwhelming majority of Americans reported feeling troubled by potential misuses of their information (Westin, 2001). And, it can be found in recent polls showing that a majority of Americans are concerned about data harvesting by corporations (Pew, 2019). Yet despite robust evidence of ongoing concern about privacy, a common thread in the privacy literature highlights the claim that, although people say that they care, they do not actually do—as proven by seemingly careless online disclosures (Johnson, 2020; Miller, 2014). Privacy is no longer a social norm, the narrative goes, or, in fact, privacy is dead (Sprenger, 1999).

We disagree. Surveys, field studies, and experiments—as well as common sense—show that consumers engage in privacy-regulating behaviors continually and in both online and offline scenarios, crossing the many diverse dimensions and definitions of privacy, from spatial to informational. The drive for privacy is under-appreciated in part because individual actions to protect privacy are so ubiquitous and second nature that they go unnoticed, or are not construed as privacy behaviors.

La spinta verso la privacy e la difficoltà di ottenerla nell’era digitale

The Evidence for Privacy-Seeking Behavior

In our everyday lives in the offline world, we instinctively and continually engage in privacy behaviors without even thinking: lowering our voice during intimate conversations; leaving a group of people to take a personal call; tilting a document we are reading so it is protected from prying eyes; and drawing curtains to ensure privacy in our bedrooms. An American social psychologist, Irwin Altman (1975), argued that privacy behaviors are so ubiquitous and common that they occur with little conscious awareness. Writing decades before the Internet age—and so, focusing on personal space rather than information—he noted that protection of personal space is instinctive and universal, across time and geography: Transgressions of personal space invoke a variety of reactions, including a shifting of gaze, breaking eye contact, turning the body, or adopting protective postures (Altman, 1977).

Altman’s insights apply also to how people interact on the Internet. Online, too, we engage continuously in behaviors that delimit the contours of our closeness or openness to others. Multiple times per day, we alternate between different email accounts or online personae to separate personal from professional spheres; pick privacy settings to manage the visibility of our social media posts; reply privately to group messages, carefully selecting recipients for our responses; enter (or rely on previously stored) passwords to keep information in our online accounts private; set “I am busy!” notices on instant messaging profiles to tell people not to contact us, right now; and turn on and off camera or audio on conference calls, depending on what we want to (or must) show to others. The precise motivations behind these behaviors are diverse; their common trait is the individual’s attempt to regulate the boundaries of their interactions with others.

Observations of online privacy-seeking behaviors go well beyond the anecdotal. Here are just a few of many examples coming from contemporary scholars and their empirical research.

Consider a seminal survey by Pew (2013): It found that an overwhelming majority (86%) of surveyed US adults reported having taken steps online to remove or mask their digital footprints. The steps were diverse, from less to more sophisticated, including clearing cookies, encrypting email, avoiding using their name, and using virtual private networks.

Self-reports do not always reflect actual behavior. Yet, observational field studies of consumer choice also provide evidence of privacy-seeking behavior. The behaviors encompass the many diverse scenarios in which privacy (and privacy invasions) plays a role in consumers’ lives: telemarketing annoyance, government surveillance, online tracking, social media intrusions, and so on.

For instance, by 2007, after the Federal Trade Commission had opened the National Do Not Call Registry (a database with the telephone numbers of individuals who do not want telemarketers to contact them), 72% of Americans had registered on the list (Bush, 2009). And following the revelations Edward Snowden made in 2013 regarding secret government surveillance programs, US consumers became less likely to read Wikipedia articles perceived as privacy sensitive (Penney, 2016). Or consider the following: Facebook users who were members of the Carnegie Mellon University’s network in 2005 progressively transitioned toward less public sharing of personal information over time: While 86% of those users were publicly sharing their date of birth in 2005, the percentage of them doing so decreased year by year, down to 22% in 2009 (Stutzman et al., 2013). Additionally, while only a miniscule proportion of Facebook users on that same Facebook network had altered their (highly visible) default search and visibility settings in 2005 (Gross & Acquisti, 2005), just about 7% of Facebook users studied by Fiesler et al. (2017) had not changed their default privacy settings by 2017. While the two samples are different, and self-selection cannot be excluded, the disparity in choices over time is stark.

Evidence of privacy-seeking behavior arises from laboratory and field experiments as well. Many of them focus on exchanges involving privacy of data and monetary rewards. For instance, a field experiment testing for a gap in willingness to pay/willingness to accept for privacy showed that over 50% of participants were not willing to exchange a $10 anonymous gift card for a $12 trackable one—essentially refusing a 20% increase in rewards to give away information on their purchasing decisions (Acquisti, John, & Loewenstein, 2013).

Acknowledging Complexities, and Reconciling the Evidence

Although we have deliberately highlighted evidence of privacy-seeking behaviors to counter a prevailing narrative of disappearing concerns, there exists ample evidence of people not bothering to protect information, or engaging publicly in behaviors only a short time ago considered highly private. There is evidence of consumers being unwilling to pay for data protection (Beresford et al., 2012; Preibusch et al., 2013), and choosing to give up personal data in exchange for small convenience and small rewards (Athey et al., 2017). Although by now it is well known that mobile apps collect and share sensitive information with third parties (Almuhimedi et al., 2015), the number of app downloads increases every year (Statista, 2016). Major data breaches have become common (Fiegerman, 2017), yet most people seem willing to trust companies with personal information in exchange for relatively small benefits (Ghose, 2017). And a plethora of widespread, readily observable, everyday online behaviors seem to bespeak an overall lack of concern.

Evidence of disclosure-seeking behavior on its own, however, does not contradict the argument that consumers care for privacy. First, the work of Altman (1975, 1977) serves as an antidote to simple notions of privacy as a static condition of withdrawal, protection, or concealment of information (e.g., Posner, 1978). Altman construed privacy as a dialectic and dynamic process of boundary regulation. Privacy regulation encompasses both the opening and the closing of the self to others. By balancing and alternating the two, individuals manage interpersonal boundaries to achieve desired levels of privacy—an optimal amount of social interaction, or an optimal balance of both disclosing and protecting personal information.

Privacy regulation is thus dynamic—a process highly dependent on and responsive to changes in context and a process that applies equally to the many diverse dimensions of privacy the literature has explored (a diversity this manuscript embraces, as evidenced by the disparate consumer scenarios it covers). Consistent with this account (see also Petronio 2002), the seemingly contrasting examples of protection- and disclosure-seeking behaviors illustrate how, while we manage privacy all the time, we do not continuously protect our data. It would be undesirable (and unfeasible) for any individual to do so.

Second, the evidence for seemingly privacy-neglecting behaviors highlights a deeper issue: Privacy is extraordinarily difficult to manage, or regulate, in the Internet age. Consider, again, some of the examples of protective behaviors presented earlier in this section; they have a second side. Although Facebook users on the Carnegies Mellon University’s network did become less likely to share their personal information with strangers between 2005 and 2009, changes in the Facebook interface in late 2009 and early 2010, affecting the visibility of certain profile fields, abruptly reversed that protective trend, making public again, for a significant portion of users, personal information those users had attempted to keep private (Stutzman et al., 2013). Likewise, although a DuckDuckGo (2017) survey did find that a substantial proportion of Internet users had tried to use private browsing, it also found that two-thirds of those users misunderstood (in fact, overestimated) the degree of protection that private browsing provided. And, while an Instamotor survey found that 89% of respondents had taken at least one step to protect data, the percentage of respondents taking all of those steps was exceedingly small; in fact, some of the more protective steps (such as using VPNs) were adopted by a small minority (Lewis, 2017). Not coincidentally, those steps were also the ones less familiar to average users, and costlier to adopt.

This more nuanced look at the evidence suggests that claims concerning privacy being “dead” too often confuse wants with opportunities—what people want and what they can actually achieve. The desire for privacy may well be virtually universal; consumers, offline as well as online, continually attempt to regulate boundaries between public and private. Yet, the opportunities and ability to do so effectively—to achieve desired levels of privacy—may be shrinking. As the above examples suggest, the reasons are both psychological and economic.

Can Individuals Effectively Manage Privacy Online? Behavioral hurdles

Economists use the term “revealed preferences” to refer to how consumers’ true valuations can be revealed by their behavior, such as their choices in the marketplace. Applied to the privacy domain, a revealed preference argument would posit that consumers protect and share precisely what they desire—from disclosing personal information on social media to covering online footprints using privacy-enhancing technologies. The choices they make, according to this perspective, should be optimal for them personally and for society as a whole: If privacy behaviors express true preference for privacy, market outcomes based on consumers’ choices would, in the absence of externalities, maximize aggregate utility.

On the surface, such reasoning appears consistent with the Altmanian notion of privacy as a process of boundary regulation, according to which the individual deliberately chooses when and what to protect or to share. In reality, regulating one’s privacy is aspirational: In Altman’s terms, desired privacy may not be matched by achieved privacy, and market behaviors may not always necessarily capture underlying preferences for privacy. We focus, in this section, on some psychological factors causing the discrepancy.

The privacy literature has increasingly drawn from research in psychology and behavioral economics to provide empirical evidence of numerous processes affecting, and sometimes impairing, privacy-related decision making (Margulis 2003). These factors range from privacy “calculus” to emotions; from asymmetric information to bounded rationality; and from resignation and learned helplessness to cognitive and behavioral biases (Acquisti, Brandimarte, & Loewenstein, 2015). Much behavioral research has been conducted, in the past two decades, on psychological factors that affect privacy decision-making. The lesson from this literature is that “non-normative factors” (factors that arguably do not affect true costs and benefits of sharing) can easily sway observed privacy choices.

Some of them are summarized in Table 1. Together, they explain when privacy-related behaviors capture actual preferences, and when they may not.

Table 1: Some psychological factors affecting privacy decision making

Psychological factorDescriptionRepresentative consequenceFirms’ response
Information asymmetriesConsumers are unaware of the diverse ways firms collect and use their dataConsumers cannot respond to risks they are unaware ofIncreases firms’ ability to collect and use consumer information
Bounded rationalityConsumers lack the processing capacity to make sense of the complexities of the information environmentFew read, or even could make sense of, privacy policiesWriting policies using sophisticated, legalistic terms that obscure the central issues
Present biasOveremphasizing immediate, and under-weighing delayed, costs and benefitsConsumers will incur long-term costs—for example, intrusive profiling and advertising—in exchange for small immediate benefits—for example, online discountsOffering small benefits in exchange for consumer data sharing
IntangibilityPutting little weight on outcomes that are intangible—difficult to isolate or quantifyConsequences of privacy violations are often diffuse and difficult to connect with specific actionsMaking it difficult for consumers to draw connections between specific acts of data sharing and specific privacy violations (e.g., price discrimination)
Constructed preferencesUncertainty about one’s preferences leads people to rely on crude decision heuristics that often run counter to considerations of objective costs and benefitsSticking with default privacy settingsSetting defaults that are advantageous to the firm rather than to the consumer
Illusory controlThe feeling (often illusory) that one is in control of a situation leads to increased risk-takingConsumers share more when given more granular control over privacy settingsProvide consumers with more granular privacy controls to encourage disclosure
HerdingThe tendency to imitate the behavior of other peopleConsumers share more when they see others sharing more on social mediaProvide social media feeds that convey a maximal sense of others’ sharing
AdaptationThe tendency to get used to risks that are unchanged over time or that increase graduallyDespite ever-increasing violations of privacy, consumers adapt to them and accept themChange data usage practices gradually
The drive to shareThe powerful drive to share information, including personal informationSharing of highly private, or even incriminating, information (e.g., on social media)Working behavioral levers that elicit the motive to share (e.g., recommending photographs to share)

The Supply Side of Privacy

Psychological factors affect, and to some degree distort, observed market demand for privacy. Economic factors affect its supply.

A supply of privacy does exist in the market. Techniques and protocols have been developed to protect data in nearly every imaginable online activity—from email to instant messaging, and from online advertising to online social media services. Some of those tools have been incorporated into products now available to consumers, such as VPN services, user-friendly encrypted emails, nontracking search engines and maps, anonymous browsers, and secure messaging platforms. Some major technology companies have started trying to leverage their private features as a source of competitive advantage (Panzarino, 2019). And most online services offer to varying degrees privacy controls such as opt-outs, visibility settings, and so forth.

At the same time, living in the modern world means being subject to continuous and ubiquitous tracking. Whether we are aware of it or not, both our online and offline behaviors are constantly tracked, by surveillance cameras (Satariano, 2019), face-recognition technologies (Feng, 2019), rental cars activating GPS tracking by default (Mapon, 2017), multitudes of apps that share with an opaque ecosystem of intermediaries personal data from our phones (Almuhimedi et al., 2015), and trackers used by companies to learn and predict our browsing behavior (Federal Trade Commission, 2016). Even the boundaries between our online and offline existences are eroding: Statistical tools are used to match sets of data belonging to the same individual across domains and services, endangering the very notion of anonymity and permitting personal information to be inferred from nonsensitive public data (Narayanan & Shmatikov, 2008).

The forces that drive this relentless encroachment of surveillance into every facet of our lives are in part behavioral (our increasing adaptation and habituation to tracking) but in greater part economic: The reduction in the costs of both surveillance and data storage, as well as the increasing value of (and success in) monetizing personal data, increases firms’ demand for tracking, thus driving down the supply of privacy. Such an outcome had been predicted by Hirshleifer (1978), who showed how private firms faced incentives to overinvest in gathering information: The resources used to acquire and disseminate that information would be wasteful from a societal point of view.

The rise of surveillance as an economic model (Zuboff 2015) has implications other than firms’ overinvestments. First, markets with significant information asymmetries (such as, surely, the market for privacy) can lead to economic inefficiency and even market failures (Akerlof 1970). Furthermore, market forces lose, in part, their ability to restrain firms’ data usage practices if network effects (especially powerful in two-sided market platforms such as search engines, advertising networks, and social media) lead to quasi-monopoly incumbents.

While the debate around competition, antitrust, and data regulation is nuanced and evolving, reduced competition and network effects reinforce each other, and enable incumbents to accumulate more data and user attention (say, from search behavior), then enter markets for other services (say, navigation maps), which in turn allows more data collection (and more user attention), and makes it possible for the incumbent to supply even more services than any entrant or competitor. Increased data collection—in terms of both increasing share of a consumer market and increasingly detailed inferences about each consumer—can thus lead to better, more valuable services, but also to fewer available outside options for privacy-seeking consumers.

Data-based network effects also tend to create powerful lock-ins into current products—such as online social networks, whose value resides precisely in continued engagement of a growing user base—strengthening incumbents. This creates what has been referred to as a privacy externality (Acquisti, Taylor, & Wagman, 2016): Other people’s usage of privacy-intrusive services increases the cost for privacy-conscious consumers not to use them. At the extreme, consumers’ costs to choose protective options become impractically large to accept. Once no privacy becomes the default, or the social norm, privacy options risk disappearing from the market altogether.

Making matters worse, consumers’ marginal costs of protection increase rapidly with the degree of protection sought. Because so much of what we do is now tracked, there is simply too much to learn about and to protect. Everyone can, with little effort, refrain from publishing their social security number on their Facebook profile. Using a VPN is more costly—in cognitive, usability, and economic terms. Attempting to hide most of one’s digital footprints from third-party monitoring is nearly incalculably demanding—and, in light of the continuously evolving technologies for surveillance, ultimately futile.

The Interaction of Economics and Psychology

While some firms may actively promote privacy, there is an almost surely greater fraction that respond to, and to a great extent exploit, consumers’ psychological characteristics (Section 3) for their own ends. Before the term “dark patterns” started being popularized (Gray et al., 2018), behavioral research on privacy had highlighted how platform providers can leverage interface changes to influence privacy choice (Hartzog, 2010). Firms, for example, have a deep appreciation of the impact of defaults (Acquisti, John, & Loewenstein, 2013), and joining a long tradition (e.g., of enrolling consumers in magazine subscriptions that continue unless proactively terminated), they set privacy defaults in a liberal fashion, banking on consumers’ laziness, and inattention (see Table 1 for a summary of different ways that firms can take advantage of consumer psychology with respect to privacy).

Even transparency and control can be used to nudge consumers toward higher disclosures (Acquisti, Adjerid, & Brandimarte, 2013). Transparency and control are important components of privacy management. For instance, they reduce customers’ feeling of emotional violation and distrust in cases where personal data shared with firms are subject to vulnerabilities (Martin et al., 2017). But, due to the behavioral hurdles we have highlighted above, they are not sufficient conditions for privacy protection. In fact, economic factors such as network effects and lock-in can exacerbate behavioral hurdles, giving some firms more data, more user attention, more control, and ultimately more ability to influence consumer behavior. Ultimately, consumer “responsibilization” (Giesler & Veresiu, 2014) approaches to privacy, predicated around so-called notice and consent regime—that is, reliance on consumer “choice” (Solove, 2012)—have not made it affordable or even feasible for consumers to achieve desired levels of privacy.

Consider the historical evolution of online tracking, which started with browser “cookies.” As users started adopting cookie managers to prevent cross-site tracking, the online advertising industry developed so-called “flash” cookies to avoid consumers’ deflecting strategies. And, as users started acquiring tools to defend against this new form of tracking, the industry switched to even more intrusive—and harder to hide from—tracking strategies: device fingerprinting, deep packet inspection, and so on. The consumer-seeking privacy in the digital age cannot rely on Altman’s mutually shared social norms and intuitive behaviors, which worked in an offline world. She/he is a modern Sisyphus constantly forced to learn new strategies, to little avail.

Desired vs. Desirable Privacy

A valid counterpoint to the arguments we just considered is that desired levels of privacy, albeit unachievable, may in fact exceed what would actually be optimal for consumers. If, in the age of analytics, the collection and analysis of data can be source of great technological and economic advancement, then the loss of privacy, far from being a threat to societal well-being, may be a necessary price for increased consumer and societal welfare. Thus, desirable amounts of privacy may be less than what consumers claim to want. They may, in fact, be precisely the levels that markets already produce.

The fact that great utility can be extracted from data is undeniable and not questioned here. Rather, we scrutinize from a number of angles the premise that market outcomes satisfy optimal balances of data use and protection. What does privacy economics actually tell us about the trade-offs associated with data sharing and how are the benefits from data allocated?

Who Benefits from Consumer Data Collection?

Market equilibria may also not be advantageous for consumer welfare in relative terms, if most of the benefits accrued from the collection and usage of their data accrue to other stakeholders, such as data intermediaries that, thanks to a combination of market power and supremacy in surveillance technology, have nearly unchallenged control over the data.

Consider, for instance, online targeted advertising. According to industry insiders, collecting and using consumer data to target ads create economic win-wins for all parties involved—merchants, consumers, publishers, and intermediaries (AdExchanger, 2011).

In reality, on theoretical grounds, targeting can either increase or decrease (Bergemann & Bonatti, 2011; De Corniere & De Nijs, 2016) the welfare of stakeholders other than the data intermediaries. And available empirical research tells us very little about how the benefits of targeted ads are allocated to stakeholders other than merchants and the intermediaries themselves. Data from ad networks suggest that opting out of behavioral targeting costs publishers and ad exchanges approximately $8.5 per consumer (Johnson et al., 2020). Yet, a Digiday 2019 poll of publisher executives found that for 45% of respondents, behavioral ad targeting had “not produced any notable benefit,” and 23% claimed it had “actually caused their ad revenues to decline” (Weiss, 2019). And what about consumers themselves? Do consumer search costs go down because of targeted ads? Are the prices they pay for advertised products on average higher, or lower, than those they would have paid for products found via search? What about the quality of those products? Much more research is needed in this area.

Noneconomic Ramifications: Privacy “Dark Matter”

Our arguments have, so far, only focused on economically quantifiable implications of privacy choices in specific contexts—such as targeted advertising. Two critical considerations arise when we try to look at a broader picture.

First, costs of privacy across scenarios are arguably impossible to combine into a definitive, aggregate estimation of the “loss” caused by a generalized lack of privacy. And this is not because privacy costs are rare and few—but for the opposite reason: They are very common, but highly diverse in form, heterogeneous in likelihood, and varying in magnitude. They range from identity theft to price discrimination; from attention and time waste to psychological harm; from discrimination in targeted advertisement to filter bubbles; and from stigma to rare but catastrophic personal consequences (see, e.g., Acquisti, Taylor, & Wagman, 2016; Calo, 2011; Solove, 2005, 2007). Hence the aggregation and estimation problem. For instance, to the extent that data surreptitiously collected through privacy-intrusive apps can have an effect on a country’s election how can we quantify (or even demonstrate) that impact, and its plethora of downstream ramifications?

Second, and relatedly, we have not even considered many of the most consequential ramifications of the loss of privacy. We call this economic “dark matter”: We know it is there, but cannot quantify it. The ramifications include the collective value of privacy (Regan, 1995); its role in preserving room for individual subjectivity (Cohen, 2010); and its role in protecting freedom (Westin, 1967), dignity (Schoeman, 1984), and fairness (Jagadish, 2016), or the very integrity of social life (Nissenbaum, 2009). If market outcomes respond primarily to economic incentives, market equilibria may not account for these intricate, indirect, less tangible, and yet arguably even more critical implications of data protection.

So, What Should Be Done?

If market outcomes are unlikely to produce not just the levels of privacy consumers desire, but also the levels that would be desirable for them, what—if anything—can be done to correct privacy imbalances? We consider in the last section of this article a number of strategies that have been proposed: nudges, data propertization schemes, privacy-enhancing technologies, and regulation.

Some of the behavioral and psychological hurdles we considered above may be countered, or ameliorated, through behavioral interventions, to align privacy outcomes with ex ante preferences (Acquisti, 2009). Numerous privacy nudges have been explored in the literature, from changing social media default visibility settings to making the consequences of privacy choices more salient to users (Acquisti, Adjerid, Balebako, et al., 2017). Unfortunately, while nudges have been proven to be somewhat effective in experiments and field trials (for instance, Zhang & Xu, 2016), it is unclear that localized behavioral interventions alone can correct the enormous imbalance consumers encounter online between their ability to manage personal data and platform providers’ ability to collect it. By controlling user interfaces, the providers remain in control of the architecture of choice.

Data propertization schemes have been proposed in the literature since the mid-1990s. Laudon (1996) proposed the creation of personal data markets, where consumers would trade rights with organizations over the collection and usage of their information, thereby “monetizing” their data. Over time, technological barriers to Laudon’s proposal have vanished. Data monetization startups have emerged and politicians have incorporated data propertization or “data dividends” in their platforms (Daniels, 2019). While appealing on some levels (Arrieta-Ibarra et al., 2018), data propertization schemes face hurdles in practice (Acquisti, Taylor, & Wagman, 2016). One issue is whether consumers, who under such a scheme will face decisions about who to sell their data to and for how much, are able to assign fair, reasonable valuations to their own data, considering the informational and behavioral hurdles we have highlighted in the previous sections. A second issue is that schemes that monetize privacy run the risk of exacerbating inequality, creating a world in which only the affluent can have privacy, or in which the already rich get more for their data than anyone else. Finally, considering the consequential noneconomic dimensions of privacy, some might find abhorrent the notion of putting a price on it, or question the propriety of allowing people to sell it, much as many question whether people should be allowed to sell their own organs for transplant.

Furthermore, market-based data propertization schemes suffer from a nearly insurmountable economic challenge. The most valuable data are contextual and dynamic; it is created in the interaction between incumbent service providers and consumers; and—missing regulatory intervention establishing baseline protections—those providers are unlikely to relinquish its ownership to others. Hence, data propertization schemes may simply add themselves to an ecosystem of widespread surveillance, rather than replace it.

Because they allow both data protection and data analytics, so-called Privacy-Enhancing Technologies (or PETs) offer significant potential individual and societal benefits. Yet, many consumers are unlikely to take advantage of them, due to unawareness of PETs’ existence, distrust, or perceived (and actual) costs. Thus, barriers to the success of PETs are both psychological and economic in nature. Pushing the responsibility for their usage to individuals—that is, expecting them to navigate a universe of disparate, heterogeneous self-defense solutions across an ever-increasing range of scenarios in which data are tracked—would once again shift exorbitant costs onto consumers in usability, cognitive, and economic terms (such as the opportunity costs arising from loss of features in services when PETs are deployed). In any case, much as is the case for nudges and data propertization schemes, deployment of PETs is an inherently individualist solution: In the absence of a wide-ranging regulatory intervention supporting their deployment by making privacy the default (Cavoukian, 2009), it is hard to see how a patchwork approach of localized solutions—only working under specific circumstances, on specific apps or systems, in specific scenarios—could go far toward addressing what is inherently a systemic problem of privacy.

In short: Psychologically informed interventions, data propertization schemes, and (especially) privacy-enhancing technologies may be useful tools for privacy management. However, due to their barriers and limitations, we also conclude that none is likely to work as intended in the absence of substantive, comprehensive regulation. Regulatory policies should mandate a framework of baseline privacy protection, addressing both the consumer-side hurdles and the supply-side factors we have considered in this article.

Conclusion

The ultimate conclusion of this paper may appear pessimistic. We showed that people care and act to manage their privacy, but face steep psychological and economic hurdles that make not just desired, but also desirable privacy nearly unattainable. We conclude that approaches to privacy management that rely purely on market forces and consumer responsibilization have failed. Comprehensive policy intervention is needed if a society’s goal is to allow its citizens to be in the position to manage privacy effectively and to their best advantage.

Although our conclusions, as we noted, may appear pessimistic, some of the very evidence we discussed in this article provides a glimmer of hope. Turning back to where we started, pronouncements that privacy is dead, we argued, confuse opportunities with wants. People’s opportunities for privacy are notably shrinking. And yet, across history, individuals—from Eastern Germans under Stasi (Betts, 2010; Sloan & Warner, 2016) to teenagers on social media (boyd & Marwick, 2011)—have revealed a remarkable tenacity in their attempts to carve out private spaces for themselves and their groups in face of all odds—even while in public, and even under surveillance. Technologies, interfaces, and market forces can all influence human behavior. But probably, and hopefully, they cannot alter human nature. If privacy, as Altman proposed, is both culturally specific and culturally universal, chances are that people’s quest for privacy will not dissipate.

References

Abowd, J. M., & Schmutte, I. M. (2019). An economic analysis of privacy protection and statistical accuracy as social choices. American Economic Review, 109(1), 171–202.

Acquisti, A. (2004). Privacy in electronic commerce and the economics of immediate gratification. In Proceedings of the 5th ACM conference on electronic commerce.

Acquisti, A. (2009). Nudging privacy: The behavioral economics of personal information. IEEE Security & Privacy, 7(6), 82–85.

Acquisti, A., Adjerid, I., Balebako, R., Brandimarte, L., Cranor, L.F., Komanduri, S., Leon, P.G., Sadeh, N., Schaub, F., Sleeper, M., & Wang, Y. (2017). Nudges for privacy and security: Understanding and assisting users’ choices online. ACM Computing Surveys (CSUR), 50(3), 1-41.

Acquisti, A., Adjerid, I., & Brandimarte, L. (2013). Gone in 15 seconds: The limits of privacy transparency and control. IEEE Security & Privacy, 11(4), 72–74.

Acquisti, A., Brandimarte, L., & Loewenstein, G. (2015). Privacy and human behavior in the age of information. Science, 347(6221), 509–514.

Acquisti, A., & Gross, R. (2006). Imagined communities: Awareness, information sharing, and privacy on the Facebook. In International workshop on privacy enhancing technologies. Springer.

Acquisti, A., John, L. K., & Loewenstein, G. (2012). The impact of relative standards on the propensity to disclose. Journal of Marketing Research, 49(2), 160–174.

Acquisti, A., John, L. K., & Loewenstein, G. (2013). What is privacy worth? The Journal of Legal Studies, 42(2), 249–274.

Acquisti, A., Taylor, C., & Wagman, L. (2016). The economics of privacy. Journal of Economic Literature, 54(2), 442–492.

Adjerid, I., Acquisti, A., Telang, R., Padman, R., & Adler-Milstein, J. (2016). The impact of privacy regulation and technology incentives: The case of health information exchanges. Management Science, 62(4), 10421063.

AdExchanger. (2011, October 28). If a consumer asked you, “Why is tracking good?”, what would you say? https://adexchanger.com/online-advertising/why-is-tracking-good/

Adjerid, I., Pe’er E., & Acquisti, A. (2018). Beyond the privacy paradox: Objective versus relative risk in privacy decision making. MIS Quarterly, 42(2), 465–488.

Ajzen, I., & Fishbein, M. (1977). Attitude-behavior relations: A theoretical analysis and review of empirical research. Psychological Bulletin, 84(5), 888–918.

Akerlof, G. A. (1970). The Market for” Lemons”: Quality Uncertainty and the Market Mechanism. The Quarterly Journal of Economics, 84(3), 488-500.

Almuhimedi, H., Schaub, F., Sadeh, N., Adjerid, I., Acquisti, A., Gluck, J., Cranor, L., & Agarwal, Y. (2015, April). Your location has been shared 5,398 times! A field study on mobile app privacy nudging. In Proceedings of the 33rd annual ACM Conference on Human Factors in Computing Systems (pp. 787–796).

Altman, I. (1975). The environment and social behavior. Brooks/Cole Pub. Co.

Altman, I. (1977). Privacy regulation: Culturally universal or culturally specific? Journal of Social Issues, 33(3), 66–84.

Arrieta-Ibarra, I., Goff, L., Jiménez-Hernández, D., Lanier, J., & Weyl, E. G. (2018). Should we treat data as labor? Moving beyond “free”. AEA Papers and Proceedings, 108, 38–42.

Athey, S., Catalini, C., & Tucker, C. (2017). The digital privacy paradox: Small money, small costs, small talk (No. w23488). National Bureau of Economic Research.

Bamberger, K. A., & Mulligan, D. K. (2019). Privacy law: On the books and on the ground. In The handbook of privacy studies: An interdisciplinary introduction (p. 349).

Barassi, V. (2019). Datafied citizens in the age of coerced digital participation. Sociological Research Online, 24(3), 414–429.

Barnes, S. B. (2006). A privacy paradox: Social networking in the United States. First Monday, 11(9).

Barth, S., de Jong, M. D., Junger, M., Hartel, P. H., & Roppelt, J. C. (2019). Putting the privacy paradox to the test: Online privacy and security behaviors among users with technical knowledge, privacy awareness, and financial resources. Telematics and informatics, 41, 55–69.

Bergemann, D., & Bonatti, A. (2011). Targeting in advertising markets: Implications for offline versus online media. The RAND Journal of Economics, 42(3), 417–443.

Beresford, A. R., Kübler, D., & Preibusch, S. (2012). Unwillingness to pay for privacy: A field experiment. Economics Letters, 117(1), 25–27.

BERR (Department for Business, Enterprise, and Regulatory Reform) (2008). Regulation and innovation: evidence and policy implications. BERR Economics Paper n. 4, United Kingdom.

Bettinger, E. P., Long, B. T., Oreopoulos, P., & Sanbonmatsu, L. (2012). The role of application assistance and information in college decisions: Results from the H&R Block FAFSA experiment. The Quarterly Journal of Economics, 127(3), 1205–1242.

Betts, P. (2010). Within walls: Private life in the German Democratic Republic. Oxford University Press.

Boyd, D., & Marwick, A. E. (2011). Social privacy in networked publics: Teens’ attitudes, practices, and strategies. In A decade in Internet time: Symposium on the dynamics of the Internet and society.

Brandimarte, L., Acquisti, A., & Loewenstein, G. (2013). Misplaced confidences: Privacy and the control paradox. Social Psychological and Personality Science, 4(3), 340–347.

Brandom, R. (2018, October 24). Tim Cook wants a federal privacy law—But so do Facebook and Google. The Verge. https://www.theverge.com/2018/10/24/18018686/tim-cook-apple-privacy-law-facebook-google-gdpr

Burtch, G., Ghose, A., & Wattal, S. (2015). The hidden cost of accommodating crowdfunder privacy preferences: A randomized field experiment. Management Science, 61(5), 949–962.

Bush, G. W. (2009). Economic regulation. Chapter 9. White House Archives. https://georgewbush-whitehouse.archives.gov/cea/ERP_2009_Ch9.pdf

Calo, R. (2011). The boundaries of privacy harm. Indiana Law Journal, 86, 1131.

Carbone, E., & Loewenstein, G. (2020.) Dying to divulge: The determinants of, and relationship between, desired and actual disclosure. https://ssrn.com/abstract=3613232.

Cavoukian, A. (2009). Privacy by design: The 7 foundational principles. Information and Privacy Commissioner of Ontario, Canada.

Chen, B. X. (2018, March 21). Want to #DeleteFacebook? You can try. New York Times. https://www.nytimes.com/2018/03/21/technology/personaltech/delete-facebook.html

CISCO (2019). Consumer Privacy Survey: The growing imperative of getting data privacy right. CISCO Cybersecurity Series 2019. https://www.cisco.com/c/dam/en/us/products/collateral/security/cybersecurity-series-2019-cps.pdf

Cohen, J. E. (2010). What privacy is for. Harvard Law Review, 126, 1904.

Daniels, J. (2019, February 12). California governor proposes ‘new data dividend’ that could call on Facebook and Google to pay users. CNBC. https://www.cnbc.com/2019/02/12/california-gov-newsom-calls-for-new-data-dividend-for-consumers.html

De Corniere, A., & De Nijs, R. (2016). Online advertising and privacy. The RAND Journal of Economics, 47(1), 48–72.

Dienlin, T., & Trepte, S. (2015). Is the privacy paradox a relic of the past? An in‐depth analysis of privacy attitudes and privacy behaviors. European Journal of Social Psychology, 45(3), 285–297.

Downs, A. (1957). An economic theory of democracy. Harper & Row.

Draper, N. A., & Turow, J. (2019). The corporate cultivation of digital resignation. New Media & Society, 21(8), 1824–1839.

DuckDuckGo. (2017, January). A study on private browsing: Consumer usage, knowledge, and thoughts. Technical report. https://duckduckgo.com/download/Private_Browsing.pdf

Federal Trade Commission. (2016, June). Online tracking. https://www.consumer.ftc.gov/articles/0042-online-tracking

Feng, E. (2019, December 16). How China is using facial recognition technology. NPR. https://www.npr.org/2019/12/16/788597818/how-china-is-using-facial-recognition-technology

Fiegerman, S. (2017, September 7). The biggest data breaches ever. CNN Business. http://money.cnn.com/2017/09/07/technology/business/biggest-breaches-ever/index.html

Fiesler, C., Dye, M., Feuston, J. L., Hiruncharoenvate, C., Hutto, C. J., Morrison, S., Roshan, P. K., Pavalanathan, U., Bruckman, A. S., De Choudhury, M., & Gilbert, E. (2017). What (or who) is public? Privacy settings and social media content sharing. In Proceedings of the 2017 ACM conference on computer supported cooperative work and social computing (pp. 567–580).

Gerber, N., Gerber, P., & Volkamer, M. (2018). Explaining the privacy paradox: A systematic review of literature investigating privacy attitude and behavior. Computers & Security, 77, 226–261.

Ghose, A. (2017). Tap: Unlocking the mobile economy. MIT Press.

Giesler, M., & Veresiu, E. (2014). Creating the responsible consumer: Moralistic governance regimes and consumer subjectivity. Journal of Consumer Research, 41(3), 840–857.

Godinho de Matos, M., & Adjerid, I. (2019). Consumer behavior and firm targeting after GDPR: The case of a telecom provider in Europe. NBER Summer Institute on IT and Digitization.

Goldberg, I. (2002). Privacy-enhancing technologies for the Internet, II: Five years later. In International workshop on privacy enhancing technologies. Springer.

Goldfarb, A., & Tucker, C. E. (2011). Privacy regulation and online advertising. Management Science, 57(1), 57–71.

Goldfarb, A., & Tucker, C. (2012). Privacy and innovation. Innovation Policy and the Economy, 12(1), 65–90.

Gray, C. M., Kou, Y., Battles, B., Hoggatt, J., & Toombs, A. L. (2018). The dark (patterns) side of UX design. In Proceedings of the 2018 CHI conference on human factors in computing systems (pp. 1–14).

Gross, R., & Acquisti, A. (2005). Information revelation and privacy in online social networks (The Facebook case). In Proceedings of the 2005 ACM workshop on privacy in the electronic society (pp. 71–80).

Habib, H., Colnago, J., Gopalakrishnan, V., Pearman, S., Thomas, J., Acquisti, A., Christin, N., & Cranor, L. F. (2018). Away from prying eyes: Analyzing usage and understanding of private browsing. In Fourteenth symposium on usable privacy and security (SOUPS 2018) (pp. 159–175).

Hartzog, W. (2010). Website design as contract. American University Law Review, 60, 1635.

Henrich, J., Heine, S. J., & Norenzayan, A. (2010). The weirdest people in the world?. Behavioral and Brain Sciences, 33(2–3), 61–83.

Hirsch, D. D. (2013). Going Dutch: Collaborative Dutch privacy regulation and the lessons it holds for US privacy law. Michigan State Law Review, 83.

Hirshleifer, J. (1978). The private and social value of information and the reward to inventive activity. In Uncertainty in economics (pp. 541–556). Academic Press.

Hoofnagle, C. J., & Urban, J. M. (2014). Alan Westin’s privacy homo economicus. Wake Forest Law Review, 49, 261.

Jagadish, H. V. (2016). The values challenge for Big Data. In Bulletin of the IEEE computer society technical committee on data engineering (pp. 77–84).

Jentzsch, N., Preibusch, S., & Harasser, A. (2012). Study on monetising privacy. an economic model for pricing personal information. European Network and information Security Agency (ENISA).

John, L., Acquisti, A., & Loewenstein, G. (2011). Strangers on a plane: Context-dependent willingness to divulge sensitive information. Journal of Consumer Research, 37(5), 858–873.

Johnson, B. (2020, January 11). Privacy no longer a social norm, says Facebook founder. he Guardian. https://www.theguardian.com/technology/2010/jan/11/facebook-privacy

Johnson, G. A., Shriver, S. K., & Du, S. (2020). Consumer privacy choice in online advertising: Who opts out and at what cost to industry?. Marketing Science, 39(1), 33–51.

Kang, R., Dabbish, L., Fruchter, N., & Kiesler, S. (2015). “My data just goes everywhere”: User mental models of the Internet and implications for privacy and security. In Eleventh symposium on usable privacy and security (SOUPS 2015) (pp. 39–52).

KFF. (2020, April). Coronavirus, social distancing, and contact tracing. Health tracking poll. https://www.kff.org/global-health-policy/issue-brief/kff-health-tracking-poll-late-april-2020/

Kokolakis, S. (2017). Privacy attitudes and privacy behaviour: A review of current research on the privacy paradox phenomenon. Computers & Security, 64, 122–134.

Kunreuther, H., Ginsberg, R., Miller, L., Sagi, P., Slovic, P., Borkan, B., & Katz, N. (1978). Disaster insurance protection: Public policy lessons. Wiley.

Laudon, K. C. (1996). Markets and privacy. Communications of the ACM, 39(9), 92–104.

Lewis, B. (2017, November 7). Americans Say Data Privacy is Important, but Few Take Steps to Protect Themselves. Instamotor. https://instamotor.com/blog/online-data-privacy-survey

Madejski, M., Johnson, M., & Bellovin, S. M. (2012, March). A study of privacy settings errors in an online social network. In 2012 IEEE international conference on pervasive computing and communications workshops (pp. 340–345). IEEE.

Mapon. (2017, June 9). GPS tracking for rental cars: How to break from the mold. https://www.mapon.com/us-en/blog/2017/06/gps-tracking-for-rental-cars-how-to-break-from-the-mold

Margulis, S. T. (2003). Privacy as a social issue and behavioral concept. Journal of social issues, 59(2), 243-261.

Marreiros, H., Tonin, M., Vlassopoulos, M., & Schraefel, M. C. (2017). “Now that you mention it”: A survey experiment on information, inattention and online privacy. Journal of Economic Behavior & Organization, 140, 1–17.

Martin, K. (2020). Breaking the privacy paradox: the value of privacy and associated duty of firms. Business Ethics Quarterly, 30(1), 65–96.

Martin, K. D., Borah, A., & Palmatier, R. W. (2017). Data privacy: Effects on customer and firm performance. Journal of Marketing, 81(1), 36–58.

Martin, K., & Nissenbaum, H. (2016). Measuring privacy: An empirical test using context to expose confounding variables. Columbia Science and Technology Law Review, 18(1), 176–218.

McDonald, A. M., & Cranor, L. F. (2008). The cost of reading privacy policies. Information System: A Journal of Law and Policy for the Information Society, 4, 543.

McGeveran, W. (2016). Friending the privacy regulators. Arizona Law Review, 58, 959.

Melumad, S., & Meyer, R. (2020). Full disclosure: How smartphones enhance consumer self-disclosure. Journal of Marketing, 84(3), 28–45.

Miller, C. C. (2014, November 12). Americans say they want privacy, but act as if they don’t. New York Times. https://www.nytimes.com/2014/11/13/upshot/americans-say-they-want-privacy-but-act-as-if-they-dont.html

Miller, A. R., & Tucker, C. (2009). Privacy protection and technology diffusion: The case of electronic medical records. Management Science, 55(7), 1077–1093.

Moore Jr, B. (1984). Privacy: Studies in social and cultural history. Routledge.

Murphy, R. F. (1964). Social distance and the veil. American Anthropologist, 66(6), 1257–1274.

Narayanan, A., & Shmatikov, V. (2008). Robust de-anonymization of large sparse datasets. In 2008 IEEE symposium on security and privacy. IEEE.

Nissenbaum, H. (2009). Privacy in context: Technology, policy, and the integrity of social life. Stanford University Press.

Noam, E. M. (1997). Privacy and self-regulation: Markets for electronic privacy. Privacy and Self-Regulation in the Information Age, 21–33.

Norberg, P. A., Horne, D. R., & Horne, D. A. (2007). The privacy paradox: Personal information disclosure intentions versus behaviors. Journal of Consumer Affairs, 41(1), 100–126.

Olson, M. (1965). The Logic of Collective Action. Cambridge University Press.

Palen, L., & Dourish, P. (2003). Unpacking” privacy” for a networked world. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 129-136).

Panzarino, M. (2019, March 19). Apple ad focuses on iPhone’s most marketable feature: Privacy. Techcrunch. https://techcrunch.com/2019/03/14/apple-ad-focuses-on-iphones-most-marketable-feature-privacy/

Pennebaker, J. W. (1997). Opening up: The healing power of emotional expression. Guilford.

Penney, J. W. (2016). Chilling effects: Online surveillance and Wikipedia use. Berkeley Technology Law Journal, 31, 117.

Petronio, S. (2002). Boundaries of privacy: Dialectics of disclosure. Suny Press.

Pew Research Center. (2012). Privacy management on social media sites. https://www.pewresearch.org/internet/2012/02/24/privacy-management-on-social-media-sites/.

Pew Research Center. (2013). Anonymity, privacy and security online. https://www.pewresearch.org/internet/2013/09/05/anonymity-privacy-and-security-online/.

Pew Research Center. (2015). Americans’ privacy strategies post-Snowden. https://www.pewresearch.org/wp-content/uploads/sites/9/2015/03/PI_AmericansPrivacyStrategies_0316151.pdf.

Pew Research Center. (2019). Americans and privacy: Concerned, confused and feeling lack of control over their personal information. https://www.pewresearch.org/internet/wp-content/uploads/sites/9/2019/11/Pew-Research-Center_PI_2019.11.15_Privacy_FINAL.pdf.

Posner, R. A. (1978). Economic theory of privacy. Regulation, 2, 19–26.

Preibusch, S., Kübler, D., & Beresford, A. R. (2013). Price versus privacy: An experiment into the competitive advantage of collecting less personal information. Electronic Commerce Research, 13(4), 423–455.

Quine, W. V. (1976). The ways of paradox. Harvard University Press.

Regan, P. M. (1995). Legislating privacy: Technology, social values, and public policy. The University of North Carolina Press.

Romanosky, S., Telang, R., & Acquisti, A. (2011). Do data breach disclosure laws reduce identity theft? Journal of Policy Analysis and Management 30(2), 256–286.

Rosenberg, M., Confessore, N., & Cadwalladr, C. (2018, March 17). How Trump consultants exploited the Facebook data of millions. New York Times. https://www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump-campaign.html

Sanchez-Rola, I., Dell’Amico, M., Kotzias, P., Balzarotti, D., Bilge, L., Vervier, P. A., & Santos, I. (2019, July). Can I opt out yet? GDPR and the global illusion of cookie control. In Proceedings of the 2019 ACM Asia conference on computer and communications security (pp. 340–351).

Satariano, A. (2019, September 15). Real-time surveillance will test the British tolerance for cameras. New York Times. https://www.nytimes.com/2019/09/15/technology/britain-surveillance-privacy.html

Savage, S. J. & Waldman, D. M. (2015). Privacy tradeoffs in smartphone applications. Economics Letters, 137, 171–175.

Schoeman, F. (1984). Privacy: Philosophical Dimensions. American Philosophical Quarterly, 21(3), 199-213.

Sloan, R. H., & Warner, R. (2016). The self, the Stasi, and NSA: Privacy, knowledge, and complicity in the surveillance state. Minnesota Journal of Law, Science and Technology, 17, 347.

Slovic, P. (1995). The construction of preference. American Psychologist, 50(5), 364.

Solove, D. J. (2005). A taxonomy of privacy. University of Pennsylvania Law Review, 154, 477.

Solove, D. J. (2007). I’ve got nothing to hide and other misunderstandings of privacy. San Diego Law Review, 44, 745.

Solove, D. J. (2012). Introduction: Privacy self-management and the consent dilemma. Harvard Law Review, 126, 1880.

Solove, D. (2021, forthcoming). The myth of the privacy paradox. George Washington Law Review, 89.

Spiekermann, S., Grossklags, J., & Berendt, B. (2001, October). E-privacy in 2nd generation e-commerce: Privacy preferences versus actual behavior. In Proceedings of the 3rd ACM conference on electronic commerce (pp. 38–47).

Sprenger, P. (1999, January 26). Sun on privacy: “Get over it.” Wired. https://www.wired.com/1999/01/sun-on-privacy-get-over-it/

Statista. (2016, November 23). Number of mobile phone users worldwide from 2015 to 2020. https://www.statista.com/statistics/274774/forecast-of-mobile-phone-users- worldwide/

Stutzman, F. D., Gross, R., & Acquisti, A. (2013). Silent listeners: The evolution of privacy and disclosure on Facebook. Journal of Privacy and Confidentiality, 4(2), 7–41.

Svirsky, D. (2019). Three experiments about human behavior and legal regulation [Doctoral dissertation, Harvard University, Graduate School of Arts & Sciences].

Taylor, H. (2001). Testimony on “Opinion surveys: What consumers have to say about information privacy.” Hearing before the Subcommittee on Commerce, Trade and Consumer Protection. Serial No. 107-35. https://www.govinfo.gov/content/pkg/CHRG-107hhrg72825/html/CHRG-107hhrg72825.htm

Tamir, D. I., & Mitchell, J. P. (2012). Disclosing information about the self is intrinsically rewarding. Proceedings of the National Academy of Sciences, 109(21), 8038–8043.

Tsai, J. Y., Egelman, S., Cranor, L., & Acquisti, A. (2011). The effect of online privacy information on purchasing behavior: An experimental study. Information Systems Research, 22(2), 254–268.

Varian, H. R. (1996). Economic aspects of personal privacy. In Privacy and self-regulation in the information age. National Telecommunications and Information Administration, US Department of Commerce.

Vitak, J., & Ellison, N. B. (2013). “There’s a network out there you might as well tap”: Exploring the benefits of and barriers to exchanging informational and support-based resources on Facebook. New Media & Society, 15(2), 243–259.

Vitak, J., & Kim, J. (2014). “You can’t block people offline”: Examining how Facebook’s affordances shape the disclosure process. In Proceedings of the 17th ACM conference on computer supported cooperative work & social computing (pp. 461–474).

Weiss, M. (2019, June 5). Digiday research: Most publishers don’t benefit from behavioral ad targeting. Digiday. https://digiday.com/media/digiday-research-most-publishers-dont-benefit-from-behavioral-ad-targeting/

Westin, A. (1967). Privacy and freedom. Atheneum.

Westin, A. (2001). Testimony on “Opinion surveys: What consumers have to say about information privacy.” Hearing before the Subcommittee on Commerce, Trade and Consumer Protection. Serial No. 107-35. https://www.govinfo.gov/content/pkg/CHRG-107hhrg72825/html/CHRG-107hhrg72825.htm

White, T. B., Novak, T. P., & Hoffman, D. L. (2014). No strings attached: When giving it away versus making them pay reduces consumer information disclosure. Journal of Interactive Marketing, 28(3), 184–195.

Wong, J. C. (2019, March 18). The Cambridge Analytica scandal changed the world: But it didn’t change Facebook. The Guardian. https://www.theguardian.com/technology/2019/mar/17/the-cambridge-analytica-scandal-changed-the-world-but-it-didnt-change-facebook

Wong, R. Y., & Mulligan, D. K. (2019). Bringing design to the privacy table: Broadening “design” in “privacy by design” through the lens of HCI. In Proceedings of the 2019 CHI conference on human factors in computing systems.

Zhang, B. & Xu, H. (2016). Privacy nudges for mobile applications: Effects on the creepiness emotion and privacy attitudes. In Proceedings of the 19th ACM conference on computer-supported cooperative work & social computing.

Zuboff, S. (2015). Big other: surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 30(1), 75-89.

Valuta la qualità di questo articolo

La tua opinione è importante per noi!

Speciale PNRR

Tutti
Incentivi
Salute digitale
Formazione
Analisi
Sostenibilità
PA
Sostemibilità
Sicurezza
Digital Economy
CODICE STARTUP
Imprenditoria femminile: come attingere ai fondi per le donne che fanno impresa
DECRETI
PNRR e Fascicolo Sanitario Elettronico: investimenti per oltre 600 milioni
IL DOCUMENTO
Competenze digitali, ecco il nuovo piano operativo nazionale
STRUMENTI
Da Istat e RGS gli indicatori per misurare la sostenibilità nel PNRR
STRATEGIE
PNRR – Piano nazionale di Ripresa e Resilienza: cos’è e novità
FONDI
Pnrr, ok della Ue alla seconda rata da 21 miliardi: focus su 5G e banda ultralarga
GREEN ENERGY
Energia pulita: Banca Sella finanzia i progetti green incentivati dal PNRR
TECNOLOGIA SOLIDALE
Due buone notizie digitali: 500 milioni per gli ITS e l’inizio dell’intranet veloce in scuole e ospedali
INNOVAZIONE
Competenze digitali e InPA cruciali per raggiungere gli obiettivi del Pnrr
STRATEGIE
PA digitale 2026, come gestire i fondi PNRR in 5 fasi: ecco la proposta
ANALISI
Value-based healthcare: le esperienze in Italia e il ruolo del PNRR
Strategie
Accordi per l’innovazione, per le imprese altri 250 milioni
Strategie
PNRR, opportunità e sfide per le smart city
Strategie
Brevetti, il Mise mette sul piatto 8,5 milioni
Strategie
PNRR e opere pubbliche, la grande sfida per i Comuni e perché bisogna pensare digitale
Formazione
Trasferimento tecnologico, il Mise mette sul piatto 7,5 milioni
Strategie
PSN e Strategia Cloud Italia: a che punto siamo e come supportare la PA in questo percorso
Dispersione idrica
Siccità: AI e analisi dei dati possono ridurre gli sprechi d’acqua. Ecco gli interventi necessari
PNRR
Cloud, firmato il contratto per l’avvio di lavori del Polo strategico
Formazione
Competenze digitali, stanziati 48 milioni per gli Istituti tecnologici superiori
Iniziative
Digitalizzazione delle reti idriche: oltre 600 milioni per 21 progetti
Competenze e competitività
PNRR, così i fondi UE possono rilanciare la ricerca e l’Università
Finanziamenti
PNRR, si sbloccano i fondi per l’agrisolare
Sanità post-pandemica
PNRR, Missione Salute: a che punto siamo e cosa resta da fare
Strategie
Sovranità e autonomia tecnologica nazionale: come avviare un processo virtuoso e sostenibile
La relazione
Pnrr e PA digitale, l’alert della Corte dei conti su execution e capacità di spesa
L'editoriale
Elezioni 2022, la sfida digitale ai margini del dibattito politico
Strategie
Digitale, il monito di I-Com: “Senza riforme Pnrr inefficace”
Transizione digitale
Pnrr: arrivano 321 milioni per cloud dei Comuni, spazio e mobilità innovativa
L'analisi I-COM
Il PNRR alla prova delle elezioni: come usare bene le risorse e centrare gli obiettivi digitali
Cineca
Quantum computing, una svolta per la ricerca: lo scenario europeo e i progetti in corso
L'indice europeo
Desi, l’Italia scala due posizioni grazie a fibra e 5G. Ma è (ancora) allarme competenze
L'approfondimento
PNRR 2, ecco tutte le misure per cittadini e imprese: portale sommerso, codice crisi d’impresa e sismabonus, cosa cambia
Servizi digitali
PNRR e trasformazione digitale: ecco gli investimenti e le riforme previste per la digitalizzazione della PA
Legal health
Lo spazio europeo dei dati sanitari: come circoleranno le informazioni sulla salute nell’Unione Europea
Servizi digitali
PNRR e PA digitale: non dimentichiamo la dematerializzazione
Digital Healthcare transformation
La trasformazione digitale degli ospedali
Governance digitale
PA digitale, è la volta buona? Così misure e risorse del PNRR possono fare la differenza
Servizi digitali
Comuni e digitale, come usare il PNRR senza sbagliare
La survey
Pnrr e digitale accoppiata vincente per il 70% delle pmi italiane
Missione salute
Fascicolo Sanitario Elettronico alla prova del PNRR: limiti, rischi e opportunità
Servizi pubblici
PNRR: come diventeranno i siti dei comuni italiani grazie alle nuove risorse
Skill gap
PNRR, la banda ultra larga crea 20.000 nuovi posti di lavoro
Il Piano
Spazio, Colao fa il punto sul Pnrr: i progetti verso la milestone 2023
FORUMPA2022
PNRR e trasformazione digitale: rivedi i Talk di FORUM PA 2022 in collaborazione con le aziende partner
I contratti
Avio, 340 milioni dal Pnrr per i nuovi propulsori a metano
Next Generation EU
PNRR, a che punto siamo e cosa possono aspettarsi le aziende private
Fondi
Operativo il nuovo portale del MISE con tutti i finanziamenti per le imprese
Servizi comunali
Il PNRR occasione unica per i Comuni digitali: strumenti e risorse per enti e cittadini
Healthcare data platform
PNRR dalla teoria alla pratica: tecnologie e soluzioni per l’innovazione in Sanità
Skill
Competenze digitali, partono le Reti di facilitazione
Gli obiettivi
Scuola 4.0, PNRR ultima chance: ecco come cambierà il sistema formativo
Sistema Paese
PNRR 2, è il turno della space economy
FORUM PA 2022
FORUM PA 2022: la maturità digitale dei comuni italiani rispetto al PNRR
Analisi
PNRR: dalla Ricerca all’impresa, una sfida da cogliere insieme
Innovazione
Pnrr, il Dipartimento per la Trasformazione digitale si riorganizza
FORUM PA 2022
PA verde e sostenibile: il ruolo di PNRR, PNIEC, energy management e green public procurement
Analisi
PNRR, Comuni e digitalizzazione: tutto su fondi e opportunità, in meno di 3 minuti. Guarda il video!
Rapporti
Competenze digitali e servizi automatizzati pilastri del piano Inps
Analisi
Attuazione del PNRR: il dialogo necessario tra istituzioni e società civile. Rivedi lo Scenario di FORUM PA 2022
Progetti
Pnrr, fondi per il Politecnico di Torino. Fra i progetti anche IS4Aerospace
Analisi
PNRR, Colao fa il punto sulla transizione digitale dell’Italia: «In linea con tutte le scadenze»
La Svolta
Ict, Istat “riclassifica” i professionisti. Via anche al catalogo dati sul Pnrr
Analisi
Spazio, Colao fa il punto sul Pnrr: i progetti verso la milestone 2023
FORUM PA 2022
Ecosistema territoriale sostenibile: l’Emilia Romagna tra FESR e PNRR
Il Piano
Innovazione, il Mise “centra” gli obiettivi Pnrr: attivati 17,5 miliardi
Analisi
PNRR: raggiunti gli obiettivi per il primo semestre 2022. Il punto e qualche riflessione
Analisi
PNRR: dal dialogo tra PA e società civile passa il corretto monitoraggio dei risultati, tra collaborazione e identità dei luoghi
Webinar
Comuni e PNRR: un focus sui bandi attivi o in pubblicazione
Analisi
Formazione 4.0: cos’è e come funziona il credito d’imposta
PA e Sicurezza
PA e sicurezza informatica: il ruolo dei territori di fronte alle sfide della digitalizzazione
PA e sicurezza
PNRR e servizi pubblici digitali: sfide e opportunità per Comuni e Città metropolitane
Water management
Water management in Italia: verso una transizione “smart” e “circular” 
LE RISORSE
Transizione digitale, Simest apre i fondi Pnrr alle medie imprese
Prospettive
Turismo, cultura e digital: come spendere bene le risorse del PNRR
Analisi
Smart City: quale contributo alla transizione ecologica
Decarbonizzazione
Idrogeno verde, 450 milioni € di investimenti PNRR, Cingolani firma
Unioncamere
PNRR, imprese in ritardo: ecco come le Camere di commercio possono aiutare
I fondi
Industria 4.0: solo un’impresa su tre pronta a salire sul treno Pnrr
CODICE STARTUP
Imprenditoria femminile: come attingere ai fondi per le donne che fanno impresa
DECRETI
PNRR e Fascicolo Sanitario Elettronico: investimenti per oltre 600 milioni
IL DOCUMENTO
Competenze digitali, ecco il nuovo piano operativo nazionale
STRUMENTI
Da Istat e RGS gli indicatori per misurare la sostenibilità nel PNRR
STRATEGIE
PNRR – Piano nazionale di Ripresa e Resilienza: cos’è e novità
FONDI
Pnrr, ok della Ue alla seconda rata da 21 miliardi: focus su 5G e banda ultralarga
GREEN ENERGY
Energia pulita: Banca Sella finanzia i progetti green incentivati dal PNRR
TECNOLOGIA SOLIDALE
Due buone notizie digitali: 500 milioni per gli ITS e l’inizio dell’intranet veloce in scuole e ospedali
INNOVAZIONE
Competenze digitali e InPA cruciali per raggiungere gli obiettivi del Pnrr
STRATEGIE
PA digitale 2026, come gestire i fondi PNRR in 5 fasi: ecco la proposta
ANALISI
Value-based healthcare: le esperienze in Italia e il ruolo del PNRR
Strategie
Accordi per l’innovazione, per le imprese altri 250 milioni
Strategie
PNRR, opportunità e sfide per le smart city
Strategie
Brevetti, il Mise mette sul piatto 8,5 milioni
Strategie
PNRR e opere pubbliche, la grande sfida per i Comuni e perché bisogna pensare digitale
Formazione
Trasferimento tecnologico, il Mise mette sul piatto 7,5 milioni
Strategie
PSN e Strategia Cloud Italia: a che punto siamo e come supportare la PA in questo percorso
Dispersione idrica
Siccità: AI e analisi dei dati possono ridurre gli sprechi d’acqua. Ecco gli interventi necessari
PNRR
Cloud, firmato il contratto per l’avvio di lavori del Polo strategico
Formazione
Competenze digitali, stanziati 48 milioni per gli Istituti tecnologici superiori
Iniziative
Digitalizzazione delle reti idriche: oltre 600 milioni per 21 progetti
Competenze e competitività
PNRR, così i fondi UE possono rilanciare la ricerca e l’Università
Finanziamenti
PNRR, si sbloccano i fondi per l’agrisolare
Sanità post-pandemica
PNRR, Missione Salute: a che punto siamo e cosa resta da fare
Strategie
Sovranità e autonomia tecnologica nazionale: come avviare un processo virtuoso e sostenibile
La relazione
Pnrr e PA digitale, l’alert della Corte dei conti su execution e capacità di spesa
L'editoriale
Elezioni 2022, la sfida digitale ai margini del dibattito politico
Strategie
Digitale, il monito di I-Com: “Senza riforme Pnrr inefficace”
Transizione digitale
Pnrr: arrivano 321 milioni per cloud dei Comuni, spazio e mobilità innovativa
L'analisi I-COM
Il PNRR alla prova delle elezioni: come usare bene le risorse e centrare gli obiettivi digitali
Cineca
Quantum computing, una svolta per la ricerca: lo scenario europeo e i progetti in corso
L'indice europeo
Desi, l’Italia scala due posizioni grazie a fibra e 5G. Ma è (ancora) allarme competenze
L'approfondimento
PNRR 2, ecco tutte le misure per cittadini e imprese: portale sommerso, codice crisi d’impresa e sismabonus, cosa cambia
Servizi digitali
PNRR e trasformazione digitale: ecco gli investimenti e le riforme previste per la digitalizzazione della PA
Legal health
Lo spazio europeo dei dati sanitari: come circoleranno le informazioni sulla salute nell’Unione Europea
Servizi digitali
PNRR e PA digitale: non dimentichiamo la dematerializzazione
Digital Healthcare transformation
La trasformazione digitale degli ospedali
Governance digitale
PA digitale, è la volta buona? Così misure e risorse del PNRR possono fare la differenza
Servizi digitali
Comuni e digitale, come usare il PNRR senza sbagliare
La survey
Pnrr e digitale accoppiata vincente per il 70% delle pmi italiane
Missione salute
Fascicolo Sanitario Elettronico alla prova del PNRR: limiti, rischi e opportunità
Servizi pubblici
PNRR: come diventeranno i siti dei comuni italiani grazie alle nuove risorse
Skill gap
PNRR, la banda ultra larga crea 20.000 nuovi posti di lavoro
Il Piano
Spazio, Colao fa il punto sul Pnrr: i progetti verso la milestone 2023
FORUMPA2022
PNRR e trasformazione digitale: rivedi i Talk di FORUM PA 2022 in collaborazione con le aziende partner
I contratti
Avio, 340 milioni dal Pnrr per i nuovi propulsori a metano
Next Generation EU
PNRR, a che punto siamo e cosa possono aspettarsi le aziende private
Fondi
Operativo il nuovo portale del MISE con tutti i finanziamenti per le imprese
Servizi comunali
Il PNRR occasione unica per i Comuni digitali: strumenti e risorse per enti e cittadini
Healthcare data platform
PNRR dalla teoria alla pratica: tecnologie e soluzioni per l’innovazione in Sanità
Skill
Competenze digitali, partono le Reti di facilitazione
Gli obiettivi
Scuola 4.0, PNRR ultima chance: ecco come cambierà il sistema formativo
Sistema Paese
PNRR 2, è il turno della space economy
FORUM PA 2022
FORUM PA 2022: la maturità digitale dei comuni italiani rispetto al PNRR
Analisi
PNRR: dalla Ricerca all’impresa, una sfida da cogliere insieme
Innovazione
Pnrr, il Dipartimento per la Trasformazione digitale si riorganizza
FORUM PA 2022
PA verde e sostenibile: il ruolo di PNRR, PNIEC, energy management e green public procurement
Analisi
PNRR, Comuni e digitalizzazione: tutto su fondi e opportunità, in meno di 3 minuti. Guarda il video!
Rapporti
Competenze digitali e servizi automatizzati pilastri del piano Inps
Analisi
Attuazione del PNRR: il dialogo necessario tra istituzioni e società civile. Rivedi lo Scenario di FORUM PA 2022
Progetti
Pnrr, fondi per il Politecnico di Torino. Fra i progetti anche IS4Aerospace
Analisi
PNRR, Colao fa il punto sulla transizione digitale dell’Italia: «In linea con tutte le scadenze»
La Svolta
Ict, Istat “riclassifica” i professionisti. Via anche al catalogo dati sul Pnrr
Analisi
Spazio, Colao fa il punto sul Pnrr: i progetti verso la milestone 2023
FORUM PA 2022
Ecosistema territoriale sostenibile: l’Emilia Romagna tra FESR e PNRR
Il Piano
Innovazione, il Mise “centra” gli obiettivi Pnrr: attivati 17,5 miliardi
Analisi
PNRR: raggiunti gli obiettivi per il primo semestre 2022. Il punto e qualche riflessione
Analisi
PNRR: dal dialogo tra PA e società civile passa il corretto monitoraggio dei risultati, tra collaborazione e identità dei luoghi
Webinar
Comuni e PNRR: un focus sui bandi attivi o in pubblicazione
Analisi
Formazione 4.0: cos’è e come funziona il credito d’imposta
PA e Sicurezza
PA e sicurezza informatica: il ruolo dei territori di fronte alle sfide della digitalizzazione
PA e sicurezza
PNRR e servizi pubblici digitali: sfide e opportunità per Comuni e Città metropolitane
Water management
Water management in Italia: verso una transizione “smart” e “circular” 
LE RISORSE
Transizione digitale, Simest apre i fondi Pnrr alle medie imprese
Prospettive
Turismo, cultura e digital: come spendere bene le risorse del PNRR
Analisi
Smart City: quale contributo alla transizione ecologica
Decarbonizzazione
Idrogeno verde, 450 milioni € di investimenti PNRR, Cingolani firma
Unioncamere
PNRR, imprese in ritardo: ecco come le Camere di commercio possono aiutare
I fondi
Industria 4.0: solo un’impresa su tre pronta a salire sul treno Pnrr

Articoli correlati

Articolo 1 di 3