Privacy as contextual integrity The theory of
contextual integrity, developed by
Helen Nissenbaum, defines privacy as an appropriate information flow, where appropriateness, in turn, is defined as conformance with legitimate, informational norms specific to social contexts.
Right to be let alone In 1890, the United States
jurists Samuel D. Warren and Louis Brandeis wrote "The Right to Privacy", an article in which they argued for the "right to be let alone", using that phrase as a definition of privacy. This concept relies on the theory of
natural rights and focuses on protecting individuals. The citation was a response to recent technological developments, such as photography, and sensationalist journalism, also known as
yellow journalism. There is extensive commentary over the meaning of being "let alone", and among other ways, it has been interpreted to mean the right of a person to choose
seclusion from the attention of others if they wish to do so, and the right to be immune from scrutiny or being observed in private settings, such as one's own home. Although this early vague legal concept did not describe privacy in a way that made it easy to design broad legal protections of privacy, it strengthened the notion of privacy rights for individuals and began a legacy of discussion on those rights in the US.
Limited access Limited access refers to a person's ability to participate in society without having other individuals and organizations collect information about them. Various theorists have imagined privacy as a system for limiting access to one's personal information.
Edwin Lawrence Godkin wrote in the late 19th century that "nothing is better worthy of legal protection than private life, or, in other words, the right of every man to keep his affairs to himself, and to decide for himself to what extent they shall be the subject of public observation and discussion." Adopting an approach similar to the one presented by Ruth Gavison Nine years earlier,
Sissela Bok said that privacy is "the condition of being protected from unwanted access by others—either physical access, personal information, or attention."
Control over information Control over one's personal information is the concept that "privacy is the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others." Generally, a person who has
consensually formed an interpersonal relationship with another person is not considered "protected" by privacy rights with respect to the person they are in the relationship with.
Charles Fried said that "Privacy is not simply an absence of information about us in the minds of others; rather it is the control we have over information about ourselves." Nevertheless, in the era of
big data, control over information is under pressure.
States of privacy Alan Westin defined four states—or experiences—of privacy: solitude, intimacy, anonymity, and reserve.
Solitude is a physical separation from others; Intimacy is a "close, relaxed; and frank relationship between two or more individuals" that results from the seclusion of a pair or small group of individuals. (In this sense, "accessing" an individual includes accessing personal information about them.) Privacy is described as "behaviors falling at specific locations on these two dimensions". Johnson examined the following four stages to categorize where people exercise personal control: outcome choice control is the selection between various outcomes. Behaviour selection control is the selection between behavioural strategies to apply to attain selected outcomes. Outcome effectance describes the fulfillment of selected behaviour to achieve chosen outcomes. Outcome realization control is the personal interpretation of one's achieved outcome. The relationship between two factors– primary and secondary control, is defined as the two-dimensional phenomenon where one reaches personal control: primary control describes behaviour directly causing outcomes, while secondary control is behaviour indirectly causing outcomes. Johnson explores the concept that privacy is a behaviour that has secondary control over outcomes.
Lorenzo Magnani expands on this concept by highlighting how privacy is essential in maintaining personal control over one's identity and consciousness. He argues that consciousness is partly formed by external representations of ourselves, such as narratives and data, which are stored outside the body. However, much of our consciousness consists of internal representations that remain private and are rarely externalized. This internal privacy, which Magnani refers to as a form of "information property" or "moral capital," is crucial for preserving free choice and personal agency. According to Magnani, when too much of our identity and data is externalized and subjected to scrutiny, it can lead to a loss of personal control, dignity, and responsibility. The protection of privacy, therefore, safeguards our ability to develop and pursue personal projects in our own way, free from intrusive external forces. Acknowledging other conceptions of privacy while arguing that the fundamental concern of privacy is behavior selection control, Johnson converses with other interpretations including those of Maxine Wolfe and Robert S. Laufer, and Irwin Altman. He clarifies the continuous relationship between privacy and personal control, where outlined behaviours not only depend on privacy, but the conception of one's privacy also depends on his defined behavioural outcome relationships.
Secrecy Privacy is sometimes defined as an option to have secrecy. Richard Posner said that privacy is the right of people to "conceal information about themselves that others might use to their disadvantage". In various legal contexts, when privacy is described as secrecy, a conclusion is reached: if privacy is secrecy, then rights to privacy do not apply for any information which is already publicly disclosed. When privacy-as-secrecy is discussed, it is usually imagined to be a selective kind of secrecy in which individuals keep some information secret and private while they choose to make other information public and not private.
Personhood and autonomy Privacy may be understood as a necessary precondition for the development and preservation of personhood. Jeffrey Reiman defined privacy in terms of a recognition of one's ownership of their physical and mental reality and a moral right to
self-determination. Through the "social ritual" of privacy, or the social practice of respecting an individual's privacy barriers, the social group communicates to developing children that they have exclusive moral rights to their bodies—in other words, moral ownership of their body. Privacy is required to exercise choice. Furthermore, others must acknowledge and respect the self's boundaries—in other words, they must respect the individual's privacy. suggest that systemic and routinized deprivations or violations of privacy deteriorate one's sense of autonomy over time. This control primarily entails the ability to regulate contact with others. Protecting intimacy is at the core of the concept of sexual privacy, which law professor
Danielle Citron argues should be protected as a unique form of privacy.
Physical privacy Physical privacy could be defined as preventing "intrusions into one's physical space or solitude." An example of the legal basis for the right to physical privacy is the U.S.
Fourth Amendment, which guarantees "the right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures". Physical privacy may be a matter of cultural sensitivity, personal dignity, and/or shyness. There may also be concerns about safety, if, for example one is wary of becoming the victim of crime or
stalking. There are different things that can be prevented to protect one's physical privacy, including people watching (even through recorded images) one's
intimate behaviours or
intimate parts and unauthorized access to one's personal possessions or places. Examples of possible efforts used to avoid the former, especially for
modesty reasons, are
clothes,
walls,
fences, privacy screens,
cathedral glass,
window coverings, etc.
Organizational Government agencies, corporations, groups/societies and other organizations may desire to keep their activities or secrets from being revealed to other organizations or individuals, adopting various
security practices and controls in order to keep private information confidential. Organizations may seek legal protection for their secrets. For example, a government administration may be able to invoke
executive privilege or declare certain information to be
classified, or a corporation might attempt to protect valuable proprietary information as
trade secrets.
An individual right David Flaherty believes networked computer databases pose threats to privacy. He develops 'data protection' as an aspect of privacy, which involves "the collection, use, and dissemination of personal information". This concept forms the foundation for fair information practices used by governments globally. Flaherty forwards an idea of privacy as information control, "[i]ndividuals want to be left alone and to exercise some control over how information about them is used".
Richard Posner and Lawrence Lessig focus on the economic aspects of personal information control. Posner criticizes privacy for concealing information, which reduces market efficiency. For Posner, employment is selling oneself in the labour market, which he believes is like selling a product. Any 'defect' in the 'product' that is not reported is fraud. For Lessig, privacy breaches online can be regulated through code and law. Lessig claims "the protection of privacy would be stronger if people conceived of the right as a property right", and that "individuals should be able to control information about themselves".
A collective value and a human right There have been attempts to establish privacy as one of the fundamental
human rights, whose social value is an essential component in the functioning of democratic societies. Priscilla Regan believes that individual concepts of privacy have failed philosophically and in policy. She supports a social value of privacy with three dimensions: shared perceptions, public values, and
collective components. Shared ideas about privacy allows freedom of conscience and diversity in thought. Public values guarantee democratic participation, including freedoms of speech and association, and limits government power. Collective elements describe privacy as collective good that cannot be divided. Regan's goal is to strengthen privacy claims in policy making: "if we did recognize the collective or public-good value of privacy, as well as the common and public value of privacy, those advocating privacy protections would have a stronger basis upon which to argue for its protection". Leslie Regan Shade argues that the human right to privacy is necessary for meaningful democratic participation, and ensures human dignity and autonomy. Privacy depends on norms for how information is distributed, and if this is appropriate. Violations of privacy depend on context. The human right to privacy has precedent in the
United Nations Declaration of Human Rights: "Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers." Shade believes that privacy must be approached from a people-centered perspective, and not through the marketplace. Dr. Eliza Watt, Westminster Law School, University of Westminster in London, UK, proposes application of the International Human Right Law (IHRL) concept of "virtual control" as an approach to deal with extraterritorial mass surveillance by state intelligence agencies. Dr. Watt envisions the "virtual control" test, understood as a remote control over the individual's right to privacy of communications, where privacy is recognized under the ICCPR, Article 17. This, she contends, may help to close the normative gap that is being exploited by nation states.
Privacy paradox and economic valuation The
privacy paradox is a phenomenon in which online users state that they are concerned about their privacy but behave as if they were not. While this term was coined as early as 1998, it was not used in its current popular sense until the year 2000. When compared to adults, young people tend to disclose more information on
social media. However, this does not mean that they are not concerned about their privacy. Susan B. Barnes gave a case in her article: in a television interview about Facebook, a student addressed her concerns about disclosing personal information online. However, when the reporter asked to see her Facebook page, she put her home address, phone numbers, and pictures of her young son on the page. The privacy paradox has been studied and scripted in different research settings. Several studies have shown this inconsistency between privacy attitudes and behavior among online users. However, by now an increasing number of studies have also shown that there are significant and at times large correlations between privacy concerns and information sharing behavior, which speaks against the privacy paradox. A meta-analysis of 166 studies published on the topic reported an overall small but significant relation between privacy concerns and informations sharing or use of privacy protection measures. So although there are several individual instances or anecdotes where behavior appear paradoxical, on average privacy concerns and privacy behaviors seem to be related, and several findings question the general existence of the privacy paradox. However, the relationship between concerns and behavior is likely only small, and there are several arguments that can explain why that is the case. According to the
attitude-behavior gap, attitudes and behaviors are
in general and in most cases not closely related. A main explanation for the partial mismatch in the context of privacy specifically is that users lack awareness of the risks and the degree of protection. Users may underestimate the harm of disclosing information online. For example, users may not know how to change their
default settings even though they care about their privacy. Psychologists Sonja Utz and Nicole C. Krämer particularly pointed out that the privacy paradox can occur when users must trade-off between their privacy concerns and impression management.
Research on irrational decision making A study conducted by Susanne Barth and Menno D.T. de Jo demonstrates that decision making takes place on an irrational level, especially when it comes to mobile computing. Mobile applications in particular are often built up in such a way that spurs decision making that is fast and automatic without assessing risk factors. Protection measures against these unconscious mechanisms are often difficult to access while downloading and installing apps. Even with mechanisms in place to protect user privacy, users may not have the knowledge or experience to enable these mechanisms. Users of mobile applications generally have very little knowledge of how their personal data are used. When they decide which application to download, they typically are not able to effectively interpret the information provided by application vendors regarding the collection and use of personal data. Other research finds that this lack of interpretability means users are much more likely to be swayed by cost, functionality, design, ratings, reviews and number of downloads than requested permissions for usage of their personal data.
The economic valuation of privacy The willingness to incur a privacy risk is suspected to be driven by a complex array of factors including risk attitudes, personal value for private information, and general attitudes to privacy (which are typically measured using surveys). One experiment aiming to determine the monetary value of several types of personal information indicated relatively low evaluations of personal information. Despite claims that ascertaining the value of data requires a "stock-market for personal information",
surveillance capitalism and the
mass surveillance industry regularly place price tags on this form of data as it is shared between corporations and governments.
Information asymmetry Users are not always given the tools to live up to their professed privacy concerns, and they are sometimes willing to trade private information for convenience, functionality, or financial gain, even when the gains are very small. One study suggests that people think their browser history is worth the equivalent of a cheap meal. Another finds that attitudes to privacy risk do not appear to depend on whether it is already under threat or not. By now, the privacy calculus has been supported by several studies. ==Actions which reduce privacy==