Difference between revisions of "Privacy"

From MgmtWiki
Jump to: navigation, search
(Personal Context)
(Poisoning Data)
 
(4 intermediate revisions by the same user not shown)
Line 6: Line 6:
 
From the Latin ''priuates'' meaning apart.
 
From the Latin ''priuates'' meaning apart.
  
People want to know things. But people want to hide things. The search for [[Knowledge]] is always running up against the right to privacy. We are told by Jill LePore an eminent Historian.<ref name=lepore>Jill LePore, ''Who Killed Truth'' </ref>
+
People want to know things. But people want to hide things. The search for [[Knowledge]] is always running up against the right to privacy. We are told by Jill LePore an eminent Historian.<ref name=lepore>Jill LePore, ''Who Killed Truth'' (2023-06-13) ASIN  B0C6BY479C</ref>
  
 
A nice review of privacy and its foundation can be found in the New Yorker<ref>Jeannie Suk Gersen, ''Why the “Privacy” Wars Rage On'' New Yorker (2022-06-27)
 
A nice review of privacy and its foundation can be found in the New Yorker<ref>Jeannie Suk Gersen, ''Why the “Privacy” Wars Rage On'' New Yorker (2022-06-27)
Line 45: Line 45:
 
* So this page is all about having your cake and eating it too.
 
* So this page is all about having your cake and eating it too.
  
 +
===The Data Cult===
 +
Technology views all technology problems as something that can be fixed by more technology. If web sites are collecting too much data, lets control the data that flows into them.  This is not working now and probably can never work well. Lowry Pressly sees the ideology of information lurking in a less likely place—among privacy advocates trying to defend us from digital intrusions. This is because the standard view of privacy assumes there is “some information that already exists,” and what matters is keeping it out of the wrong hands. Such an assumption, for Pressly, is fatal. It “misses privacy’s true value and unwittingly aids the forces it takes itself to be resisting,” he writes. To be clear, Pressly is not opposed to reforms that would give us more power over our data—but it is a mistake “to think that this is what privacy is for.” “Privacy is valuable not because it empowers us to exercise control over our information,” he argues, “but because it protects against the creation of such information in the first place.”<ref>Lowry Pressly, ''The Right to Oblivion: Privacy and the Good Life'' Harvard UP (2024-10-08) ISBN ‎ 9780674260528</ref>
 
===The Alignment Problem===
 
===The Alignment Problem===
 
Alignment of human needs with technological solutions has not been good.<ref>Brian Christian, ''The Alignment Problem: Machine Learning and Human Values''  2020-10-06 ISBN 978-0393635829</ref> Privacy preserving dat; ideas do not translate into hard mathematical constructs. Cynthia though that when people were asking for privacy they were actually worried about about somebody using their data in the wrong way. It wasn't so much about hiding the data at all costs, but  preventing harm from the way that data was used. The conversation has sifted from privacy to fairness. Fairness through blindness does not work. All personal data provides information that can be used to categorize the individual, in other words, to discriminate.
 
Alignment of human needs with technological solutions has not been good.<ref>Brian Christian, ''The Alignment Problem: Machine Learning and Human Values''  2020-10-06 ISBN 978-0393635829</ref> Privacy preserving dat; ideas do not translate into hard mathematical constructs. Cynthia though that when people were asking for privacy they were actually worried about about somebody using their data in the wrong way. It wasn't so much about hiding the data at all costs, but  preventing harm from the way that data was used. The conversation has sifted from privacy to fairness. Fairness through blindness does not work. All personal data provides information that can be used to categorize the individual, in other words, to discriminate.
Line 137: Line 139:
 
* Be kind to your fiends by never giving any but your trusted email app access to your contacts list.
 
* Be kind to your fiends by never giving any but your trusted email app access to your contacts list.
 
The general slant of this section is that you should be part of the solution and not part of the problem. This is different than most recommendations for personal privacy which focus on the reader and not on society. The general problem of privacy erosion cannot be solved if we all focus on our own privacy rather than the social and economic systems that give rise to that erosion. Still, the advice in many of the articles is good, if not particularly forward looking. Some example in New York Times articles.<ref>Thorin Klosowski, ''How to Protect Your Digital Privacy'' New York Times (2019 undated)  https://www.nytimes.com/guides/privacy-project/how-to-protect-your-digital-privacy</ref><ref>David Progue, ''10 Tips to Avoid Leaving Tracks Around the Internet'' New York Times (2019-10-14 print) p. B6 and (2019-10-08 internet) https://www.nytimes.com/2019/10/04/smarter-living/10-tips-internet-privacy-crowdwise.html?searchResultPosition=1</ref>
 
The general slant of this section is that you should be part of the solution and not part of the problem. This is different than most recommendations for personal privacy which focus on the reader and not on society. The general problem of privacy erosion cannot be solved if we all focus on our own privacy rather than the social and economic systems that give rise to that erosion. Still, the advice in many of the articles is good, if not particularly forward looking. Some example in New York Times articles.<ref>Thorin Klosowski, ''How to Protect Your Digital Privacy'' New York Times (2019 undated)  https://www.nytimes.com/guides/privacy-project/how-to-protect-your-digital-privacy</ref><ref>David Progue, ''10 Tips to Avoid Leaving Tracks Around the Internet'' New York Times (2019-10-14 print) p. B6 and (2019-10-08 internet) https://www.nytimes.com/2019/10/04/smarter-living/10-tips-internet-privacy-crowdwise.html?searchResultPosition=1</ref>
 +
 +
===Poisoning Data===
 +
In an era of growing [[Artificial Intelligence]] with its vast ability to correlate data, one means of keeping some control of data is to alter it in some way to avoid future matching.<ref>Gregory Mone, ''Poisoning Data to Protect it''  '''CACM 67''' No 6 (2024-06)</ref> This method works best for large data items like audio and video. While [[Identifier]]s cannot be changed, other methods like tokenizing might be available.
  
 
===International Regulations===
 
===International Regulations===

Latest revision as of 14:19, 6 October 2024

Full Title or Meme

  1. Privacy is the right to be let alone as defined in the late 19th century law. [1]
  2. Privacy more recently has come to mean the right to hide User Private Information or Behavior.

Context

From the Latin priuates meaning apart.

People want to know things. But people want to hide things. The search for Knowledge is always running up against the right to privacy. We are told by Jill LePore an eminent Historian.[2]

A nice review of privacy and its foundation can be found in the New Yorker[3] Privacy rights protect personal autonomy and shield survivors of abuse. They also conceal abuse and safeguard the powerful. Is the concept coherent?

A law review article[4] tried to define what the modern meaning of privacy might be. It created a taxonomy, a score card, if you will, but nothing absolute or clear. Its forth taxon is Invasion with a sub species of Intrusion corresponding to the Justice Douglas quote below.

Everyone wants some privacy, but no one can explain that in a way that matches anyone else's desire for privacy. Privacy is associated with liberty, but also with privilege (private roads, private schools), with confidentiality (private conversations) with nonconformity, with dissent, with shame, with embarrassment, with the deviant, with the taboo, with subterfuge and concealment.[5]

In his dissent to a 1952 Supreme Court decision[6] Justice Douglas issued this stirring defense of the right to be let alone:

"If liberty is to flourish, government should never be allowed to force people to listen to any radio program. The right of privacy should include the right to pick and choose from competing entertainments, competing propaganda, competing political philosophies. If people are let alone in those choices, the right of privacy will pay dividends in character and integrity. The strength of our system is in the dignity, the resourcefulness, and the independence of our people. Our confidence is in their ability as individuals to make the wisest choice. That system cannot flourish if regimentation takes hold. The right of privacy, today violated, is a powerful deterrent to any one who would control men's minds."

The above statement seems to state that the people's character and integrity will be enhanced by the right to pick their own sources of information. There is ample reason to doubt that statement in light of the environment in 2018 where populist demagogues are again persuading people to vote against their own interests by manipulating news sources that appeal to the peoples' prurient interests. [7] But the statement does seem to clarify that privacy is about liberty, the right to do what you want with yourself, with your attention and with your belongings. The "right to be let alone" was codified by the US Supreme Court in in 1965.[8] Here we see that the valuable commodity in commerce is not User Information, it is getting the right user's Attention. User Information is, however, useful in targeting the "right" users.

The privacy context that is most often discussed for computer systems is user's data protection rather than intrusions into a user's attention. Bill Gates put it this way:[9]

[At Microsoft we don't generally] distinguish among the types of data being collected - the kind of shoes you like to buy versus which diseases you're genetically predisposed to - or who is gathering it, or how they're using it. Your shopping history and our medical history aren't collected by the same people, protected by the same safeguards or used for the same purposes. Recognizing this distinction would [make our] discussions more enlightening.

But the subject of most interest to users is identity theft and unexpected (creepy is the word most often used) knowledge about what users have been viewing and how that impacts the messages that are targeted to a user. So the context reported in the technical press is typically more about a computer site's legal liability rather than the users' experience. In other words, the challenge understood by computer sites today is the user's rights, rather than the user's privacy expectations.

Here's a definition from Proton: The way Google defines privacy is, “Nobody can exploit your data, except for us.” [Proton's] definition is cleaner, more simple, and more authentic: Nobody can exploit your data—period.[10] It is not at all clear that this definition of Privacy matches user's expectations at all. Google is dependent on advertising to make their products available for "free". When Neeva offered the same functionality as Google without the ads but with a small monthly fee, they failed and shut down.[11]

Personal Context

Privacy is a very loosely defined word and very much it depends on both what you mean by it and what a person expects, depending on context. My context for privacy when I'm out on the street or on a Friday evening in the entertainment district is a lot different than if I'm going to an Alcoholics Anonymous meeting in the basement of a church. My expectations in the context are radically different. That hasn't yet translated effectively into the digital realm where we speaking more specifically, usually about personally identifiable information about a person and how it gets collected, used, and disclosed by various entities, and in that space I think the context really should be defined by my personal approach to explaining privacy to people, which is a person has privacy when they're able to determine what information they share about themselves, with whom, and under what terms. And when you phrase it like that, that becomes an "operationalizable" way to approach your functional and non-functional requirements for designing a system that's going to process personal information.

The best way to describe privacy is by the reactions against data released to the public. Lately we're seeing a growing "Privacy First" movement that recognizes that so many of America's ills start with the country's poor privacy protections (Congress last updated consumer privacy law in 1988, when they banned video-store clerks from telling newspapers which VHS cassettes you rented as a result of the data released during the Clarence Thomas hearings). Whether you're worried about Big Tech brainwashing, AI deepfake porn, red state Attorneys General following teens into abortion clinics, or cops rounding up the identities of protesters with "reverse-warrants" served on Google. But the first legal definition of privacy was the Warren and Brandeis report[1] published after some newspaper gossip column reporting in Boston about a Brandeis relative. It seems that most attempts to control privacy have come as a reaction to an unflattering (or dangerous) exposure of "private" information.

Government Awareness

Perhaps the earliest attention to privacy in the US Federal government came during Congressional hearings on a giant US national data center proposal. But on the last day of hearing 1966-07-20 Paul Baran of the Rand Corporation[12] told the committee that while the Nations data center could destroy Privacy as we know, but that it did not make the slightest bit of difference if the data center were built, Data collection didn't require a building as the data was certain to get linked up anyway. It didn't matter whether the data was put into a single data base or spread around the country, the result is the same. We cannot stop technology but we can provide and safeguards if we take action now, before it is too late. If the government does set up regulation now we will be left with a world where anyone can collect and use the data for any purpose at all.[2] The subcommittee was adjourned and no action was taken.

Identifiers and Privacy

  • The very human need to create labels for things is as old as language itself. That is what nouns and adjectives are designed to do.
  • In computer science, an entity is a object with an identifier.
  • But then people do not like to be objectified. They want to be individuals, which is what they really are.
  • Taxonomies, that create buckets (called taxa) and assign every object to one of the taxa always run into ambiguities and start creating branches to accommodate the ambiguity.
  • Privacy and Identity are antithetical concepts. Having privacy in any environment where things are labeled in a taxonomy will destroy privacy completely if the taxonomy is taken to extremes.
  • But then this wiki is all about Identity Management.
  • So this page is all about having your cake and eating it too.

The Data Cult

Technology views all technology problems as something that can be fixed by more technology. If web sites are collecting too much data, lets control the data that flows into them. This is not working now and probably can never work well. Lowry Pressly sees the ideology of information lurking in a less likely place—among privacy advocates trying to defend us from digital intrusions. This is because the standard view of privacy assumes there is “some information that already exists,” and what matters is keeping it out of the wrong hands. Such an assumption, for Pressly, is fatal. It “misses privacy’s true value and unwittingly aids the forces it takes itself to be resisting,” he writes. To be clear, Pressly is not opposed to reforms that would give us more power over our data—but it is a mistake “to think that this is what privacy is for.” “Privacy is valuable not because it empowers us to exercise control over our information,” he argues, “but because it protects against the creation of such information in the first place.”[13]

The Alignment Problem

Alignment of human needs with technological solutions has not been good.[14] Privacy preserving dat; ideas do not translate into hard mathematical constructs. Cynthia though that when people were asking for privacy they were actually worried about about somebody using their data in the wrong way. It wasn't so much about hiding the data at all costs, but preventing harm from the way that data was used. The conversation has sifted from privacy to fairness. Fairness through blindness does not work. All personal data provides information that can be used to categorize the individual, in other words, to discriminate.

User and Web Site Acceptance for Privacy Seals

Back in 2000 the FTC had the following to say.[15] There is no evidence to-date that things have changed for the better.

The 2000 Survey also examined the extent to which industry's primary self-regulatory enforcement initiatives online privacy seal programs have been adopted. These programs, which require companies to implement certain fair information practices and monitor their Compliance, promise an efficient way to implement privacy protection. However, the 2000 Survey revealed that although the number of sites enrolled in these programs has increased over the past year, the seal programs have yet to establish a significant presence on the Web. The Survey found that less than one-tenth, or approximately 8%, of sites in the Random Sample, and 45% of sites in the Most Popular Group, display a privacy seal.

Still, hope springs eternal, the ID2020 Alliance folk have created a Certification Mark[16] as it attempts " to improve lives through digital identity by creating a trustmark for digital identity technologies that meet ID2020's technical requirements. The Certification Mark gives companies developing digital identity technologies in line with ID2020's core principles of portability, persistence, privacy, and user-control a way to demonstrate their commitment to "good" digital identity in the market, incentivizing a race to the top. It draws inspirations from efforts like the Trustable Tech Mark to develop a "badge of honor" for companies and organizations designing technology with user privacy and rights top-of-mind."

Terminologies

Several efforts are underway to collect the terms that are used in privacy standards, regulation and documentations.

  • W3C Data Privacy Vocabulary is a community (non-standards) group in the W3C collecting other sources. Preliminary draft (V0.1) on 2019-07-26.

Problems

  • See also Attacks on Privacy.
  • Privacy, as a right, has no real basis in history or law before the Warren and Brandeis article[1] in 1890 as it does not exist in the tribal societies which predated civilizations nor in the autocracies that dominated until the enlightenment of the mid 1800s when access to the posting of private mail became available to every citizen in the developed countries of the world. Since Hebb's experiments starting in 1951 we have known exactly how complete isolation can dive anyone into insanity,[17] so extreme Privacy should not be considered to be healthy, even if it were possible. Nor can Privacy be considered an absolute right since we also want to support the state in bringing malefactors to justice. It will most likely be many years before we learn to strike the right balance, particularly in view of the many other changes still underway in social structures.
  • Note, however, that Supreme court justice Douglas went back to common law to say that, for example, actions in the privacy of one's bedroom, could not be disclosed even if true.
  • The biggest problem about Why the “Privacy” Wars Rage On[18] is the use by the rich and powerful to hide their actions, even when illegal.
  • Users or Consumers have always been a real asset to any business and access that provides user attention (aka eyeballs) to a message is now a commodity to be bought and sold on the internet.
  • The great amount of attention currently focused on Privacy is having the unfortunate effect of accretion of unrelated concerns which are not related to the right to be let alone.[19] The following are some of the topics listed by the New York Times as a part of that accretion to the term: (1) employers use Facebook s' advertising platform to show certain job ads only to men, to young folk, or to any other category selected by the advertiser, (2) Google collects the where-abouts of its users - even after they deliberately turn off location history, (3) AT&T shares its mobile customers' locations with data brokers. Regardless of whether these behaviors are good are bad, these are social not privacy issues.
  • Technology, and social media in particular,[20] record nearly every transaction that we make, and can retrieve it for any purpose not explicitly disallowed by regulation. The amount of information that Facebook has accumulated about people is far more than most people understand.[21] The constantly evolving problem is to keep the laws protecting citizens from depredations by large organization up-to-date in the face of rapidly accelerating change.
  • Researchers have shown that users stated desires do not match their behavior.
    We find that many users want more control over tracking and think that controlled tracking has benefits, but are unwilling to put in the effort to control tracking or distrust current tools. We confirm previous findings that users’ general attitudes about tracking are often at odds with their comfort in specific situations.
  • Steven Levy tried to answer a question about privacy, which basically goes to the point that everyone seems to know what they mean about privacy, until someone asks them what Privacy really is. (with apologies to St. Augustine talking about time.) It does, however, finally get to the point that privacy is a subject to be debated by the least competent commentators, lawyers and legislatures.
    Sergio asks, “The word ‘privacy’ has become a buzzword that everyone uses nowadays whenever talking about the risks of Big Tech. However, I believe its definition has morphed and evolved during the last 10-15 years into something new. What would be your definition of ‘privacy’ in this hyperconnected, non-stop, data-mining, automated and personalized world?”
    I agree that the challenges of privacy have changed in the past decade or two, particularly as more information about us gets routinely collected and used to monitor us, sell us things, and sometimes even to steal from us. But I don’t think that’s changed the definition of privacy, at least as I see it. Privacy is about keeping what’s personal to us out of the hands of people who don’t need to see or access it. It’s so much harder to do that in this hyperconnected, non-stop, data-mining, automated and personalized world. But it would certainly help if we had stronger laws to help us. For instance, I would like a clear opt-in before anyone tracks my movements on the web or shares any information they gather about me with anyone else. And I would like to see huge fines and maybe even criminal penalties for those who violate such a law.
  • See the page Identity Theft for financial issues created by release of User Private Information.
  • See the page Privacy Risk for more details on how an Enterprise could mitigate risks associated with holding User Private Information.

Privacy is NOT an Absolute Right

  • There are documented cases where unrestricted privacy has killed 200 people, including the subject. It is a right that must be balanced with all the others.
  • Some activists try to make Privacy into a means to avoid letting the government know anything about you. For example Moxie Marlinespike created the messaging app to enable privacy communications.[22] Some would argue that privacy conversations have been available since recorded time began. Law-enforcement is always trying to pierce the veil.
  • This is a fight that will never be concluded.

Avoidance Patterns

Given any set of governmental regulation support user privacy, Web Sites will to to any length to avoid the intent of the regulation to suit their own needs:

  1. The favorite dodge is exemptions. The GDPR has dozens, drown by expensive lawyers when the regulation was written, and available now to twist then in any possible wy to help the Web Site avoid the plain meaning of the law.
  2. Technology can create Dark Patterns in their web sites that can make it very difficult for user's to exercise their rights to privacy. California bans ‘dark patterns’ that trick users into giving away their personal data (2021-03-16)
  3. Dark Patterns: UChicago/Princeton Research Reveals the Dirty Tricks of Online Shopping

Privacy Through Obscurity

Several organizations have tried to obscure collected data in some way that will allow the data to be used for making decisions, but not able to separate out any personal attributes or behaviors from any one individual. This has not worked out so well. In particular the Decentralized ID community extols the concept of herd privacy in decentralized technologies at the same time that the central bankers all all assisting the various governments in disentangling the transaction on blockchains as the digital currency is turned into real-world wealth.

The ability to collect, store, and process large quantities of data has allowed for the better design of products and services using machine learning and data science tools, but it can also create privacy risks for individuals and legal risks for organizations, if data is not properly managed. In "Federated Learning and Privacy," [23]there is a discussion of the efforts to build privacy-preserving systems for machine learning and data science on decentralized data. Specifically, the authors explore federated learning and federated analytics, two approaches that allow data scientists to generate analytical insights from the combined information of multiple clients in decentralized datasets. These privacy technologies may be combined in real-world systems with minimized risk to both individuals and the custodians of the data. According to the authors, federated learning and federated analytics leverage the principle of data minimization, which includes the objective of collecting only the data needed for the specific computation, as well as the principle of data anonymization, in which the final released output of the computation does not reveal anything unique to an individual.

Abuses

Solutions

The number of governments that have proposed "Solutions" to the invasions of "The Right to Privacy"[1] is large[24], but the results are meager and most law enforcement agencies are often in court trying to use any loop-hole to strip away privacy; especially in cases where products have been specifically designed to maintain privacy.[25] The same laws that protect privacy also protect criminal behavior. Not that privacy laws always make sense; US federal law prohibits the National Tracing Center from using a searchable data base to identify the owners of guns seized at crime scenes. No privacy advocate has yet recommended that law be repealed.

Security is a responsibility of a person in organization to protect the organization, whereas Privacy is the responsibility of a person or people in the organization responsible to respect the privacy of the individuals whose data they process

The solutions shown below are basically targeted to user's rights, privacy figures in most of these, but the connection is often a stretch. It might be appropriate at this time change the name of the topic "privacy" to "user rights".

Educate the User

THIS IS NOT A SOLUTION. Don't misunderstand; educating computer users in being safe on their computers is a fine idea. Just don't rely on education to solve a poorly designed system. If you don't think users can be private on your solution, DON'T SHIP IT till it is ready for the user out-of-the-box. And test that product with real users to ensure that it can be used safely.

In what must be the most extreme example of trying to educate the user, an immersive theater experience was tried in Pittsburgh.[26]

Differential Privacy

There is an idea currently abroad[27] that privacy can be guaranteed by inserting a little noise into the data. This is claimed to give good broad statistical studies of large databases without loss of any personal anonymity. It would good if this concept were compared to Claude Shannon's paper [28] which described the effect of noise on data transmission. Or consider the impact of AI to extract data from noisy channels. While this technique might have some limited applicability, it is not a broad solution to anything other than professor's tenue.

Technology Solutions

Privacy Enhancing Technology Providers have been proposed and are actively available from Microsoft and IBM. Their uptake has been negligible because of two factors: they are clumsy for users and they don't provide much privacy. The real problem facing users today is how to stop digital agents from using your information once they acquire it. The GDPR is designed to control the movement of users private data from one enterprise to another. The State of California now has a initiative scheduled for ballot that focuses more on places where your data exits. But neither of them addresses the problem of your data in some third world country, or island nation, from misuse. In some ways the GDPR is a scam in that it is more about penalizing corporations in California for the benefit of bureaucrats in Europe than helping with existing loss of privacy.

Global Privacy Regulations Compliance using a new OAuth 3.0 design informed by global privacy regulations, ensuring compliance and facilitating international operations. "OAuth 3.0 is crafted with global privacy regulations in mind, helping Organizations maintain compliance across different jurisdictions." From Aaron Parecki 2024-06-02 - Yet another attempt to bolt privacy onto existing paradigms rather than look for new solutions.

Privacy Gateways

The principle portal to the internet on your computer and smartphone is the browser. There is a tug-of-war between browsers to see which can win the user's confidence and bag the largest collection os users which they can then target with ads and tracking. The conflict in that dichotomy should be obvious. While at first it was the smaller browsers that offered privacy, like Opera, soon Firefox was in the game and doing well at it. The big guys, Apple and Google, then stepped up their game are are now dominate. But that does not mean that things are stable as reported by the interesting chat in Wired where the staff tried to understand the tug between privacy and anti-trust. Since the two behemoths have different goals, the tension should be interesting. How User Data Privacy and Antitrust Law Got All Tangled Up 2021-04-30.

The Right to be Let Alone

Most of the legislation and press is actually about Information Sharing which is not the original Warren and Brandeis definition. User attention is actually what advertisers want, Information Sharing just helps the advertiser to target the audience. Ad Blockers do address user attention as does Native App Privacy with focuses on apps that may demand user attention by sending notification messages to the screen at any time they wish. Fortunately the Smart Phone operating systems all allow fine grained user control of notifications from apps. But this is one area that receive little attention until an article from MIT[29] that describes the de-anonymization of location data which is sold to companies that want to see where traffic patterns concentrate. It turns out that if I know where you go, I can figure out who you are.

Personal Recommendations

Like many people, there is a tendency for Enterprises to try to project their own failures elsewhere and proclaim their own innocence. Privacy concerns are just another case where they try to blame the user for not solving the problem. Here are some ideas that Users can take to protect their privacy, not that it should be their obligations, but because Web Sites continue to be so poor at helping. Some of the ideas come from the past president of the Direct marketing Association.[30] who seems to think that any solution "has to start with consumers". Other commentators are more forceful like Ellen Pad's article in Wired[31] who "thinks they don’t care about their users and how their platforms work to harm many, and so they don’t bother to understand the interactions and amplification that result. They’ve been trained to not care." In any case you are the only person who really cares about your privacy.

  • Don't use Facebook to buy things. As a part of their effort to stop "Fake Identifiers" Facebook will abruptly terminate accounts, so any place you use a Facebook sign-in might suddenly become unavailable.
  • Don't stay logged into Web Sites like Google Maps or Facebook that track your every move.
  • Don't take surveys or tell Web Sites about your preferences. The pollster may be selling your options to those who want to market to you.
  • Run an "Ad blocker" that stops the ability of third party sites, present as ads on your screen, from loading, or from storing cookies on your device. This will also speed up your download times often by huge amounts.
  • Help efforts like the citizen initiative[32] filed with the secretary of state in California to limit the sale of user data.[33] Did not even need to go to the ballot because the CA legislature acted first.
  • Pay attention to security and privacy notifications on your computer; often you will be able to "just say no".
  • Freeze your credit ratings as described on this site.
  • Never click on a link or call a phone number that you cannot validate as legitimate, most especially in an email supposedly from "a friend" who may have provided access to their contacts lists.
  • Be kind to your fiends by never giving any but your trusted email app access to your contacts list.

The general slant of this section is that you should be part of the solution and not part of the problem. This is different than most recommendations for personal privacy which focus on the reader and not on society. The general problem of privacy erosion cannot be solved if we all focus on our own privacy rather than the social and economic systems that give rise to that erosion. Still, the advice in many of the articles is good, if not particularly forward looking. Some example in New York Times articles.[34][35]

Poisoning Data

In an era of growing Artificial Intelligence with its vast ability to correlate data, one means of keeping some control of data is to alter it in some way to avoid future matching.[36] This method works best for large data items like audio and video. While Identifiers cannot be changed, other methods like tokenizing might be available.

International Regulations

US Federal Regulations

  • The US Congress has a bill before it that is so popular that it was voted out of committee by 53-2.[37] The two negative votes were state of California Democrats that wanted existing California law to remain in effect as it was more restrictive. Since the leader of the House, Nancy Pelosi, is also from California, action on the bill is not likely.
  • Gramm Leach Bliley Act (Reg P) - Banks covered by this Act must tell their customers about their privacy practices and explain to them their right to "opt out" if they don't want their information shared with third parties.
  • Endgame On: The narrowing path ahead for privacy legislation[38]
    After almost four years of effort and in a highly polarized political environment, any bipartisan agreement is a huge deal. That’s why the June 3, 2022 release of a bipartisan “discussion draft” from party leaders on both sides of the Congressional committees that oversee consumer protection and technology issues have galvanized the national privacy debate.
  • ACM's U.S. Technology Policy Committee (USACM) this week (2018-07-02) submitted recommendations to the Senate Commerce Committee's Subcommittee on Consumer Protection, Product Safety, Insurance, and Data Security that focus on protecting personal privacy in the wake of the Facebook and Cambridge Analytica scandal. USACM urged Congress to immediately act to protect the public good and the integrity of the democratic process by addressing technical and ethical issues raised by the data breach. Specifically, USACM wants Congress to draft and adopt comprehensive personal privacy protection legislation extending beyond social media to meet nine critical goals, including limiting collection/minimizing retention of personal data; clarifying and simplifying User Consent processes and maximizing user control of data, and simplifying data-sharing policies and assuring data-flow Transparency. USACM chair Stuart Shapiro says the organization has significant expertise in the technical and ethical aspects of the case, and looks forward to serving as an apolitical resource for Congress in pursuit of solutions.[39]
  • The tech industry lobbying group "Information Technology Industry Council" is aggressively "lobbying officials of the Trump administration and else where to start outlining a federal privacy law." to "overrule the California law and instead put into place a kinder set of rules that would give the companies wide leeway over how personal digital information was handled."[40] According to reviews of their proposals[41] it seems that this group is trying to weaken the privacy laws in comparison to the ones in California or Europe.
  • HIPPA and COPPA offer legislative protections respectively to the sectors of Health Care patients and children under the age of 13.
  • The financial sector heavyweights have banded together with the law firm Venable and former NIST senior executive Jeremy Grant, to form the Better Identity Coalition" to focus (in their own words) "on developing and advancing consensus-driven, cross-sector policy solutions that promote the development and adoption of better solutions for identity verification and authentication." It seems like every sector wants to offer their own sector's version of a cross-sector solution.
  • The Federal Trade Commission’s report on mobile privacy: Mobile Privacy Disclosures: Building Trust Through Transparency
  • Legislation was signed by the president (2018-09-14) that does more to make your data your property by forcing credit agencies to allow free control by the consumer.
  • National Telecommunications and Information Administration is conducting a comprehensive review of the nexus between privacy policy and innovation in the Internet economy which is updated nearly every month.
  • NIST has updated the Risk Management Framework to include privacy considerations (2018-05-11).
  • NIST has published draft 5 of the SP 800-53 publication on Security and Privacy Controls for Information Systems and Organizations (2018-08) which nominally applies only to federal non-defense systems, but is often used in commercial systems as well.
  • NIST has started (2018-11-29) a year long effort to establish a Privacy Framework. This is basically a risk analysis for enterprises that handle user private data.

US State Regulations

  • First-in-the-nation consumer privacy rights are the law of the land in California![42] I guess they mean first among the 50 states.
  • PING published a review of the California legislation.
  • The California State Attorney General’s recommendations for mobile privacy: Privacy on the Go: Recommendations for the Mobile Ecosystem
  • California state attorney general Xavier Becerra outlined on (2019-10-11)[43] what companies will have to do to comply with the state’s landmark privacy law, the California Consumer Protection Act (CCPA). The law will go into effect Jan. 1, 2020. The draft regulations instruction companies on how quickly they should respond to customer requests about their data and how they should verify customers requesting to see or delete their data
  • Washington State is in their third attempt to pass a regulation which sounds a lot like the GDPR. The version of (2020-02) is found [here. It would not take effect until mid 2021 and would be delayed for 3 more years for non-profits. The ACLU is trying to block its passage as it is written by the technology companies themselves.

User Rights

In general discussions about privacy are focused on the Privacy Risk experienced by data collections and their efforts to avoid legal liability. Here we try to focus on the rights that users have in the digital world.

  1. Legal rights are discussed on this site.
  2. The Universal Declaration of Human Rights (UDHR) states in article 12 - "No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honor and reputation. Everyone has the right to the protection of the law against such interference or attacks."
  3. While the UDHR has no legal basis in most countries, it is followed at one level or another in nearly all of them.

GDPR grants these rights

One legal firm was bold enough to describe the GDPR in the following bullet points. What they left out was the dozens of exclusions which means that the GDPR only applies to organizations that are either (1) based in the US, or (2) unwilling to pay $500/hour for a lawyer to find an exclusion that apples to them. See the wiki page GDPR is a scam for more details.

  • The right to be informed
    • You must provide data subjects with various pieces of information about the data processing activities you carry out.
    • This information is usually provided in a Privacy Notice or Privacy Statement.
    • The information must be:
      • concise, transparent, intelligible and easily accessible;
      • written in clear and plain language; and
      • free of charge.
  • The right of access
    • You must provide data subjects with:
      • confirmation their data is being processed;
      • access to their personal data; and
      • other supplementary information.
    • You must comply with any subject access request within one month of receipt.
    • You cannot charge a fee unless the request is “manifestly unfounded or excessive”.
    • Where you process a large quantity of information you can ask the data subject to specify the information they want access to.
    • You may refuse to comply with a subject access request where this is “manifestly unfounded or excessive”.
  • The right to rectification (called redress in this wiki)
    • Data subjects can have their personal data rectified if it is inaccurate or incomplete.
    • You must comply with any request to rectify within one month of receipt. This can be extended to 2 months where the request is complex.

Conclusions

At the end of any discussion about privacy, the harsh fact emerges that for all people talk about privacy, they seem have little interest in doing anything to preserve it. Oliver Sacks had this to say[44]
Everything is public now, potentially: one’s thoughts, one’s photos, one’s movements, one’s purchases. There is no privacy and apparently little desire for it in a world devoted to non-stop use of social media. Every minute, every second, has to be spent with one’s device clutched in one’s hand. Those trapped in this virtual world are never alone, never able to concentrate and appreciate in their own way, silently. They have given up, to a great extent, the amenities and achievements of civilization: solitude and leisure, the sanction to be oneself, truly absorbed, whether in contemplating a work of art, a scientific theory, a sunset, or the face of one’s beloved.

References

  1. 1.0 1.1 1.2 1.3 Warren and Brandeis The Right to Privacy (1890-12-15) Harvard Law Review http://groups.csail.mit.edu/mac/classes/6.805/articles/privacy/Privacy_brand_warr2.html
  2. 2.0 2.1 Jill LePore, Who Killed Truth (2023-06-13) ASIN B0C6BY479C
  3. Jeannie Suk Gersen, Why the “Privacy” Wars Rage On New Yorker (2022-06-27) https://www.newyorker.com/magazine/2022/06/27/why-the-privacy-wars-rage-on-amy-gajda-seek-and-hide-brian-hochman-the-listeners?utm_source=nl&utm_brand=tny&utm_mailing=TNY_Daily_062222&utm_campaign=aud-dev&utm_medium=email&utm_term=tny_daily_digest
  4. Daniel J. Solove, A Taxonomy of Privacy University of Pennsylvania Law Review (2006-01) https://scholarship.law.upenn.edu/cgi/viewcontent.cgi?article=1376&context=penn_law_review
  5. Louis Menand Nowhere to Hide: Why Do We Care So Much About Privacy? The New Yorker June 18, 2018 https://www.newyorker.com/magazine/2018/06/18/why-do-we-care-so-much-about-privacy
  6. Justice Douglas (Descent), Public Utilities Comm'n v. Pollak, 343 U.S. 451 (1952) https://supreme.justia.com/cases/federal/us/343/451/case.html
  7. Eduardo Porter, Is the Populist Revolt Over? Not if Robots Have Their Way https://www.nytimes.com/2018/01/30/business/economy/populist-politics-globalization.html
  8. Justice Douglas 381 U.S. 479 Griswold v. Connecticut (No. 496) (1995-06-07) US Supreme Court https://www.law.cornell.edu/supremecourt/text/381/479
  9. Bill Gates, Thinking Big (2018-09-09) The New York Times Book Review p1. ff
  10. Gilead Edelman, Proton Is Trying to Become Google—Without Your Data Wired (2022-05-26) https://www.wired.com/story/proton-mail-calendar-drive-vpn
  11. J. R. Raphael, Google killer, killed: Neeva and the limits of privacy as a philosophy Computer World (2023-05-23) https://www.computerworld.com/article/3697309/google-killer-neeva-privacy.html
  12. Paul Baran and the Origins of the Internet https://www.rand.org/about/history/baran.html
  13. Lowry Pressly, The Right to Oblivion: Privacy and the Good Life Harvard UP (2024-10-08) ISBN ‎ 9780674260528
  14. Brian Christian, The Alignment Problem: Machine Learning and Human Values 2020-10-06 ISBN 978-0393635829
  15. FTC PRIVACY ONLINE: FAIR INFORMATION PRACTICES IN THE ELECTRONIC MARKETPLACE A REPORT TO CONGRESS May 2000 https://www.ftc.gov/sites/default/files/documents/reports/privacy-online-fair-information-practices-electronic-marketplace-federal-trade-commission-report/privacy2000text.pdf
  16. ID2020 Alliance accelerates towards "good" digital identity through launch of Certification Mark and new Alliance member Cision News (2019-01-24) https://www.prnewswire.com/news-releases/id2020-alliance-accelerates-towards-good-digital-identity-through-launch-of-certification-mark-and-new-alliance-member-300783721.html
  17. Michael Mechanic, What Extreme Isolation Does to Your Mind. (2012-10-18) Mother Jones https://www.motherjones.com/politics/2012/10/donald-o-hebb-effects-extreme-isolation/
  18. Jeannie Suk Gersen, Why the “Privacy” Wars Rage On, Wired (2022-06-27) https://www.newyorker.com/magazine/2022/06/27/why-the-privacy-wars-rage-on-amy-gajda-seek-and-hide-brian-hochman-the-listeners
  19. Natasha Singer, Just Don't Call it Privacy. (2018-09-23) New York Times p. SR 4
  20. Sarah Igo, The Known Citizen: A History of Privacy in Modern America 2018 ISBN 978-0674737501
  21. Brian X. Chen https://www.nytimes.com/2018/04/11/technology/personaltech/i-downloaded-the-information-that-facebook-has-on-me-yikes.html
  22. Anna Wiener "Privacy Settings' The New Yorker. p 32ff (2020-10-26)
  23. Kallista Bonawitz, Peter Kairouz, Brendan McMahan, and Daniel Ramage. ACM Queue Volume 19, issue 5 (2021-12-12) https://queue.acm.org/detail.cfm?id=3501293
  24. https://en.wikipedia.org/wiki/Privacy#Privacy_law
  25. Jack Nicas Apple to Close iPhone Security Hole That Law Enforcement Uses to Crack Devices https://www.nytimes.com/2018/06/13/technology/apple-iphone-police.html
  26. Michael Skirpan + 4, Is a Privacy Crisis Experienced, a Privacy Crisis Avoided? CACM 65 no 3 p 26ff (2022-03)
  27. Miguel Guevara +3, Differential Privacy: The Pursuit of Protections by Default CACM 64 no2 (2021-02) P 36ff
  28. Claude E. Shannon +1, The Mathematical Theory of Communications1963 ISBN 0252725484
  29. Rob Matheson, The privacy risks of compiling mobility data. (2018-12-07) MIT News Office http://news.mit.edu/2018/privacy-risks-mobility-data-1207
  30. Linda Woolley, Letter to the Editor. (2018-09-02) New York Times Magazine p. 7
  31. Ellen Pad, Let's Stop Pretending Facebook and Twitter's CEOs Can't Fix This Mess. (2018-08-28) https://www.wired.com/story/ellen-pao-facebook-twitter-ceos-can-fix-abuse-mark-zuckerberg-jack-dorsey
  32. Mary Ross, Alastair Mactaggart, The Consumer Right to Privacy Act of 2018 https://oag.ca.gov/system/files/initiatives/pdfs/17-0039%20%28Consumer%20Privacy%20V2%29.pdf
  33. Daisuke Wakabayashi Silicon Valley Faces Regulatory Fight on Its Home Turf May 13, 2018 https://www.nytimes.com/2018/05/13/business/california-data-privacy-ballot-measure.html
  34. Thorin Klosowski, How to Protect Your Digital Privacy New York Times (2019 undated) https://www.nytimes.com/guides/privacy-project/how-to-protect-your-digital-privacy
  35. David Progue, 10 Tips to Avoid Leaving Tracks Around the Internet New York Times (2019-10-14 print) p. B6 and (2019-10-08 internet) https://www.nytimes.com/2019/10/04/smarter-living/10-tips-internet-privacy-crowdwise.html?searchResultPosition=1
  36. Gregory Mone, Poisoning Data to Protect it CACM 67 No 6 (2024-06)
  37. Matt Laslo, The Shaky Future of a Post-Roe Federal Privacy Law Wired (2022-09-15) https://www.wired.com/story/adppa-roe-democrats-congress/
  38. Cameron F. Kerry, Endgame On - The narrowing path ahead for privacy legislation Brookings (2022-06-22) https://www.brookings.edu/blog/techtank/2022/06/22/endgame-on-the-narrowing-path-ahead-for-privacy-legislation/
  39. USACM Calls on Congress to Enact Comprehensive Consumer Privacy Protections (2018-07-02) https://www.acm.org/media-center/2018/july/media-advisory-us-tech-policy-ctte-letter-on-data-privacy
  40. Cecilia Kang, Pursuing Law on Privacy, With Caveats. (2018-08-27) New York Times p. B1,ff
  41. Bernadette Tansey, Tech Industry Lobby Proposes Data Privacy Laws; Critics Call Them Weak. (2018-10-22) Xconomy https://www.xconomy.com/san-francisco/2018/10/22/tech-industry-lobby-proposes-data-privacy-laws-critics-call-them-weak/
  42. CA Privacy Web Site https://www.caprivacy.org/
  43. Lauren Feiner, California AG tells businesses like Facebook and Google how they must comply with the state’s new landmark privacy law CNBC (2019-10-11) https://www.cnbc.com/2019/10/11/california-attorney-general-outlines-rules-for-state-privacy-law-ccpa.html
  44. Oliver Sacks The Machine Stops - The neurologist on steam engines, smartphones, and fearing the future. (2019-02-04) The New Yorker https://www.newyorker.com/magazine/2019/02/11/the-machine-stops