User Private Information

From MgmtWiki
Jump to: navigation, search

Full Title or Meme

User Information consists of two parts, the Private and the Public. Click on the link to see more about User Information in general.

Context

There is an interesting story about Samuel D. Warren II, one of the authors of "The Right to Privacy"[1]. He was a very private person who had a very public scandal that resulted in lewd newspaper articles about the reason for his wife's death.[2] Shortly after that he wrote the law review article which was first to sharply define the right.

Most of the discussion of user authentication is based on NIST Special Publication NIST SP 800-63-2 and the Levels of Authentication (LOA). (A new draft SP 800-63-3 has just been approved which proposes Identity Assurance Level (IAL) 1, 2 and 3. Both the LOA and IAL are listed below.) This page looks at user authentication from the point of view of the user and what they understand about their own exposure of personal information in a system. This could be shown through the exhibiting of a user's status through authentication from the user's need to provide the session with authorization to achieve the user's intentions.

The only function of the internet is to move information from one point to another point. The internet protocols structure data and provide context that allows the systems communicating on the internet to infer meaning. It is the action that information has when it moves from one site to another that unlocks the full potential of the internet. This page address the visibility to the user as to the amount of information that a user needs to provide to a relying party (RP) web site to achieve their goals.

Note that risk status directly addresses the issues of the December 2016 Plenary on the acceptance by the IDESG of NIST SP 800-63-2 assurance levels 4 and 5. Specifically these levels of assurance are claimed to not meet the pseudonymity requirements of the IDEF, but are required for some secure web sites. The appearance of high levels of risk on the web page should help to address these concerns and still allow sites to be IDEF compliant.

Goal

Web sites are more easily transparent about what personal information from users they as RP (or IdP) plan to use and reference.

The cognitive load on the user should be improved by this effort.

Status

The progress on this topic is tracked on this spreadsheet: "LOAs vs. User Understanding of their exposure to a system".

Problems

  1. Users move between privacy contexts on a single display device and are not sure what level of personal information they have agreed to share in the current context. The HTTP "DO NOT TRACK" header is key to this working well.
  2. Users under the age of 13 or with diminished capabilities need special protections. While this page does not yet reflect these special needs, it would be great if there were some enhancement that would address such needs. We need some HTTP header like "DO NOT TRACK" to trigger such a user experience. A solution will require the linking of dependents with the guardians, like happens in a family circle.

Solution

When to use this Solution

The user is on an internet connected display device using a browser or other HTML display screen. This solution is based on a display to the user by a Relying Party (RP). It is expected that the same values could be used by other entities, like an IdP. It is expected that a PIER logo would be in the header of every page on a compliant site in addition to the IDEF logo at the bottom of the page.

User Private Information Exposure Risk (PIER) Levels

These levels are created specifically to give the user feedback on the risk of exposure of their private information.

Pier = (1) Private Information Exposure Risk, (2) a structure (as a breakwater) extending into navigable water for use as a landing place or promenade or to protect or form a safe harbor

0=Anonymous

The user provides no information and the RP does not try to link the user by placing cookies or other tracking methods.

Connection identifiers may be used, for example an HTTPS connection always creates an identifier. Kerberos defines an anonymous connection in RFC6112.

1=Pseudonymous

The user provides or permits some identifier that can be reused over time to link a continued presence. The RP does not try to link the user to a human. The user implicitly intends to assert DO NOT TRACK.

IAL 1 ≃ LOA 1 or 2 if provided by the user. Note that the identifier may be supplied by the RP.

2=Weakly Authenticated

The user provides or permits an identifier that may be linked from one site to another under limited conditions permitted by the user expressed intent.

IAL 1 ≃ LOA 1 or 2 if provided by the user. Note that the identifier may be supplied by the RP.

3=Strongly Authenticated

The RP has some specific requirement to know the identity of the user, such as Know Your Customer, or KYC regulations for a bank.

IAL ≃ LOA 3

4=User Identity Proofed in Person

The user has some legal obligation to be well-known to the system in order to be authorized to perform their duties. The user will have some physical authentication factor like a CAC or PIV card and some binding between the current interchange and the physical factor (see Channel Binding).

IAL 3 ≃ LOA 4

Open Issues

Data exposed by a Federated IdP may be in excess of what a user intends. Describing the problem in terms of OpenID Connect which is a federated authentication built on oauth 2, which is an authorization protocol. The process is this: a user navigates to an RP, the RP bounces a request off the user to the IdP which verifies the user intent, the IdP replies through the user to to RP with an authorization token that allows the RP to request the authorized attributes from the IdP. Now the question is whether the RP really needed all of the attributes requested and whether the user actually understood what attributes she agreed to release.

Do Not Track (DNT) has not been approved for release by the World Wide Web consortia even though it is widely implemented in common browsers. This HTTP header field has the capability to include multiple values and could handle an indication that the user was restricted in some way.

Assurance and Privacy are at odds with each other. There are no easy solutions to solve that. See the page Right to be Forgotten for more discussion.

References and Coordination

User Identity Management

  • The Best Practice and Example Relying Party document describes how a relying party can construct an IDEF compliant site.
  • The Identity Model page describes a new model which integrates the customer relationship management (CRM) and vendor relationship management (VRM) into a model that can be used by any system.

Engineering Risk for Private Information

There is a substantial literature on how an enterprise can manage the risk of exposing private information, two of those are listed here are designed to help federal agencies apply privacy regulations:

There is also an evolving effort to measure "Conduct Risk Assessment" in the UK FCA (Financial Conduct Authority) that has been getting positive reception in the US and other countries. The following quote expresses this quite well.

The FCA's key aim in relation to Conduct Risk is to ensure that firms do the right thing for their customers whilst keeping them, and the integrity of the markets in which they operate, at the heart of everything that they do. Firms should seek to promote good behaviour across all aspects of their organisation and develop a Culture in which it is clear that there is no room for misconduct. Although "treating customers fairly" has long been part of the retail regulatory framework, Conduct Risk should not be seen as merely an extension of this.

External Definitions

  1. GDPR Definition: personal data means any information relating to an identified or identifiable natural person ('data subject'); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.
  2. ISO 29100: personally identifiable information PII [is] any information that (a) can be used to identify the PII principal to whom such information relates, or (b) is or might be directly or indirectly linked to a PII principal. (To determine whether a PII principal is identifiable, account should be taken of all the means which can reasonably be used by the privacy stakeholder holding the data, or by any other party, to identify that natural person.)
  3. FHIR defines protection given to "Protected Health Information" (PHI) or electronic PHI (ePHI or EHI).

  1. Warren and Brandies, The Right to Privacy Harvard Law Review IV (5) 192-200
  2. Amy Gajda, Seek and Hide ISBN 978-1984880741