Trust

From MgmtWiki
Jump to: navigation, search

Full Title or Meme

Trust, simply put, is just the projection into the future of past behavior by a Subject (person or organization) that has a consistent Identity over time.

Author

Tom Jones 2018-07-25

Goals and Scope

The content on this page is intended to describe the tools to evaluate what can be trusted by an individual in a digital age.

There is also a section on bootstrapping trust from Attested hardware. More detail is on the Attested page.

The primary interaction of interest here is consumer to business (C2B); although a comparison will be made to business to business (B2B) and consumer to government (C2G).

Context

Creation of trust in a digital ecosystem is both very hard and very easy. It is very easy in the sense that we have great examples of trustworthy ecosystems in eBay and Amazon. [1] It is very hard in the sense that creating and maintaining such an ecosystem is an awful lot of real continuing effort that must be diligently and faithfully nurtured. Andrei Hagiu's theory can explain why platforms, seem to steer consumers towards established products and sellers.[2] The major problems are Recovery and Redress, which see. All of the current tumult about trust on the internet is created by a bunch of legislatures and technologists that are trying to convince the world that a problem exists that they are uniquely capable of resolving. Caveat Emptor. The following looks at some of the open issues.

There are two ways to approach the problem of trust in networked digital systems. Each approach creates their own distinct context.

Computer Engineering

The scientific approach looks for a set of laws that can be formulated and tested to provide the desired trust.

In B2B interactions studies have shown that there is an 86% level of correlation between level of assurance and trust. [3]

When this approach does not work in C2B interactions, computer technical people always seem to start by blaming the users and propose educating the users. This approach never winds up meeting the goals.

Social Engineering

The social approach to trust is exploited by con men attacking our human weaknesses every day. Creating trustworthy computer systems can not be overcome this sort of attack. One good example

Only a good user experience is available to a Web Site to start the process of gaining user trust. The site needs to look attractive and as though it were built by a solid company. The section below on Public Relations explores that aspect of trust. The other is branding; which is simply a means to link the current experience to the company's past behavior. In the long run only branding will work, provided that Conduct Risk Problem described below does not tarnish the brand's good name.

Individual men may be moral in the sense that they are able to consider interests other than their own in determining problems of conduct, and are capable, on occasion, of preferring the advantages of others to their own . . . But all these achievements are more difficult, if not impossible, for human societies and social groups. In every human group there is less reason to guide and to check impulse, less capacity for self transcendence, less ability to comprehend the need of others and therefore more unrestrained egoism than the individuals, who compose the groups, reveal in their personal relationships.[4] This problem is the source of much of the unpleasantness of the current internet experience.

Related Contexts

  • Federated Trust is a page that describes another way to address trust by the creation of a trusted sub-net where all members have been vetted by a mutually trusted third party.
  • Reliance is “the act of relying, or the condition or quality of being reliant; dependence; confidence; trust; repose of mind upon what is deemed sufficient support or authority.”[5] In this case trust that the actors will perform as expected.
  • Relying Party is web site that trusts the authority of some identifier providers' assurances or attribute verifiers in making their own trust decision to authorize some access by a user.
  • User Trust of a Web Site is a page that drills into the user experience at a web site that might engender trust.

Problems

The Psychological Problem

The philosopher Karl Popper defined the psychological problem as one of human effort to discover the truth of a statement.[6]

It is not possible to fully test any Framework until it has been fully specified and perhaps even full implemented.

Therefore the selection of elements to be included in any new Framework, or changes to existing Framework, is dependent of the feelings of the decision maker. That’s not to say that we lack stratagems to help make a good guess, but rather that we cannot really know the outcome of the implementation of the Framework in advance

The Technological Problem

It is easy to set up a Web Site that promises to solve the Trust Problem. Several have done so. The challenge is that there is no particular reason to trust a web site just because they claim to provide trust. In general, if some entity needs to tell you about some attribute on the site, the validity of the assertion should be immediately be suspect. Some examples are: (1) the Wells Fargo assertions of trustworthiness immediately after receiving a billion dollar fine,[7] or (2) the release of privacy data by LifeLock after claiming for years to help you protect your User Private Information.[8] Also several new technology initiatives seem to claim to be trust anchors, when all they provide is a means to time-stamp a document. (see the page on Distributed Identity for details.) In general be very wary of any claim made for a technological solution to a Trust problem.

The Philosophical Problem

What does it even mean to Trust that some outcome will occur? Part of the problem is that Trust is context dependent. I am content to trust my money to my banker, but not my brother-in-law. On the other hand I am content to trust my children to my brother-in-law, but not my banker. In his discussion on logic, Karl Popper[9] determines that any logical proof of a statement of fact requires some specialized language where the statements can be made in a manner that does not allow for any ambiguity of meaning. In the internet we see an example of this sort of language in the X.509 Certificate chains that bind the name of a web site to a root certificate of known trustworthiness. There are a set of rules issued by the CA|B forum[10] that specify the conditions under which a web browser can accept a web site as trust worthy. At the end of 2015 when Google discoved that Symantic was not following these rules, they dropped the offending root certificate from the list accepted by the Chrome browser.[11] The strict meaning of the rules were tested, and the rules won in this case. Before any other logically acceptable trust metric is instituted on the internet, a similar strict definition of the context and the language will be needed.

The Ethical Conduct Problem

Ethics and trust are inextricably linked. We are interested in ethics in large part because we are concerned, even obsessed, with the question of who we can trust is a world where there is risk and uncertainty. In our relationships, we humans are much more concerned about assessing trustworthiness of others than we are in trying to figure out how ethical they are. So what is trust and what is trustworthiness? Clearly putting our own well-being into the hands of others requires that all the parties to an interaction share the same traditions, which means sharing some sort of framework in which an ethical decision can be made.[12]

Our lives are embedded in human networks where we need to assess trust at a place and time we need to understand the context, or the framework that applies in that place and time. The Decision to Trust Model (DTM Model) was developed to help us make better decisions about discerning trustworthiness and even repairing trust.

Trustworthiness relates directly to ethics on two specific dimensions: integrity and benevolence. In brief: ‘‘A trustworthy party is one that will not unfairly exploit the vulnerabilities of the other party in the relationship.’’[13]

Trust relationships exist at many levels: between two people, among members of a team, between teams, within an organization, between workers and management and even within an entire system, like the financial system or the air traffic control system. The further removed individuals are from the locus of the relationship, it becomes more complicated to assess trustworthiness. For example, how do you judge the trustworthiness of a bank or a financial system that is saving your money? We would like use a combination of personal and impersonal cues. For example, if we were making a trust judgment about a doctor for surgery, we might assess not only the doctor but also in the doctor's hospital.

Conduct Risk is a new field of auditing that is a response to the huge loss in value of companies like Arthur Anderson, Wells Fargo or Equifax[14] suffered as a result of the manner in which they conducted their business.

Solutions

  • Trust is earned. Where Ethics are lacking Trust cannot survive. They are reciprocal concepts[15] as high trust environments will encourage good ethics of participants and good ethical values in on-line interchanges will build trust.
  • Wittgenstein writes: "Must I not begin to trust somewhere? … somewhere I must begin with non-doubting; and that is not, so to speak, hasty but excusable: it is part of judging."[16] We call that beginning the Framework and further expect that it will also be subject to revision as it is criticized for some failure in the future. The best minds, like Claude Shannon had a method of stripping away the inessentials to get to the core of the problem, and then solve that problem.[17] Trust is that core problem for building a Framework. The following solutions have been tried in the past.

Bootstrapping Trust from the User's Device

How can a User know when Web Site Security is sufficient for them to trust. While that page discusses the issue from the perspective of what the web site can do, this topic is what can a user do to create a source of trust that they can use not only to show that they are trustworthy, but that they can use to test the trustworthiness of the web sites that they visit. This all starts with a trusted User Agent running on a trusted User Device, a computer or cell phone that can express its own level of security. This is a problem that has been solved many times over the past decades. In 2012 Bryan Parno published an article titled "Trust Extension for Commodity Computers"[18] that described a technology solution based on the TPM chip that had already been installed in 350 million computers. This article describes dome the the challenges of deploying such a secure solution as requiring some protected execution environment as could be obtained from a virtual machine environment. While that was prohibitively expensive at the time, current computer systems embed the modern equivalent of the TPM in every Intel or ARM computer chip and Windows 10 laptop computers now run protected code in a virtual machine environment.[19] So the technology exists to bootstrap trust from the user's personal device.

The operational assumption is that the security of any trust statement is determined by the protection and access to the key, which is only accessible in a Trusted Execution Environment. More about trusted hardware will be posted to that page and the page Bootstrapping Identity and Consent for a protocol to use after the user device is known to be trustworthy, and the page Attested for more about hardware trust attestation.

Chain of Trust

For certificates, and most particularly for CCITT X.509 Certificates, the trust of the certificate can chain back to a root of trust certificate. If a Site puts that root of trust certificate into their trust repository, then all of the certificates which chain off of it are also trusted to the extent listed in the certificate itself. The implication of this is that all self-signed certificates must be explicitly included in the trust repository if they are to be trusted.[20] The assumption behind all chains of trust is that every signing key, including the subject's, is well protected as described above.

Transitive Trust

Since most of us do not have the capability to evaluate even a small fraction of the requests that we receive on a daily basis, we are compelled to trust things that are vouched by others as trustworthy. As one example, Microsoft has created a trustworthy computing initiative in 2002.[21] "Fundamental to that decision was the understanding that a company’s greatest asset is customer trust." It sounds like they want us to trust them with our computing decisions. Even for the most trusted decision of all, what software to run in kernel mode on our personal computers, they don't seem to able to perform without error.[22] Still, the only possible decision to be made, if we want to use the modern conveniences of a digital lifestyle, is whether we could make a better decision than companies like Microsoft. It seems clear that most of us will need to find a collection of organizations that we, individually, decide are best capable of looking after our interests and trust whom they tell us to trust.

Root of Trust

In the above two cases there was a place the user could start the process of trust, the Root of Trust. The well-know root of trust in the browser works for establishing a secure connection, and the EV Cert can build on that to provide more Assurance as to the physical reality of the Enterprise behind the Web Site. A Federation Service can provide a similar root of trust based on adherence to the principles of the Federation. The challenge is two-fold: (1) how to get Web Sites to prove Compliance to the principles, and (2) how to get Users to trust and look for the TrustMark for that Federation?

Complexity Trust

The proponents of Block-chain and Distributed Ledger technologies are making some fantastical assertions about how trust can be inferred from the complexity of calculations made by computers distributed all around the world, but mainly in China. Caveat Emptor.

Public Relations

The most common response to any lack of trust is to hire a public relations expert to generate some for you with a warm and fuzzy User Experience. Consider the case of Purdue Pharmaceutical after their behavior promoting opioid dependency became public knowledge.[23] They began an advertising campaign to try to convince the world how they really care about solving a problem that they created. Or how Facebook manipulates their User Experience to make them seem like a warm and friendly company.[24] The problem is that appearance matters, no much how we try to say it doesn't, but once trust is lost, it is a long slow road to build it back.

References

  1. John Herrman Want to understand all that ails the modern internet? Look to eBay its first megaplatform - and the blueprint for everything that followed. (2018-06-24) The New York Times Magazine p. 14ff
  2. Andrei Hagiu +1, Platforms and the exploration of new products (2018-06-18) http://andreihagiu.com/wp-content/uploads/2018/06/Exploration-new-sellers-and-products-06142018.pdf
  3. Muneesh Kumar, Trust and Technology in B2B E-Commerce: Practices and Strategies for Assurance Google Books IGI ISBN 978-1613503539
  4. Reinhold Niebuhr, Moral Man and Immoral Society. Scribner’s, (1933)
  5. Merriam Webster 3rd International Dictionary
  6. Karl R. Popper, The Logic of Scientific Discovery English Edition (1959) ISBN 0-415-07892-X
  7. Donna Borak, Wells Fargo fined $1 billion for insurance and mortgage abuses. (2018-04-20) CNN https://money.cnn.com/2018/04/20/news/companies/wells-fargo-regulators-auto-lending-fine/index.html/
  8. Kim Zetter, Lifelock Once Again Failed at Its One Job: Protecting Data (2015-07-21) Wired https://www.wired.com/2015/07/lifelock-failed-one-job-protecting-data/
  9. Karl Popper, Conjectures and Refutations. (1963) Routledge chapter 9 ISBN 0-415-28594-1
  10. CAB Forum, Baseline Requirements https://cabforum.org/baseline-requirements-documents/
  11. Larry Loeb, Google No Longer Trusts Symantec’s Root Certificate. https://securityintelligence.com/news/google-no-longer-trusts-symantecs-root-certificate/
  12. Trust http://www.ethicalsystems.org/content/trust
  13. Banerjee, Bowie and Pavone An Ethical Analysis of the Trust Relationship page 308 in Bachmann and Zaheer eds. Handbook of Trust Research
  14. GAO DATA PROTECTION, Actions Taken by Equifax and Federal Agencies in Response to the 2017 Breach. (2018-08) https://www.gao.gov/assets/700/694158.pdf
  15. Linda Fisher Thornton Ethics and Trust are Reciprocal https://leadingincontext.com/2014/06/18/ethics-and-trust/
  16. Ludwig Wittgenstein On Certainty. p. 150 (1972-09-06) ISBN 0061316865
  17. Sat Rana, Claude Shannon: How a Genius Solves Problems. https://medium.com/personal-growth/claude-shannon-how-a-real-genius-solves-problems-15b434aeb2b6
  18. Bryan Parno, Trust Extension for Commodity Computers CACM 55 No 6 (2012-06) p 76ff
  19. Microsoft. Windows Defender Application Guard overview (2018-07-09) https://docs.microsoft.com/en-us/windows/security/threat-protection/windows-defender-application-guard/wd-app-guard-overview
  20. Microsoft. Certificates [in Widows]. https://technet.microsoft.com/en-us/library/cc700805.aspx
  21. Scott Charney https://news.microsoft.com/2012/01/12/at-10-year-milestone-microsofts-trustworthy-computing-initiative-more-important-than-ever/
  22. CEO (unnamed) of Paramount Defenses, Alarming! : Windows Update Automatically Downloaded and Installed an Untrusted Self-Signed Kernel-mode Lenovo Driver on New Surface Device http://www.cyber-security-blog.com/2018/06/windows-update-installed-an-untrusted-lenovo-driver-on-a-microsoft-surface-device.html?m=1
  23. Barry Meier, Origins of an Epidemic: Purdue Pharma Knew its opioids Were Widely Abused (2018-05-29)New York Times https://www.nytimes.com/2018/05/29/health/purdue-opioids-oxycontin.html
  24. Evan Selinger, Prof. Philosophy at RIT, Facebook Fabricates Trust Through Fake Intimacy. (2018-06-04) https://medium.com/s/trustissues/facebook-fabricates-trust-through-fake-intimacy-b381e60d32f9