Difference between revisions of "Trust"

From MgmtWiki
Jump to: navigation, search
(Accountability)
(Author)
 
(30 intermediate revisions by the same user not shown)
Line 3: Line 3:
  
 
===Author===
 
===Author===
Tom Jones 2018-07-25 updated 2019-11-11
+
Tom Jones 2018-07-25 updated 2024-10-22
  
 
===Goals and Scope===
 
===Goals and Scope===
* The goal of trust is for an action to be initiated or completed. If two parties do not trust each other they may need a third part to mediate the decision process.
+
* Humans depend on cooperative action to meet their goals and trust is needed to make cooperation action work.
 +
* The goal of trust in the digital world is for an action to be authorized or completed. If two parties do not trust each other they may need a third part to mediate the decision process.
 
* The content on this page is intended to describe the tools to evaluate what can be trusted by an individual in a digital age.
 
* The content on this page is intended to describe the tools to evaluate what can be trusted by an individual in a digital age.
 
* There is also a section on bootstrapping trust from [[Attested]] hardware. More detail is on the [[Attested]] page.
 
* There is also a section on bootstrapping trust from [[Attested]] hardware. More detail is on the [[Attested]] page.
 
* The primary interaction of interest here is consumer to business ([[C2B]]); although a comparison will be made to business to business ([[B2B]]) and consumer to government ([[C2G]]).
 
* The primary interaction of interest here is consumer to business ([[C2B]]); although a comparison will be made to business to business ([[B2B]]) and consumer to government ([[C2G]]).
 +
* Trust is always contextual. I might pay a water bill of $100 without thinking about trusting the water company to accurately meter my water use. On the other hand a bill for $1000 would cause me to question the source of the water usage attributed to me.
 +
 +
===Traditional Definitions for Trust===
 +
Trust, confidence, reliance, dependence, faith come into comparison when they mean the fact of feeling sure, or the state of mind of one who feels sure, that a person or thing will not fail him.<ref>Walther W. Skeat, ''An Etymological Dictionary of the English Language.'' Oxford (1882)</ref>
 +
* Trust implies an absolute and assured resting on that which is its object: it often suggests a basis upon other (not necessarily weaker) grounds than experience or sensible proofs. It is the most frequent term in religious use (as, "O God...in thee is my trust"-Psalms cxli. 8) but it occurs also in secular use, especially when an intimate knowledge of, or a deep affection for, the person or persons of whom one is assured is implied (as, "He was a gentleman on whom I built an absolute trust"-Shak.) or when there has been no cause for changing an instinctive or intuitive judgment respecting a person's or thing's reliability (as, he has always gained the trust of his associates; he takes every word of his father on trust).
 +
* Confidence may or may not imply definite grounds for one's assurances, such as the support of experience or of convincing evidence: when it does, it carries less suggestion of emotional factors than trust and a stronger implication of an assurance based upon the evidence of one's senses; as, "those in whom we had no confidence, and who reposed no confidence in us" (Burke); "the note of easy confidence that our London had become what Rome had been, the Capital city" (Quiller-Couch). When it does not imply such grounds, it usually suggests even less reliable grounds for that feeling than does trust; as, “But several expressions in Darwin’s writings leave us in no doubt that he shared the confidence in progress which arising from very unscientific sources, dominated the minds of his generation
 +
* Reliance implies not only an attitude or feeling, but also an objective expression of it as in an act or action; as, he had such reliance on the doctor's skill that he had allowed to be operated upon at once; "His diffidence had prevented his depending on his own judgment in so anxious a case, but his reliance on mine made everything easy" (Austen); "Mark had written out his Christmas sermon with a good deal of care and an excessive reliance on what  preachers had said before him" (C. Mackenzie).
 +
* Dependence differs from reliance chiefly in suggesting greater subordination of self; as, her dependence on her husband must sometimes be an annoyance to him; Affectionate dependence of the Creator” (T. Erskine)
 +
* Faith (as here compared: using the most common sense of the word "belief") implies confidence, but it often suggest a degree of credulity or the acceptance of something capable of being proved without considering the evidence or seeking the facts: it, therefore, is often used when the person or thing in which one has faith is open to question or suspicion; as, he has great faith in a popular patent medicine; he takes everything the newspapers say on faith; it was a long time before he lost faith in the doctor’s powers.
 +
 
===Terminology===
 
===Terminology===
 
For consistency with related NIST reports, this page uses the following definitions for trust-related terms:
 
For consistency with related NIST reports, this page uses the following definitions for trust-related terms:
Line 16: Line 27:
 
* Trusted boot: A system boot where aspects of the hardware and firmware are measured and compared against known good values to verify their integrity and thus their trustworthiness.
 
* Trusted boot: A system boot where aspects of the hardware and firmware are measured and compared against known good values to verify their integrity and thus their trustworthiness.
 
* Trustworthy: Worthy of being trusted to fulfill whatever critical requirements may be needed.<ref>NIST Special Publication (SP) 800-160 Volume 2 Revision 1 https://doi.org/10.6028/NIST.SP.800-160v2r1</ref>
 
* Trustworthy: Worthy of being trusted to fulfill whatever critical requirements may be needed.<ref>NIST Special Publication (SP) 800-160 Volume 2 Revision 1 https://doi.org/10.6028/NIST.SP.800-160v2r1</ref>
 +
 +
===Starts at the Top===
 +
Davos is a [https://link.foreignpolicy.com/click/34086822.80943/aHR0cHM6Ly9mb3JlaWducG9saWN5LmNvbS8yMDE5LzAxLzIyL2luc2lkZS10aGUtc3RyYW5nZS1idWJibGUtd29ybGQtb2YtZGF2b3Mv/64427f2888c29c4d764f6d68Cb2770704 bubble]. The World Economic Forum’s annual meeting in Switzerland in January is routinely reviled as elitist. But the obvious criticisms aside, the gathering serves an important purpose. It’s useful to have the world’s top CEOs, politicians, and representatives from civil society in the same space at the start of a new year debating ideas.
 +
 +
The theme for the 2024 conference was “Rebuilding Trust” — an acknowledgment that the world is increasingly fragmented and public faith in institutions is low. The biggest topics of discussion last week centered on conflict, elections, climate change, and how new technologies like artificial intelligence might impact each of those things.
 +
 +
[[Artificial Intelligence]] cropped up in almost every discussion. I moderated an interesting panel about the impact of technology on elections in 2024, which, as you know, is the subject of our latest print issue, [https://link.foreignpolicy.com/click/34086822.80943/aHR0cHM6Ly9mb3JlaWducG9saWN5LmNvbS90aGUtbWFnYXppbmUv/64427f2888c29c4d764f6d68B8f8d7862 “The Year the World Votes.”] My guests included Jan Lipavský, the Czech Republic’s foreign minister; Smriti Zubin Irani, India’s minister of women and child development; André Kudelski, the CEO of the Swiss-based Kudelski Group; Alexandra Reeve Givens, the CEO of the Center for Democracy & Technology; and Matthew Prince, the co-founder and CEO of CloudFlare. Watch the full discussion [https://link.foreignpolicy.com/click/34086822.80943/aHR0cHM6Ly9mb3JlaWducG9saWN5LmNvbS9saXZlL2ZwLWF0LWRhdm9zLXRoZS15ZWFyLXRoZS13b3JsZC12b3Rlcy8/64427f2888c29c4d764f6d68B91993403 here:] I promise it’ll make you a little bit smarter about how the public and private sectors can get together to handle deepfake videos and troll farms in this crucial year for democracy.
 +
 +
I also had a fascinating one-on-one conversation on stage with Kyriakos Mitsotakis, the prime minister of Greece. As incumbent leaders the world over struggle in the polls, Mitsotakis returned to power in 2023 with a resounding victory. How did he do it? What lessons can centrist leaders elsewhere learn? You can watch our full discussion [https://link.foreignpolicy.com/click/34086822.80943/aHR0cHM6Ly9mb3JlaWducG9saWN5LmNvbS9saXZlL2ZwLWF0LWRhdm9zLWEtY29udmVyc2F0aW9uLXdpdGgtZ3JlZWNlcy1reXJpYWtvcy1taXRzb3Rha2lzLw/64427f2888c29c4d764f6d68B28c5c831 here.]<ref>Ravi Agrawal, ''Yes, Davos is a bubble'' (2024-01-23) Foreign Policy Email</ref>
  
 
==Context==
 
==Context==
 +
*We generally trust the water that comes out of the tap. From time-to-time that trust is broken, but generally we trust that we can get a glass of tap water and drink it. We do no run any sort of analysis as to the safety of the water in that glass. The same is not true of the computer connected to the internet. If we are not careful of every message that appears on the screen before we click it, our computer could be rendered into a villain living in out own house, or our bank account could be depleted. Why is that tolerated? The water company tests the water that they give us, but the internet service provider is somehow absolved from responsibility.
 
*Trust exists in the space between hope and fear. We hope for the best, but fear more that the worst will happen. Unfortunately the overriding challenge is lack of attention. The internet would be paralyzed if every transaction had to be fully examined. But this constant attention seems to be what [[Web Site]] developers expect of their customers. You can hear that expectation whenever you hear someone asking for more "education of the consumers." Somehow there needs to be a easier way to create a trusted ecosystem for the internet.
 
*Trust exists in the space between hope and fear. We hope for the best, but fear more that the worst will happen. Unfortunately the overriding challenge is lack of attention. The internet would be paralyzed if every transaction had to be fully examined. But this constant attention seems to be what [[Web Site]] developers expect of their customers. You can hear that expectation whenever you hear someone asking for more "education of the consumers." Somehow there needs to be a easier way to create a trusted ecosystem for the internet.
 
*Creation of trust in a digital ecosystem is both very hard and very easy. It is very easy in the sense that we have great examples of trustworthy ecosystems in eBay and Amazon. <ref>John Herrman ''Want to understand all that ails the modern internet? Look to eBay its first megaplatform - and the blueprint for everything that followed.'' (2018-06-24) The New York Times Magazine p. 14ff</ref> It is very hard in the sense that creating and maintaining such an ecosystem is an awful lot of real continuing effort that must be diligently and faithfully nurtured. Andrei Hagiu's theory can explain why platforms, seem to steer consumers towards established products and sellers.<ref>Andrei Hagiu +1, ''Platforms and the exploration of new products'' (2018-06-18) http://andreihagiu.com/wp-content/uploads/2018/06/Exploration-new-sellers-and-products-06142018.pdf</ref> The major problems are [[Recovery]] and [[Redress]], which are further described on those wiki pages. All of the current tumult about trust on the internet is created by a bunch of legislatures and technologists that are trying to convince the world that a problem exists that they are uniquely capable of resolving. Caveat Emptor. The following looks at some of the open issues.
 
*Creation of trust in a digital ecosystem is both very hard and very easy. It is very easy in the sense that we have great examples of trustworthy ecosystems in eBay and Amazon. <ref>John Herrman ''Want to understand all that ails the modern internet? Look to eBay its first megaplatform - and the blueprint for everything that followed.'' (2018-06-24) The New York Times Magazine p. 14ff</ref> It is very hard in the sense that creating and maintaining such an ecosystem is an awful lot of real continuing effort that must be diligently and faithfully nurtured. Andrei Hagiu's theory can explain why platforms, seem to steer consumers towards established products and sellers.<ref>Andrei Hagiu +1, ''Platforms and the exploration of new products'' (2018-06-18) http://andreihagiu.com/wp-content/uploads/2018/06/Exploration-new-sellers-and-products-06142018.pdf</ref> The major problems are [[Recovery]] and [[Redress]], which are further described on those wiki pages. All of the current tumult about trust on the internet is created by a bunch of legislatures and technologists that are trying to convince the world that a problem exists that they are uniquely capable of resolving. Caveat Emptor. The following looks at some of the open issues.
Line 73: Line 94:
  
 
[[Conduct Risk]] is a new field of auditing that is a response to the huge loss in value of companies like Arthur Anderson, Wells Fargo or Equifax<ref>GAO DATA PROTECTION, ''Actions Taken by Equifax and Federal Agencies in Response to the 2017 Breach.'' (2018-08) https://www.gao.gov/assets/700/694158.pdf</ref> suffered as a result of the manner in which they conducted their business.
 
[[Conduct Risk]] is a new field of auditing that is a response to the huge loss in value of companies like Arthur Anderson, Wells Fargo or Equifax<ref>GAO DATA PROTECTION, ''Actions Taken by Equifax and Federal Agencies in Response to the 2017 Breach.'' (2018-08) https://www.gao.gov/assets/700/694158.pdf</ref> suffered as a result of the manner in which they conducted their business.
 
+
===The Internet Problem===
 +
Quoting Timothy Ruff - sign everything, trust nothing. The point is that a signature from an unknown signer is pointless. Even when the signer has a cert, that still doesn't get to the core problem. In 2024-10 Apple proposed that all certs expire in 45 days, which at least gets rid of bogus certs a little faster - but it does NOT get the core missing piece. Just claiming a trust model does not build trust. Only experience builds trust. Somehow the EU thinks they can legislate that "A qualified electronic signature with an [[eIDAS 2.0]] qualified certificate carries the same weight as a handwritten signature in court." We have experience from Netherlands (diginotar) to show that approach is certain to fail to deliver either trust or justice.
 
===The Business Problem===
 
===The Business Problem===
 
There was a time after WWII when confidence and trust seemed to be at a high point. Business was good, Labor had a say in their working conditions, and most of the population had a rosy view of the future. As explained in a companion page on the [[Common Good]], that all started to fall apart, first with the Korean and Vietnam wars, and then with the age of business take-overs. Now both business groups and identity groups have all decided to look out just for their own interests. Given a dog-eat-dog ecosystem, is it any wonder that people have little trust in business that push all technology that reduces the need for people as a good for society? There is some indication that companies, like GE and Boeing, with too great a concern for the bottom line, now realize that short term profit is not the way to long term sustainability. Even the DealBook of the New York Times<ref>Andrew Ross Sorkin, ''Seeking a Path to Trust.'' (2019-11-12) New York Times p. F1ff</ref> tells us<blockquote>There's an old saying, "Things move at the speed of Trust". ... How could you possibly trust something on a platform if you don't even know if it's true? If there is one singular issue that defines the intersection of business and policy at this moment, it is a deepening trust deficit.  ... The words "Trust" and "responsibility" were peppered through out. ... [Brian Chesky] thinks many of us in this [technology] industry over the last 10 years are going from a hands-off model, where the internet is an immune system, to realizing that's not really enough; that we have to take more responsibility for the stuff on our platform.</blockquote>
 
There was a time after WWII when confidence and trust seemed to be at a high point. Business was good, Labor had a say in their working conditions, and most of the population had a rosy view of the future. As explained in a companion page on the [[Common Good]], that all started to fall apart, first with the Korean and Vietnam wars, and then with the age of business take-overs. Now both business groups and identity groups have all decided to look out just for their own interests. Given a dog-eat-dog ecosystem, is it any wonder that people have little trust in business that push all technology that reduces the need for people as a good for society? There is some indication that companies, like GE and Boeing, with too great a concern for the bottom line, now realize that short term profit is not the way to long term sustainability. Even the DealBook of the New York Times<ref>Andrew Ross Sorkin, ''Seeking a Path to Trust.'' (2019-11-12) New York Times p. F1ff</ref> tells us<blockquote>There's an old saying, "Things move at the speed of Trust". ... How could you possibly trust something on a platform if you don't even know if it's true? If there is one singular issue that defines the intersection of business and policy at this moment, it is a deepening trust deficit.  ... The words "Trust" and "responsibility" were peppered through out. ... [Brian Chesky] thinks many of us in this [technology] industry over the last 10 years are going from a hands-off model, where the internet is an immune system, to realizing that's not really enough; that we have to take more responsibility for the stuff on our platform.</blockquote>
Line 132: Line 154:
  
 
* You can listen to some "experts" in the field try to make some sense of this [https://www.iif.com/Publications/ID/4790/FRT-Episode-115-Principles-for-Digital-Trust-Networks here].
 
* You can listen to some "experts" in the field try to make some sense of this [https://www.iif.com/Publications/ID/4790/FRT-Episode-115-Principles-for-Digital-Trust-Networks here].
 +
* One limitation of any network is that we cannot [[Trust]] that it will always be available. For those cases we still need some residual means of making trust decisions. For example if a federal official is trying to use federal funds to mitigate the effects of a disaster, there will be not means to fully eliminate waste, fraud and abuse.
  
 
===Academic Views of Trust===
 
===Academic Views of Trust===
Line 152: Line 175:
  
 
===Accountability===
 
===Accountability===
The solutions that deal with governance have become popular since 2020 with organizations like the Trust-over-IP (ToIP) foundations leading the attempt to "reboot the web of trust".<ref>''Rebooting the Web of Trust'' https://www.weboftrust.info/</ref> The original Web-of-Trust was created for the PGP email program in the 1990's. It relied on transitive trust. I will trust the people that you trust. That never gained much traction and the who PGP project slowly faded from memory. With the ascendance of [[Artificial Intelligence]] in 2023, the question takes on more urgency. How can any algorithm be trusted to deliver truth, when the internet itself has become a cesspool of misinformation?<ref>Emanuel Moss, ''Trust is not Enough: Accuracy, Error, Randomness, and Accountability in an Algorithmic Society''  '''CACM 66''' No. 6 p. 42ff (2023-06)</ref><blockquote>Uncertainty about how, whether, and when algorithmic harms come to pass is not every going away, at least not completely. So, we remain in need of mechanisms to address both ''whom'' to hold accountable and ''how'' to hold them accountable, rather than to rely on them to make themselves more trustworthy.<blockquote>
+
The solutions that deal with governance have become popular since 2020 with organizations like the Trust-over-IP (ToIP) foundations leading the attempt to "reboot the web of trust".<ref>''Rebooting the Web of Trust'' https://www.weboftrust.info/</ref> The original Web-of-Trust was created for the PGP email program in the 1990's. It relied on transitive trust. I will trust the people that you trust. That never gained much traction and the who PGP project slowly faded from memory. With the ascendance of [[Artificial Intelligence]] in 2023, the question takes on more urgency. How can any algorithm be trusted to deliver truth, when the internet itself has become a cesspool of misinformation?<ref>Emanuel Moss, ''Trust is not Enough: Accuracy, Error, Randomness, and Accountability in an Algorithmic Society''  '''CACM 66''' No. 6 p. 42ff (2023-06)</ref><blockquote>Uncertainty about how, whether, and when algorithmic harms come to pass is not every going away, at least not completely. So, we remain in need of mechanisms to address both ''whom'' to hold accountable and ''how'' to hold them accountable, rather than to rely on them to make themselves more trustworthy. ... One reason it is so difficult to develop mechanisms of accountability is that such mechanisms speak to one of two complementary aspects of accountability without considering the other. One of these aspects of accountability is the role inhabited by those who ''ought'' to be accountable. In this role, accountable actors - engineers, corporate officers, or licensing bodies - must conduct themselves with a willingness to accept blame and with the capacity to see themselves as responsible to others for doing so.  They must make themselves trustworthy and accountable.  The other aspect of accountability is the ability of others - harmed parties or those acting on their behalf - to hold responsible parties accountable, regardless of the willingness to accept blame. The second aspect of accountability requires social, institutional relations - between individuals, the organizations they might works for or in, and the institutions that are themselves held accountable to the public - for accountability to exist particularly when those who ought to be trustworthy fall short.</blockquote>
 +
Perhaps when every cell phone comes with [[Artificial Intelligence]] an agent will become available to determine if an offered transaction is trustworthy.
 +
 
 +
===When Trust Fades===
 +
All that is left is hope --  and fear.
  
 
==References==
 
==References==
Line 163: Line 190:
 
[[Category:Article]]
 
[[Category:Article]]
 
[[Category:Trust]]
 
[[Category:Trust]]
 +
[[Category:Psychology]]
 
[[Category:Philosophy]]
 
[[Category:Philosophy]]

Latest revision as of 10:18, 22 October 2024

Full Title or Meme

Trust, simply put, is just the projection into the future of past behavior by a Subject (person or organization) that has a consistent Identity over time.

Author

Tom Jones 2018-07-25 updated 2024-10-22

Goals and Scope

  • Humans depend on cooperative action to meet their goals and trust is needed to make cooperation action work.
  • The goal of trust in the digital world is for an action to be authorized or completed. If two parties do not trust each other they may need a third part to mediate the decision process.
  • The content on this page is intended to describe the tools to evaluate what can be trusted by an individual in a digital age.
  • There is also a section on bootstrapping trust from Attested hardware. More detail is on the Attested page.
  • The primary interaction of interest here is consumer to business (C2B); although a comparison will be made to business to business (B2B) and consumer to government (C2G).
  • Trust is always contextual. I might pay a water bill of $100 without thinking about trusting the water company to accurately meter my water use. On the other hand a bill for $1000 would cause me to question the source of the water usage attributed to me.

Traditional Definitions for Trust

Trust, confidence, reliance, dependence, faith come into comparison when they mean the fact of feeling sure, or the state of mind of one who feels sure, that a person or thing will not fail him.[1]

  • Trust implies an absolute and assured resting on that which is its object: it often suggests a basis upon other (not necessarily weaker) grounds than experience or sensible proofs. It is the most frequent term in religious use (as, "O God...in thee is my trust"-Psalms cxli. 8) but it occurs also in secular use, especially when an intimate knowledge of, or a deep affection for, the person or persons of whom one is assured is implied (as, "He was a gentleman on whom I built an absolute trust"-Shak.) or when there has been no cause for changing an instinctive or intuitive judgment respecting a person's or thing's reliability (as, he has always gained the trust of his associates; he takes every word of his father on trust).
  • Confidence may or may not imply definite grounds for one's assurances, such as the support of experience or of convincing evidence: when it does, it carries less suggestion of emotional factors than trust and a stronger implication of an assurance based upon the evidence of one's senses; as, "those in whom we had no confidence, and who reposed no confidence in us" (Burke); "the note of easy confidence that our London had become what Rome had been, the Capital city" (Quiller-Couch). When it does not imply such grounds, it usually suggests even less reliable grounds for that feeling than does trust; as, “But several expressions in Darwin’s writings leave us in no doubt that he shared the confidence in progress which arising from very unscientific sources, dominated the minds of his generation
  • Reliance implies not only an attitude or feeling, but also an objective expression of it as in an act or action; as, he had such reliance on the doctor's skill that he had allowed to be operated upon at once; "His diffidence had prevented his depending on his own judgment in so anxious a case, but his reliance on mine made everything easy" (Austen); "Mark had written out his Christmas sermon with a good deal of care and an excessive reliance on what preachers had said before him" (C. Mackenzie).
  • Dependence differs from reliance chiefly in suggesting greater subordination of self; as, her dependence on her husband must sometimes be an annoyance to him; Affectionate dependence of the Creator” (T. Erskine)
  • Faith (as here compared: using the most common sense of the word "belief") implies confidence, but it often suggest a degree of credulity or the acceptance of something capable of being proved without considering the evidence or seeking the facts: it, therefore, is often used when the person or thing in which one has faith is open to question or suspicion; as, he has great faith in a popular patent medicine; he takes everything the newspapers say on faith; it was a long time before he lost faith in the doctor’s powers.

Terminology

For consistency with related NIST reports, this page uses the following definitions for trust-related terms:

  • Trust: “The confidence one Entity has in another that the second Entity will behave as expected.”
  • Trusted: An Entity that another Entity relies upon to fulfill critical requirements on its behalf.
  • Trusted boot: A system boot where aspects of the hardware and firmware are measured and compared against known good values to verify their integrity and thus their trustworthiness.
  • Trustworthy: Worthy of being trusted to fulfill whatever critical requirements may be needed.[2]

Starts at the Top

Davos is a bubble. The World Economic Forum’s annual meeting in Switzerland in January is routinely reviled as elitist. But the obvious criticisms aside, the gathering serves an important purpose. It’s useful to have the world’s top CEOs, politicians, and representatives from civil society in the same space at the start of a new year debating ideas.

The theme for the 2024 conference was “Rebuilding Trust” — an acknowledgment that the world is increasingly fragmented and public faith in institutions is low. The biggest topics of discussion last week centered on conflict, elections, climate change, and how new technologies like artificial intelligence might impact each of those things.

Artificial Intelligence cropped up in almost every discussion. I moderated an interesting panel about the impact of technology on elections in 2024, which, as you know, is the subject of our latest print issue, “The Year the World Votes.” My guests included Jan Lipavský, the Czech Republic’s foreign minister; Smriti Zubin Irani, India’s minister of women and child development; André Kudelski, the CEO of the Swiss-based Kudelski Group; Alexandra Reeve Givens, the CEO of the Center for Democracy & Technology; and Matthew Prince, the co-founder and CEO of CloudFlare. Watch the full discussion here: I promise it’ll make you a little bit smarter about how the public and private sectors can get together to handle deepfake videos and troll farms in this crucial year for democracy.

I also had a fascinating one-on-one conversation on stage with Kyriakos Mitsotakis, the prime minister of Greece. As incumbent leaders the world over struggle in the polls, Mitsotakis returned to power in 2023 with a resounding victory. How did he do it? What lessons can centrist leaders elsewhere learn? You can watch our full discussion here.[3]

Context

  • We generally trust the water that comes out of the tap. From time-to-time that trust is broken, but generally we trust that we can get a glass of tap water and drink it. We do no run any sort of analysis as to the safety of the water in that glass. The same is not true of the computer connected to the internet. If we are not careful of every message that appears on the screen before we click it, our computer could be rendered into a villain living in out own house, or our bank account could be depleted. Why is that tolerated? The water company tests the water that they give us, but the internet service provider is somehow absolved from responsibility.
  • Trust exists in the space between hope and fear. We hope for the best, but fear more that the worst will happen. Unfortunately the overriding challenge is lack of attention. The internet would be paralyzed if every transaction had to be fully examined. But this constant attention seems to be what Web Site developers expect of their customers. You can hear that expectation whenever you hear someone asking for more "education of the consumers." Somehow there needs to be a easier way to create a trusted ecosystem for the internet.
  • Creation of trust in a digital ecosystem is both very hard and very easy. It is very easy in the sense that we have great examples of trustworthy ecosystems in eBay and Amazon. [4] It is very hard in the sense that creating and maintaining such an ecosystem is an awful lot of real continuing effort that must be diligently and faithfully nurtured. Andrei Hagiu's theory can explain why platforms, seem to steer consumers towards established products and sellers.[5] The major problems are Recovery and Redress, which are further described on those wiki pages. All of the current tumult about trust on the internet is created by a bunch of legislatures and technologists that are trying to convince the world that a problem exists that they are uniquely capable of resolving. Caveat Emptor. The following looks at some of the open issues.
  • Cheat me once, shame on you. Cheat me twice, shame on me.
  • Those who do not trust are not trusted.

There are two ways to approach the problem of trust in networked digital systems. Each approach creates their own distinct context.

Computer Engineering

The scientific approach looks for a set of laws that can be formulated and tested to provide the desired trust.

In B2B interactions studies have shown that there is an 86% level of correlation between level of assurance and trust. [6]

When this approach does not work in C2B interactions, computer technical people always seem to start by blaming the users and propose educating the users. This approach never winds up meeting the goals.

Social Engineering

The social approach to trust is exploited by con men attacking our human weaknesses every day. Creating trustworthy computer systems can not be overcome this sort of attack. One good example

Only a good user experience is available to a Web Site to start the process of gaining user trust. The site needs to look attractive and as though it were built by a solid company. The section below on Public Relations explores that aspect of trust. The other is branding; which is simply a means to link the current experience to the company's past behavior. In the long run only branding will work, provided that Conduct Risk Problem described below does not tarnish the brand's good name.

Individual men may be moral in the sense that they are able to consider interests other than their own in determining problems of conduct, and are capable, on occasion, of preferring the advantages of others to their own . . . But all these achievements are more difficult, if not impossible, for human societies and social groups. In every human group there is less reason to guide and to check impulse, less capacity for self transcendence, less ability to comprehend the need of others and therefore more unrestrained egoism than the individuals, who compose the groups, reveal in their personal relationships.[7] This problem is the source of much of the unpleasantness of the current internet experience.

Related Contexts

  • Federated Trust is a page that describes another way to address trust by the creation of a trusted sub-net where all members have been vetted by a mutually trusted third party.
  • Reliance is “the act of relying, or the condition or quality of being reliant; dependence; confidence; trust; repose of mind upon what is deemed sufficient support or authority.”[8] In this case trust that the actors will perform as expected.
  • Relying Party is web site that trusts the authority of some identifier providers' assurances or attribute verifiers in making their own trust decision to authorize some access by a user.
  • User Trust of a Web Site is a page that drills into the user experience at a web site that might engender trust.
  • A Trust Service is some Web Site like a Framework Membership Validation service that can report on details of some other Entity or document.

Problems

The Psychological Problem

The philosopher Karl Popper defined the psychological problem as one of human effort to discover the truth of a statement.[9] The biggest challenge with that on the internet is that most users do not have the analytical skills to determine whether a headline (which is all they typically view) is true or false according to Professor David Rand.[10]

It is not possible to fully test any Framework until it has been fully specified and perhaps even full implemented. Therefore the selection of elements to be included in any new Framework, or changes to existing Framework, is dependent of the feelings of the decision maker. That’s not to say that we lack stratagems to help make a good guess, but rather that we cannot really know the outcome of the implementation of the Framework in advance

The Technological Problem

It is easy to set up a Web Site that promises to solve the Trust Problem. Several have done so. The challenge is that there is no particular reason to trust a web site just because they claim to provide trust. In general, if some entity needs to tell you about some attribute on the site, the validity of the assertion should be immediately be suspect. Some examples are: (1) the Wells Fargo assertions of trustworthiness immediately after receiving a billion dollar fine,[11] or (2) the release of privacy data by LifeLock after claiming for years to help you protect your User Private Information.[12] Also several new technology initiatives seem to claim to be trust anchors, when all they provide is a means to time-stamp a document. (see the page on Distributed Identity for details.) In general be very wary of any claim made for a technological solution to a Trust problem.

The Philosophical Problem

What does it even mean to Trust that some outcome will occur? Part of the problem is that Trust is context dependent. I am content to trust my money to my banker, but not my brother-in-law. On the other hand I am content to trust my children to my brother-in-law, but not my banker. In his discussion on logic, Karl Popper[13] determines that any logical proof of a statement of fact requires some specialized language where the statements can be made in a manner that does not allow for any ambiguity of meaning. In the internet we see an example of this sort of language in the X.509 Certificate chains that bind the name of a web site to a root certificate of known trustworthiness. There are a set of rules issued by the CA|B forum[14] that specify the conditions under which a web browser can accept a web site as trust worthy. At the end of 2015 when Google discoved that Symantic was not following these rules, they dropped the offending root certificate from the list accepted by the Chrome browser.[15] The strict meaning of the rules were tested, and the rules won in this case. Before any other logically acceptable trust metric is instituted on the internet, a similar strict definition of the context and the language will be needed.

The Ethical Conduct Problem

Ethics and trust are inextricably linked. We are interested in ethics in large part because we are concerned, even obsessed, with the question of who we can trust is a world where there is risk and uncertainty. In our relationships, we humans are much more concerned about assessing trustworthiness of others than we are in trying to figure out how ethical they are. So what is trust and what is trustworthiness? Clearly putting our own well-being into the hands of others requires that all the parties to an interaction share the same traditions, which means sharing some sort of framework in which an ethical decision can be made.[16]

Our lives are embedded in human networks where we need to assess trust at a place and time we need to understand the context, or the framework that applies in that place and time. The Decision to Trust Model (DTM Model) was developed to help us make better decisions about discerning trustworthiness and even repairing trust.

Trustworthiness relates directly to ethics on two specific dimensions: integrity and benevolence. In brief: ‘‘A trustworthy party is one that will not unfairly exploit the vulnerabilities of the other party in the relationship.’’[17]

Trust relationships exist at many levels: between two people, among members of a team, between teams, within an organization, between workers and management and even within an entire system, like the financial system or the air traffic control system. The further removed individuals are from the locus of the relationship, it becomes more complicated to assess trustworthiness. For example, how do you judge the trustworthiness of a bank or a financial system that is saving your money? We would like use a combination of personal and impersonal cues. For example, if we were making a trust judgment about a doctor for surgery, we might assess not only the doctor but also in the doctor's hospital.

Conduct Risk is a new field of auditing that is a response to the huge loss in value of companies like Arthur Anderson, Wells Fargo or Equifax[18] suffered as a result of the manner in which they conducted their business.

The Internet Problem

Quoting Timothy Ruff - sign everything, trust nothing. The point is that a signature from an unknown signer is pointless. Even when the signer has a cert, that still doesn't get to the core problem. In 2024-10 Apple proposed that all certs expire in 45 days, which at least gets rid of bogus certs a little faster - but it does NOT get the core missing piece. Just claiming a trust model does not build trust. Only experience builds trust. Somehow the EU thinks they can legislate that "A qualified electronic signature with an eIDAS 2.0 qualified certificate carries the same weight as a handwritten signature in court." We have experience from Netherlands (diginotar) to show that approach is certain to fail to deliver either trust or justice.

The Business Problem

There was a time after WWII when confidence and trust seemed to be at a high point. Business was good, Labor had a say in their working conditions, and most of the population had a rosy view of the future. As explained in a companion page on the Common Good, that all started to fall apart, first with the Korean and Vietnam wars, and then with the age of business take-overs. Now both business groups and identity groups have all decided to look out just for their own interests. Given a dog-eat-dog ecosystem, is it any wonder that people have little trust in business that push all technology that reduces the need for people as a good for society? There is some indication that companies, like GE and Boeing, with too great a concern for the bottom line, now realize that short term profit is not the way to long term sustainability. Even the DealBook of the New York Times[19] tells us
There's an old saying, "Things move at the speed of Trust". ... How could you possibly trust something on a platform if you don't even know if it's true? If there is one singular issue that defines the intersection of business and policy at this moment, it is a deepening trust deficit. ... The words "Trust" and "responsibility" were peppered through out. ... [Brian Chesky] thinks many of us in this [technology] industry over the last 10 years are going from a hands-off model, where the internet is an immune system, to realizing that's not really enough; that we have to take more responsibility for the stuff on our platform.

The Friedman doctrine, also called shareholder theory or stockholder theory, is a normative theory of business ethics advanced by economist Milton Friedman which holds that a firm's sole responsibility is to its shareholders.[20] But on 2019-08-19 Business Roundtable Redefines the Purpose of a Corporation to Promote ‘'An Economy That Serves All Americans[21] In an updated Statement that moves away from Shareholder Primacy to include Commitments to All Stakeholders.

Trusting the Machine

The device of most interest to the human user of any computer system is likely to be the smartphone in their pocket. The next section will consider how this device fits into a network Ecosystem. It can be a challenge for the user to separate the actions of the Smartphone and its operating system from that of the apps running on that phone, in part because the phone is shipped with a variety of apps from: (1) the manufacturer of the phone, (2) the operating system on the phone and (3) the telco that sold the phone to the user. While in some cases the function performed by the device is mechanical, such as taking a picture and displaying it back to the user, the phone has become an Artificial Intelligence in its own right. In a 2022 article Sue Halpern[22] describes the process that enables the US Military warfighters and commanders to trust that a device in the field will perform as expected by the people that sent that soldier into combat. SoarTech has been working with DoD agencies to create a trust model of an AI in the hands of warfighter to understand how the warfighter develops and describes trust in the devices that they employ.

Above and beyond the trust given to the phone by the user, any Relying Party that acquires information from the user needs to be able to trust the device in the user's hands. Several issues arise there from the interpretation of the user's intent into bits one the wire to the details about how the user Credentials on the phone are protected from compromise. But the most important information to the Relying Party is the identification of the phone to the user. There are levels of Assurance that range from identification of the owner of the phone to the actual presence of the user holding the phone at them time the transaction is made, also known a liveness proofing. Inferring all of this Trust from bits on the wire is the subject of much of this wiki.

Trusting the Ecosystem

Humans are predisposed to trusting the natural ecosystem where they live. They expect the sun to rise every day and are not disappointed, even though the very idea of Induction of truths from facts was disparaged by philosophers starting from David Hume.

Trusting the Technology

Cathy O'Neil worked in finance and then in tech where she found that "people didn't really understand that the algorithms weren't predicting but classifying ... and that this wasn't a match problem but a political problem. A thrust problem." [23] She noted that every algorithm is optimized for a particular notion of success and is trained on historical data to recognize patterns. That is they were judging people based on the success or failure of others who shared some characteristics with you. O'Neil had theist say about an attempt to use recidivism statistics to predict the behavior of people who were up for parole. "We are not really ever defining success for the prison system. We are simply predictably that we will continue to profile such people in the future because the' what we've done in the past. It's very sad and, unfortunately, speaks to the fact that we have a history of shifting responsibilities of society's scourges to the victims of the scrouges." More examples can be found on the page on Artificial Intelligence.

Solutions

  • Trust is earned. Where Ethics are lacking Trust cannot survive. They are reciprocal concepts[24] as high trust environments will encourage good ethics of participants and good ethical values in on-line interchanges will build trust.
  • Wittgenstein writes: "Must I not begin to trust somewhere? … somewhere I must begin with non-doubting; and that is not, so to speak, hasty but excusable: it is part of judging."[25] We call that beginning the Framework and further expect that it will also be subject to revision as it is criticized for some failure in the future. The best minds, like Claude Shannon had a method of stripping away the inessentials to get to the core of the problem, and then solve that problem.[26] Trust is that core problem for building a Framework. The following solutions have been tried in the past.

Bootstrapping Trust from the User's Device

How can a User know when Web Site Security is sufficient for them to trust. While that wiki page discusses the issue from the perspective of what the web site can do, this page is what can a user do to create a source of trust that they can use not only to show that they are trustworthy, but that they can use to test the trustworthiness of the web sites that they visit. This all starts with a trusted User Agent running on a trusted User Device, a computer or Smart Phone that can express its own level of security. This is a problem that has been solved many times over the past decades. In 2012 Bryan Parno published an article titled "Trust Extension for Commodity Computers"[27] that described a technology solution based on the TPM chip that had already been installed in 350 million computers. This page describes some the the challenges of deploying such a secure solution as requiring some protected execution environment as could be obtained from a virtual machine environment. While that was prohibitively expensive at the time, current computer systems embed the modern equivalent of the TPM in every Intel or ARM computer chip and Windows 10 laptop computers now run protected code in a virtual machine environment.[28] So the technology exists to bootstrap trust from the user's personal device.

The operational assumption is that the security of any trust statement is determined by the protection and access to the key, which is only accessible in a Trusted Execution Environment. More about trusted hardware will be posted to that page and the page Bootstrapping Identity and Consent for a protocol to use after the user device is known to be trustworthy, and the page Attested for more about hardware trust attestation.

Chain of Trust

For certificates, and most particularly for CCITT X.509 Certificates, the trust of the certificate can chain back to a root of trust certificate. If a Site puts that root of trust certificate into their trust repository, then all of the certificates which chain off of it are also trusted to the extent listed in the certificate itself. The implication of this is that all self-signed certificates must be explicitly included in the trust repository if they are to be trusted.[29] The assumption behind all chains of trust is that every signing key, including the subject's, is well protected as described above.

Transitive Trust

Since most of us do not have the capability to evaluate even a small fraction of the requests that we receive on a daily basis, we are compelled to trust things that are vouched by others as trustworthy. As one example, Microsoft has created a trustworthy computing initiative in 2002.[30] "Fundamental to that decision was the understanding that a company’s greatest asset is customer trust." It sounds like they want us to trust them with our computing decisions. Even for the most trusted decision of all, what software to run in kernel mode on our personal computers, they don't seem to able to perform without error.[31] Still, the only possible decision to be made, if we want to use the modern conveniences of a digital lifestyle, is whether we could make a better decision than companies like Microsoft. It seems clear that most of us will need to find a collection of organizations that we, individually, decide are best capable of looking after our interests and trust whom they tell us to trust.

Root of Trust

Cryptographic Trust

The above solutions to Trust rely on creation of a central point of Trust and creating a protocol to communicate that trust to other participants in the network. In 1982 explorations began using cryptographic solutions to the lack of a central source of trust. The example used was a distributed group of generals who needed to decided on a plan of action while not trusting that the other generals would not renege on the plan for their own benefit.[32] Block Chain solutions with distributed ledger can be used to provide a distributed source of trust by ensuring that a majority of participants agree on the current state of facts by randomizing the participant which gets to assert the current state as used in Bitcoin to prevent double spending of the coins. Bitcoin used a computationally expensive process which would allow the random selection of the participant with the best solution. All of the other participants could validate that first solution to decide if they agreed that is was a valid source of trust. Now there are proposed solutions that are less expensive than proof-of-work.[33]. Any well constructed distributed ledger system can be used to established trust where no one central authority can be trusted.

Sovereign Trust

Governments have the advantage of being immune to any suit where they have not agreed to accept responsibility. That solves one of the major issues with liability that other Identifier or Attribute Providers face, crippling law suits. On the other hand, they are also able to write laws that allow them to ignore individuals' privacy rights in the interests of "national security", whatever it is that they define that do be. Just looking at government issued fiat currency we can see that trust can have a significant impact in the value of the currency, and conversely that the value of the currency can indicate the trust that individual investors have in that currency. It would be nice if there were some metric of trust in identifier credentials like there is for currency, but that is harder to measure. In some cases, like national health insurance, there is no choice but to use the government issued identifier credentials. Fortunately most national governments prevent the use of those credentials for commercial purposes. The eID cards issued in many countries are overcoming that reluctance, possibly to severe allergic reactions from populaces that have bad experiences with those credentials.

See the wiki page Self-Sovereign Identity for a discussion on how to give users sovereignty over their Identifiers.

Public Relations

The most common response to any lack of trust is to hire a public relations expert to generate some for you with a warm and fuzzy User Experience. Consider the case of Purdue Pharmaceutical after their behavior promoting opioid dependency became public knowledge.[34] They began an advertising campaign to try to convince the world how they really care about solving a problem that they created. Or how Facebook manipulates their User Experience to make them seem like a warm and friendly company.[35] The problem is that appearance matters, no much how we try to say it doesn't, but once trust is lost, it is a long slow road to build it back.

Zero Trust

If it is so hard to establish trust, why not just try to run society where every transaction begins with zero trust and acquires sufficient trust to complete the transaction?

This is the notion of a Zero Trust Architecture where we drop the very human experience of trusting everyone in our village because everyone knows everyone else.

Digital Trust Networks

If we can no longer rely one proximity for trust, as was possible at the beginning of the network age where perimeters enclosed everyone that had been vetted, now we create a virtual version of that that is called a Digital Trust Network. In other works, we have created a virtual perimeter for trust networks to replace a virtual perimeter of physical connections.

  • You can listen to some "experts" in the field try to make some sense of this here.
  • One limitation of any network is that we cannot Trust that it will always be available. For those cases we still need some residual means of making trust decisions. For example if a federal official is trying to use federal funds to mitigate the effects of a disaster, there will be not means to fully eliminate waste, fraud and abuse.

Academic Views of Trust

In 2021 a neuro-psychoeconomic (NPE) model of trust was proposed that synthesizes the neural findings into three types of trust:[36]

  1. calculus-based trust (i.e., derived by calculating of costs and benefits of entering or discontinuing a trust relationship)
  2. knowledge-based trust (i.e., grounded in information processing about the other’s predictability over current knowledge)
  3. identification-based trust (i.e., based on the identification of the other’s intentions and acts over repeated interactions and mutual understanding)

Governance Frameworks

A new proposal gained adherence in the years after 2020. Somehow it because popular to say that all trust problems could be solved if only society were able to establish frameworks that would show enterprises how to build trust.[37] But somehow there are companies today that seem to be trustworthy and other who are not. Here is a list from 2023 (Facebook data does not seem to be complete, but has been estimated at 2.2 billion[38]: It is really unclear how any governance framework would have any influence on companies that have experienced such breeches. Experience teaches us that some people (and some enterprises) are trustworthy and others are not. See the wiki page on Conduct Risk.

  • Yahoo – 3 billion records lost in 2013
  • River City Media – 1.37 billion records lost in 2017
  • Aadhaar – 1.1 billion records lost in 2018
  • Spambot – 711 million records lost in 2017
  • Facebook – 533 million records lost in 2019
  • Syniverse – 500 million records lost in 2021
  • MySpace – 427 million records lost in 2016
  • CAM4 – 11 billion records leaked in 2020 (but not stolen by cyber criminals)
  • T-Mobile - seems to be on track to have the most breaches with two more in 2023 releasing data on 27 and then 836 million by May.[39]

Accountability

The solutions that deal with governance have become popular since 2020 with organizations like the Trust-over-IP (ToIP) foundations leading the attempt to "reboot the web of trust".[40] The original Web-of-Trust was created for the PGP email program in the 1990's. It relied on transitive trust. I will trust the people that you trust. That never gained much traction and the who PGP project slowly faded from memory. With the ascendance of Artificial Intelligence in 2023, the question takes on more urgency. How can any algorithm be trusted to deliver truth, when the internet itself has become a cesspool of misinformation?[41]
Uncertainty about how, whether, and when algorithmic harms come to pass is not every going away, at least not completely. So, we remain in need of mechanisms to address both whom to hold accountable and how to hold them accountable, rather than to rely on them to make themselves more trustworthy. ... One reason it is so difficult to develop mechanisms of accountability is that such mechanisms speak to one of two complementary aspects of accountability without considering the other. One of these aspects of accountability is the role inhabited by those who ought to be accountable. In this role, accountable actors - engineers, corporate officers, or licensing bodies - must conduct themselves with a willingness to accept blame and with the capacity to see themselves as responsible to others for doing so. They must make themselves trustworthy and accountable. The other aspect of accountability is the ability of others - harmed parties or those acting on their behalf - to hold responsible parties accountable, regardless of the willingness to accept blame. The second aspect of accountability requires social, institutional relations - between individuals, the organizations they might works for or in, and the institutions that are themselves held accountable to the public - for accountability to exist particularly when those who ought to be trustworthy fall short.

Perhaps when every cell phone comes with Artificial Intelligence an agent will become available to determine if an offered transaction is trustworthy.

When Trust Fades

All that is left is hope -- and fear.

References

  1. Walther W. Skeat, An Etymological Dictionary of the English Language. Oxford (1882)
  2. NIST Special Publication (SP) 800-160 Volume 2 Revision 1 https://doi.org/10.6028/NIST.SP.800-160v2r1
  3. Ravi Agrawal, Yes, Davos is a bubble (2024-01-23) Foreign Policy Email
  4. John Herrman Want to understand all that ails the modern internet? Look to eBay its first megaplatform - and the blueprint for everything that followed. (2018-06-24) The New York Times Magazine p. 14ff
  5. Andrei Hagiu +1, Platforms and the exploration of new products (2018-06-18) http://andreihagiu.com/wp-content/uploads/2018/06/Exploration-new-sellers-and-products-06142018.pdf
  6. Muneesh Kumar, Trust and Technology in B2B E-Commerce: Practices and Strategies for Assurance Google Books IGI ISBN 978-1613503539
  7. Reinhold Niebuhr, Moral Man and Immoral Society. Scribner’s, (1933)
  8. Merriam Webster 3rd International Dictionary
  9. Karl R. Popper, The Logic of Scientific Discovery English Edition (1959) ISBN 0-415-07892-X
  10. Peter Dizikes, Truth, Lies and Tribal Voters. (2018-11,12) MIT News, part of Technology Review
  11. Donna Borak, Wells Fargo fined $1 billion for insurance and mortgage abuses. (2018-04-20) CNN https://money.cnn.com/2018/04/20/news/companies/wells-fargo-regulators-auto-lending-fine/index.html/
  12. Kim Zetter, Lifelock Once Again Failed at Its One Job: Protecting Data (2015-07-21) Wired https://www.wired.com/2015/07/lifelock-failed-one-job-protecting-data/
  13. Karl Popper, Conjectures and Refutations. (1963) Routledge chapter 9 ISBN 0-415-28594-1
  14. CAB Forum, Baseline Requirements https://cabforum.org/baseline-requirements-documents/
  15. Larry Loeb, Google No Longer Trusts Symantec’s Root Certificate. https://securityintelligence.com/news/google-no-longer-trusts-symantecs-root-certificate/
  16. Trust http://www.ethicalsystems.org/content/trust
  17. Banerjee, Bowie and Pavone An Ethical Analysis of the Trust Relationship page 308 in Bachmann and Zaheer eds. Handbook of Trust Research
  18. GAO DATA PROTECTION, Actions Taken by Equifax and Federal Agencies in Response to the 2017 Breach. (2018-08) https://www.gao.gov/assets/700/694158.pdf
  19. Andrew Ross Sorkin, Seeking a Path to Trust. (2019-11-12) New York Times p. F1ff
  20. Jeff Smith, The Shareholders vs. Stakeholders Debate MIT Sloan Management Review (2003-07-15) https://sloanreview.mit.edu/article/the-shareholders-vs-stakeholders-debate/
  21. Corporate Governance, Business Roundtable Redefines the Purpose of a Corporation to Promote An Economy That Serves All Americans (2019-08-19) https://www.businessroundtable.org/business-roundtable-redefines-the-purpose-of-a-corporation-to-promote-an-economy-that-serves-all-americans
  22. Sue Halpern, Flying Aces New Yorker (2022-01-24) p 18 ff https://www.newyorker.com/magazine/2022/01/24/the-rise-of-ai-fighter-pilots
  23. Cathy O'Neil, The Shame Machine: Who Profits in the new age of Humiliation Crown (2022-03-22) ISBN ‎ 978-1984825452
  24. Linda Fisher Thornton Ethics and Trust are Reciprocal https://leadingincontext.com/2014/06/18/ethics-and-trust/
  25. Ludwig Wittgenstein On Certainty. p. 150 (1972-09-06) ISBN 0061316865
  26. Sat Rana, Claude Shannon: How a Genius Solves Problems. https://medium.com/personal-growth/claude-shannon-how-a-real-genius-solves-problems-15b434aeb2b6
  27. Bryan Parno, Trust Extension for Commodity Computers CACM 55 No 6 (2012-06) p 76ff
  28. Microsoft. Windows Defender Application Guard overview (2018-07-09) https://docs.microsoft.com/en-us/windows/security/threat-protection/windows-defender-application-guard/wd-app-guard-overview
  29. Microsoft. Certificates [in Widows]. https://technet.microsoft.com/en-us/library/cc700805.aspx
  30. Scott Charney https://news.microsoft.com/2012/01/12/at-10-year-milestone-microsofts-trustworthy-computing-initiative-more-important-than-ever/
  31. CEO (unnamed) of Paramount Defenses, Alarming! : Windows Update Automatically Downloaded and Installed an Untrusted Self-Signed Kernel-mode Lenovo Driver on New Surface Device http://www.cyber-security-blog.com/2018/06/windows-update-installed-an-untrusted-lenovo-driver-on-a-microsoft-surface-device.html?m=1
  32. Lamport, L.; Shostak, R.; Pease, M., The Byzantine Generals Problem. (1982) ACM Transactions on Programming Languages and Systems. 4 (3): 387-389. doi:10.1145/357172.357176
  33. Zubin Koticha, Proof of Stake and the History of Distributed Consensus: Part 1, Nakamoto Consensus, Byzantine Fault Tolerance, Hybrid Consensus, Thunderella. (2018-09-04) Thunder https://medium.com/thunderofficial/proof-of-stake-and-the-history-of-distributed-consensus-part-1-nakamoto-consensus-byzantine-176e0156316e
  34. Barry Meier, Origins of an Epidemic: Purdue Pharma Knew its opioids Were Widely Abused (2018-05-29)New York Times https://www.nytimes.com/2018/05/29/health/purdue-opioids-oxycontin.html
  35. Evan Selinger, Prof. Philosophy at RIT, Facebook Fabricates Trust Through Fake Intimacy. (2018-06-04) https://medium.com/s/trustissues/facebook-fabricates-trust-through-fake-intimacy-b381e60d32f9
  36. Yan Wu + 6, Understanding identification-based trust in the light of affiliative bonding: Meta-analytic neuroimaging evidence (2021-12) Pages 627-64 https://www.sciencedirect.com/science/article/abs/pii/S0149763421004413?via%3Dihub
  37. ACM Policy Council, The Data Trust Deficit (2023-05) https://dl.acm.org/doi/pdf/10.1145/3605240
  38. Jenny Gross, How to Claim Your Share of Facebook’s $725 Million Privacy Settlement New York Times (2023-04-20) https://www.nytimes.com/2023/04/20/business/facebook-settlement-apply.html
  39. Rachelle Blair-Frasier, T-Mobile confirms second data breach in 2023 Security (2023-05-04) https://www.securitymagazine.com/articles/99300-t-mobile-confirms-second-data-breach-in-2023
  40. Rebooting the Web of Trust https://www.weboftrust.info/
  41. Emanuel Moss, Trust is not Enough: Accuracy, Error, Randomness, and Accountability in an Algorithmic Society CACM 66 No. 6 p. 42ff (2023-06)

Other Material

  • LIGHTest is a project that is partially funded by the European Commission as an Innovation Action as part of the Horizon2020 program under grant agreement number 700321. LIGHTest‘s objective is to create a Lightweight Infrastructure for Global Heterogeneous Trust management in support of an open Ecosystem of Stakeholders and Trust schemes. We show supported scenarios, motivate the necessity for global trust management and discuss related work. Then we present how LIGHTest addresses the challenges of global trust management, its reference architecture and the pilot applications.
  • Do you Trust me? 2021-19-05