Full Title or Meme
Risk Assessment can take many forms depending on the principal that bears the risk of loss and the metric that is to be minimized.
While this topic focuses just on the risk of Identity Management there are still many principals that can suffer loss in any digital interchange.
- Identifier and Appribute Providers
- Relying Partiers
- Tursted Third Parties
There are several objective:
- Liability to tort actions
- Fines from Government bodies
- Loss of reputation
- Loss of Privacy
- The appearance of fairness in the court of public opinion.
For example, it has been shown that it is not possible for predictions to be made about the likelihood of future events without a series of assumptions, which have been shown to be unlikely to all be simultaneously true
- the base rate statistics need to be the same for different sub populations of the sample.
- there is no algorithm that can be generated to classify a mixed population in the same category for the sub population (say race or gender)
Mark Twain had it right when he declared that there are liers, there are damn liers and there are statisticians. It isn't so much that the statisticians are lying as that they are not capable of explain the mathematics to the people that are reading the statistics.
Pro Publica Article
A good example of the problem with statistics was highlighted in the Pro Publica Article Sentencing guideline were created to help judges make unbiased sentences based on a criminals likelihood of reoffending if released. The company can created the software tested the results using statistics that did not consider the offers race, and so claimed that it was unbiased. The problem was that when race was considered it was shown to be wilding wrong about the recidivism of blacks versus whites. The question was then asked how these results could re reconciled. The answer was that they cannot. If statistics are used in a place where they are not valided, nothing can be said about their validity. As a results blacks were unfairly sentenced to longer terms than were whites.
- Julia Angwin +3, Machine Bias - There’s software used across the country to predict future criminals. And it’s biased against blacks. (2016-05-23) ProPublica https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing