Difference between revisions of "Verification"

From MgmtWiki
Jump to: navigation, search
(Formal Verification of Software)
(Context)
Line 9: Line 9:
 
* [https://cacm.acm.org/magazines/2021/7/253452-formal-software-verification-measures-up/fulltext?mobile=false Formal Software Verification Measures Up] article seems to claim that we will be able to prove that programs will deliver only correct results, "real soon now" in spite of the fact that for most systems humans are not capable of defining what a correct result might be. The book "The Alinement Problem"<ref>Brain Christian, ''The Alinement Problem'' ISBN 978-0393635829</ref> explains in great detail why setting goals for computer systems is not even close to a solution.<blockquote>The disconnect between intention and results―between what mathematician Norbert Wiener described as “the purpose put into the machine” and “the purpose we really desire”―defines the essence of “the alignment problem."</blockquote>
 
* [https://cacm.acm.org/magazines/2021/7/253452-formal-software-verification-measures-up/fulltext?mobile=false Formal Software Verification Measures Up] article seems to claim that we will be able to prove that programs will deliver only correct results, "real soon now" in spite of the fact that for most systems humans are not capable of defining what a correct result might be. The book "The Alinement Problem"<ref>Brain Christian, ''The Alinement Problem'' ISBN 978-0393635829</ref> explains in great detail why setting goals for computer systems is not even close to a solution.<blockquote>The disconnect between intention and results―between what mathematician Norbert Wiener described as “the purpose put into the machine” and “the purpose we really desire”―defines the essence of “the alignment problem."</blockquote>
 
* [https://techcrunch.com/2016/05/16/how-can-we-control-intelligent-systems-no-one-fully-understands/ How can we control intelligent systems no one fully understands?] is an article from 2016-05-16 that makes the point the complex systems are never predictable.
 
* [https://techcrunch.com/2016/05/16/how-can-we-control-intelligent-systems-no-one-fully-understands/ How can we control intelligent systems no one fully understands?] is an article from 2016-05-16 that makes the point the complex systems are never predictable.
 +
* Yet academic s are quite sure that no one should every run a program that produces any result that's part part of its original design<ref></ref>Samual Greengard, "Formal Software Verification Measures up'' '''CACM 64''' no 7. (2021-07)
  
 
==References==
 
==References==
  
 
[[Category: Glossary]]
 
[[Category: Glossary]]

Revision as of 18:58, 1 December 2022

Full Title or Meme

Verification is a process for comparing an assertion with a rule set to assure that the assertion is in compliance with the rule set.

Context

Formal Verification of Software

In the context of Identity Management Formal Verification of software means the inspection of software to certain if it will produce the results in its specification.

  • Formal Software Verification Measures Up article seems to claim that we will be able to prove that programs will deliver only correct results, "real soon now" in spite of the fact that for most systems humans are not capable of defining what a correct result might be. The book "The Alinement Problem"[1] explains in great detail why setting goals for computer systems is not even close to a solution.
    The disconnect between intention and results―between what mathematician Norbert Wiener described as “the purpose put into the machine” and “the purpose we really desire”―defines the essence of “the alignment problem."
  • How can we control intelligent systems no one fully understands? is an article from 2016-05-16 that makes the point the complex systems are never predictable.
  • Yet academic s are quite sure that no one should every run a program that produces any result that's part part of its original designCite error: Invalid <ref> tag;

refs with no name must have contentSamual Greengard, "Formal Software Verification Measures up CACM 64 no 7. (2021-07)

References

  1. Brain Christian, The Alinement Problem ISBN 978-0393635829