Difference between revisions of "Verified Information"

From MgmtWiki
Jump to: navigation, search
(Meaning of the Data)
(Context)
 
(14 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
==Full Title or Meme==
 
==Full Title or Meme==
A collection of [[Verified Claim]]s or data together with the context that can provides meaning to the provenance, meaning and accessibility of that data.
+
A collection of [[Verified Claim]]s or data together with the context that can provide valuable information through the provenance, meaning and accessibility of that data.
  
 
==Context==
 
==Context==
While a great deal of ink has been expended (and captured on the wiki page [[Information]]) at is core information is most useful when view as input to a decision process. That decision process must be able to evaluate all of the data available, understand the reliability of the data and then act on the data. The better the information content of data, the more valuable it is. This wiki page is about creating packets of data together with the [[Provenance]] of the data, the [[Meaning]] of the data and the [[Authorization| Authorized Access]] to the data. Lets look at each of those three categories:
+
While a great deal of ink has been expended (and captured on the wiki page [[Information]]), at its core, information is most useful when viewed as input to a decision process. That decision process must be able to evaluate all of the data available, understand the reliability of the data and then act on the data. The better the information content of data, the more valuable it is. This wiki page is about creating packets of data together with the [[Provenance]] of the data, the [[Meaning]] of the data and the [[Authorization| Authorized Access]] to the data. Let's look at each of those three categories:
 
===Provenance of Data===
 
===Provenance of Data===
 +
Basically just: who, what, where and why.
 +
 
===Meaning of the Data===
 
===Meaning of the Data===
 
Information can be useful when it is collected in one enterprise that has its own taxonomy of data elements with the intent of each element, where it is collected and how it is used. The context for all three of these functions is embedded in the very structure of the enterprise policies and procedures. When the data is moved from one enterprise to another, the context is stripped away and even if it were to be available to the receiver of the data, that context might be opaque or worse in the enterprise that acquires the data. A solution to this problem arose early in the transport of goods from one country to another starting with the rise of international trade. One particular use case, the Berlin airlift brought all of the challenges of international traffic in goods to a head and forced an enterprising solution by U.S. Airforce Colonel Edward Guilbert<ref>J. Naomi Skinner, ''EDI – 50 years old and getting stronger'' (2018-04-24) Open Text  https://blogs.opentext.com/edi-50-years-old-getting-stronger/</ref> starting with paper forms and transitioning to a standardized data schema for teletype machines in 1968. This effort evolved into the work of the [http://x12.org/x12org/awards/asc-x12-about-awards.cfm Transportaion Data Coordinating Committee] which organized ANSI Committee X12 for Electronic Data Interchange (EDI). Even in 1975 it was realized that the source of data needed to be validated, the meaning of the data standardized and the destination of the data discoverable.
 
Information can be useful when it is collected in one enterprise that has its own taxonomy of data elements with the intent of each element, where it is collected and how it is used. The context for all three of these functions is embedded in the very structure of the enterprise policies and procedures. When the data is moved from one enterprise to another, the context is stripped away and even if it were to be available to the receiver of the data, that context might be opaque or worse in the enterprise that acquires the data. A solution to this problem arose early in the transport of goods from one country to another starting with the rise of international trade. One particular use case, the Berlin airlift brought all of the challenges of international traffic in goods to a head and forced an enterprising solution by U.S. Airforce Colonel Edward Guilbert<ref>J. Naomi Skinner, ''EDI – 50 years old and getting stronger'' (2018-04-24) Open Text  https://blogs.opentext.com/edi-50-years-old-getting-stronger/</ref> starting with paper forms and transitioning to a standardized data schema for teletype machines in 1968. This effort evolved into the work of the [http://x12.org/x12org/awards/asc-x12-about-awards.cfm Transportaion Data Coordinating Committee] which organized ANSI Committee X12 for Electronic Data Interchange (EDI). Even in 1975 it was realized that the source of data needed to be validated, the meaning of the data standardized and the destination of the data discoverable.
  
The common solution for all meaning in an information economy, as it was for Linnaeus in biology, it the creation of a standard taxonomy of terms with standardized meaning.
+
The common solution for all meaning in an information economy, as it was for Linnaeus in biology<ref>M. A. Donk, ''Typification and Later Starting-Points'' https://www.iapt-taxon.org/historic/Congress/IBC_1959/Prop018-019.pdf</ref>, is the creation of a standard taxonomy of terms with standardized meaning. In typical database technology this is a data dictionary together with a data schema, which describes the Entity, the Attributes and the Relationships (EAR). Berners-Lee (the originator of the Web we all use today) created a vision for the Semantic Web, a common buzz-word of the day, which is simply a way to send the EAR data over the internet. He was apparently not aware of the EDI effort that has been in process since 1975. Like many of the current batch of internet entrepreneurs, he thought he had discovered something completely new.<blockquote>I have a dream for the Web [in which computers] become capable of analyzing all the data on the Web – the content, links, and transactions between people and computers. A "Semantic Web", which makes this possible, has yet to emerge, but when it does, the day-to-day mechanisms of trade, bureaucracy and our daily lives will be handled by machines talking to machines. The "intelligent agents" people have touted for ages will finally materialize.<ref>Tim Berners-Lee, Mark Fischetti, Weaving the Web. Harper (1999) chapter 12. ISBN 978-0-06-251587-2.</ref></blockquote>
  
It does appear that [[Artificial Intelligence]] can be used to create meaning from data based on historical evidence. While this may change the pool of information that can be verified, the AI is still just a means for creation of the context and enables data to become information.
+
It does appear that [[Artificial Intelligence]] can be used to create meaning from data based on historical evidence. While this may change the pool of information that can be verified, the AI is still just a means for creation of the context that enables data to become information.
  
 
===Access to the Data===
 
===Access to the Data===

Latest revision as of 11:46, 2 November 2019

Full Title or Meme

A collection of Verified Claims or data together with the context that can provide valuable information through the provenance, meaning and accessibility of that data.

Context

While a great deal of ink has been expended (and captured on the wiki page Information), at its core, information is most useful when viewed as input to a decision process. That decision process must be able to evaluate all of the data available, understand the reliability of the data and then act on the data. The better the information content of data, the more valuable it is. This wiki page is about creating packets of data together with the Provenance of the data, the Meaning of the data and the Authorized Access to the data. Let's look at each of those three categories:

Provenance of Data

Basically just: who, what, where and why.

Meaning of the Data

Information can be useful when it is collected in one enterprise that has its own taxonomy of data elements with the intent of each element, where it is collected and how it is used. The context for all three of these functions is embedded in the very structure of the enterprise policies and procedures. When the data is moved from one enterprise to another, the context is stripped away and even if it were to be available to the receiver of the data, that context might be opaque or worse in the enterprise that acquires the data. A solution to this problem arose early in the transport of goods from one country to another starting with the rise of international trade. One particular use case, the Berlin airlift brought all of the challenges of international traffic in goods to a head and forced an enterprising solution by U.S. Airforce Colonel Edward Guilbert[1] starting with paper forms and transitioning to a standardized data schema for teletype machines in 1968. This effort evolved into the work of the Transportaion Data Coordinating Committee which organized ANSI Committee X12 for Electronic Data Interchange (EDI). Even in 1975 it was realized that the source of data needed to be validated, the meaning of the data standardized and the destination of the data discoverable.

The common solution for all meaning in an information economy, as it was for Linnaeus in biology[2], is the creation of a standard taxonomy of terms with standardized meaning. In typical database technology this is a data dictionary together with a data schema, which describes the Entity, the Attributes and the Relationships (EAR). Berners-Lee (the originator of the Web we all use today) created a vision for the Semantic Web, a common buzz-word of the day, which is simply a way to send the EAR data over the internet. He was apparently not aware of the EDI effort that has been in process since 1975. Like many of the current batch of internet entrepreneurs, he thought he had discovered something completely new.
I have a dream for the Web [in which computers] become capable of analyzing all the data on the Web – the content, links, and transactions between people and computers. A "Semantic Web", which makes this possible, has yet to emerge, but when it does, the day-to-day mechanisms of trade, bureaucracy and our daily lives will be handled by machines talking to machines. The "intelligent agents" people have touted for ages will finally materialize.[3]

It does appear that Artificial Intelligence can be used to create meaning from data based on historical evidence. While this may change the pool of information that can be verified, the AI is still just a means for creation of the context that enables data to become information.

Access to the Data

  • Information (aka contextualized data) is valuable.

Problem

It goes without saying that we live in an information age. The threats to any enterprise in the current digital ecosystem might be different in means, but the motive remains the same; bad actors are out there to steal your:

  1. money
  2. influence
  3. clients or customers.

Whether the bad actor is Facebook or some Bulgarian troll, the means is universal access to the internet.

Solution

This wiki discusses one specific solution to the acquisition, interpretation and Protected storage of Data. Specifically it looks at packets of data together with the associated metadata.

References

  1. J. Naomi Skinner, EDI – 50 years old and getting stronger (2018-04-24) Open Text https://blogs.opentext.com/edi-50-years-old-getting-stronger/
  2. M. A. Donk, Typification and Later Starting-Points https://www.iapt-taxon.org/historic/Congress/IBC_1959/Prop018-019.pdf
  3. Tim Berners-Lee, Mark Fischetti, Weaving the Web. Harper (1999) chapter 12. ISBN 978-0-06-251587-2.

Other Material