Distinguished Adjunct Professor, Golden Gate University, San Francisco, CA 94105
Associate Adjunct Professor, Dominican University, San Rafael, CA 94901
The questions posed in the title of this blog are becoming extremely relevant today, given the current political debate and some big companies deciding to become the arbiters of what information is true and what is not.
As Prof. Burgin points out , discerning false and true information is a risky business.
“In his lectures on optics, Newton developed a corpuscular theory of light. According to this theory, light consists of small moving particles. Approximately at the same time, Huygens and Hook built a wave theory of light. According to their theory, light is a wave phenomenon. Thus, somebody would like to ask the question who, i.e., Newton or Huygens and Hook, gave genuine information and who gave false information. For a long time, both theories were competing. As a result, the answer to our question depended whether the respondent was an adherent of the Newton’s theory or of the theory of Huygens and Hook. However, for the majority of people who lived at that time both theories provided pseudo-information because those people did not understand physics. A modern physicist believes that both theories contain genuine information. So, distinction between genuine and false information in some bulk of knowledge depends on the person who estimates this knowledge. Thus, we have seen that the problem of false information is an important part of information studies and we need more developed scientific methods to treat these problems in an adequate manner.”
We face a similar situation today, both in the sciences and in politics. The classical computer scientists believe in Church-Turing thesis and are not enthusiastic to consider alternatives to push the boundaries. The general theory of information and all the work by prof. Burgin is completely ignored even when examples were shown with implementations that offer solutions going beyond the church-Turing thesis dealing with digital computing functions, structure and fluctuations [2,3].
There are big companies deciding what truth is and what should be censored or forbidden. Politics is so divided that some are questioning the truth in mathematics. The difference between “white” mathematics and “black” mathematics are being discussed. Conspiracy theories about conspiracy theories are thrown around with ease and no consequence.
It is time to analyze the true nature of information using a firm theoretical framework and find ways to use it wisely, just as all physical, chemical, and biological systems do in nature.
What is Information?
According to Merriam Webster dictionary information is:
- Knowledge obtained from investigation, study, or instruction
- INTELLIGENCE, NEWS
- FACTS, DATA
According to prof. Mark Burgin , Information is:
- The attribute inherent in and communicated by one of two or more alternative sequences or arrangements of something (such as nucleotides in DNA or binary digits in a computer program) that produce specific effects:
- A signal or character (as in a communication system or computer) representing data
- Something (such as a message, experimental data, or a picture) which justifies change in a construct (such as a plan or theory) that represents physical or mental experience or another construct
- A quantitative measure of the content of information. Specifically, a numerical quantity that measures the uncertainty in the outcome of an experiment to be performed.
- The communication or reception of knowledge or intelligence
While we all know intuitively what information is, and use information services routinely, it seems that there is no consensus among computer scientists and information technology practitioners on what really information is. Some say it is data. Others say it is knowledge. As long ago as 2005, prof. Burgin  pointed out “information has become the most precious resource of society. However, there is no consensus on the meaning of the term “information,” and many researchers have considered problems of information definition…..However, to have a valid and efficient theory of information, it is not enough to have a correct definition of information. We need properties of information and information processes.”
In the book on the general theory of Information (GTI) developed by Prof. Burgin [4 -6], he concludes “Information is not merely an indispensable adjunct to personal, social, and organizational functioning, a body of facts, data, and knowledge applied to solutions of problems or to support actions. Rather it is a central defining characteristic of all life-forms, manifested in genetic transfer, in stimulus-response mechanisms, in the communication of signals and messages and, in the case of humans, in the intelligent acquisition of knowledge, understanding and achieving wisdom.”
He introduces three faces of information:
- Information in a broad sense, which is a potential for changes and includes different kinds of energy. For example, in the physical world, information is energy. In the mental world of concepts, information is mental energy. In the structural world, we have information per se, or information in the strict sense such as cognitive information.
- Information as a relative concept, which depends on the infological system (suggesting that information is the joint function of data and knowledge);
- Information in a strict sense, which acts on structures, such as knowledge, data, beliefs and ideas.
This theory explains clearly how data, knowledge and information are related using the metaphorical comparison “Knowledge to Information is as Matter is to Energy.” It is possible to call this relation by the name KIME structure (See figure above). Unfortunately, this theory and its application using structural machines , named sets, and theory of oracles is a well kept secret from classical computer scientists and information technology practitioners who still believe in the old pyramid of Data – Information – Knowledge – Wisdom (called the DIKW pyramid shown in the figure above also) and are not able to leverage the new theories to build a new class of information systems in the digital world mimicking the biological systems in the real world in all its detail. The DIKW pyramid is like an ancient Egyptian pyramid in comparison with the modern skyscraper of the KIME structure.
Purpose of this Blog
In this blog, we will discuss how to absorb this theory and apply it to:
- Better understand how biology uses information processing structures (in the form of genes and neurons) to execute, monitor and manage the “life processes.” Resulting autopoiesis, which uses physical and chemical processes converting matter and energy, points to a way to design a new class of digital autopoietic machines.
- Discuss how to design and build a new class of digital information processing structures using both symbolic and sub-symbolic computing structures such as digital genes and digital neurons. These systems allow us to proactively configure, monitor and proactively manage distributed independent computing structures communicating with each other in real-time to maintain their equilibrium state even in the face of very rapid fluctuations in the demand for and availability of their resources.
- To build a new class of real-time risk management systems that use knowledge and the history of its evolution (non-Markovian process, where the conditional probability of a future state depends on not only the present state but also on its prior state history) in a particular situation. In essence, acknowledge and use the fact that complex adaptive systems are beholden to their past. This augments current deep learning system  with a model-based reasoning system utilizing domain knowledge and provides transparency that is currently lacking in Deep Learning systems.
- Discuss a “measure” of information veracity to formally design methods to reason and make rational decisions with the available knowledge and its history.
From Classical Computer Science to New Science of Information Processing Structures
Recent advances in various disciplines of learning are all pointing to a new understanding of how information processing structures in nature operate. Combining this knowledge with the global theory of information , may yet help us to not only solve the age-old philosophical question of “mind-body dualism” but also pave a path to design and build self-regulating digital automata with a high degree of sentience, resilience and intelligence.
Classical computer science with its origins from the John von Neumann’s stored program, which implemented the structure of a universal Turing machine, has given us tools to decipher the mysteries of physical, chemical and biological systems in nature. Both symbolic computing and subsymbolic computing with neural network implementations have allowed us to model and analyze various observations (including both mental and physical processes) and use information to optimize our interactions with each other and with our environment. In turn, our understanding of the nature of information processing structures in nature using both physical and computer experiments is pointing us to a new direction in computer science going beyond the current Church-Turing thesis boundaries of classical computer science.
Our understanding of information processing structures and their internal and external behaviors causing their evolution in all physical, chemical and biological systems in nature and digital systems in particular, produced by humans are suggesting the need for a common framework where function, structure and fluctuations impact these systems composed of many autonomous components interacting with each other under the influence of external forces in their environment. As Stanislas Dehaene (Stanislas Dehaene (2014) “Consciousness and the Brain: Deciphering How the Brain Codes our Thoughts” Penguin Books, New York. P 162) points out “What is required is an overreaching theoretical framework, a set of bridging laws that thoroughly explain how mental events relate to brain activity patterns. The enigmas that baffle contemporary neuroscientists are not so different from the ones that physicists resolved in the nineteenth and twentieth centuries. How, they wondered, do the macroscopic properties of ordinary matter arise from a mere arrangement of atoms? Whence the solidity of a table, if it consists almost entirely of a void, sparsely populated by a few atoms of carbon, oxygen, and hydrogen? What is a liquid? A solid? A crystal? A gas? A burning flame? How do their shapes and other tangible features arise from a loose cloth of atoms? Answering these questions required an acute dissection of the components of matter, but this bottom-up analysis was not enough; a synthetic mathematical theory was needed.”
Fortunately, our understanding of the theory of information processing structures and their evolution in nature points a way for a theoretical framework that allows us to:
- Explain the information processing architecture, which is gleamed from our studies of physical, chemical and biological systems, and to articulate how to model and represent cognitive processes that bind the brain-mind-body behaviors and also,
- Design and develop a new class of digital information processing systems, which are autopoietic. An autopoietic machine is capable of “of regenerating, reproducing and maintaining itself by production, transformation and destruction of its components and the networks of processes downstream contained in them.”
All living systems are autopoietic and have figured out a way to create information processing structures, which exploit physical and chemical processes to manage not only their own internal behaviors but also their interactions with their environment to assure their survival in the face of rapidly changing circumstances. Cognition is an important part of living systems and is the ability to process information through perception using different sensors. Cognitive neuroscience has progressed in “cracking open the black box of consciousness ” to discern how cognition works in managing information with neuronal activity. Functional magnetic resonance imaging used very cleverly to understand the “function of consciousness, its cortical architecture, its molecular basis, and even its diseases” allows us now to model the information processing structures that relate cognitive behaviors and consciousness.
In parallel, our understanding of the genome provides insight into information processing structures with autopoietic behavior. The gene encodes the processes of “life” in an executable form, and a neural network encodes various processes to interact with the environment in real time. Together, they provide a variety of complex adaptive structures. All of these advances throw different light on the information processing architectures in nature.
Fortunately, a major advance in new mathematical framework allows us to model information processing structures and push the boundaries of classical computer science just as relativity physics pushed the boundary of classical Newtonian physics and statistical mechanics pushed the boundaries of thermodynamics by addressing function, structure and fluctuations in the components constituting the physical and chemical systems. Here are some of the questions we need to answer in the pursuit of designing and implementing an autopoietic machine with digital consciousness:
- What is Classical Computer Science?
- What are the Boundaries of Classical Computer Science?
- What do we learn from Cognitive Neuroscience about The Brain and Consciousness?
- What do we Learn from the Mathematics of Named Sets, Knowledge Structures, Cognizing Oracles and Structural Machines?
- What are Autopoietic Machines and How do they Help in Modeling Information Processing Structures in Nature?
- What are the Applications of Autopoietic Digital Automata and how are they different from the Classical Digital Automata?
- Why do we need to go beyond classical computer science to address autopoietic digital automata?
- What are knowledge structures and how are they different from data structures in classical computer science?
- How are the operations on the schema representing the data structures and knowledge structures differ?
- How do “Triadic Automata” help us implement hierarchical intelligence?
- How does an Autopoietic Machine move us to Go Beyond Deep Learning to Deep Reasoning Based on Experience and Model-based Reasoning?
- What is the relationship between information processing structures in nature and the digital information processing structures?
- What are the limitations of digital autopoietic automata in developing same capabilities of learning and reasoning as biological information processing structures?
- How do the information processing structures explain consciousness in living systems and can we infuse similar processes in the digital autopoietic automata?
In a series of blogs, we will attempt to search for the answers to these questions and in the process, we hope to understand the new science of information processing structures, which will help us build a new class of autopoietic machines with digital consciousness. We invite scholars who have spent time to understand information processing structures to contribute to this discussion.
However, as interesting as the new science is, more interesting is the new understanding and the opportunity to transform current generation information technologies without disturbing them with an overlay architecture just like the biological systems evolved an overlay cognitive structure to provide global regulation while keeping local component autonomy intact and coping with rapid fluctuations in real-time. We need to address following questions:
- How are the knowledge structure different from current data structures and how will database technologies benefit from autopoiesis to create a higher degree of sentience, resilience and hierarchical intelligence at scale while reducing current complexity?
- Will the operations on knowledge structure schemas improve the current database schema operations and provide higher degree of flexibility and efficiency?
- Today, most databases manage their own resources (memory management, network performance management, availability constraints etc.), which increase complexity and decrease efficiency. Will autopoiesis simplify the distributed database resource management complexity and allow application workloads become PaaS and IaaS agnostic and provide location independence?
- Can we implement autopoiesis without disturbing current operation and management of information processing structures?
- What are the measures of information?
- What is the relationship of Shannon’s theory to GTI?
- How will we use GTI to discern false information from True information?
- How will we estimate risk based on history of events when a new event changes the behavior of the system? Can it be done with autopoietic machines in real-time to make decisions and act as all biological systems do?
Stay tuned and participate in the journey.
- M. Burgin, (2005) Is Information Some Kind of Data? FIS2005, Microsoft Word – InfData4.doc (mdpi.org)
- Mikkilineni, R. Going beyond Church–Turing Thesis Boundaries: Digital Genes, Digital Neurons and the Future of AI. Proceedings 2020, 47, 15. https://doi.org/10.3390/proceedings2020047015
- Mikkilineni, R. Information Processing, Information Networking, Cognitive Apparatuses and Sentient Software Systems. Proceedings 2020, 47, 27. https://doi.org/10.3390/proceedings2020047027
- Burgin, M. The General Theory of Information as a Unifying Factor for Information Studies: The Noble Eight-Fold Path. Proceedings 2017, 1, 164. https://doi.org/10.3390/IS4SI-2017-04044
- Burgin, M. Theory of Information: Fundamentality, Diversity and Unification; World Scientific: New York, NY, USA; London, UK; Singapore, 2010.
- Burgin, M. Data, Information, and Knowledge. Information 2004, 7, 47–57
2 thoughts on “What is Information? How do We Produce it, and How do We Consume it? How Do We Know Whether Some Information is True or False?”
Dear Dr. Rao Mikkilineni, the way you are brining a meaningful commentary to the General Theory of Information by correlating the foundational theories connecting matter and energy is amazing. I have been inspired by your work in the area of self-healing and self-supporting information systems, which I have been trying to find some solutions since almost 2014, I feel so fortunate to have known you. I am sure the world will recognize your Turing Award worthy work in these areas!!! – Soma Sekhar Gidugu
LikeLiked by 1 person
Thank you @Soma Sekhar Gidugu. I am only a student learning to understand the true nature of information processing structures. There are other people who have really made pioneering work in this area. First, the prolific writings of Prof. Mark Burgin clearly articulate the theory and unfortunately, the classical computer scientists seem to have paid no attention. His first book in 2005 already lays the foundations and his recent books elaborate the full theory. His work on Global Theory of Information is a masterpiece and truly deserves credit. Others such as Gordana Dodig-Crnkovic, Marcin Schroeder and others who organize is4si are all working toward the same goal to unravel the mysteries of information processing structures. More recent work on brain using functional magnetic resonance throws new light and Dr. Stanislas DeHaene has clearly articulated the need for a theory as I mentioned in the blog. Hopefully, others pay attention to this area which is very fertile for moving us towards strong AI. I wrote some papers on expert systems and their implementation in the late 80’s when I was at Bell Labs and neural networks when I was at US WEST in the early 90’s. We have come a long way and I sincerely believe after reading all the writings from these masters, we are close to building autopoietic machines even if classical computer scientists and Deep Learning pundits do not pay attention. This is a great opportunity for a new generation of scientists and engineers and all I am doing is to disseminate the knowledge from the masters hopefully, to new audiences such as graduate students who are being taught by an older generation of computer scientists and engineers trained in classical computer science and had probably no time or patience to learn the new science of information processing structures.