Figure 1: Selected carriers of information about human and machine intelligence, and their relationship.
What is Natural Intelligence?
Here is a dialogue between the CEO of Google, that created one of the AI programs that can be used either to create fake information that can cause deliberate harm, or provide deep insights from data and help decision-making better than any single human being can, and the 60-minutes TV show host. The machine in both cases has been taught how to convert information into a huge pool of knowledge that can be used by humans or other machines that interact with it.
Google CEO on AI: “There is an aspect of this we call all of us in the field call it as a black box, you know, you don’t fully understand, and you can’t quite tell why it said this or why it got wrong. We have some ideas and our ability to understand this gets better over time but that is where the state of the art is.
60 Minutes: “You don’t fully understand how it works and yet you’ve turned it loose on society?”
Google CEO: “Let me put it this way. I don’t think we fully understand how the human mind works either.”
This is the old school of classical computer science. However, we have come a long way from the old school. The new science of information processing structures derived from our knowledge of genomics, neuroscience, and the general theory of information tells us that human intelligence stems from the knowledge encapsulated in the genome of biological systems and is transmitted from the survivor to the successor.
(See Mikkilineni, R. The Science of Information Processing Structures and the Design of a New Class of Distributed Computing Structures. Proceedings 2022, 81, 53. https://doi.org/10.3390/proceedings2022081053)
The genome provides the operational knowledge to implement biological life processes using 30+ trillion cells which are autonomous, collaborate in groups, have the ability to process information, create knowledge, and use metabolism (conversion of matter and energy). The specification provides full knowledge of functional requirements and their fulfillment, Non-functional requirements and their fulfillment along with best practices passed on from the survivors to their successors including how to fight a virus that affected their ancestors in the past.
“The single fertilized egg cell develops into a full human being is achieved without a construction manager or architect. The responsibility for the necessary close coordination is shared among the cells as they come into being. It is as though each brick, wire, and pipe in a building knows the entire structure and consults with the neighboring bricks to decide where to place itself.”
Yanai, Itai; Martin, Lercher. The Society of Genes (pp. 11-13). Harvard University Press. Kindle Edition. p. 11.
Without the genome, there is no natural intelligence. The genome enables both the autopoietic and cognitive behaviors exhibited by biological systems. For a detailed discussion of autopoietic and cognitive behaviors, please refer to A New Class of Autopoietic and Cognitive Machines and the references cited therein.
The Crux of the Problem of Current AI:
The use of machine intelligence, which Alan Turing described in his prophetic paper in 1948 “Intelligent Machinery” has proven to exceed his imagination and its use with both symbolic computing and machine learning has contributed to business process automation and data-driven insight-based decision making. Recent advances using large language models and generative AI have while proving an order of magnitude improvement in knowledge acquisition and its use, also created an opportunity for the abuse of technology by evil-doers and bad actors by creating fake information and synthetic media. This brings up the question of ethics in using AI and the need for some kind of regulation with checks and balances.
The original thinkers of machine intelligence, such as John von Neumann, Alan Turing, and many others were trying first to guess how the mind works and build a machine that mimics the aspects of the mind they could decipher. Symbolic computing was proposed by Alan Turing with this statement. “A man in the process of computing a real number replaced by a machine which is only capable of a finite number of conditions.” The Turing machine proposed by Alan Turing led to machines that automate process execution and improve the efficiency, resiliency, and scalability of human operations. Sub-symbolic computing was proposed by several scientists and mathematicians including McCulloch, Pitts, John von Neumann, and Alan Turing guessed how the neuron and neural networks work in the brain and mimicked its function using an algorithm executed on the symbolic computing structure. Thus, current AI and process automation algorithms run on an infrastructure that uses sequences of symbols to represent information and another sequence of symbols that represent operations on the information to execute processes defined as algorithms or sequences of tasks.
For a detailed account of the evolution of machine intelligence and the relationship between human and machine intelligence, please see the video “Evolution of Machine Intelligence”
The long and short of the lesson from the science of information processing structures is that the current symbolic and sub-symbolic computing structures are limited in their ability to mimic human intelligence that has evolved over billions of years and is transmitted by the survivors to their successors through the genome. The genome is a specification of operational knowledge that has the ability to create a “self” using 30+ trillion process-executing cells each with the knowledge to function autonomously and collaborate with other cells to execute collaborative processes with shared knowledge using matter and energy transformations.
We now know much more about how the mind operates with the body and brain through great advances in genomics, neuroscience, and our understanding of the general theory of information and how biological systems use genome-derived intelligence to interact with the world. Information is the bridge between the material world and the mental world of the biological system that is made possible through knowledge from the genome. Thus matter, energy, information, and knowledge play important roles in the development of human intelligence and therefore also play a deeper role in building machines that truly mimic the intelligence of biological systems.
Current tools used to build truly intelligent systems fall short of mimicking genome-derived intelligence in two key aspects discussed in the video mentioned above:
- Sub-symbolic and symbolic computing structures are not adequate to define the functional requirements, the non-functional requirements, and the best practices to execute the life processes of the computer and the computed. The reason lies in the inadequacy of the sequences of symbols and operations on them using another sequence of symbols. Alan Turing captured only a part of how human intelligence works. Sub-symbolic computing also is an algorithm that uses sequences of symbols.
- Emergence is a property of complex adaptive systems where complex structures under the influence of large fluctuations undergo a phase transition that is not under the control of the system. The structure itself has no control over the outcome. Biological systems have devised the genome in order to avoid the emergence and define their own destiny based on the specification and execution of life processes using the relationships between information, knowledge, matter, and energy. The general theory of information provides a framework to understand the relationship between the material world and the mental world created through the genome using the theory of ideal structures.
The general theory of information provides a path to infuse both cognitive and autopoietic behaviors using a digital genome. See Information Theoretical Principles of Software Development.
A digital genome-based information processing structure is composed of both symbolic and sub-symbolic computing structures and provide model-based and transparent cognitive processes and address the black-box problem. It allows us to create a digital replica of the domain specific model of physical reality, establish real-time synchronization between the digital and material worlds, and use the digital information processing to manage our interactions with the material world using data, based on objective reality. Information is the bridge between the two worlds.
The transparency of the model-based reasoning and its connection to physical reality reduces the knowledge gap between the various actors involved in the real world and reduces the mistakes often caused by human self-referential circularity of their logic not moored to external reality. (See Life After Google for a discussion of Gödel’s theorem and the impact of self-referential circularity not moored to external reality)
The digital genome approach is being currently applied to specific business problems such as self-regulating distributed software applications, and medical-knowledge-based digital assistants trying to reduce the knowledge gap between the patient and the healthcare service providers by giving the right information at the right time in the right format to the parties that need it to make the right decision.
Perhaps if the classical computer scientists study the science of information processing structures, we could make progress in building predictable, stable, safe, and secure machine intelligence that assist humans in making real-time life and death decisions based on transparent and open knowledge. It is all about reducing the knowledge gap between various actors making decisions with transparent and timely information access that reduces the impact of self-referential circularity of humans not moored to external reality. Machines are better in processing information than humans, if they are endowed with well-designed digital genomes to perform specific functions. Humans are ultimately responsible for how we use machines. Good people use it wisely and evil people use it to subvert language, freedom, democracy, and ultimately the civilization itself. Therefore, it is imperative for good people to develop the antidotes with better technology that transcends today’s classical computer science and its limitations. The digital genome provides the architecture for transparency, verification of facts, and insights based on external reality.
The Jazz metaphor here is very apt. The thesis of classical computer science is met with the antithesis of machine intelligence based on the current half-knowledge about the nature of human intelligence. The synthesis is perhaps provided by the new knowledge about the role of the genome and our understanding from the general theory of information about how to infuse autopoietic and cognitive behaviors into dgital automata to Improve their sentience, resilience, and intelligence.
We need to graduate from the old school of classical computer science to the new school of the science of information processing structures by transitioning from data structures to knowledge structures and from Turing Machines to Structural Machines.
Food for thought.