Artificial intelligence and some aspects of human thinking when solving non-computable problems
The hypothesis of Strong Artificial Intelligence, which states that all aspects of human intellect can be reproduced, and the hypothesis of Weak Artificial Intelligence, which maintains that it is basically impossible to reproduce human creativity.
Рубрика | Философия |
Вид | статья |
Язык | английский |
Дата добавления | 03.06.2022 |
Размер файла | 179,9 K |
Отправить свою хорошую работу в базу знаний просто. Используйте форму, расположенную ниже
Студенты, аспиранты, молодые ученые, использующие базу знаний в своей учебе и работе, будут вам очень благодарны.
Размещено на http://www.allbest.ru/
ARTIFICIAL INTELLIGENCE AND SOME ASPECTS OF HUMAN THINKING WHEN SOLVING NON-COMPUTABLE PROBLEMS
G.A. Sheroziya, M.G. Sheroziya
The article compares the hypothesis of Strong Artificial Intelligence, which states that all aspects of human intellect can be reproduced with the help of direct programming, and the hypothesis of Weak Artificial Intelligence, which maintains that it is basically impossible to reproduce human creativity and the ability of a person to create and discover new information using direct programming.
The article provides evidence to support the hypothesis of Weak Artificial Intelligence. The proofs are formulated as theorems and are based on information theory, complexity theory, and the theory of non-computable problems. The authors often resort to the notion of non-computable problems which cannot be solved on the basis of a limited algorithm.
The authors underline the difference between Kolmogorov's and Shannon's approaches to information. Shannon's formulas enable one to determine potential information capacity of a signal or text, but fail to define their real semantic content. Kolmogorov's approach to information focuses on the estimation of information capacity and the solution of non-computable problems.
Therefore, unlike Shannon's approach, Kolmogorov's approach to information is associated with human thinking and enables one to employ mathematical principles to explain some peculiarities of human thinking.
The article shows that new information appears only under experimental conditions or when non-computable problems are solved. Algorithmic processing of information does not create new information.
The authors conclude that artificial intelligence which is creative and capable of generating new information cannot be created through direct programming and is a non-computable problem itself.
artificial intelligence; programming; algorithm; information; Kolmogorov's approach; Shannon's approach
Искусственный интеллект и некоторые особенности мышления при решении невычислимых задач
Г.А. Шерозия, М.Г. Шерозия
В статье приведено сравнение гипотезы сильного искусственного интеллекта, предполагающего возможность воспроизведения всех сторон человеческого интеллекта с помощью прямого программирования и гипотезы слабого искусственного интеллекта, отрицающей возможность воспроизведения с помощью алгоритмов способностей человека к творчеству и созданию новой информации.
Приведен ряд доказательств правильности гипотезы слабого искусственного интеллекта. Доказательства сформулированы в виде теорем и основаны на теории информации, теории сложности, теории невычислимых задач. Особенно часто автор использует понятие о невычислимой задаче, которую, по определению, невозможно решить с помощью конечного алгоритма. artificial intelligence human
Подчеркнуто различие подхода Шеннона к информации от подхода Колмогорова. Шенноновские формулы позволяют определить потенциальную информационную емкость сигнала или текста, но не определяют их реальное смысловое содержание. Колмогоровский подход к информации требует при определении количества информации в тексте решения невычислимой задачи, а это, в свою очередь, требует реального понимания содержания текста.
Таким образом колмогоровский подход к информации в отличие от шенновского связан с человеческим мышлением. Этот факт позволяет распространить строгие математические соотношения на некоторые особенности человеческого мышления.
Показано, что новая информация создается только при решении невычислимых задач либо в эксперименте. Алгоритмическое решение задачи новой информации не создает.
Исходя из изложенных представлений сделан вывод о том, что искусственный интеллект, способный к творчеству и созданию новой информации, не может быть создан с помощью методов прямого программирования и его создание само по себе является невычислимой задачей.
искусственный интеллект; программирование; алгоритм; информация; подход Колмогорова; подход Шеннона
The article compares the hypothesis of Strong Artificial Intelligence, which suggests the possibility of replicating human intelligence through direct programming, with the hypothesis of Weak Artificial Intelligence, which denies the possibility of algorithmically replicating human ability to create and produce new information.
The article provides some evidence to prove the hypothesis of Weak Artificial Intelligence. The proofs are formulated as theorems.
The proofs are proofs based on information theory, complexity theory, the theory of non- computable problems. The concept of non-computable task, which cannot be solved with the help of a finite algorithm by definition, is used especially often in the article.
The difference between the Shannon approach and the Kolmogorov approach to information is emphasized.
Shannon formulas make it possible to determine the potential information capacity of a signal or text, but they do not determine the real semantic content of this signal or text. The Kolmogorov approach to information requires the solution of an non-computable problem for determination of the amount of information in the text. And in turn this requires a real understanding of the content of the text.
Thus, the Kolmogorov approach to information is essentially connected with the peculiarities of human thinking. This fact allows us to extend rigorous mathematical relationships to some aspects of human thinking.
It is shown that new information is being created only when solving non-computable problems, or in an experiment. Algorithmic solution of the problem does not create the new information.
Based on the presented views, it was concluded that artificial intelligence capable of creativity and creating new information cannot be created using direct programming methods and its creation is a non-computable problem itself.
Almost in all countries of the world, some work is being done to increase the proliferation of computer technology in all spheres of human activities, such as the Internet, banking, chess, communication, and even driving. The development of advanced software technologies shows that we will eventually create software and hardware capable of replicating human behavior and thinking (including intuition, creativity and the ability of a person to create and discover new information in the process of intellectual activities). Such an ideology is called the theory of Strong Artificial Intelligence [1]. However, some scientists [2] believe that people's consciousness cannot be reduced to formal logic and algorithms, even though it includes them. This is the theory of Weak Artificial Intelligence. According to this theory, human thinking can never be replicated by means of direct programming.
There is another area of scientific research where the problem of choosing between Strong and Weak Artificial Intelligence is also very relevant. In some countries, scientists attempt to model virtual neural networks and, in fact, to create an artificial cortex [3-5]. There is a reason to suppose that in the next twenty years scientists will manage to create a human-like virtual brain. Hence, there is an issue of whether a virtual brain can be taught to think through direct programming or through education in human environment.
The discussion has been going on for several decades. If there were a mathematical theory of human consciousness, it could be relied upon in deciding in favor of Strong Artificial Intelligence or in favor of Weak Artificial Intelligence, but, unfortunately, such theory does not exist.
However, in the twentieth century, due to the efforts of a number of outstanding mathematicians (Godel, Turing, Shannon, Von Neumann, Kolmogorov), the outlines of mathematical information theory, complexity theory, and the theory of non-computable problems were established. For the purposes of this article, the concept of a non-computable problem, which, by definition, cannot be solved with the help of a finite algorithm, seems particularly interesting.
The impetus for the development of the theory of non-computable problems was Turing's proof of the so-called “halting problem”. Turing proved that it was impossible to create a program that was able to conclude whether the source program would halt after completing the calculations or whether it would run forever. Later, a large number of non-computable problems were discovered, many of which were reduced to the halting problem. It is impossible to create a program that will come to a conclusion about stopping another program, but being people, programmers do this kind of work every day.
Another impetus for the development of ideas about how science works and how human thinking works is Godel's theory. This mathematician was able to solve several problems, from which it followed that any theory containing arithmetic could not be simultaneously consistent and complete. That is, if a theory is consistent, it is incomplete, and if it is complete, then it will inevitably contain contradictions. When Godel proved his theorems, it became clear why the great mathematician Hilbert had failed to unite all mathematical principles into a single science, and why the great physicist Einstein had failed to create a unified theory of gravitational and electromagnetic fields. But the most important thing is that despite the fundamental impossibility of creating a single theory about the outside world, people easily use the existing independent pieces of scientific knowledge, somehow uniting them in their minds.
With the emergence of the theory of non-computable problems, a clear, mathematically rigorous criterion appeared for the first time, allowing people to distinguish computer intelligence from human intelligence. That is, with the development of the listed branches of mathematics, there is hope for the creation of a mathematical theory of consciousness in the future. A computer that acts clearly on the basis of a limited algorithm cannot solve a non-computable problem, and if it were possible to prove that people can solve non-computable problems, then a mathematically specific difference between human intelligence and artificial intelligence would appear and, accordingly, there would be a criterion for choosing between Strong Artificial Intelligence hypothesis and Weak Artificial Intelligence hypothesis.
This article is an attempt to prove that people are capable of solving some non-computable problems.
Trying to formalize the idea of human intelligence, it is impossible to do it without the use of such concepts as information, creativity, and cognitive processes. We compile the three most well-known approaches to the information theory [6-8] in chronological order. In 1928, Hartley proposed an approach to information as a measure of changes in the diversity of a given set (combinatorial approach). In this case, information contained in any text is equal to I = n log2 m, where n is the number of letters in the message and m is the number of letters in the alphabet. Accordingly, if a text is written in a binary code, m = 2 and I = n.
In 1948, Shannon proposed a probabilistic approach to information. The amount of information in a message consisting of non-equiprobable elements, by Shannon, equals
However, Shannon himself and other scientists [8] emphasized that Shannon's formula gives the amount of information that could potentially be in a signal or text containing a fixed number of ones and zeros. This formula does not respond to the question about the actual amount of meaningful information in this signal. Some other approach is needed to determine the meaning contained in the signal.
In 1965, Kolmogorov proposed an algorithmic approach to the understanding of information. He introduced the concept of “Relative complexity” of the object “y” for a given “x” as the minimum length L(P) of the “program” “P” for obtaining “y” from “x”. Thus, the formulated definition depends on the “programming method”. The programming method is the function ^(p, x) = y, which puts in compliance the object “y” to the program P and the object “x”. Therefore, the relative complexity by Kolmogorov is
If K 9(y) = K „(y/1) is considered as simply a complexity of the object “y”, then the amount of information in “x” relative to “y” according to Kolmogorov can be determined by the formulas:
Four provisions important for the understanding of the article have proved that:
1. The Kolmogorov complexity is not computable, that is, there is no algorithm by which the Kolmogorov complexity or the minimum length of any message can be determined. The absence of an algorithm means that this complexity is infinite.
2. There is no lower bound for the Kolmogorov complexity.
In general, the Kolmogorov complexity is less than Shannon entropy (Shannon's information theory) [9].
3. With the algorithmic transformation of the studied objects, the amount of information in them does not increase (more precisely, it does not increase by more than a constant depending on the transformation algorithm). That is, if there is an object “y”, its Kolmogorov complexity is
Here A(y) is an object “y”, transformed by the algorithm A. Other scientists pointed out the fact that algorithmic information array processing does not create new information. The same conclusion is made in the book by L. Brillouin [8].
4. The Rice-Uspensky theorem states that “all nontrivial statements about programs are algorithmically unsolvable”. In fact, this theorem includes the Turing Halting Theorem, as a special case.
The difference between the approaches of Hartley, Shannon and Kolmogorov is easy to illustrate if you calculate the amount of information in any text by each of the methods. For simplicity, we will not use optimal encoding methods, but will simply enumerate letters and punctuation marks, assigning them consecutive numbers and converting the latter to binary. Each letter, sign or gap between words is written in a six-digit binary code. Then it is obvious that according to Hartley the amount of information in a text will simply be equal to the number of its letters, spaces and punctuation marks multiplied by six. According to Shannon, the amount of information in a text will be smaller, since the probabilities of meeting “0” or “1” in a text will most likely differ. According to Kolmogorov, the amount of information in a text may be even smaller. However, it is difficult to predict what specific amount of information will be determined, since there is no algorithm for calculating the Kolmogorov complexity. As a result, the amount of received information will depend on the reader's ability to cut off the noise and provide a synthesis of received information.
Whether people can solve non-computable problems is an issue of fundamental importance, though hardly provable.
Consider whether it is possible to create a program that will be able to distinguish between non-computable and computable problems.
Let us prove Theorem No. 1: “The task of selecting computable problems and non-computable problems is a non-computable problem”. We can attempt to prove it by contradiction. Let's suppose that this problem is computable. Then there is an algorithm (Selector-1) that can recognize a computable problem and a non-computable problem. But then there is another algorithm (Selector-2), which can be used to analyze the first algorithm (Selector-1) to see if it presents a computable problem or a non- computable problem. Therefore, the second algorithm (Selector-2) can describe the non-trivial properties of the first algorithm (Selector-1). However, this contradicts the Rice-Uspensky theorem. Thus, the task of selecting computable and non-computable problems is a non-computable problem.
Consider whether it is possible to create a program that will be able to invent non-computable problems.
Theorem No. 2 is obviously derived from theorem No. 1: “The task of the invention of non- computable problems is a non-computable problem”. This fact follows from the obvious thesis that the skill of inventing non-computable problems must include the skill of “recognizing” non-computable problems. But according to theorem No. 1, the latter skill is a non-computable problem.
Consider whether it is possible to create a program that will be able to create new information while processing information arrays.
This issue is closely related to theorem No. 1 and theorem No. 2 and the answer to it can be formulated as follows: Theorem No. 3: “The creation of some new information or a new algorithm while processing information arrays is a non-computable problem”. This statement automatically follows from provision No. 3 of the Kolmogorov complexity and can be easily proved by contradiction.
There is an equally important question of whether a reverse statement is true: Does the solution to a non-computable problem inevitably lead to the creation of new information? To answer this question, it is necessary to consider three possible options: 1. A solution to a non-computable problem can lead to the creation of new information; 2. A solution to a non-computable problem can lead to a decrease in the initial information; 3. A solution to a non-computable problem can lead to the creation of zero information.
The answer to the first question is positive and is given by Theorem No 3. The answer to the second question is negative, because prior to the solution to the problem, some source data has been known that remains known even after the solution. Therefore, when solving a problem, the amount of initial information does not decrease. The third question has a negative answer, too. Let's suppose that there is a non- computable problem that creates zero new information. However, this fact itself is new information, therefore, the initial assumption leads to a contradiction. Thus, there is only one sensible option: solving a non-computable problem always leads to the creation of new information. This allows us to formulate the following theorem.
Theorem No 4. A non-computable problem is a problem whose solution leads to the creation of new information.
On the basis of this theorem, the classical definition of a non-computable problem as a problem unsolvable with the help of algorithms becomes absolutely understandable, since the problems that are solved with the help of algorithms do not create new information. Another important fact follows from Theorem No 4. As it is known, the definition of the Kolmogorov complexity of an object (text) is a non-computable problem that can be solved by the definition of the shortest length of the description of the object (text) under study. According to Theorem No 4, something new must be found. Obviously, this new in this case will be knowledge about the real content of information in the object (text) under study.
Consider the problem of solving a problem in the absence of sufficient initial data.
People have to deal with such problems both in scientific research and in other areas of human activity. To begin with, consider the simplest case of a single message function, based on the Shannon- Kotelnikov theorem. As it is known, if there is a signal p(t) and p(t) = 0 at t < 0 and at t > T, and the spectrum of p(t) does not contain frequencies above fm, then the sampling theorem states that
Thus, the continuous signal limited in time and in the
completely determined by the discrete series of N of its values (N = T x 2fm + 1 = Nm + 1). Moreover, these values of Pi (samples) are taken at regular intervals, determined by the maximum frequency of the signal spectrum (At = f ).
Thus, according to the Shannon-Kotelnikov theorem, the information contained in the N samples completely determines all the parameters of the signal p(t) and allows it to be reproduced using the above algorithm. Therefore, this is a common computable problem.
Suppose now that one of the Pi samples, namely Pk, is not known. Obviously, the simplest approximation in this case will be the determination of Pi ( k-1)--error in determining P(t) will be within a few tens of percent at this point, if N >> 1. Such accuracy may be acceptable for a number of specific problems. However, if two, three or more samples are not known, then the determination accuracy will decrease more and more for any approximation algorithm and at some point it will be impossible to determine p(t) with the help of some algorithm. The problem will become non-computable, since for its solution it will be necessary to find out new information -- where the lost samples are located. And the problem is not that there is no solution, but that there is an infinite number of possible solutions.
Another task similar to the task with lost samples can be specified. This is a task with the simplest linear (or branched) logical chain, where one element is associated with one previous element and one (or several) subsequent ones. It is not difficult to develop an algorithm for assembling the whole chain if the links of the element number “i” with the elements of “i ” are known and unambiguou: ±1 However, if some elements are lost, then the problems described above arise.
In real science, such chains are rarely found, and more often, the individual facts or blocks of knowledge are connected to other such blocks not linearly, but in several dimensions. The most obvious and simple analogy is the famous children's game “Lego”. Therefore, we can call this connection of elements the Lego logic.
Obviously, the revelation of the laws governing the construction of such a structure will require some efforts involving, in particular, the Godel theorem. However, one fact can be immediately indicated. Let's suppose that there are N elements and it is known that M elements of them can be combined according to some rules into a single block. Obviously, if N is finite and the join rules are known, then a finite algorithm for assembling M elements from existing N elements can be developed. That is, such a problem is an ordinary computable problem.
Let us suppose now that M lacks any element. Apparently, the existing program can basically restore the block of elements M. However, if, for example, a large number of elements are missing and, for example, M splits into two or more sub-blocks, then the program will not be able to restore the integrity of the M block. Let us now prove the following theorem:
The Lego Theorem (Theorem No 5). Let there be N lego elements (N >> 1). Let M (M >> 1) of them be able to be combined into a single unit. Let X elements be deleted from M. The problem of determining of the minimum X, for which the assembly of a single block M becomes a non- computable problem, is a non-computable problem.
Let's prove it by contradiction. Supposedly, there is a Lego algorithm that can determine X, but then it turns out that the Lego algorithm is able to distinguish a computable problem from a non-computable problem. However, it has previously been proved that this is impossible. Can people solve the problems described above? They can. And, moreover, science largely develops precisely by restoring a holistic picture by individual experimental facts (points, samples). Let us give some classic examples on this topic.
Maxwell discovered the equations of the electromagnetic field, based on the experiments of Faraday and some other experimenters. However, to determine the correct kind of equations, Maxwell had to postulate the presence of a bias current and this allowed him to make a number of discoveries. Thereby, Maxwell was able to predict the unified nature of electromagnetic and optical phenomena.
The history of the discovery of the Mendeleev periodic table of the chemical elements is no less known. At the time of the discovery, some of the elements were unknown, and some of the elements had inaccurate parameters. Nevertheless, the correct type of Mendeleev's table of elements was determined.
The most striking example of solving a problem in the absence of sufficient data seems to be the creation of the general theory of relativity by Einstein. Since Newton's time, it had been known that the inertial mass (Fa = ma) and the gravitational mass were equal and it did not surprise anyone. However, this experimental fact alone was enough for Einstein to understand the equivalence of gravitational and inertial phenomena, and to create a general theory of relativity.
The four examples of non-computable problems given above can be solved by people. People can create new information, people can distinguish computable and non-computable problems, people can invent non-computable problems, people can solve problems in the absence of sufficient source data. The four skills shape human ability to solve non-computable problems. Therefore, it can be considered proven that people can solve non-computable problems.
In mathematics, problems that don't have a solution algorithm are called non-compu-table; in humanities, the ability of a person to solve problems that do not fit the framework of logic is called intuition.
Together with the emergence of the theory of non-computable functions in mathematics, i.e., functions that are not reducible to algorithms, there appeared an objective criterion which allowed us to distinguish between computer intelligence and human intelligence. Computer intelligence is limited to algorithms and is not capable of solving non-computable problems. A person is able to go beyond algorithms and can solve non-computable problems whose algorithms are infinitely complex. The approaches to the representation of the Hartley and Shannon information are fully algorithmic and this makes them very convenient in technical applications. The definition of information according to Kolmogorov cannot be reduced to algorithms and is significantly related to the peculiarities of human thinking. That is why Kolmogorov's approach to information is important when attempting to model a creative thinking of a human being.
Kolmogorov argues that that the complexity of information contained in an object does not change by more than a constant during algorithmic information processing. That is, computer-based information processing does not lead to the emergence of new information that was not originally contained in the original database. Then it turns out that new information can be obtained either through the interaction of a reader with some media, or through information processing performed by a thinker acting outside the algorithms and solving non-computable problems during information processing (the task in the absence of sufficient data, the task of establishing the complexity of the object (text)).
In the article, we implicitly relied on Kolmogorov's definition of information when dealing with theorems. It can be accounted for by a number of considerations.
As has already been mentioned, Shannon and many other scientists emphasized that Shannon information (mathematical information) does not allow one to calculate the amount of semantic information (semantics) in a signal (text). That is, the value calculated using Shannon's formulas is an information parameter, but it does not describe the value of the real semantic content of information. In [10] it is proposed to label Shannon information as the “capacity of information packaging”. The authors of the article believe that information capacity is a better term to describe the aforementioned phenomenon. Therefore, the Shannon formulas allow us to determine how much semantic information can potentially be written in a given set of zeros and ones, but it is quite possible that nothing is written. That is, according to the algorithms of Shannon, you can determine information capacity of a text or signal, which can be filled with semantic information. But whether this capacity is used must be determined with the help of some other approaches. And these approaches should enable one to determine the presence of semantic information in the information volume, which presupposes the presence of such a skill as “Understanding”. It is obvious that any text (object) includes not only meaningful information, but also informational noise. You can distinguish the first from the second only by understanding the meaning of the text.
“Understanding” the content of an object (text) obviously implies the possibility of presenting this content in a brief form. And this brief record should be understood as meaningful information in the original information object. Thus, an attempt to formalize the term “meaningful (semantic) information” automatically leads to the requirement of “Understanding”, and understanding makes it possible to single out a brief, semantic, informationally meaningful content of a text. The amount of brief semantic content of a text, expressed in a binary code, can be defined as a quantitative parameter determining the content of semantic information in a text. Thus, in order to determine the amount of semantic information in a text, it is necessary to understand the content of this text and then put it in a short form, retaining the original meaning. The length of this extremely brief presentation will correspond to the amount of semantic information in the original text. Since there are no algorithms for “Understanding”, it is obvious that the determination of the amount of semantic information in a text is a non-computable problem.
Comparison of the final way of calculation of the quantitative measurement of the value of semantic information in a text to the calculation of Kolmogorov information shows their identity. The calculation of Kolmogorov information is done according to the shortest possible presentation of a text and this is a non-computable problem.
The calculation of semantic meaningful information is also done with the minimal amount of a text and you need “Understanding” for this. Thus, when the equivalence of the concepts “non-computable problem” and “Understanding” is recognized, it is clear that Kolmogorov's approach to information is an approach that provides the study of semantic information. If we try to consider “Understanding” in terms of algorithms, then it is obvious that “Understanding” cannot be reduced to limited algorithms, since it is “Understanding” that allows to distinguish non-computable problems from computable ones, allows to invent new non-computable problems, allows to solve problems in the absence of sufficient data, allows to establish the amount of semantic information when finding Kolmogorov text complexity. Thus, in Kolmogorov's terms, the Kolmogorov complexity of “Understanding” is infinite.
Treating Shannon information as the information capacity of a text (object), and Kolmogorov information as semantic information contained in the same text, we clearly see why the top estimate for Kolmogorov information is Shannon information.
The recording of scientific information becomes more and more compact over time and includes an increasing amount of information. That is, science in its development closely follows the precepts of the monk Occam, who proposed “not to multiply the essence without necessity (“Occam's razor ”).
It can be said that a scientific discovery in its formulated form is a brief record of information describing a phenomenon. On the other hand, Kolmogorov complexity is determined by the length of the shortest description of an object or a phenomenon. Comparing these two approximate definitions, we can say that the discovery formula is the Kolmogorov record. But then the problems caused by the fact that Kolmogorov complexity is non-computable should be extended to any formula. Thus, to accomplish a scientific discovery, there needs to be a thinker who is able to summarize a huge amount of scattered information, understand what data is missing and be able to formulate the shortest possible recording of this information. This record will be a discovery formula and at the same time a Kolmogorov record, and the thinker will prove his ability to solve non-computable problems.
In this case, it is obvious that in order to make scientific discoveries, to invent new algorithms in programming, in business, politics, art, we need a thinker, a creator who is able to use not only algorithmic, but also not algorithmic methods. Most often these methods are called intuition. How can you create such a thinker? We formulate a fairly obvious idea, which we will call the theorem on the creation of a thinker.
Theorem № 6. “A thinker capable of solving non-computable problems cannot be created without using non-algorithmic methods. Therefore, the task of creating a thinker who is capable of solving non- computable problems is in itself a non-computable problem”.
Let us prove it by contradiction. Supposedly, there are algorithms that allow you to create a thinker who can solve non-computable problems. But then it turns out that in the end, these source algorithms provide a solution to a non-computable problem. That is, assuming that such algorithms exist, we came to a contradiction.
Consequently, there are no algorithms for creating such a thinker, which was to be demonstrated. Consequently, it is impossible to create human-like artificial intelligence using direct programming methods.
There are no finite algorithms for making creative thinkers; however, such thinkers regularly arise in human environment. The authors of [5] express a number of suppositions why a child growing up in a complex human environment can leam to think creatively. The conditions which enable it are formulated as follows: a complex mind, a developed body, developed sense organs, instinctively motivated behaviors, complex social environment, ability to acquire and process information, openness to external information and sensory flows.
Only an experiment can show whether the listed conditions are sufficient for creating artificial intelligence that is not inferior to human intelligence.
Only when artificial intelligence whose creativity is not inferior to human creativity is created will it be possible to truly investigate the mechanisms of “Understanding” and say that we know how human thinking works.
The authors express their sincere gratitude to N. G. Gatina and A. E. Denisov for their numerous and helpful comments on the article.
СПИСОК ИСПОЛЬЗОВАННОЙ ЛИТЕРАТУРЫ
1. Russell S. J., Norvig P. Instructor's Manual: Exercise Solutions for Artificial Intelligence A Modern Approach. -- Ed. 2. -- New Jersye, 2003.
2. Penrose R., Lukas J. Shadows of the Mind : An Approach to the Missing Science of Consciousness. -- Oxford : Oxford University Press, 1994.
3. Markram Н. The Blue Brain Project. Nature Peviews Neuroscience. -- 2006. -- Vol. 7. -- Pp. 153160.
4. Frue J. Rajagopal Ananthanarayanan, and Dharmendra S. Modha “Towards real -- time, mouse -- scale cortical simulations” // Computational and Systems Neuroscience (CoSyNc). Salt Lake City (Utah). -- 2007, Feb. 22
25. [Also appears as IBM Reserch Report RJ 10404 (PDF, 87 KB), 2/5/2007].
5. Sheroziya G. A., Sheroziya M. G. The Human Mind Originating from Networks of Artificial Logical Elements -- Introduction into the Project of Creating the New Man. -- Ryazan : PRIZ, 2013. -- 280 p.
6. Shannon С. Е. A Mathematical Theory of Communication. Bell Syst. Techn. Joum. -- 1948. -- Vol. 27. -- Pp. 379-423, 623-656.
7. Kolmogorov A. N. Three Approaches to the Definition of the “Amount of Information” // Problems of Information Transmission journal. -- 1965. -- Vol. 1 (1). -- Pp. 3-11.
8. Brillouin L. Science and Information Theory. -- New York : Academic press inc. Publi-shers, 1956.
9. Vereshchagin N. K., Uspensky V. A., Shen A. The Kolmogorov Complexity and Algorithmic Randomness. -- M. : MNNMO, 2013. -- 576 p.
10. Korogodin V. I., Korogodina V. L. Information as the Basis of Life. -- Dubna : Publishing Center “Phoenix”. -- 208 p.
REFERENCES
1. Russell S. J., Norvig P. Instructor's Manual: Exercise Solutions for Artificial Intelligence A Modern Approach. Ed. 2. New Jersye, 2003.
2. Penrose R., Lukas J. Shadows of the Mind: An Approach to the Missing Science of Consciousness. Oxford, Oxford University Press, 1994.
3. Markram H. The Blue Brain Project. Nature Peviews Neuroscience. 2006, vol. 7, pp. 153-160.
4. Frue J. Rajagopal Ananthanarayanan, and Dharmendra S. Modha “Towards real -- time, mouse -- scale cortical simulations”. CoSyNc: Computational and Systems Neuroscience. Salt Lake City (Utah), 2007, Feb. 22-25 [Also appears as IBM Reserch Report RJ 10404 (PDF, 87 KB), 2/5/2007].
5. Sheroziya G. A., Sheroziya M. G. Chelovecheskij razum, rozhdennyj v setyakh iskusstvennykh logicheskikh ehlementov -- vvedenie v proekt sozdaniya novogo cheloveka [The Human Mind Originating from Networks of Artificial Logical Elements -- Introduction into the Project of Creating the New Man]. Ryazan, Priz Publ., 2013, 280 p. (In Russian).
6. Shannon C. E. A Mathematical Theory of Communication. Bell Syst. Techn. Joum. 1948, vol. 27, pp. 379423, 623-656.
7. Kolmogorov A. N. Three Approaches to the Definition of the “Amount of Information”. Problemy peredachi informatsii [Problems of Information Transmission]. 1965, vol. 1, iss. 1, pp. 3-11.
8. Brillouin L. Science and Information Theory. N. Y., Academic press inc. Publishers, 1956.
9. Vereshchagin N. K., Uspensky V. A., Shen A. Kolmogorovskaya slozhnost' i algoritmicheskaya sluchajnost' [The Kolmogorov Complexity and Algorithmic Randomness]. Moscow, MTSNMO Publ., 2013, 576 p.
10. Korogodin V. I., Korogodina V. L. Informatsiya kak osnova zhizni [Information as the Basis of Life]. Dubna, Phoenix Publ., 208 p.
Размещено на Allbest.ru
...Подобные документы
Confucianism as the creation of a harmonious society in the ancient pattern, in which every person has a function. Creativity and the ability of a person to self-renew as a guarantee of human constancy. Methods of constructing harmonious society.
эссе [14,0 K], добавлен 10.01.2014History of the Foreign Intelligence. The variety of views of various historians on the social nature of intelligence and espionage. Structure of the U.S. intelligence community. Legislation on intelligence. Brief details of the persons who headed the CIA.
реферат [20,6 K], добавлен 24.06.2010History of the Foreign Intelligence. Structure of the U.S. intelligence community. Legislation on intelligence. Essence of soldiery and state secrets. The intelligence organizations of the Ministry of Defense. within the U.S. civilian agencies.
реферат [20,5 K], добавлен 23.06.2010Методология, технология и архитектура решения SAP Business Objects. Возможные действия в Web Intelligence. Создание документов и работа с ними. Публикация, форматирование и совместное использование отчетов. Общий обзор приложения, его интерфейсы.
курсовая работа [1,4 M], добавлен 24.09.2015The history of Human Rights Watch - the non-governmental organization that monitors, investigating and documenting human rights violations. Supportive of a diverse and vibrant international human rights movement and mutually beneficial partnerships.
презентация [1,6 M], добавлен 12.03.2015Research of the main representatives of prose XX of century. Consideration of similarity and distinction genres of leading writers Conrad and Somerset. The analysis of products "Human bondage" and "Human heart" as symbols of a wave of human development.
курсовая работа [74,8 K], добавлен 08.04.2010The main theories in the field of human origin, their basic content and direction of research. Basic stages of human development from the primitive to the modern form of the form. Character change erectus skeleton human time frame of the process.
презентация [614,1 K], добавлен 26.09.2014As is generally known, science and education are one of resources of the state, one of fundamental forms of culture of civilization, as well as competitive advantage of every individual. Basics of general theory of systems (GTS) and systemic analysis.
аттестационная работа [197,5 K], добавлен 13.10.2008Creation history International Partnership for Human Rights. Projects aiming to advance the rights of vulnerable communities, such as women, children, migrants and minorities, who are subject to human rights abuses in different parts of the world.
презентация [472,6 K], добавлен 04.10.2012Create a source of light in Earth orbit. Energy source for the artificial sun. Development of light during nucleosynthesis. Using fusion reactors. Application lamp in the center of a parabolic mirror. Application of solar panels and nuclear reactors.
презентация [2,7 M], добавлен 26.05.2014Employees are an important component of every business. Every human resource staff must perform series of functions: recruiting, selecting employees, training, developing workers, and appraising employment performance. Internal and External Recruitment.
контрольная работа [20,5 K], добавлен 28.04.2010The requirements of human rights. The rights to life and liberty. Impact In Terms Of Substantive Law. Procedure or Levels of Damages in the Field Of Health Law. Effects of Traditional Practices on Women and Children. Traditional Childbirth Practices.
реферат [16,0 K], добавлен 27.01.2012Consideration of the mass media as an instrument of influence on human consciousness. The study of the positive and negative aspects of the radio, television, press, magazines, Internet. Advantages and disadvantages of the media in the Great Britain.
дипломная работа [2,3 M], добавлен 14.10.2014Association first human with other animals. Mystical feelings toward animals and it’s reflected in folktales. Many wild animals have been exterminated, eliminated habitats for their. Domestication of animals, their use for working out of medicines.
презентация [1,7 M], добавлен 19.01.2012Classification of the resistance. External and internal barnry protecting the human body from pathological factors of the environment. The chemical composition of the blood, its role and significance. Influence the age on individual reactivity progeria.
презентация [4,5 M], добавлен 17.10.2016Определение и сущность Business Intelligence. Возможности BI-систем и оценка их функционала, используемые методы и роли. Характеристика, миссия и цели организации, анализ ее макросреды. SWOT-анализ исследуемого автосалона и оценка его внешней среды.
курсовая работа [231,1 K], добавлен 20.06.2014Selected aspects of stimulation of scientific thinking. Meta-skills. Methods of critical and creative thinking. Analysis of the decision-making methods without use of numerical values of probability (exemplificative of the investment projects).
аттестационная работа [196,7 K], добавлен 15.10.2008Legal linguistics as a branch of linguistic science and academic disciplines. Aspects of language and human interaction. Basic components of legal linguistics. Factors that are relevant in terms of language policy. Problems of linguistic research.
реферат [17,2 K], добавлен 31.10.2011Классификация информационных систем управления деятельностью предприятия. Анализ рынка и характеристика систем класса Business Intelligence. Классификация методов принятия решений, применяемых в СППР. Выбор платформы бизнес-интеллекта, критерии сравнения.
дипломная работа [1,7 M], добавлен 27.09.2016The constitution, by the definition of K. Marx, the famous philosopher of the XIXth. Real purpose of the modern Constitution. Observance and protection of human rights and a citizen. Protection of political, and personal human rights in the society.
реферат [19,2 K], добавлен 10.02.2015