Professor Hawking’s view on the apocalypse
Predictions of scientists about the threats of the apocalypse that may be realized during this millennium. Materials from the speeches and publications of S. Hawking, theoretical physicist and cosmologist regarding the problem of the end of the world.
Рубрика | Биология и естествознание |
Вид | статья |
Язык | английский |
Дата добавления | 25.12.2022 |
Размер файла | 131,8 K |
Отправить свою хорошую работу в базу знаний просто. Используйте форму, расположенную ниже
Студенты, аспиранты, молодые ученые, использующие базу знаний в своей учебе и работе, будут вам очень благодарны.
Размещено на http://www.allbest.ru/
Professor Hawking's view on the apocalypse
N. Ostrovets, T. Matvieieva, M. Chursanova
National Technical University of Ukraine «Igor Sikorsky Kyiv Polytechnic Institute»,
Kyiv, Ukraine
Abstract
hawking predictions apocalypse
The paper presents an overview of probable apocalypse scenarios from the point of view of modern scientists. According to their predictions, the ancient myth may become a reality even before the end of this millennium if humanity does not take actions to prevent the possible catastrophe. The purpose of our work is to analyze those world tendencies and areas of modern technology that, with uncontrolled development, can pose a threat to the existence of life on Earth. Stephen Hawking, the world-known theoretical physicist and cosmologist, was the first to raise the issue of the end of the world on a scientific level. The paper reviews materials from his public speeches and interviews, publications and projects that identify the potential dangers of the apocalypse, as well as initiatives taken by the scientist to save humanity. According to Hawking, one of the greatest threats is man-made artificial intelligence (AI). The theory of technological singularity states that AI develops much faster than the pace of biological evolution and can overtake humans within the next 100 years, and also it will be able to create and improve its own kind. The main problem nowadays is to design a reliable control system over AI. Hawking, along with tens of thousands of scientists and experts, has signed an open letter claiming it unacceptable to give artificial intelligence to military weapons. The use of autonomous AI arms poses a threat of uncontrollable warfare and is potentially more dangerous than nuclear warheads. Another threat is intrusion of a more advanced extraterrestrial civilization, as Hawking has warned. He has taken part in the launch of the Breakthrough Listen project monitoring space objects with powerful radio telescopes, but no intelligent alien signals have been detected so far. The more real problem is the climate change on Earth, with the unceasing trend of global warming, as evidenced by reports from the World Meteorological Organization. The accelerating tendency of the greenhouse gases emission can make the Earth's climate similar to Venus. Therefore, the most urgent requirements for all branches of science and engineering are the search for new environmental technologies. No less dangerous are genetically engineered viruses. Development of biological weapons, genetic modifications of viruses can get out of control and destroy human immunity. Viruses evolve even without human influence, forming new incurable strains, and nowadays we are witnessing the reality of such a threat during the COVID-19 pandemic. It's been 20 years ago when Hawking has warned that viruses could destroy humanity rather than an atomic bomb. Thus, the considered catastrophes are the result of humankind development and they may be the cause of its annihilation. The most important responsibility of the scientific community is to prevent such scenarios. Moreover, Hawking suggests that the key to the survival of human civilization is development of colonies in outer space.
Keywords: apocalypse, artificial intelligence, global warming, virus, Stephen Hawking.
Анотація
Н.O. Островець, Т.В. Матвєєва, М.В. Чурсанова
Національний технічний університет України «Київський політехнічний інститут імені Ігоря Сікорського», м. Київ, Україна
АПОКАЛІПСИС З ТОЧКИ ЗОРУ ПРОФЕСОРА ГОКІНГА
В роботі представлено огляд загроз апокаліпсису, що за прогнозами науковців можуть реалізуватися вже протягом цього тисячоліття. Розглянуто матеріали з виступів, інтерв'ю, публікацій С. Гокінга, всесвітньо відомого фізика-теоретика та космолога, в яких озвучив проблему кінця світу на науковому рівні. За Гокінгом, однією з найбільших загроз є штучний інтелект. Він розвивається набагато швидше за темпи біологічної еволюції та скоро зможе самостійно відтворювати собі подібних. Вченими було підписано відкритий лист, що закликає ні в якому разі не наділяти штучним інтелектом військову техніку, адже при виході з-під людського контролю це буде більш небезпечним, ніж ядерна зброя. Серед інших можливих сценаріїв апокаліпсису Гокінг виділяв вторгнення позаземної цивілізації, зміни клімату, поширення генно-модифікованих вірусів. Зростаючі темпи глобального потепління, викликані парниковим ефектом, можуть зробити клімат Землі схожим на Венеру. А розробки біологічної зброї за допомогою генної інженерії можуть випустити вірус, який знищить людський імунітет. Віруси еволюціонують навіть без людського втручання, і реальність такої загрози ми спостерігаємо зараз під час пандемії COVID-19. Розглянуті в роботі катастрофи є результатом розвитку людства і можуть стати причиною його знищення. Задачею наукової спільноти є не допустити таких сценаріїв. Більш того, запорукою виживання людської цивілізації Гокінг вважав створення колоній у космосі. Ключові слова: апокаліпсис, штучний інтелект, глобальне потепління, вірус, Стівен Гокінг.
Introduction
hawking predictions apocalypse
The question of apocalypse has always been worrying minds of mankind. Every century counts a hundred of prophets who are convinced of their rightness. Numerous science fiction writers have offered their version of events. First of all, the end of the world is supposed to be a global catastrophe on a planetary or above-planetary scale, which leads to the destruction of human civilization. In the scientific sense, this phraseology often means the destruction of all existing things, that is, the whole material world. One can recall several thousand announced dates of the end of the world. The last time humanity prepared for the apocalypse was in 2012, based on the Mayan calendar, and that had caused spreading of mass panic in many countries. However, superstition aside, scientists also say their word about the probable apocalypse and are actively exploring this problem, and their opinions should be listened to.
Stephen Hawking, the world renown British scientist, theoretical physicist and cosmologist, was the first to raise the issue of the end of the world on a scientific level. He has alarmed the whole world with his statement about when the world may collapse. For the first time he has announced that at a student meeting in Oxford, and then for many years he has been on guard of possible risks for the future of human race. The aim of our work is to analyze those world tendencies and areas of modern technology that, according to Hawking's predictions, can pose a threat to the existence of life on Earth in case of their uncontrolled development. The paper reviews materials from Hawking's public speeches and interviews, publications and projects that identify the potential dangers of the apocalypse, as well as initiatives taken by the scientist to save humanity. Also, our paper overviews the current state of progress of the threatening trends. Modern scientific community should pay attention to those problems and prevent the events when future of the human civilization gets into trap because of reckless turn in the course of scientific and technological progress. “We are not going to stop making progress, or reverse it, so we must recognize the dangers and control them. I'm an optimist, and I believe we can do it”, Hawking has said in an interview with the BBC.
Stephen Hawking is famous for his revolutionary approach in cosmology, which is a union of the general theory of relativity and quantum mechanics. His scientific works in theoretical physics concern the gravitational singularity theorems in the framework of general relativity; he has theoretically predicted radiation from black holes explained by the quantum effects near the black hole event horizon, which brings the idea of singularity reactor with the black hole evaporation serving as the energy source. Hawking has noted that over the past 50 years' science has developed rapidly, and humanity has come closer to understanding the universe. He's expressed hope that in the near future researchers will be able to use gravitational waves to see how the big bang has been happening.
Stephen Hawking has also contributed a lot into the popularization of high science and making it available to people [14]. His books «A Brief History of Time: From the Big Bang to Black Holes», «The Universe in a Nutshell», “Brief Answers to the Big Questions» and others discuss basic concepts of modern physics and theory of evolution of the Universe. They touch the question of how the world will end, and dangers for the future of life on our planet Earth in particular.
The present study covers a chronological period from early 2000's, when Stephen Hawking has publicly initiated the question about extinction of the human civilization, and up to date, when even after the scientist's death in 2018 we can see how his predictions proceed. Professor Hawking believed that human race must find the way to travel in space and colonize additional planets before the apocalypse scenario would happen on Earth. «I am not sure that the human race will sustain for at least another thousand years if it does not find the opportunity to escape into space. There are many scenarios of how all living things on a small planet can die. But I am an optimist. We will definitely reach the stars,” the scientist has stated in his interview with the «Daily Telegraph» back in 2001. Professor Hawking has suggested and argued several possible options for the Earth apocalypse, urging scientists around the world to explore space actively. After all, in his opinion, colonization of space is the only chance to save humanity.
Let us consider some of the most threatening options for the future of mankind according to Stephen Hawking.
Artificial Intelligence (AI). «I think that computer viruses should be considered as a new form of life. Maybe it says something about human nature, that the only form of life we have created so far is purely destructive. Talk about creating life in our own image». Stephen Hawking said in his book «A Brief History of Time» [4]. Many people agree with such an opinion, believing that humanity, despite its high technological development, is lagging behind moral principles.
Back in 2014, in an interview with the British daily «The Independent» Stephen Hawking has stated: «Success in creating AI would be the biggest event in human history. Unfortunately, it might also be the last, unless we learn how to avoid the risks». Talking of risk, the scientist has meant the impossibility of competition between robot and human. Artificial intelligence is dangerous because its development is much faster than the pace of biological evolution of humans (Fig. 1). In addition, the use of AI in military equipment makes it more autonomous. The result may bring more brutal warfare and the mass extermination of civilians.
Proponents of the technological singularity theory (a hypothetical explosive increase in the speed of scientific and technological progress, which is likely to occur due to the creation of artificial intelligence and machines capable of self-reproduction) believe that artificial intelligence will be able to create its own kind by itself and to improve it, and humanity will not be able to control it [13]. «Computers will overtake humans with AI at some within the next 100 years. When that happens, we need to make sure the computers have goals aligned with ours» Hawking has said. Elon Musk, who is the co-founder of the rocket company SpaceX and the electric vehicle and clean energy company Tesla, and who takes part in the board of advisors of the Future of Life Institute, is concerned about AI along with Stephen Hawking. In his opinion, «uncontrolled artificial intelligence is potentially more dangerous than nuclear warheads.»
Here are some quotes of Elon Musk, where he clearly expresses his views on the dangers of artificial intelligence and suggest that the electronic mind can go out of human control: «With artificial intelligence we are summoning the demon. You know all those stories where there's the guy with the pentagram and the holy water and he's like... yeah, he's sure he can control the demon, but it doesn't work out. I think we should be extremely careful with artificial intelligence. If I were to guess at what our biggest existential threat is, it's probably that. I'm increasingly inclined to think there should be some regulatory oversight, maybe at the national and international level just to make sure that we don't do something very foolish». Earlier, E. Musk has stated that people simply do not understand how fast artificial intelligence progresses and what threat it can pose.
However, there are opponents to this theory. Steve Wozniak, the co-founder of Apple corporation, who previously has supported the technological singularity theory, believes that robots will forever remain only a small addition to human, because they are not able to think intuitively.
Creation of effective AI systems is not a problem nowadays, there is enough of such developments in the world. The problem is in the absence of new approaches for creating control system over the artificial intelligence, which would be primarily ethical.
Human aggression. Professor Hawking believed that with such a rapid technical development, human aggression can contribute to the collapse of human civilization. Aggression is the instinctive behavior of humans and animals, which arises in order to protect themselves and their species. «The human failing I would most like to correct is aggression. It may have had survival advantage in caveman days, to get more food, territory or partner with whom to reproduce, but now it threatens to destroy us all,» Hawking said for «The Independent».
Figure 1. Reduction of time intervals between evolutionary events (biological and technological evolution) [13]
This vague and seemingly abstract warning has a strong scientific basis behind it. In the last century, the Austrian biologist Konrad Lorenz had scientifically explained the nature of human aggression, for which he had received the Nobel Prize. His theory states that aggression is directed against the members of the same species. It is aimed at the struggle for survival, destruction of competitors and fueled by primitive instincts [8]. Thus, the essence and purpose of all conflicts and wars have not changed over the centuries, only the weapons and methods of warfare have changed.
At the International Joint Conferences on Artificial Intelligence 2015, Stephen Hawking along with thousands of scientists and AI researchers has initiated an open letter urging not to give artificial intelligence to military weapons [1]. Considering the current rate of AI development, it may take just several years to construct autonomous weapons, like quadcopters or drones, that are able to select and engage targets on their own according to the pre-defined criteria. Unlike nuclear weapons, AI arms do not require expensive or hard- to-obtain raw materials, so their mass production would be cheap and fast to establish. If such weapons go out of human control, reproducing their own kind, chasing and selectively killing particular groups of people, or if they get into the hands of terrorists or aggressive dictators, that would start a catastrophe on a worldwide scale.
Autonomous weapons bring the third revolution in warfare, after gunpowder and nuclear arm. If «dead» machines come to the battlefield instead of human soldiers, that will turn all the principles of warfare upside down. The battlefield will be maximum fierce, because the moral factor will disappear. Military generals will lose any fear and will be tempted to use AI arms to the maximum, for example, for destroying certain ethnic groups, rebellious people, and so on. While possible human losses on their own side will be completely ignored [1].
At one-point military AI technology may come to the conclusion that it does not need human control. In such a case many science fiction writers will be reckoned as prophets. There will be killer robots who choose targets on their own, without direct human commands, and later it will turn against all mankind. The game «Human vs Robots» has a chance to become a cruel reality.
That's why tens of thousands of world's leading experts in AI and robotics technology have already signed the open letter and declared that they are not interested in building AI weapons and that starting military AI arms race must be banned. Development of AI technology has great potential to benefit humanity in many peaceful areas if directed in the right way and not allowed beyond meaningful human control [1].
Advanced extraterrestrial civilization. «An advanced extraterrestrial civilization can destroy humanity, but let's keep looking for it. It would be better if we find them first», Hawking has said after becoming part of the «Breakthrough Listen» program (Yuri Milner's project to search for intelligent extraterrestrial communications in the universe). «The older I get, the more I believe that we are not alone in the universe. If so, we need to think about future. Most likely, the aliens are more advanced, and our first contact from an advanced civilization could be equivalent to when Native Americans first encountered Christopher Columbus. You remember, things didn't turn out so well for those who greeted the guests», Hawking has warned [14].
Hawking's warning has caused a storm of emotions. The royal astronomer Lord Rees has said in an interview with «The Sunday Times» that he does not exclude the possibility that aliens are already among us, but we don't guess. «I suspect that extraterrestrial life and the mind can take incomprehensible forms», he has said. «Just as a chimpanzee can't understand quantum theory, it could be that there are aspects of reality that are beyond the capacity of our brains.» [4]
Nowadays scientists have no idea how this contact could happen. However, Hawking has reminded about the way humans treat less developed beings (e.g. animals), or about the history of clashes between developed and less developed civilizations (conquistadors and local aborigines) [5]. Extraterrestrial civilization that would discover Earth probably overtakes humanity billions of years ahead and would claim our planet as another colony.
However, Jill Tarter, director of the SETI Research Center, which is also searching for the extraterrestrial civilizations, considers that Hawking's fears are for nothing. «If aliens were able to visit Earth, that would mean they would have technological capabilities sophisticated enough not to need slaves, food or other planets. I hope that extraterrestrial civilizations are not only more technologically advanced than we are, but also more aware of the values and rarity of life in space,» Tarter has said.
In the Fall of 2016, Yuri Milner, Stephen Hawking and Mark Zuckerberg announced that they are starting to listen to a potentially inhabited planet from the star Proxima Centauri, which is the third star in the Alpha Centauri system and the nearest neighbor to the Sun. That was the beginning of the «Breakthrough Listen» project searching for alien radio communications. Nowadays it uses radio telescopes targeted to monitor over 1,000,000 stars and galaxy centers in search for artificial radio or laser signals. In 2020, it has been reported about the detected narrowband radio signal from the Proxima Centauri direction, which is the first candidate for intelligent alien communication. But it is not proven yet if it really has extraterrestrial origin, and scientists are working to exclude terrestrial interference [10].
It should be mentioned, that the greatest dream of Stephen Hawking was to travel into space, and in 2015 Richard Branson, the founder of Virgin company, had offered him a free flight with Virgin Galactic. Unfortunately, that dream did not come true before Hawking death, but he had put a lot of effort to support the program of commercial space flights, considering space programs important for collaboration between different countries of the world and inspiring for future generations.
Venus Earth with an average temperature of 250 degrees Celsius. These days you can hardly find someone who have not heard about global warming. The problem of global average temperature increasing due to the greenhouse gases accumulating in the atmosphere is constantly raised at various meetings on the environmental problems. Since the Industrial Revolution, emission of such gases as carbon dioxide, methane, nitrous oxide, ozone and others is larger and larger every year. Accumulating in the atmosphere, they absorb infrared heat irradiation emitted by the Earth surface, while being transparent to visible light, and then reflect it back to Earth acting like greenhouse walls. That leads to the increase in the atmospheric temperature towards the Earth's surface [12]. According to the annual reports of the World Meteorological Organization (WMO), the past six years, including 2020, have been the six warmest years on record [15]. In 2016 the global mean temperature has been 1.1 °C above the pre-industrial levels, while in 2020 it is 1.5 °C to 2 °C above (see Fig.2). In order to stabilize the situation, essential reduction of greenhouse gas emissions is required, because even slight increases in average global temperatures can have huge negative effects.
When in 2017 US president Trump had announced decision to withdraw from the Paris climate agreement to reduce CO2 levels, Stephen Hawking could not stand aside. «We are close to the tipping point where global warming becomes irreversible. Trump's action could push the Earth over the brink, to become like Venus, with a temperature of 250 degrees, and raining sulphuric acid,» he told in the interview with the BBC News.
The most dangerous effect of the global warming is the accelerating tendency of the ocean temperature rising, which leads to the glaciers melting and sea-level raising. Ocean absorbs the majority of the excessive heat energy which is accumulated on Earth due to the greenhouse effect. The growth of the ocean heat content results in the thermal expansion of water and brings additional water volumes due to the acceleration of melting of the ice sheet, which both lead to the rise in global mean sea level. Currently, it continues to rise with the average trend 3.3 ± 0.3 mm per year, making coastal territories going under water. According to the recent WMO reports, Antarctica loses approximately 175 to 225 Gt of glacier ice per year, being the main cause of the accelerated sea-level rise. Moreover, as the concentration of CO2 in the atmosphere rises, so does the concentration of CO2 in the oceans, causing the water acidification [15].
Accompanying permafrost melting also should be included when considering the climate change models. According to the experts, subsea permafrost contains large stocks of organic carbon and billions of tons of carbon dioxide and methane. If ocean temperatures rise at the current rate, then emission of these gases from marine permafrost will reach hundred billion tons per year. That is about four times greater than the typical average annual CO2 emissions from industry, and multiplication of the global warming tendency will become catastrophic [11].
So, now we are on a completely unexplored climate platform, which gives no chances for an optimistic attitude. It requires the fastest and the most rational actions to stop the catastrophe. “Climate change is one of the great dangers we face, and it's one we can prevent if we act now”, Hawking has said. Therefore, the most urgent requirements set for all branches of science and engineering are the search for new, primarily environmental technologies. The current ecological crisis is constantly globalizing, taking the worldwide scale and poses an extraordinary threat to the existence of the natural environment and humanity in general. That threat is the factor that should unite people to prevent their own self-destruction [3]. Ignoring the problem does not help to solve it.
Genetically engineered virus. Extinction of the human race because of a virus is another possible apocalypse scenario. Despite the bans set by the international Biological Weapons Convention, researches on genetically engineered biological weapons still are held in laboratories throughout the world. «In the long term, I am more worried about biology... genetic engineering can be done in a small lab. You can't regulate every lab in the world. The danger is that either by accident or design, we create a virus that destroys us,» Stephen Hawking has said in an interview with «Telegraph».
Modern advances in biotechnology can be used to create and genetically enhance dangerous pathogens for hostile purposes, while combining with AI and robotics technologies can be used to deliver them to specific targets, like particular groups of people and even individuals. That is, AI could enable automation of developmental or production steps in design of pathogens that would affect only specific individuals of groups of people, while nanorobots could enable the delivery of biological agents to specific cells in the human body [2]. «Quick victories at minimal cost» is the dream of many states with aggressive policy. The rate of biological weapons improvement is proportional to the improvement of defense capabilities of the most developed countries. Genetic engineering can work for positive purposes like fighting against cancer cells, but at the same time it can create a virus that destroys the human immune system, or create a gene that controls the addict's desire to receive another dose of drugs, or even provoke certain actions.
Viruses evolve even without artificial influence, going out of human control and forming new incurable strains, and nowadays we are witnessing the reality of such a threat during the COVID-19 pandemic. Back in 2001, Stephen Hawking has warned about similar scenario in the interview with «Telegraph»: «The human race will disappear before the end of this millennium probably because of a virus and not an atomic bomb if no human colonies are set up in outer space.» According to Hawking, genetic engineering should be rather used to make humans more suitable for space travel and adaptable to inhabit space.
Discussion. Analyzing the prevailing tendencies in modern science and engineering, one can conclude that the world community is starting to listen to the apocalyptic warnings of scientists, and the great work begun by S. Hawking is giving results. It is the problem of saving life on our planet that sets the most topical challenges and trends in the development of today's science.
There are agreements reached at the international level to ban the building of AI weapons, to ban biological weapons, to establish new environmental standards, and the most important thing is to comply with these decrees. For example, there are emission standards accepted in the USA, Europe, Japan, Korea. The European Commission is already developing the next emission standard Euro 7/VII for vehicles, which comes into force in 2025. According to the Euro 7/VII, it is planned to reduce emission of nitrogen oxide NOx more than twice and carbon dioxide CO2 more than three times [6]. Such strict rules are necessary to save ecology of the Earth, but they are impossible to execute with internal combustion engines. That poses an urgent challenge to science and engineering to bring technologies in the automotive industry to a revolutionary new level, with total hybridization/electrification and the use of innovative approaches.
Elon Musk, who's been like-minded with Stephen Hawking, continues to share the same ideas. In November 2020, his company Tesla together with other American companies has founded the Zero Emission Transport Association (ZETA) which aims to zero pollution, when a vehicle must operate with no emission in any driving situation [16]. The «Zero pollution ambition» should be a model for other industries as well. At the sametime, the SpaceX company is working to bring low-cost and availability to space flights, and Elon Musk is planning major investments to build the first autonomous city on Mars [9].
Conclusions
The catastrophes considered in the present paper are the result of humankind development and may be the cause of its annihilation. The current stage of the global crisis is associated with the rise of the technical power of mankind, and that very power makes it destructive as never before [6]. Given the pace of development of industrial and military technologies, humanity is already operating with factors that could lead to a global catastrophe, and the most important responsibility of the scientific community is to prevent the apocalypse scenarios as long as possible.
Moreover, in order to save the human race, professor Hawking has strongly recommended to go into outer space and colonize one of the nearest star systems, like Proxima Centauri. The project of colonization of Mars is already started by the SpaceX company. Mankind cannot stop scientific and technological progress, but we can do everything to lead it in the right direction, for the protection of life on Earth and future of human civilization.
References
hawking predictions apocalypse
1. Autonomous weapons: an open letter from ai & robotics researchers (2015). Retrieved from: https://futureoflife.org/open-letter-autonomous-weapons.
2. Brockmann, K., Bauer, S., Boulanin, V (2019). BIO PLUS X: Arms Control and the Convergence of Biology and Emerging Technologies. Stockholm International Peace Research Institute (SIPRI), 44 p.
3. Goncharenko, M. M. (2014). Ekokryza v strukturi hlobal'nykh problem suchasnosti [Ecocrisis in the structure of global problems of today]. Fylosovskoe spysanye, No 2, pp. 14-18 (in Ukrainian).
4. Hawking, S. (2015). A Brief History of Time: From the big bang to black holes. Kiev: K.I.S, 201 p. (in Ukrainian).
5. Hawking, S., Mlodinov, L. (2016). The shortest history of time. Kharkiv: Klub simeynoho dozvillya, 160 p. (in Ukrainian).
6. ICCT's comments and technical recommendations on future EURO 7/VII emission standards(2021). The international council on clean transportation, Berlin, May 7, 2021, 28 p. Retrieved from: https://theicct.org/sites/default/files/eu-commission-euro-7-and-VI-may2021.pdf.
Размещено на Allbest.ru
...Подобные документы
Types of microorganisms. Viruses consist of genetic materials. Bacteria are organisms made up of just one cell. Algae are a type of living thing. Fungi are like plants that are not "green", they do not have the photosynthetic pigment chlorophyll.
презентация [188,3 K], добавлен 16.03.2014Hopewell Culture was applying to Sri Lanka. Discussions on the topic of gender issues in underdeveloped countries. The education of women in third world countries. The Hopewell are the “next generation” of the Adena. The Hopewell Interaction Sphere.
реферат [26,9 K], добавлен 08.03.2010Albert Einstein - the theoretical physicist, humanist, the founder of modern theoretical physics, Nobel Prize in Physics in 1921. The Life and scientific activity of Einstein, discovery of Theories of Relativity, the interpretation of quantum mechanics.
презентация [948,9 K], добавлен 22.04.2013Language picture of the world, factors of formation. The configuration of the ideas embodied in the meaning of the words of the native language. Key ideas for Russian language picture of the world are. Presentation of the unpredictability of the world.
реферат [17,2 K], добавлен 11.10.2015Planning a research study. Explanation, as an ability to give a good theoretical background of the problem, foresee what can happen later and introduce a way of solution. Identifying a significant research problem. Conducting a pilot and the main study.
реферат [26,5 K], добавлен 01.04.2012Factors threatening the environment. Habitat destruction and species extinction. Depletion of the ozone layer. The living portion of an ecosystem. The environment in the new millennium: the way of the world. The crisis of ecology in the developing world.
статья [47,8 K], добавлен 21.11.2009The world political and economic situation on the beginning of the twentieth century. The formation of the alliances between the European states as one of the most important causes of World War One. Nationalism and it's place in the world conflict.
статья [12,6 K], добавлен 13.03.2014Self-assembly of polymeric supramolecules is a powerful tool for producing functional materials that combine several properties and may respond to external conditions. Possibilities for preparing functional polymeric materials using the "bottom-up" route.
курсовая работа [226,4 K], добавлен 23.12.2010Today, the fashion world has once again discovered the wonderful styles from the '60s. We've come full circle and clothes are not the only part of hippiedom to resurface for the new millennium.
реферат [8,8 K], добавлен 21.12.2003Subject of theoretical grammar and its difference from practical grammar. The main development stages of English theoretical grammar. Classical scientific grammar of the late 19th century and the first half of the 20th century. Problems of ’Case’ Grammar.
курс лекций [55,4 K], добавлен 26.01.2011History is Philosophy teaching by examples. Renaissance, French Revolution and the First World War are important events in the development of the world history. French Revolution is freedom of speech. The First World War is show of the chemical weapons.
реферат [21,6 K], добавлен 14.12.2011Investigation of the problem with non-local conditions on the characteristic and on the line of degeneracy . The solution of the modied Cauchy problem with initial data. The solution of singular integral equations. Calculation of the inner integral.
статья [469,4 K], добавлен 15.06.2015It is impossible to discuss a future role of the United States of America in the world without understanding the global processes that have been taken place in the world over the last several years.
сочинение [4,0 K], добавлен 10.03.2006The Chernobyl disaster is a huge global problem of 21st century. Current status of Chernobyl NPP. The most suitable decision of solving problem of wastes is a reburial in the repository "Buryakovka". The process of the Arch assembling and sliding.
реферат [396,5 K], добавлен 19.04.2011Theoretical problems of linguistic form Language. Progressive development of language. Polysemy as the Source of Ambiguities in a Language. Polysemy and its Connection with the Context. Polysemy in Teaching English on Intermediate and Advanced Level.
дипломная работа [45,3 K], добавлен 06.06.2011I think that people can change during their life. They grow up and change their world view and ideology. They get a lot of information during their life, they reed books, meet new people, go around different society.
топик [2,0 K], добавлен 27.03.2006Theoretical basis of long-term loans: concept, types. Characteristics of the branch of Sberbank of Russia. Terms and conditions of lending to households in Sberbank of Russia. Financing of investment projects. Risk - the main problem in the credit market.
реферат [28,0 K], добавлен 17.09.2013Problem of choice basic lines of child: half, color of eyes, hair, modern possibilities of medicine. Possibility choice of design of child at artificial impregnation. Ethic problem of choice of design. Value of problem of choice of sex of child in China.
статья [15,1 K], добавлен 02.12.2010Basic problems of teenagers in a world. Question of spending their free time, relations with parents and unhappy love. Use for sniffing glue products and solvents. Danger of AIDS, his action on the immune system. Reasons on which are widespread smoking.
реферат [16,5 K], добавлен 08.02.2010Air, water and soil as necessity for existence of all living things. Importance of solving the environmental problems that endanger people's lives. Water and air pollution. Pesticides, rubbish and poison-beware. Reduction of pollution. Drainage systems.
доклад [27,1 K], добавлен 08.01.2011