Research and development of an automated video surveillance system to perform special functions

Consideration of algorithms for creating an automated system for recording and displaying information from aircraft and observation in the interactive control mode by the operator. Architecture analysis for encrypted video streaming from multiple cameras.

Рубрика Коммуникации, связь, цифровые приборы и радиоэлектроника
Вид статья
Язык английский
Дата добавления 23.10.2020
Размер файла 1,4 M

Отправить свою хорошую работу в базу знаний просто. Используйте форму, расположенную ниже

Студенты, аспиранты, молодые ученые, использующие базу знаний в своей учебе и работе, будут вам очень благодарны.

Размещено на http://allbest.ru

University of Technology

Research and development of an automated video surveillance system to perform special functions

B. Moroz, doctor of technical science, professor;

A. Shcherbakov, graduate student.

Ukraine, Dnipro

Анотація

Аналіз досліджень і розробок автоматизованої системи відеоспостереження для виконання спеціальних функцій

Мороз Б. 1, Щербаков А. Г.

Розглянуто алгоритми створення автоматизованої системи реєстрації та відображення інформації з літальних апаратів і спостереження в режимі інтерактивного управління оператором. Проаналізовано архітектура VR180 для шифрованого передачі потокового відео з декількох камер з літального апарату зі стабілізацією відео в польоті і проектуванням на шолом віртуальної реальності в перспективі огляду 360 градусів.

Ключові слова: FPV-система, AVC/SVC система кодування, VR180, трасування зображення, алгоритм компостування відео, FoV, VR-система, інтервальний моделювання.

Аннотация

Анализ исследований и разработок автоматизированной системы видеонаблюдения для выполнения специальных функций.

Мороз Б. И., Щербаков А. Г.

Рассмотрены алгоритмы создания автоматизированной системы регистрации и отображения информации с летательных аппаратов и наблюдения в режиме интерактивного управления оператором. Проанализирована архитектура VR180 для шифрованной передачи потокового видео с нескольких камер с летательного аппарата со стабилизацией видео в полете и проецированием на шлем виртуальной реальности в перспективе обзора 360 градусов.

Ключевые слова: FPV-система, AVC/SVC система кодирования, VR180, трассировка изображения, алгоритм компостирования видео, FoV, VR-система, интервальное моделирование.

Abstract

Research and development of an automated video surveillance system to perform special functions

B. Moroz, A. Shcherbakov

A complex algorithm for creating an automated system for recording and displaying information from aircraft and observation in interactive operator control mode was presented. An architecture for encrypted transmission of video streaming from several cameras from an aircraft with in-flight video stabilization and projection of a virtual reality helmet on a 360-degree perspective was proposed.

Keywords: FPV-system, video registration, streaming video, algorithm, video camera, Diffie-Hellman algorithm, YOLO algorithm, neural network.

Introduction and statement of the research problem

Elements for providing virtual reality allow you to simulate the interaction of objects with a fictional environment, and also guarantee the ability to remotely receive high-quality images from a distant point.

For these purposes, VR glasses are used, which allow the subject to turn his head in all directions horizontally and vertically during the review and get an actual image without significant loss in quality.

Information carriers are flying objects (drones) equipped with cameras. If the virtual reality system works correctly, then a person in VR glasses is able to see almost the same picture of that captured by the camera of a quadrocopter or other aircraft.

Thus, our task is to understand 2 important aspects:

1. What the VR headset projects and what image is subsequently processed by the software.

2. How is wide-format shooting and further combining performed.

Goal. To propose a comprehensive algorithm for obtaining the actual image from multiple cameras. Pay attention to the further processing of files and combining the received data into a single picture.

Describe the principle of obtaining information from video cameras, overlapping fragments, the solution to potential problems due to camera shake and an unexpected panorama change.

Analysis of recent research and publications. At the moment, there are several scientific developments regarding our topic:

1. Google has proposed a new format for shooting VR-video, namely VR180, which allows us to better deal with the problem of redundant information obtained when shooting with a panoramic view [1].

2. Indiana researchers (AT&T Labs - Research) have proposed a number of innovative ideas that include integration algorithms - a holistic system using effects at the system level [2].

3. Nick Livendag has proposed using real-time rendering as a solution to problems associated with stereoscopic 3D: the distance between two virtual cameras can be associated with variable control of the interpolar distance on the headset [3].

Statement of the main material

With the development of modem technology, the demand for round-robin video files has increased. A virtual reality headset only fuels the interest of viewers, and also allows the use of such software for professional purposes. For entertainment and educational purposes of consumers, the dynamics of interest in 360-degree video has tripled since 2015 [4].

It is expected that the dynamics of the need for VR support will grow rapidly and closer to 2020 the sales of headsets and systems for virtual reality will increase by 70% [5]. Dynamics of interest in 360-degree video is shown in Figure 1.

Fig. 1. Comparison of the dynamics of search queries for video in 360 degrees (2015-2017)

FPV-system, which is used as a way to interact with the drone, allows us to get a circular image. But the end result is still far from optimal, since 360-degree video streaming is “flat,” with noticeable artifacts for combining fragments [6].

The Indiana development team, with the support of specialists from 2AT&T Labs - Research, concluded that a number of problems associated with circular streaming video, which is transmitted to VR glasses from video cameras of flying objects (drones), can be solved through experimental studies and a detailed study of current difficulties :

1. Speed Adaptation.

2. The study and use of QoE metrics.

3. Getting high-quality inter-level interactions (TCP and web protocols, etc.).

Matteo Varvello and his colleagues continue to research in this area, currently offering a number of solutions that can save from the loss and obsolescence of information, the presence of visible artifacts when combining fragments, etc. One of the concepts that should be taken into account is field fragmentation Vision (FoV) [7]. Horizontal Field of View presented in Figure 2.

Fig. 2. Horizontal view model

According to an innovative coding scheme presented in Figure 3, selective step-by-step updating of fragments that users see is achieved. A two-level coding system allows us to create a multi-layer incremental block by means of a fundamental building block for streaming video files based on 360° mosaic.

Fig. 3. The scheme of two-level video coding

Also, multilevel solutions were used by the team to obtain the best quality image. One streaming video is formed from data obtained from several sources at the same time [8]. However, the problem of combining fragments remains relevant until now. To avoid the difficulty of forming an adequate image in real time, Nick Liventag proposed using the concept of rendering [9] is shown in Figure 4.

Fig. 4. Rendering scheme using real-time ray tracing as an example.

According to the theory of the researcher, the distance between two virtual cameras can be associated with variable control of the interpolar distance on the headset. However, in practice, it is also possible to encounter image transmission delays, the presence of visible artifacts [10]. The analogue of 360- degree video files that can be observed using VR glasses and an adequately configured FPV system is the 180-degree shooting mode proposed by Google developers.

The average person is able to cover with his binocular vision a sector of 120 degrees, this is achieved due to the synchronization of the image from both eyes. But with peripheral vision, a person is able to see a sector of about 180° - thanks to an additional 30° to the left and right zone of vision. Human vision zones are shown in Fig. 5.

Binocular vision zone 120°

Fig. 5. Scheme of zones of peripheral and binocular vision of a person

Based on the zones of human vision, it can be clearly determined that we do not have an extreme need to take data in the 360° sector, 180° is enough. For this task, we will need far fewer cameras than in the case of 360° shooting. The result is better due to the larger overlay area. That is, the picture will be presented to the user more saturated and with a better immersion effect. Benefits of shooting in VR180 mode are:

1. The best immersion effect for the user compared to 360 video.

2. Cameras are cheaper and the number of cameras needed is less.

3. Fewer artifacts and problems, in particular stitching (combining)

4. Shooting in this format places less demands on both equipment and processing using software.

Let's take a closer look at the option of 180-degree shooting for a virtual reality headset. As mentioned above, combining fragments in VR180 mode is not needed, which eliminates the need to work through visible artifacts and edit other inaccuracies in the resulting image from multiple media. The lack of need for stitching (combining) is also due to the presence of only two cameras for shooting streaming video [11].

For comparison, drones use 8, 17 or more cameras to obtain a circular view image. Projection from the shooting equipment of drones to the VR-system occurs through the formation of the overall picture from the finished blocks of fragments. The principle is to minimize errors in input (when shooting), in order to subsequently reduce the divergence in the image at the output (during translation). Additionally, it is possible to avoid obsolescence of information, which is characteristic of the 360° shooting mode. The user is able to cover the maximum amount of data proposed by VR180. An example of processing fragments in VR180 mode is shown in Figure 6.

Fig. 6. An example of processing fragments in VR180 mode

Robert Anderson, who is Google's leading virtual reality specialist, claims that shooting errors are minimal. The development team has proposed the optimal format for saving and broadcasting streaming video files. ODS allows us to achieve optimal image compression and prepare the file for further editing (changing contrast, color, black balance) [12].

An algorithm for transmitting information from a drone to a VR system can be used as in Figure 7. The automatic system calibrates the received data, taking into account all the details when calibrating files from all cameras. Streaming video is processed and also corrects errors due to external influences on the aircraft. The final step is to create a common picture with the output of the final image in ODS format to a virtual reality headset (VR glasses) [13].

To handle scenes with large exposure variations that are common in panoramic capture, each camera in the installation is exposed independently of other information transmitters. This means that adjacent cameras can have very different settings. Differences in exposure are compensated for. This helps to avoid differences when watching live video with VR glasses [14].

Fig. 7. The algorithm for transmitting information from the aircraft to the user

To achieve the optimal signal between all media and transmitters of the FPV system, an interval- based composting algorithm is used, in which each fragment is assigned a range of fjort imbalance, presented in Figure 8 [13].

Fig. 8. Schematic representation of the composting algorithm

Automatically the computing system turns the fragment into a three-dimensional object, where a range of imbalances simulates the uncertainty over the optical stream. If the signal from one camera becomes weaker, then the missing fragments are intuitively replaced by clear units in order to avoid constant size imbalances [13].

Thanks to interval modeling, the picture is transmitted without distortion and without significant delays to the user in real time. According to researchers at Google, the long distance of objects is not the cause of artifacts. The most common difficulty catalyst is a different series of problems:

1. Objects are too close - the camera does not have the ability to integrate video at a minimum distance with objects.

2. Blurred structure of the object - rendering fails if the camera does not have the ability to recognize a clear structure of the object, landscape and other details.

3. Comprehensive panorama - complex scenes can cause flashbacks.

The above reasons for the malfunctions of the FPV system can also affect the focus quality of the drone's camera. Google developers provided a number of recommendations for users in the form of a list (Fig. 9) [15].

Fig. 9. Google Developer's Guidelines for a clear focus of objects

At the moment, a block of recommendations on virtual reality and viewing a video file using a VR headset is available only in English. The list describes the following points:

1. The camera should be controlled smoothly so that objects are adequately captured and broadcast without flaws.

2. It is necessary to avoid sudden movements and dramatic changes in the course of the aircraft.

3. It is necessary to be from the object in a certain availability.

4. It is necessary to place the subject of interest closer to the center for quick focus.

5. It is not necessary to get too close to the object that interests you.

An example of ways to edit a streaming file was also given above. In addition to standard tools for changing the resulting picture, there are additional programs for editing the received streaming video.

Thus, the user can create his own video, interactive content and more. If the VR glasses support the augmented reality (AR) format, then it is possible to use two functions in parallel through the provision.

To date, a number of companies are working on a line of VR headsets, which would be able to provide opportunities for shooting streaming video and in non-stop editing of the information received, followed by the overlay of the necessary data on the finished video file or in real time.

Existing models have a viewing angle of only 110 degrees. Manufacturers claim that with the help of additional cameras and installation systems, it is possible to increase the overall visibility by 40 degrees.

Fig. 10. Table of models of VR-headsets with the possibility of mixed reality

Figure 10 shows a list of universal helmets that support parallel augmented reality. It can be seen above that the maximum viewing radius is 110 degrees [16].

An algorithm for transmitting information from a drone to a VR system is proposed below, in Figure 10.

The automatic system calibrates the received data, taking into account all the details when calibrating files from all cameras. Streaming video is processed and also corrects errors due to external influences on the aircraft. The final step is to create a common picture with the output of the final image in ODS format to a virtual reality headset (VR glasses).

Conclusion and prospects for further development

The 21st century is a stage in the rapid development of modern technology. Today, users have the opportunity to use a special equipment to receive a three-dimensional image or watch live video broadcasting.

The article touched on the issue of FPV systems that are aimed at providing real-time image from aircraft cameras (drones) with broadcast through a VR headset in the mode of 180-360 degrees of view. It was found that this area of research is relevant, as the demand for virtual reality technology is growing. Algorithms and techniques that are used in innovative VR models have been proposed, and an analysis of the difficulties that have yet to be solved has been performed.

References

1. Official YouTube Blog: The world as you see it with VR180. [Electronic resource] // https://youtube.googleblog.com/2017/06/the-world-as-you-see-it-with-vr180.html

2. Alexandra Voit, Sven Mayer, Valentin Schwind, Niels Henze: Online, VR, AR, Lab, and In-Situ: Comparison of Research Methods to Evaluate Smart Artifacts. CHI Conference on Human Factors in Computing Systems Proceedings, 2019. -P.1-10.

3. The 360° VR Paradox -- Why 360° video is both a problem and a necessity for the success of Virtual Reality. [Electronic resource] // URL: http://nicklievendag.com/the-vr-paradox/

4. VR dynamic to growth. [Electronic resource] // URL: https://www.gamesindustry.biz/articles/2018-12-06-ar-vr-spending-to-jump-69-percent-in-2019-idc.

5. The future of mobile video is virtual reality. [Electronic resource] // URL: https://techcrunch.com /2016/08/30/the-future-of-mobile-video-is-virtual-reality.

6. Xing Liu, Qingyang Xiao, Vijay Gopalakrishnan, Matteo Varvello: Research 360° Innovations for Panoramic Video Streaming, - P. 50-55. ACM, 2017.

7. Field of View. [Electronic resource] // URL: https://en.wikipedia.org/wiki/Field_of_view.

8. B. Han, F. Qian, L. Ji, and V. Gopalakrishnan. MP-DASH: Adaptive Video Streaming Over Preference-Aware Multipath. In Proceedings of the 12th International on Conference on emerging Networking Experiments and Technologies, - P. 129-143. ACM, 2016.

9. Что такое рендеринг? [Electronic resource] // URL: https://www.aion.pro/post/chto-takoe-rendering-v-realnom-vremeni-i-zachem-on-nuzhen.

10. Video 360°: качественно новый зрительский опыт [Electronic resource] // URL: https://www.thinkwithgoogle.com/intl/ru-ru/insights-trends/user-insights/video-360-kachestvenno-novyi-zritelskii-opyt/.

11. Gluckman, J., Nayar, S. K. Real-time omnidirectional and panoramic stereo. - P. 121-122, 1998.

12. ODS format. [Electronic resource] // URL: https://whatis.techtarget.com/fileformat/ODS-OpenDocument-Spreadsheet.

13. Robert Anderson, David Gallup, Jonathan T.: Virtual Reality Video, - P. 1-15, 2019.

14. Evaluation Methodology. [Electronic resource] // URL: http://videomatting.com.

15. Tips for taking VR footage - VR180 Help. [Electronic resource] // URL: https://support.google.com/vr180/answer/76871747hUen

16. The best all-in-one headsets. [Electronic resource] // URL: https://www.aniwaa.com/best-of/vr-ar/best-standalone-vr- headset/

Размещено на Allbest.ru

...

Подобные документы

  • Initial data for the term paper performance. Order of carrying out calculations. Analyze uncompensated system. Synthesize the real PD-compensator ( ) which would guarantee desired phase margin at gain crossover frequency . Analyze compensated system.

    курсовая работа [658,7 K], добавлен 20.08.2012

  • Analyses o the current situation on the project and the development of their technical realization. Brief description of the existing zonal area network. Basic requirements for communication lines. Calculation of the required number of channels.

    дипломная работа [771,0 K], добавлен 20.09.2016

  • История образования и раскол в Microsoft, обзор GNU/Linux-подобных систем Fedora, Slackware. Обзор BSD-подобных систем OpenBSD, Frenzy. Unix-подобные операционные системы Extended File System ext. XFS и Unix File System, ядро linux-kernelи Emacs.

    реферат [135,9 K], добавлен 07.12.2010

  • Аналогові та цифрові системи відеоспостереження. Розробка програмної системи АСУ. Обгрунтування вибору Trace Mode. Розробка загальної структури керування. Загальні визначення, послідовність дій по реалізації. Тестування програмного забезпечення АСУ.

    курсовая работа [1,5 M], добавлен 06.11.2016

  • Типи даних, які використовує Mpeg-4 Visual: статичні текстури, рухомі зображення. Застосування формату стиснення H.264/MPEG-4 Part 10. Аналіз програми MSU Video Quality Measurement Tool. Особливості формату Visual part 2, функції. Основні умови праці.

    дипломная работа [7,0 M], добавлен 05.04.2012

  • История использования подводных лодок ВМФ США. Описание боевых информационно-управляющих систем (БИУС) на их борту как комплекса электронно-вычислительной аппаратуры для управления и эффективного использования боевых и технических возможностей оружия.

    презентация [896,8 K], добавлен 23.12.2013

  • Построение технологии ОКС-7 "сигнализация-маршрутизация-сообщение". Стандарты систем общеканальной сигнализации: CCITT Signalling System No.6 и No.7. Взаимодействие цифровых сетей. Виды систем сигнализации: абонентская, внутристанционная и межстанционная.

    курсовая работа [228,0 K], добавлен 30.05.2014

  • Signal is a carrier of new information for the observer. Concept and classification detector signals, their variety and functional features. The detection abilities of different detector’s types, methodology and milestones of their determination.

    контрольная работа [1,1 M], добавлен 27.04.2014

  • Установка компонентов на печатные платы при помощи автоматов укладчиков или интегрированных монтажно-сборочных комплексов, их характеристики. Автомат с блоком монтажных головок. Роторно-башенная схема построения автоматов (Rotary Turret Placement System).

    реферат [161,7 K], добавлен 21.11.2008

  • Проектирование среднескоростного тракта передачи данных между двумя источниками и получателями. Сборка схемы с применением пакета "System View" для моделирования телекоммуникационных систем, кодирующего и декодирующего устройства циклического кода.

    курсовая работа [3,0 M], добавлен 04.03.2011

  • Основні переваги систем відеоспостереження перед іншими засобами безпеки. Обгрунтування вибору Trace Mode. Розробка загальної структури керування. Послідовність дій по реалізації. Тестування програмного забезпечення автоматичної системи управління.

    курсовая работа [1,9 M], добавлен 05.02.2015

  • История развития IP-телефонии. Принцип действия. Качество IP-телефонии. Интернет-телефония - частный случай IP-телефонии. Система для звонков по телефону и посылки факсов средствами IP. Стандарт Media Gateway Control. Voice Profile Internet Mail.

    реферат [66,9 K], добавлен 10.04.2007

  • Интеллектуальная система управления приточно-вытяжными установками IEVENT. Автоматизированная система управления вентиляцией и кондиционированием. Функциональная и принципиальные электрические схемы. Расчет затрат на оборудование и разработку системы.

    дипломная работа [5,7 M], добавлен 10.08.2014

  • Идентификация параметров электромеханической системы. Моделирование нелинейных объектов. Оптимизация параметров пид-регуляторов для объектов управления с нелинейностями с применением пакета прикладных программ Nonlinear Control Design (NCD) Blockset.

    лабораторная работа [474,0 K], добавлен 25.05.2010

  • Структура окна и система меню File, Edit, Circuit, Window, Help, Analysis. Обмен данными с программой PSpice. Контрольно-измерительные приборы: мультиметр, функциональный генератор, осциллограф, ненератор слова, логический анализатор и преобразователь.

    отчет по практике [1,8 M], добавлен 28.04.2015

  • Consideration of a systematic approach to the identification of the organization's processes for improving management efficiency. Approaches to the identification of business processes. Architecture of an Integrated Information Systems methodology.

    реферат [195,5 K], добавлен 12.02.2016

  • Essence, structure and basic functions of informational logistics system. Principles and the levels of information logistic system. Resources project development. Business process of logistic operations. The tariff for the transportation of cargo.

    дипломная работа [3,1 M], добавлен 31.05.2013

  • Theoretical aspects of accumulation pension system. Analysis of current status and development of accumulative pension system in Kazakhstan. Ways to improve the pension system and enhancing its social significance accumulative pension fund provision.

    курсовая работа [1,1 M], добавлен 06.11.2013

  • Theoretical aspects of efficiency of development of advertising activity and your place in marketing system, development and its value for manufacturers and consumers. Research of the advertising campaign of the new goods in open company "Nataly".

    дипломная работа [49,3 K], добавлен 19.06.2010

  • Research methods are strategies or techniques to conduct a systematic research. To collect primary data four main methods are used: survey, observation, document analysis and experiment. Several problems can arise when using questionnaire. Interviewing.

    реферат [16,7 K], добавлен 18.01.2009

Работы в архивах красиво оформлены согласно требованиям ВУЗов и содержат рисунки, диаграммы, формулы и т.д.
PPT, PPTX и PDF-файлы представлены только в архивах.
Рекомендуем скачать работу.