Skip to main content Skip to main navigation menu Skip to site footer

Уважаемые пользователи! На нашем хостинге ведутся технические работы, на сайте могут быть ошибки. Приносим свои извинения за временные неудобства.

Bulletin of the Abai KazNPU, the series of "Physical and Mathematical Sciences"

MODELING OF LARGE VOLUMES OF DATA WITH THE USE OF NoSQL

Published June 2021
Abstract

In the modern world, specialists and the information systems they create are increasingly faced with the need to store,
process and move huge amounts of data. The definition of large amounts of data, Big Data, is used to denote technologies
such as storing and analyzing large amounts of data that require high speed and real-time decision making during
processing. In this case, large volumes, high accumulation rate, and the lack of a strict internal structure of "big data" are
considered. All of this also means that classic relational databases are not well suited for storing them. In this article, we
showed solutions for processing large amounts of data for pharmacy chains using NoSQL.
This paper presents technologies for modeling large amounts of data using NoSQL, including MongoDB, and also
analyzes possible solutions, limitations that do not allow this to be done effectively. This article provides an overview of
three modern approaches to working with big data: NoSQL, DataMining and real-time processing of event flows. In this
article, as an implementation of the studied methods and technology, we consider a database of pharmacies for processing,
searching, analyzing, forecasting big data. Also, when using NoSQL, we showed work with structured and poorly
structured data in parallel in different aspects and showed a comparative analysis of the newly developed application for
pharmacy workers.

pdf
Language

Eng

How to Cite

[1]
Zhapsarbek, N. 2021. MODELING OF LARGE VOLUMES OF DATA WITH THE USE OF NoSQL. Bulletin of the Abai KazNPU, the series of "Physical and Mathematical Sciences". 69, 1 (Jun. 2021), 323–326.