z-logo
open-access-imgOpen Access
Development of a master data consolidation system model (on the example of the banking sector)
Author(s) -
Igor Prokhorov,
Nikolai Kolesnik
Publication year - 2018
Publication title -
procedia computer science
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.334
H-Index - 76
ISSN - 1877-0509
DOI - 10.1016/j.procs.2018.11.093
Subject(s) - master data , computer science , data quality , consolidation (business) , vendor , data virtualization , data consistency , data management , enterprise data management , loan , data governance , automation , business process , process management , data science , service (business) , knowledge management , database , cloud computing , business , finance , enterprise information system , marketing , engineering , mechanical engineering , virtualization , work in process , operating system
One of the most critical issues to be faced when building integration solutions in the field of integrated automation of business-processes of an enterprise is the problem of managing the so-called master data. Master data management is a set of processes and tools for the ongoing definition and management of company core data (including reference data). You can come across another name - reference data management [1]. Master data is data with the most important information for running a business: about customers, products, services, personnel, technology, materials, and so on. They are relatively rarely changed and are not transactional [2]. The purpose of master data management is to ensure that there are no repetitive, incomplete, inconsistent data in various areas of the organization’s activities. An example of poor basic data management is the work of a bank with a client who already uses a loan product, but still receives offers to take such a loan. The reason for the misbehavior is the lack of current customer data in the customer service department. The basic data management approach envisages such processes as data collection, accumulation, data cleansing, their comparison, consolidation, quality control and data distribution in the organization, ensuring their subsequent consistency and control of use in various operational and analytical applications.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom