Business Intelligence Terms, Terminology & Abbreviations
BI – Business Intelligence
BSD – Basic System Diagram
BPM – Business Performance Management (same as CPM, BPO, EPM, OPM) leading term
BPO — Business Performance Optimisation
CPM – Corporate Performance Management
CRM – Customer Relationship Management
DM – Data Mining
DSS – Decision Support System
DWH – Dataware House
ECM – Enterprise Content Management
EOP – End of Period
EOY – End of Year
EPM – Enterprise Performance Management
ERP – Enterprise Resource Planning
ESS – Executive Supporting System
FCBI – Financial and Controlling Business Intelligence
HRM – Human Resource Management
IS – Information System (production systems), transactional systems for data gathering and processing
IVAS – Integrated Value Add Simulation
KWS – Knowledge Work System
MIS – Management Information System
RACI matrix – Responsible, Acquired , Informed….
OAS – Office Automation System
ODBC – Open Database Connectivity
OLAP – Online Analytical Processing, approach invented to rapidly answer multi-dimensional analytical queries
OPM – Operational Performance Management
POS – Point of Sale
RDBMS – Relational Database Model Solution
SaaS – Software as a Service
SEMS — strategic enterprise management software
SFA – Sales Force Automation
TCC – Total Controlling Concept
TPS – Transaction Processing System
YTD – Year to Date
WCM – Web Content Management
Administration of data
a set of processes and procedures to ensure the integrity and accuracy of data in the warehouse.
Aggregation of data
type of data transformation in which some information is calculated based on other data of the same type but the “lower level”. For example, annual sales are calculated on a monthly basis.
The process of troubleshooting and resolving nekonzistencija data in the original data before transmission in a data warehouse. An example of data treatment is to clean the date for which it is known that they could not enter (for example, dates in the 1900th year, and the like, but sometimes quite complex processes)
a set definition and specification of data and their interdependence. A database that describes the structure of the database containing the data.
data extraction – the process of copying data from production databases, which are prepared for transfer to the warehouse.
Process of searching for hidden patterns in databases. For example, data mining software can help companies involved in sales in search of customers with similar interests that we can offer some additional products based on the basis novostečenog knowledge (data mining) on the preferences of that group of customers.
Logical model of most visual way to represent data and their relationships independent of technological platforms (hardware and software).
Process of distributing data from the source to the target database that keeps data consistency.
The process of filtering the merger, denormalizacije and translating source data to create data suitable for entry into the data warehouse. For example, a numerical code which represents for example, becomes a county on behalf of the county.
is a database of unique data structure as it has already been noted, which allows relatively quick and easy to perform complex queries over large amounts of data.
Unlike the production system that you can buy ready if the market is a customized business of your company, skladiše there is almost a buy, it always relies on the work of the production system.
Recent appears the tools that are intended to build data warehouses. No one should interfere with them the very concept of data warehouse. Such tools to a real data warehouse, for example, Oracle Warehouse Builder.
Decision Support System (DSS)
systems that enable decision makers in enterprise integrated access to all relevant information needed for decision making. Such a system is characterized by a small number of users (compared to some commercial applications for which a large number of people who enter into them daily data). Data are often summarized and include the entire business enterprise.
Data to store the data resulting from the transformation of the data applied from the source. The derived data does not come in their original form from which the application is made data extraction, but are created and hence adds something new to the overall knowledge stored in the database (ie data warehouse).
method of research data that is “down” to a lower level of granularity of data. It is for example, if the sales data “in months” over to the ticketing “by day”.
The situation prior to drilling, concrete drilling (drill) is done in months.
After drilling in time (months)
Quarter Month Sales
I/2000 January 2500
II/2000 April 260
data in data warehouses that have undergone cleaning methods, transformations, and various summation.
Executive Information System (EIS)
A set of tools that provides enterprise data management business. Tools offer easy and intuitive creation of reports, drilling data (drill down), ad-hoc queries over multidimensional presentation of data and other analytical functions.
Extract”Extraction”() – data extraction involves copying data sets from one source (source database) to another target data base (target database).
Enterprise Data Model
draft for all data used in the enterprise sector. Enterprise Data Model has solved any potential inconsistencies and limited interpretation of the data used and presents a consistent and understandable to everyone and acceptable way of viewing the definition of corporate data.
Extraction (transfer) is a concept under which we mean the retrieval, transformation and entry into the warehouse. Ie the full path of data from production systems to the warehouse.
Process data is defined as a process that performs data transmission from the source of the data warehouse within the agreed time intervals. Depending on the time interval between each transmission, for example, we can talk about the daily, weekly or monthly data transfer. Also, data can be carried out if necessary, at irregular intervals. For example, if the plan revision done twice a year, then the transfer of planning data do three times a year – after the first entry of the planned value and after each audit plan.
The transfer can be divided to:
1st Get source of data extraction (in the strict sense of the word)
2nd Transport of source data
3rd Acceptance of source data into target database
4th Transformation of source data
5th Charge (Load) data warehousing
Information Warehouse Information Warehouse ™
IBM’s approach to building a data warehouse implementation, which implies that functional, centralized or decentralized data warehouse.
Load (Record) and retrieval
Phase of the reach of source data deals with the collection of data from source systems. Retrieval of source data is determined by the exact algorithm and with a calm state sources. Steady state sources implies the prohibition of entry and modification of data in the source during data retrieval. This state shall ensure the consistency of retrieved data.
The simplest security idle resources is if this process can be done outside working hours.
Entering data into the warehouse is done completely automatically using the data prepared by the extraction and transformation are primpremljeni for entry into the warehouse.
data about data, such as information about how and where data are stored, as often happens for example, extraction of the data warehouse, whether it was successful and whether the error occurred, who is responsible for the information and their validity, etc. In contrast to the data model for storage which is usually applicable only to one branch of production or sales target model is more generic model and is valid in most cases, so it is worth to devote attention. Store data about the data is thus independent of the type of job for which the warehouse is.
One can identify seven common components, meta – model as shown in Fig.
The logical model describes the entities and their relationships and the physical model described by way of their implementation and may vary depending on the base where the data warehouse implemented. This is also the only platform in some way dependent part. Source model complementary information about how to map data to deliver the data warehouse. On the other hand, are data about statistics, where requests are tracked, their duration, how often the queries executed and you are the administrator of the data warehouse to try to optimize them based on the warehouse to the response time to query anything less. ETL statistics refers to the amount of data and time needed for the steps that the data passing out of reach from the production database to their placement in the data warehouse.
The following table shows the placement of components of the target – the model during the realization of storage technology, Oracle.
Element meta model implementation of Oracle
Logical warehouse model ORACLE DESIGNER / Oracle Warehouse Builder
Physical Data Model
Information about how to map the Oracle Warehouse Builder (if used)
Not implemented when not using any ETL tool. It can be seen only in the source code
Information on the rights of data (user rights) Created user rights within the Oracle
Warehouse Builder ETL statistics
If not used, then the table created by designers warehouse
Communication layer (layer) that allows applications to run on a variety of networking and computing platforms.
process of extracting previously unknown, comprehensible and sprovedivih information from any source, including transactions, documents, e-mail, web pages, etc. and use information in making key business decisions.
Data analysis that takes into account the interdependence among the different data that represent the dimensions. Thus each dimension represents some attributes of information. Multidimenzionalu analysis often referred to as OLAP (Online Analytical Processing). In order to demonstrate the process dimenzonalne analysis for example we can take a cube that represents the three dimensions (region, time and product). It lets look at the different sections that represent such sale of one product by region and time. Green marked region represents the sale of products in one region over time.
OLAP = Online Analytical Processing.
OLAP is a technology that in a fast and interactive way, giving access to different views on one and the same information. OLAP technology transforms the data to reflect actual multidimenzijonalnost business as seen by the user.
OLAP is a technology that provides answers to business questions like “What is happening with new customers?”, “What about our sales in the period .. …», to “Do we have any increase in profits derived from investment in internet and how much.” OLAP tools such as Oracle Express for example, are adapted to provide answers to such questions. The data source for them, a typical data warehouse (datawarehouse).
The difference between OLAP technology and a Datawarehouse is that the data warehouse provides answers to the questions Who?, How? and OLAP technologies can be supplemented with ‘What if … (what if analysis )’?…
OLAP is tenhologija that transform data from a data warehouse into strategic / business information as OLAP and data warehouse together in the decision making process seem complementary.
OLAP glossary of terms: http://www.olapcouncil.org/research/glossaryly.htm
Query means any request for information that sets the user applications to the data warehouse environment and generally.
Production system (information system is still sometimes called) is an information system that is used in daily business. There are two types of production systems. User-defined PS, and systems that can buy ready-made and adapted to the user. Examples of PS, for example, SAP or Oracle Applications product group (eg, financials, etc. …)
Relational Database Management System (RDBMS)
The database built by the relational model with tables that are linked bonds (distances) and as such make up the picture of the real world.
Replication is the process of maintaining copies of data. Most often, these copies (also called herself a specific database replication) database files from a previous timing. Replicated database is essential for data warehousing in the next case. If the source database must be available 24 hours a day (say, base on the Internet), then such a base excessively burdens at a time when it begins to fill the warehouse because of the source database in a short time, require large amounts of data, resulting in its slower response to customer requirements. Replicated data, or base, which is the replication of such a database can be a source of charging data warehouse where it does not bother with ‘major’ source production base. Data replication techniques take into account not charged toliko production base as it was the case with the extraction of data for storage. This is achieved mainly by copying small amounts of data but often over eg 24 hours.
the source database from which to perform the procedure, transformation and transfer of data in the target database.
(star chart, star data model) – The data model in which a single table in the middle of several related tables radially around it. In the center of the table in which data such as sales compensation, accounts that quantify relationships that have the tables that surround it, which we call dimensions (for example, to time or geographical data, etc.)
Data transformation is a process that involves transforming data from the form in which the production system in a form suitable for entry into the data warehouse