Ana içeriğe atla

An Integrated Business Intelligence Framework

Closing the Gap Between IT Support for Management and for Production.

Globalization, scarcity of natural resources, complexity, and the powerful rise of the BRICS economies are the biggest challenges for the leading industrialized countries.

For these nations, the major tasks for the next 20 years will be securing versatile
production capabilities, resource efficient engineering environments, and a consequent time-to-market delivery of highly sophisticated industrial products.

In order to cope with these challenges, engineers are concentrating their research activities on complex concepts like the “Digital Factory” or “Intelligent Production Systems” as well as on introducing a variety of systems for steering and controlling their specific, production oriented operational processes. The main objective of these measures is to fully digitalize and integrate all processes of the product lifecycle and across supply chains .

In these contexts, large volumes of data are generated and stored within the IT infrastructures that support engineering, production, and logistics. The integration of this technical-oriented data with management support information, however, is still unsatisfactory.

An integrated strategic, administrative, and operational control and a comprehensive managerial decision support still promises relevant untapped business potential. This article focuses on this topic.

It extends and adapts the BI framework by  that has been introduced in Chap. 1 and derives an integrated framework for closing the gap between management- and production-oriented IT support.


Reshaping the BI Toolset 

The more comprehensive the BI-based decision support becomes and the closer it is linked to the actual (and in the realm of manufacturing: physical) business processes, the more questions arise regarding requirements for an augmentation of classical BIsystems. Required are pertinent components and concepts for defining the interplay between the evolving BI landscape and existing operational application systems. Additionally, striving for a detailed understanding of processes leads not only to an ever increasing volume of data of both structured and unstructured nature but also to volatile use profiles and workloads.

In the following, existing concepts dealing with these developments are introduced. These are later contrasted with available systems for the support of the product lifecycle in the manufacturing sector.

Operational BI and BI and Business Process Management 

The diffusion of BI into operational and tactical management layers has been discussed under the label “Operational BI” (OpBI) . The term OpBI is problematic because it does not clearly distinguish between the realm of BI and that of operational systems. In fact, some examples given by vendors appear to be rather manifestations of an insufficient operational support than of an innate need for new BI applications.

If there already is a mature IT landscape in place—as in the manufacturing industry—the claim of a better operational decision support needs to be thoroughly substantiated. This does not mean that OpBI is without merits. BItechnologies come into play when they can exert their strengths: Integrating large volumes of data from various sources, refining them for the purposes of decision support, and presenting the results in a comprehensive fashion.

This is also why OpBI is so closely related to the connection between business process management and BI—an area where the aspect of integration clearly comes into focus. There are various facets of this, which are covered in different, partly overlapping concepts . A widespread example for viable OpBI is the area of Business Application Monitoring (BAM). In this case, data from various sources is combined in near-real-time to process-level key performance indicators (KPIs) and visualized via operational dashboards (e.g. on the status of open orders, delivery processes etc.).

BAM applications are often embedded in broader concepts for Business Process and Business Performance Management, which aim at providing a consistent base of indicators across process steps and managerial levels. An approach that goes beyond the mere presentation of refined data is “Processcentric BI”. Here, next to data, analytic functionality is embedded into operational systems in order to enable operational staff to conduct analysis on operational data .

The term “Embedded BI” goes even further. It denotes the application of BI functionality to process data from local repositories . In this case, however, the specific contribution of a BI system is not obvious. While the discussed OpBI concepts are directed towards an inclusion into running processes, Business Process Intelligence (BPI) has a more strategic momentum. BPI is concerned with the analysis of data on process instances for purposes of uncovering and optimizing the underlying process structures and models .

An example for a BPI application is process mining where operational log files are used for the extraction, enrichment, and evaluation of as-is-processstructures. Another option for BPI is tailoring existing BI analysis tools (OLAP, reporting). This, however, makes it necessary to extract, store, and handle data on the process logic rather than just on the process results, i.e. the order of activities and the related constrains need to be traceable. The concepts developed for this include the introduction of a respective “Process DWH” that is designed for such an analysis. Examples for relevant sources of process data in the realm of manufacturing are the Manufacturing Execution Systems (MES) or systems which allow an automatic tracing of objects, e.g. based on RFID technology.

Big Data, Cloud BI, and In-Memory BI Collecting relevant data for the in-depth analysis of processes and activities on the operational layer leads to data repositories with sizes beyond those of ordinary DWHs.

The relevant data can come in various forms—structured machine and sensor data, semi-structured reports form quality testing, feedback e-mails from customers, product evaluations on web pages, discussions in social networks, etc. Performance bottlenecks have always been an issue in BI that required an arsenal of strategies on multiple levels . Nowadays, however, data volumes reach a level that classical relational technologies cannot efficiently handle anymore.

This topic is currently summarized under the rather unspecific term Big Data . It can be dealt with in various ways, which can in parts also be applied in combination. One approach, particularly suited for conglomerates of semi-, and unstructured data (“polystructured data”), is to apply database technologies that relax the strict scheme requirements of relational data bases as a trade-off for a better distribution of the data processing tasks and a higher query performance (“NoSQL”—not only SQL). 

Examples include key-value stores, document stores, and extensible record stores. Contemporary NoSQL BigData repositories are particularly suited for parallelizing data aggregation and analysis task and for utilizing large clusters of computing infrastructure. While their eventual role in the domain of BI remains yet unclear, Big Data stores seem to be particularly interesting as data sources and as components for pre-processing the semi- or unstructured contents residing within those sources.

Their applicability as full-scale replacements for a business-oriented DWH is limited however, as they are by design not meant to guarantee full consistency at all points of time (“BASE” model—Basically Available, Soft state, Eventual consistency).

A second strategy for dealing with large data sets that is intensively discussed is to apply “In-Memory data base” solutions. In-Memory solutions are tailored for handling larger volumes of data in the higher layers of the memory hierarchy, i.e. Random Access Memory (RAM), processor cache, and processor registers. Combined with pertinent data structures (e.g. a column-based instead of row-based storage of data base tables) this can lead to significant gains in query performance, e.g. in OLAP solutions.

Implementations can particularly be found in specific DWH and/or OLAP appliances.
The suitability for OpBI solutions is palpable— which leads some authors to the conclusion that in the future managerial and operational enterprise systems will rest upon a (re)unified data socket that is realized in an in-memory fashion. While such a scenario is most probably only viable in a limited set of environments, the assumptions illustrate the increasing overlap between the operational and the managerial systems and the relation to questions of performance. An alternative to a high-end in-house BI infrastructure is the import of services based on Cloud Computing approaches, i.e. internet-based services that can ideally be deployed in an ad-hoc manner, scaled dynamically with changing demand based on virtualization technologies, and be used in a pay-per-use model.

The application of Cloud Computing approaches to the domain of BI (“Cloud BI”) can be an answer to issues of volatile workloads and of unpredictable requirements on the information generation and access layer. One source of such requirements is the unpredictable demand for BI on mobile devices (“Mobile BI”)—where the rapid succession of innovation cycles quickly renders investments in specific components worthless (among others: mobile clients for various platforms, user and device management, security settings, etc.).
The subject of mobile BI also gains relevance with the trend towards OpBI—an in-process decision often goes along with the need for an on-site decision, e.g. on the premises of the customer, in a distribution center, or at the shop floor.

Source Systems for BI in the Manufacturing Sector—Developments 

The level of IT support in manufacturing is currently taken to a new level. This development can be broken down into three interdependent trends: First, activities across the product lifecycle are  ncreasingly connected via digital networks. Second, identification and sensor technology is  ncreasingly embedded into the physical environment and attached to objects ranging from transportation equipment, material, Work-In-Progress (WIP), up to machines, vehicles, and buildings.
Third, there is an increasing amount of semi- and unstructured data available for analysis .
All this provides an increasing foundation of interrelated data that can be utilized for decision and management support. Injecting this data into BI systems is fruitful from two perspectives: First, integrating data on technical processes and business outcomes enables a more purposeful planning and steering at operational and tactical level (OpBI). Second, it allows for the provision of in-depth insights that can be used for strategic decisions. The following sections detail the developments regarding the IT support within the product lifecycle, the relevance of sensor and identification technologies, and the role of semi- and unstructured data sources.


Bu blogdaki popüler yayınlar

Cloud Computing Reference Architecture: An Overview

The Conceptual Reference Model Figure 1 presents an overview of the NIST cloud computing reference architecture, which identifies the major actors, their activities and functions in cloud computing. The diagram depicts a generic high-level architecture and is intended to facilitate the understanding of the requirements, uses, characteristics and standards of cloud computing. As shown in Figure 1, the NIST cloud computing reference architecture defines five major actors: cloud consumer, cloud provider, cloud carrier, cloud auditor and cloud broker. Each actor is an entity (a person or an organization) that participates in a transaction or process and/or performs tasks in cloud computing. Table 1 briefly lists the actors defined in the NIST cloud computing reference architecture. The general activities of the actors are discussed in the remainder of this section, while the details of the architectural elements are discussed in Section 3. Figure 2 illustrates the intera

Cloud Architecture

The cloud providers actually have the physical data centers to provide virtualized services to their users through Internet. The cloud providers often provide separation between application and data. This scenario is shown in the Figure 2. The underlying physical machines are generally organized in grids and they are usually geographically distributed. Virtualization plays an important role in the cloud scenario. The data center hosts provide the physical hardware on which virtual machines resides. User potentially can use any OS supported by the virtual machines used.  Operating systems are designed for specific hardware and software. It results in the lack of portability of operating system and software from one machine to another machine which uses different instruction set architecture. The concept of virtual machine solves this problem by acting as an interface between the hardware and the operating system called as system VMs . Another category of virtual machine is called

CLOUD COMPUTING – An Overview

Resource sharing in a pure plug and play model that dramatically simplifies infrastructure planning is the promise of „cloud computing‟. The two key advantages of this model are easeof-use and cost-effectiveness. Though there remain questions on aspects such as security and vendor lock-in, the benefits this model offers are many. This paper explores some of the basics of cloud computing with the aim of introducing aspects such as: Realities and risks of the model  Components in the model  Characteristics and Usage of the model  The paper aims to provide a means of understanding the model and exploring options available for complementing your technology and infrastructure needs. An Overview Cloud computing is a computing paradigm, where a large pool of systems are connected in private or public networks, to provide dynamically scalable infrastructure for application, data and file storage. With the advent of this technology, the cost of computation, application hosting, c