WHAT IS BIG DATA ANALYTICS?

Before Hadoop, we had limited storage and compute, which led to a long and rigid analytics process. First, IT goes through a lengthy process (often known as ETL) to get every new data source ready to be stored. After getting the data ready, IT puts the data into a database or data warehouse, and into a static data model. The problem with that approach is that IT designs the data model today with the knowledge of yesterday, and you have to hope that it will be good enough for tomorrow. But nobody can predict the perfect schema. Then on top of that you put a business intelligence tool, which because of the static schemas underneath, is optimized to answer KNOWN QUESTIONS. TDWI says this 3-part process takes 18 months to implement or change. And on average it takes 3 months to integrate a new data source. The business is telling us they cannot operate at this speed anymore. And there is a better way.


Yorumlar

En çok okunanlar

Cloud Computing Reference Architecture: An Overview

Cloud Architecture

Teknolojik Altyapıdan Ne Anlıyoruz?

Run SAP İş Ortağı Programı, En İyi Çözüm Operasyonunu Nasıl Sağlar?

CLOUD COMPUTING – An Overview

BİG DATA MANAGEMENT

Artırılmış Gerçeklik nedir ve hangi alanlarda kullanılıyor?

KÖRLER ÜLKESİNE KRAL OLMAK

Blockchain, sözleşmelerin dijital koda yerleştirildiği ve şeffaf paylaşılan veri tabanlarına depolandığı, silinmesi, değiştirilmesi ve düzeltilmesinden korunan bir dünyayı hayal edebiliriz.

Bilgi Sisteminin Yazılım Yetenek Olgunluk Modeli ile İlişkisi