Event Details
Overview

Classic data warehouse architectures are made up of a chain of databases. This chain consists of numerous databases, such as the staging area, the central data warehouse and several datamarts, and countless ETL programs needed to pump data through the chain. This architecture has served many organizations well. But is it still adequate for all the new user requirements and can new technology be used optimally for data analysis and storage?

Integrating self-service BI products with this architecture is not easy and certainly not if users want to access the source systems. Delivering 100% up-to-date data to support operational BI is difficult to implement. And how do we embed new storage technologies, such as Hadoop and NoSQL, into the architecture?

It is time to migrate gradually to a more flexible architecture in which new data sources can hooked up to the data warehouse more quickly, in which self-service BI can be supported correctly, in which OBI is easy to implement, in which the adoption of new technology, such as Hadoop and NoSQL, is easy, and in which the processing of big data is not a technological revolution, but an evolution.

The architecture that fulfills all these needs is called the logical data warehouse architecture. This architecture, introduced by Gartner, is based on a decoupling of reporting and analyses on the one hand, and data sources on the other hand.

The technology to create a logical data warehouse is available, and many organizations have already successfully completed the migration; a migration that is based on a step-by-step process and not on full rip-and-replace approach.

In this practical course, the architecture is explained and products will be discussed. It discusses how organizations can migrate their existing architecture to this new one. Tips and design guidelines are given to help make this migration as efficient as possible.

Training Outline

Challenges for the Classic Data Warehouse
• Integrating big data with existing data and making it available for reporting and analytics
• Supporting self-service BI and self-service data preparation
• Faster time-to-market for reports
• Polyglot persistency – processing data stored in Hadoop and NoSQL systems
• Operational Business Intelligence, or analyzing of 100% up-to-date data

The Logical Data Warehouse
• The essence : decoupling of reporting and data sources
• From batch-integration to on-demand integration of data
• The impact on flexibility and productivity – an improved time-to-market for reports
• Examples of organizations operating a logical data warehouse
• Can a logical data warehouse really work without a physical data warehouse?

Implementing a Logical Data Warehouse with Data Virtualization Servers
• Why data virtualization?
• Market overview: AtScale, Cirro Data Hub, Cisco Information Server, Data Virtuality UltraWrap, Denodo Platform, RedHat JBoss Data Virtualization, Rocket DV, and Stone Bond Enterprise Enabler
• Importing non-relational data, such as XML and JSON documents, web services, NoSQL, and Hadoop data
• The importance of an integrated business glossary and centralization of metadata specifications

Improving the Query Performance of Data Virtualization Servers
• How does caching really work
• Which virtual tables should be cached?
• Query optimization techniques and the explain feature
• Smart drivers/connectors can help improve query performance
• How can SQL-on-Hadoop engines speed up query performance?
• Working with multiple data virtualization servers in a distributed environment to minimize network traffic

Migrating to a Logical Data Warehouse
• An A to Z roadmap
• Guidelines for the development of a logical data warehouse
• Three different methods for modelling: outside-in, inside-out, and middle-out
• The value of a canonical data model
• Considerations for security aspects
• Step by step dismantling of the existing architecture
• The focus on sharing of metadata specifications for integration, transformation, and cleansing

Self-Service BI and the Logical Data Warehouse
• Why self-service BI can lead to “report chaos”
• Centralizing and reusing metadata specifications with a logical data warehouse
• Upgrading self-service BI into managed self-service BI
• Implementing Gartner’s BI-modal environment

Big Data and the Logical Data Warehouse
• New data storage technologies for big data, including Hadoop, MongoDB, Cassandra
• The appearance of the polyglot persistent environment; or each application its own optimal database technology
• Design rules to integrate big data and the data warehouse seamlessly
• Big data is too “big” to copy
• Offloading cold data with a logical data warehouse

Physical Data Lakes or Virtual Data Lakes?
• What is a Data Lake?
• Is developing a physical Data Lake realistic when working with Big Data?
• Developing a virtual Data Lake with data virtualization servers
• Can the logical Data Warehouse and the virtual Data Lake be combined?

Implementing Operational BI with a Logical Data Warehouse
• Examples of operational reporting and operational analytics
• Extending a logical data warehouse with operational data for real-time analytics
• “Streaming” data in a logical data warehouse
• The coupling of data replication and data virtualization
Making Data Vault more Flexibile with a Logical Data Warehouse
• What exactly is Data Vault?
• Using a Logical Data Warehouse to make data in a Data Vault available for reporting and analytics
• The structured SuperNova design technique to develop virtual data marts
• SuperNova turns a Data Vault in a flexible database

The Logical Data Warehouse and the Environment
• Design principles to define data quality rules in a logical data warehouse
• How data preparation can be integrated with a logical data warehouse
• Shifting of tasks in the BICC
• Which new development and design skills are important?
• The impact on the entire design and development process

Concluding Remarks

Objectives
  • What are the practical benefits of the logical data warehouse architecture and what are the differences with the classical architecture.
  • How can organizations successfully migrate to this flexible logical data warehouse architecture, step-by-step?
  • Understand the possibilities and limitations of the various available products.
  • How do data virtualization products work?
  • Discover how big data can be added transparently to the existing BI environment?
  • Understand how self-service BI can be integrated with the classical forms of BI?
  • Learn how users can be granted access to 100% up-to-date data without disrupting the operational systems?
  • What are the real-life experiences of organizations that have already implemented a logical data warehouse?
Who's Is It For?

This course is intended for everyone who needs to be aware of developments in the field of business intelligence and data warehousing, such as:

  • BI Architects
  • Business Analysts
  • Data Warehouse Managers
  • System Analysts
  • Consultants
  • Technology Planners
  • Project Managers
  • Database Designers & Database Experts
Speaker
Rick van der Lans
Rick van der Lans
Founder of R20/Consultancy BV, Ambassador of Axians Business Analytics Laren
R20/Consultancy BV
Rick van der Lans is a highly-respected independent analyst, consultant, author, and internationally acclaimed lecturer specializing in data architectures, data warehousing, business intelligence, big data, data virtualization, and database technology. He works for R20/Consultancy, which he founded in 1987. In 2018 he was selected the sixth most influential BI analyst worldwide by onalytica.com. He has presented countless seminars, webinars, and keynotes at industry-leading conferences. For many years, he served as the chairman of the annual European Enterprise Data and Business Intelligence Conference in London and the annual Data Warehousing and Business Intelligence Summit in The Netherlands. He presents seminars, keynotes, and inhouse sessions on the following topics: Modern data architectures Big data and analytics Data virtualization The logical data warehouse Data warehousing and business intelligence Rick helps clients worldwide to design their data warehouse, big data, business intelligence, and streaming architectures and solutions and assists them with selecting the right products. He is ambassador of Axians Business Analytics Laren (formerly Kadenza), an international consultancy company specializing in business intelligence, data management, big data, data warehousing, data virtualization, and analytics. He is the author of several books on computing. Some of these books are available in different languages. Books such as the popular "Introduction to SQL" is available in English, Dutch, Italian, Chinese, and German and is sold worldwide. He was the author of the first available book on SQL, entitled including Introduction to SQL, which has been translated into several languages with more than 100,000 copies sold. More recently, he published two books on Data Virtualization: Selected Writings and Data Virtualization for Business Intelligence Systems.