Agenda Highlights

Plenary Keynote: Data Science for Grown Ups: How to Get Machine Learning out of the Lab to Scale it Across the Enterprise
Dr. Alexander Borek, Global Head of Data & Analytics, Volkswagen Financial Services

Plenary Keynote: Digital Business: Tomorrow is Already Here
Andreas Bitterer, Chief Analytics Evangelist EMEA, SAP

Plenary Keynote: It’s Not about You, It’s About Them: Helping Others Take Action Based on Data
Lori Silverman, Partners for Progress

Plenary Keynote: Challenges of Developing an Enterprise Data Marketplace
Rick van der Lans, Independent Analyst, Consultant, Author and Lecturer, R20/Consultancy

Plenary Keynote: Ethics Schmethics: Hype or Hope?
Daragh O Brien, Leading Consultant, Educator and Author, Castlebridge

Enterprise Data Keynote: Using Enterprise Information Management at the International Criminal Court to End Impunity
Dr Jones Lukose, Information Management Officer, The International Criminal Court

Enterprise Data Keynote: The Producer, the Consumer, the Owner and the Rest of the World: Governing Big Data
Jan Henderyckx, Managing Partner, Inpuls

Why Training the Organisation and not just the Data Team is Vital
Phil Yeoman, Head of Data Governance, The Pensions Regulator

Mind your Language: The Criticality of Common Data Definitions in Managing Complex Data
Becky Russell, National Lead for Data Standards, Environment Agency & Nigel Turner, Principal Information Management Consultant EMEA, Global Data Strategy

Making Data Mainstream: Establishing a Data Function and Selling the Opportunities it Brings to a Commercial Organisation
Amy Balmain, Head of Data Exploitation, Southern Water

Make Insights a Team Sport with Data and AI
Lena Woolf, Senior Technical Staff Member, IBM

Data Management in Manufacturing
Felix Streichert, Chief Data Manager, Bosch

Organisations Speaking include

Keynote & Featured Speakers

Testimonials

Sponsors

Fees

  • 4 Days
  • £1945
  • £1,945 + VAT (£389) = £2,334
  • 3 Days
  • £1595
  • £1,595 + VAT (£319) = £1,914
  • 2 Days
  • £1245
  • £1,245 + VAT (£249) = £1,494
  • 1 Day
  • £795
  • £795 + VAT (£159) = £954
Group Booking Discounts
Delegates
2-3 10% discount
4-5 20% discount
6+ 25% discount

Venue

  • Radisson Blu Portman Hotel
  • 22 Portman Square
  • London W1H 7BG
  • UK

Join the conference group

Agenda

Monday, 19 November 2018, Pre-Conference Workshops
Morning Workshops
Data Virtualization From A to Z

Rick van der Lans, R20/Consultancy

Data is increasingly becoming a crucial asset for organizations to survive in today’s fast moving business world. In addition, data becomes more valuable if enriched and/or fused with other data. Unfortunately, enterprise data is dispersed by most organizations over numerous systems all using different technologies. To bring all that data together is and has always been a major technological challenge. In addition, more and more data is available outside the traditional enterprise systems. It’s stored in spreadsheets, simple file systems, cloud applications, in weblogs, in social media systems, and so on.

This is where data virtualization comes to the rescue. Data virtualization is a technology that makes a heterogeneous set of databases and files look like one integrated database. When used in business intelligence systems, it can make the architectures dramatically simpler, cheaper, and, most importantly, more agile. New reporting and analytical needs can be implemented faster and existing systems can be changed more easily.  This half-day workshop explains in detail what data virtualization is, how the products work, advantages and disadvantages are discussed, products are compared, and focuses on aspects, such as query performance, caching, data security and data integration.

  • Under the hood of a data virtualization server
  • Importing non-relational data, such as XML and JSON documents, web services, NoSQL, and Hadoop data
  • Query optimization techniques
  • Caching for performance and scalability
  • Securing access to data in virtual tables
  • Design guidelines and tips and tricks
  • Market overview, including AtScale, DataVirtuality, Denodo Platform, Dremio, FraXses, IBM Data Virtualization Manager for z/OS, Red Hat JBoss Data Virtualization (Teiid), Stone Bond Enterprise Enabler Virtuoso, Tibco Data Virtualization

Principles for the New BI

Donald Farmer, Principal, TreeHive Strategy

Whether we work in business or IT, it sometimes feels like we are bombarded with advice about which tools or platforms to choose for data analytics. But business insights don’t arise from features and functions and choosing a good platform is only the start of your analytics journey.

Too often we overlook the need for some basic principles – ways of thinking and evaluating data, technologies and organizational needs that act as landmarks on our path to a culture of analytics. For example, it’s critical to remember that the heart of business analytics is still decision-support – losing sight of that principle can waste a lot of time and money! What about the relationship between data preparation and data analysis? We need to keep in mind how closely entwined these techniques are, if we are to be effective in either.

In addition to well-established practices, as new technologies emerge we need new, relevant, principles to guide us through machine learning and AI. In this course, Donald Farmer will set out 10 fundamental principles of modern data analytics—perceptive, provocative ideas about how data really works in our businesses. These principles provide valuable starting points for planning, evaluating, and promoting business intelligence projects.

You Will Learn …

  • Why understanding the human element is essential to good information design
  • How machine learning and AI are already changing the nature of business knowledge
  • Who makes decisions in a world of automation
  • Why good governance does not necessarily result in good decisions
  • The significance of bias in machine learning, but also in everyday analytics.

This workshop is suitable to:

  • BI and analytics architects designing and developing analytic systems
  • Business leaders trying to guide teams through the important changes happening in analytics and machine learning
  • Data analysts and data scientists who need to work and communicate with business users
  • IT leaders investing in platforms, tools and training in analytics and machine learning.

Artificial Intelligence for Mere Mortals

Jos van Dongen, Principal Consultant, Tholis Consulting

The evolution in attention from Big Data to Machine Learning to Artificial Intelligence completes the transition from mostly technology focused to human centric computing. The net effect however is that while the applications of the technology are well known to many, the technology itself is becoming less transparent and understood by only a few. As a result, many people shy away from adopting new algorithms and use cases simply because it looks too complex or even scary. This workshop will break down the latest advances in AI like chatbots, decisioning engines and image recognition into the different components that make up the technology. Using a use case driven approach, you’ll get deeper insights and a good understanding of how AI actually works. You will learn:

  • How organizations in healthcare, insurance, retail and transportation are using AI
  • What the power and limitations of AI are
  • How to select the AI building blocks you can use in your own projects

Getting Started with Data Quality – A Primer

Jon Evans, Equillian

Today, more than ever, the quality of data, underpinned by a robust approach to Data Quality Management, is critical to the success of every organisation. Unfortunately, it is a topic that is still impenetrable to many through the use of unfamiliar jargon and too much emphasis on technology.

In this half-day workshop, Equillian’s Jon Evans will seek to redress the balance, by taking the audience on a journey from first principles right through to advice on establishing a Data Quality Programme. Along the way, both beginners and those already familiar with the topic will benefit from a business-focused approach, based on industry best practice coupled with many years of experience helping organisations tackle their Data Quality challenges.

The session will be structured around 4 key topics:

  • Why should I care about data quality?
  • Monitoring data quality
  • Improving data quality
  • Developing a DQ Programme

Beyond GDPR Compliance to Ethical Data: A Practical Introduction to Ethical Information Management

Katherine O’Keefe, Lead Consultant / Chief Ethics Oficer, Castlebridge

Your organization is already doing “ethics” whether you explicitly consider it or not.  The question is whether it is a “best efforts” production without the clear guidance of principles. Increasingly data ethics is becoming recognized as important not just as a “fuzzy” concept of corporate social responsibility but as both a commercial differentiator and as a real influence on our day to day experience and a key management risk.  Have you taken the time to consider what it is you’re doing? Is your organization’s leadership setting the tone from the top?  Do you have appropriate systems of management in place to support ethical decisions and actions? This course is designed for information management professionals and provides a detailed framework and practical tools and techniques for implementing an ethical information management strategy.

Key takeaways for this session include:

  • An overview of fundamental Ethical Concepts as related to Information Management
  • Risk management, Information management practices
  • Using GDPR Compliance to focus Ethical Data use beyond compliance
  • Methods to align ethics with Information Governance
  • Practical tools for implementing an ethical information strategy

Fast Data and Edge Analytics – the Next Frontier in Smart Business

Mike Ferguson, Managing Director, Intelligent Business Strategies

The demand for lower latency streaming data is growing rapidly in many enterprises.  This may be from data sources like financial markets, weather data, click stream data in web logs, or internet of things sensor data deployed in operational areas like manufacturing production line equipment and logistics. IoT data, in particular, is rapidly growing.

The reason this data is needed is to provide near real-time insights into business operations to help organisations make informed decisions often on a near real-time basis. However, the problem with streaming data is that while many companies are capturing it, either on-premises or in the cloud, they are not doing much with it.  One of the reasons for this is that traditional architectures are based on the assumption that data always needs to be processed and analysed in the data centre or the cloud and not in the edge. Also, tools to process and analyse this type of data are not in place.  This session explores what is different about streaming data and analytics. It then looks at what’s needed to prepare and analyse it and at the option to analyse streaming data in the edge of the network closer to where the data is created. It considers what is needed to get ready for streaming data including edge analytics versus a streaming analytics platform and how this affects existing architectures, data preparation, model management and decision management. Finally, it looks at what you need to consider to integrate data-in-motion with enterprise data if you want to analyse data at the edge.

  • What is fast data?
  • Types of fast data and what is different about it?
  • Processing options – prepare and analyse in the centre or at the edge or both?
  • What is edge analytics?
  • Do you need streaming analytics platform or something else?
  • How will your architecture need to change to accommodate edge analytics?
  • How do you ingest and analyse high velocity fast data at scale?
  • How do you integrate fast data with enterprise data at rest if analysing at the edge versus the cloud or the data centre?
  • What about model management?
  • Decision management- Automating analysis and action taking
  • Getting started

09:30 -12:45
Data Virtualization From A to Z
Principles for the New BI
Artificial Intelligence for Mere Mortals

Jos van Dongen, Principal Consultant, Tholis Consulting

Jos van Dongen

Principal Consultant, Tholis Consulting

Jos van Dongen is a consultant, trainer, analyst and author. Jos has been involved in software development, business intelligence (BI) and data warehousing since 1991 and is the (co)author of three highly acclaimed BI books. He’s currently working as analytics advisor and technology evangelist at SAS Netherlands and speaks regularly about advances in data management, data science & data visualization at national and international conferences.

Getting Started with Data Quality – A Primer

Jon Evans, Information Strategist & Founder, Equillian

Jon Evans

Information Strategist & Founder

Equillian

Jon Evans is an Information Strategist, self-confessed data quality geek and the founder of Equillian, an independent UK consultancy practice specialising in Enterprise Information Management. For the past two decades, he has been helping organisations harness their information and transform it into a strategic business asset. His wealth of experience covers all the key disciplines that help define, manage and exploit enterprise information, from putting in place effective Data Governance to delivering insight through Business Intelligence. In the field of Data Quality, he contributes expert knowledge and thought-leadership, drawing upon a track record of successfully delivering DQ initiatives to a wide range of organisations, including a key role in advancing the statistical analysis of health data. As a regular speaker and panellist at industry events, Jon enjoys bridging the gap between the business and IT domains, bringing fresh understanding and clarity – the same approach he adopts as a respected Information Management coach and mentor. Follow Jon on Twitter: @MadAboutData.

Beyond GDPR Compliance to Ethical Data: A Practical Introduction to Ethical Information Management

Katherine O'Keefe, Lead Data Governance & Privacy Consultant, Castlebridge

Katherine O'Keefe

Lead Data Governance & Privacy Consultant

Castlebridge

Dr Katherine O’Keefe is a lead Information Governance and Privacy consultant, trainer, and Chief Ethicist with Castlebridge. She has worked with clients in a variety of sectors, from telco to healthcare to charities, on consulting and training engagements. She has represented Castlebridge at Data Governance conferences internationally and regularly writes about Data Governance topics on the Castlebridge website and in other industry publications. Katherine lectures on Data Ethics and Privacy at the Law Society of Ireland and is a member of an international Data Ethics roundtable. She is also a leading expert on the fairy tales of Oscar Wilde and leads Castlebridge’s Gamification team, exploring ways to use games, storytelling, and non- traditional training activities to help change how people in organisations think about information. Follow Katherine @okeefekat.

Fast Data and Edge Analytics – the Next Frontier in Smart Business
11:00 - 11:15
Morning Break
12:45 - 14:00
Lunch
Afternoon Workshops
Designing A Logical Data Warehouse

Rick van der Lans, R20/Consultancy 

Business intelligence has changed dramatically the last years. The time-to-market for new reports and analysis has to be shortened, new data sources have to be made available to business users more quickly, self-service BI and data science must be supported, more and more users want to work with zero-latency data, adoption of new technologies, such as Hadoop, Spark, and NoSQL, must be easy, and analysis of streaming data and big data is required.

The classic data warehouse architecture has served many organizations well. But it’s not the right architecture for this new world of BI. It’s time for organizations to migrate gradually to a more flexible architecture: the logical data warehouse architecture. This architecture, introduced by Gartner, is based on a decoupling of reporting and analyses on the one hand, and data sources on the other hand. With the logical data warehouse architecture new data sources can hooked up to the data warehouse more quickly, self-service BI can be supported correctly, operational BI is easy to implement, the adoption of new technology is much easier, and processing of big data is not a technological revolution, but an evolution. And most importantly, the technology to create logical data warehouses is available: data virtualization servers. In this practical tutorial, the architecture is explained in detail. Tips and design guidelines are given to help make this migration as efficient as possible.

  • The benefits of the logical data warehouse architecture and differences with the classic data warehouse architecture
  • How easily new data sources can be made available for analytics and data science
  • How self-service analytics can be supported by a logical data warehouse, and how it helps to share specifications across different analytics tools
  • How your organization can successfully migrate to a flexible logical data warehouse architecture in a step-by-step fashion
  • How logical data warehouses help integrate self-service analytics with classic forms of business intelligence
  • The real-life experiences of organizations that have implemented a logical data warehouse

The Art of AI

Jan W Veldsink, Lead AI at Rabobank Compliance and Core Teacher, Nyenrode / Rabobank

You will not have failed to notice the rise of Artificial Intelligence after a long winter into a sunny spring. It is now time to find out what the areas of BI and AI have to offer each other. In this session, because AI is being used more and more recently, Jan Veldsink will pay attention to what AI has to deliver organizations in a BI context.

AI currently focuses on Deep Learning, a layered variant of Machine Learning based on Neural Networks. By training a network with examples it forms a competence, usually classifying, for example, images or events. Requirements are set for the training examples and we will see how BI can play a role in making suitable training materials.

Or you can be assisted as a Data Scientist by an already trained AI! We will see several examples of AI-based systems that can take over part of the work of Data Scientists.

And for the real connoisseur there is the possibility to train an AI with own examples and then have them do part of the BI work. Typical examples of this are anomaly detection in fraud investigation and determining when machinery maintenance should take place. We will see various platforms that make it possible.

Governing the Data Lake - The Critical Importance of An Information Catalogue

Mike Ferguson, Intelligent Business Strategies

With so much new data being captured across the enterprise and multiple self-service and data science initiatives being undertaken, something has to know and track what’s going on and what’s available in an increasingly complex data landscape.  In addition, people need the ability to publish what data and what artefacts (ETL jobs, data preparation jobs, analytical models, dashboards, etc) currently exist to encourage re-use and prevent re-invention.  This session shows how information catalogue software can be used to publish data, artefacts and policies to manage, organise and govern in a multi-platform data and analytical environment.  This session will cover:

  • What is an information catalogue?
  • Information catalogue capabilities, e.g. business glossary, automatic data profiling, automatic tagging and data classification, automatic sensitive data discovery, automatic data indexing, faceted search, data marketplaces, artefact publishing
  • How does an information catalogue help govern a data lake?
    • Data meaning
    • Data Quality
    • Data access security
    • Data Privacy
    • Data Lifecycle management
    • Data Lineage
  • Information Catalog technology offerings
  • Creating a governed information value chain using an information catalogue
  • Key roles and responsibilities – Information producers, information consumers and governance
  • Publishing data and analytics as a service
  • Integrating disparate metadata via Open Metadata and Governance
  • Policy management and policy enforcement across multi-platform via an information catalog
  • Integrating the catalog with data management, data science, and BI technologies
  • Consumer trust – Accessing business glossaries and metadata lineage

Making Enterprise Data Quality a Reality

Nigel Turner, Principal Information Management Consultant EMEA, Global Data Strategy 

Many organisations are recognising that tackling data quality (DQ) problems requires more than a series of tactical, one off improvement projects. By their nature many DQ problems extend across and often beyond an organisation.  So the only way to address them is through an enterprise wide programme of data governance and DQ improvement activities embracing people, process and technology. This requires very different skills and approaches from those needed on many traditional DQ projects.

If you attend this workshop you will leave more ready and able to make the case for and deliver enterprise wide data governance & DQ across your organisation. This highly interactive workshop will also give you the opportunity to tackle the problems of a fictional (but highly realistic) company who are experiencing end to end data quality & data governance challenges. This will enable you to practise some of the key techniques in a safe, fun environment before trying them out for real in your own organisations.

Run by Nigel Turner of Global Data Strategy, the workshop will draw on his extensive personal knowledge of initiating & implementing successful enterprise DQ and data governance in major organisations, including British Telecommunications and several other major companies.  The approaches outlined in this session really do work.

The workshop will cover:

  • What differentiates enterprise DQ from traditional project based DQ approaches
  • How to take the first steps in enterprise DQ
  • Applying a practical Data Governance Framework
  • Making the case for investment in DQ and data governance
  • How to deliver the benefits – people, process & technology
  • Real life case studies – key do’s and don’ts
  • Practice case study – getting enterprise DQ off the ground in a hotel chain
  • Key lessons learned and maxims for success

Closing the Communication Chasm: Using Stories to Convey Actionable Insights

Lori Silverman, Partners for Progress

What’s the best way to communicate data to accelerate decision-making and action? Here’s a proven path: Reveal insights hidden in the data. Storify the insights that need to be given voice. And determine the best way to communicate those stories.  What does this imply? You need to know how to shift conversations from questioning data, data integrity, data analyses, and data visualization to dialogue grounded in foresight, strategic thinking, and action. And you need to know how to craft compelling meaning-filled stories that don’t contain data. What???? No data??? This workshop demonstrates how to unlock value for you and your organization through becoming a data translator and teaching others how to do the same.

  • Determine which questions to ask to move people from data to insight to action.
  • Identify three types of insights and stories that may reside within a set of data.
  • Outline compelling stories that move people to action.

Lakes, Marts, and All Things Data; How to Really Support the Business Strategy with Information Management

Jan Henderyckx, Partner, BearingPoint

Every day organisations make business decisions assuming the information in their system is accurate, but for many it can be costly if the data is flawed, out-dated, unchecked or simply not accessible. In a market where everyone is striving for more insights through data, the accuracy and trust of your data can make the difference between competitive advantage and bad decisions.

Aligning your information requirements with strategic business objectives is critical. Organisational, procedural and technical capabilities and policies need to be put in place to provide information management capabilities.

The industry has recognised the potential of information and we’ve witnessed an exponential growth in related tools, database solutions, BIG Data platforms, appliances, Data Refineries, Data Lakes, analytics, algorithms, … However, many companies are struggling to deploy these concepts in a sustainable and effective way. The number of data breaches and data related incidents are rising at the same, if not higher rate. For that reason, the approach this seminar takes is to embrace the innovation and disruptive ability of insight but to embed it in the organisation in a sustainable way.

Do you recognise that information is a valuable asset, but do you struggle to deliver on that value?

This seminar teaches you how you can turn your organisation around and make it information centric delivering on the promise of accurate and trusted business information.

  • Engaging your business and have them take the lead and recognise the value of information;
  • Setting up the relevant building blocks are to become information centric;
  • Aligning your data- with your business strategy;
  • Redefine your Business Intelligence architecture;
  • Select the proper Enterprise Information platform to support your information strategy;
  • Assuring data literacy through the use of an information catalog;
  • Capability model and related services that support innovation and operational trust;
  • A metadata reference architecture federated or centralised;
  • Policy driven information management to assure proper lifecycle management.

14:00 – 17:15
Designing A Logical Data Warehouse
The Art of AI

Jan W Veldsink, Lead AI at Rabobank Compliance and Core teacher Nyenrode Nyenrode / Rabobank

Jan W Veldsink

Lead AI at Rabobank Compliance and Core teacher Nyenrode Nyenrode / Rabobank

Jan is a creative and energetic new thinker, with a passion for people and human processes.  Jan is a Senior Advisor, Trainer and Coach specialising in Artificial Intelligence and Intuitive interventions in organizations. His mission is to contribute to a secure and endurable environment within teams and organizations. His expertise areas are Artificial Intelligence and Machine Learning, cyber security, systems thinking, organizational/group dynamics, serious gaming and innovations.

Governing the Data Lake - The Critical Importance of An Information Catalogue
Making Enterprise Data Quality a Reality

Nigel Turner, Principal Information Management Consultant EMEA, Global Data Strategy

Nigel Turner

Principal Information Management Consultant EMEA

Global Data Strategy

Nigel Turner is Principal Information Management Consultant for EMEA at Global Data Strategy Ltd. and Vice-Chair of the Data Management Association of the UK.  Nigel has worked in Information Management for over 25 years, both as an in-house implementer of Information Management solutions at British Telecommunications plc and subsequently as an external consultant to more than 150 clients, including the Environment Agency, British Gas, HSBC, Intel US and others.

Closing the Communication Chasm: Using Stories to Convey Actionable Insights
Lakes, Marts, and All Things Data; How to Really Support the Business Strategy with Information Management
15:30 - 15:45
Afternoon Break
Tuesday, 20 November 2018, Conference Day 1 & Exhibits
09:00 – 09:10
Joint Conference Chair Introductions: Rick van der Lans, R20/Consultancy and Jan Henderyckx, BearingPoint
Plenary Keynote: It's Not about You, It's About Them: Helping Others Take Action Based on Data

Lori Silverman, Partners for Progress

Maybe you routinely collect, cleanse, mine, and monitor data for insights. Or select technologies for data storage, processing, and transportation. Perhaps you get requests for data and analyses that fuel all sorts of visualizations. Or you’re implementing big data governance structures. These all have one goal in common: Enabling quicker and better organizational decision-making that sparks collaborative and aligned action. This begs two questions: Why does a communication chasm exist between those who do this work and those who need to solve pressing business problems and implement the solutions? And how do emotion and intuition reside harmoniously with data? This keynote presents a model and how-to framework for closing these gaps.

Objectives:

  • Illustrate the relationship between data, insight, intuition, and action.
  • Identify three different kinds of insight that can emerge from data.
  • Demonstrate why story is the narrative vehicle for decision making and action.

09:10 – 10:00
Plenary Keynote: It's Not about You, It's About Them: Helping Others Take Action Based on Data
Business Intelligence & Analytics Keynote: Working with Ambiguity

Donald Farmer, Principal, TreeHive Strategy

We have all seen sales forecasts where next year’s number are projected down to the last penny. Many of our efforts in Business Intelligence suffer in a similar way from false precision. In today’s workplace, these old certainties are wearing away. With the growth of Artificial Intelligence, Machine Learning and Predictive Analytics, we need to become much more comfortable with the language of probability and inference.

In this Keynote, Donald Farmer will explore how business users can become more comfortable with ambiguous information. We’ll see how new styles of visualization and reporting can help to make predictive analytics more actionable. And we’ll look at decision making styles and organizational patterns that are more appropriate for this more complex world.

Enterprise Data Keynote: Using Enterprise Information Management at the International Criminal Court to End Impunity

Dr Jones Lukose, Information Management Officer, The International Criminal Court

Human existence has continuously sought to find information, use and preserve it for posterity. Information preservation is, therefore, one of the most important tasks of communities and organizations. Information is now more correlated than ever and found in large quantities known as ‘big data’; it is pervasive, difficult to capture, store or analyze.

In this Keynote, the International Criminal Court (ICC), a judicial organization that has the preservation of its information as a critical aspect of its judicial obligation, is discussed. Born in the digital age, the Court has adopted an eCourt strategy covering all aspects of its operations to end impunity in the world. And to solve the accountability challenge its functions are redefined towards digital information orchestration and less towards traditional paper preservation.  In this Keynote, therefore, a story is told in the form of three buttons and in it address ICC’s information management practices and tools.

Key Topics:

  • Information value and accountability
  • Records managers and archivists as part of the information system
  • Probabilistic and unpredictable
  • Intuition and Trust
  • Records managers and archivists as orchestrators
  • Big Data

Take Away Include:

  • Understand the Court and its global mandate to end impunity
  • Learn what it means to run an eCourt
  • Understand the new role of records and archives professionals in a modern institution
  • Understand the motivational dimension of managing enterprise information in a “digital” organisation

The act of preservation of elements of human existence does not only feed to aesthetic functions but also to the fulfilment of accountability needs of the society. In this digital culture of ‘data everywhere,’ the demand for accountability is high necessitating a redefinition of information management practices. This is, therefore, a story of how information management practitioners in the ICC are redefining their relevance and value while helping the organisation end impunity.

10:05 – 10:50
Business Intelligence & Analytics Keynote: Working with Ambiguity
Enterprise Data Keynote: Using Enterprise Information Management at the International Criminal Court to End Impunity
10:50 – 11:20
Break & Exhibits
Plenary Keynote Panel Discussion: The Changing Role of Data in Organisations

Moderator: Rick van der Lans, R20/Consultancy

We all know that the role of data within organizations has changed? There was a time when enterprise data was deeply hidden in our data centers. It was very hard for business users to get access to it. Nowadays, data plays a key role in day-to-day operations. Without the right data at the right time and in the right form, businesses operations may grind to a halt. New software and hardware technologies have made it possible to exploit data in many more ways than before. This all sounds easy, but what are the consequences of this changing role? What does it mean for ourselves, the BI specialists? And will an extensive use of data clash with the GDPR? Are we still using the right technology? Do we have the right data architecture? We are still struggling with many of these and similar questions. All these questions and more will be debated during this panel.

11:20 – 12:05
Plenary Keynote Panel Discussion: The Changing Role of Data in Organisations

Rick van der Lans, Independent Analyst, Consultant, Author and Lecturer, R20/Consultancy

Ajay Khanna, Vice President, Marketing, Reltio

Ajay Khanna

Vice President, Marketing

Reltio

Ajay Khanna is the vice president, Marketing at Reltio, the creator of data-driven applications. His product marketing and product management expertise stems from various leadership roles at large public enterprise software companies including Veeva Systems, Oracle, KANA, Progress, and Amdocs. He holds an MBA in marketing and finance from Santa Clara University.

Dr. Alexander Borek, Global Head of Data & Analytics, Volkswagen Financial Services

Katherine O'Keefe, Lead Data Governance & Privacy Consultant, Castlebridge

Hossein Kakavand, CEO and co-founder, Luther Systems

Hossein Kakavand

CEO and co-founder

Luther Systems

Luther Systems CEO, Hossein Kakavand has extensive experience in the technology space and financial enterprise services. He received his Ph.D from Stanford University in Information Systems and Statistics, and subsequently Stanford Business School. He was a fund manager at Soliton Capital, a quantitative hedge fund with focus on ETF trading. He has also been with Funding Circle, a leading Financial Technology peer to peer financing platform, where he was part of the team launching the capital markets group, where he was instrumental in developing and bringing to market the first ever Securitization of a portfolio of fixed income peer to peer (P2P) assets across Europe, and setting up an Exchange Traded P2P income fund currently traded on the London Stock Exchange, as well as the first European hedge fund leverage P2P loan investment program. Further, he also developed an end to end portfolio analytics and monitoring pipeline. He has also advised a number of financial and technology startups in Silicon Valley on both technical and strategic issues. Including, a VC backed early stage distributed computation platform focused on automation of analytical portfolio risk and optimization modelling. Also a hardware-software combination computation platform, and a VC backed big data analytics recommendation company. He is also on the Harvard Business School Angel investors of London selection committee.

Business Intelligence & Analytics
Business Intelligence & Analytics
Enterprise Data
Enterprise Data
The Automated Factory Worker

Santanu Dawn, Master Data Leader EMEA, Goodyear 

The terms Dig Data, Machine Learning and IOT are the buzzwords of the moment. There is much discussion on what they are and why it is different from other projects. Now welcome to the world of manufacturing. For ages, our machines have been generating huge amount of data, it’s only now that we have the capability and technology to cost effectively capture and process it. In pure statistical terms, the manufacturing world generates as much machine data as the social media. The question is what to do with it.

In this session, we will try to build an optimized manufacturing organization by:

  • Beating the bottleneck – The new era of analytics
  • Augmenting workforce through technology
  • Intelligent software systems and how they are transforming knowledge work
  • New advances in artificial intelligence, machine learning and natural user interfaces

Data Product Development: from Zero to Hero

Andrey Sharapov, Data Scientist, Lidl

Modern organizations are overwhelmingly becoming convinced that data can be successfully turned into better decision making that will directly result into higher profits and lower costs. At Lidl, we believe that data products are the key ingredient that can either automate or supplement business processes. Ability to give internal decision makers an opportunity to explore not only the landscape of their business with descriptive analytics but also to allow them to uncover hidden patterns and dig deeper using prescriptive analytics is becoming increasingly popular among different organizations.

Needless to say that starting from scratch is a very difficult endeavour  and often companies begin with simple ad-hoc analysis but fail to progress toward data products due to lack of talent and the ever-increasing technological complexities. Indeed, a data product is a piece of software that is able to consistently produce meaningful results, while ad-hoc analysis is usually a single and irreproducible dive into data with doubtful outcome.

Lidl is a German global discount supermarket chain, that operates over 10,000 stores across Europe and the United States. In this session, Andrey will show you how to build a data product starting literally from pieces of ad-hoc research.  He will demonstrate their experience in working with internal customers and the challenges that data science teams face. Since they are a young team of data engineers and data scientists, Andrey will share how they jump-started data product development at Lidl and show the key achievements as well as challenges that they came up against. In this talk, Andrey will illustrate how they tamed their Hadoop cluster,  took advantage of the Hortonworks platform,  wrote and deployed production code in R.

Data, Meet Regulations. How do you do?

Peter Campbell, Founding Member & Director, DAMA BeLux

In recent years, regulations have increased significantly, both here in Europe as well as globally. Some of these were triggered by the global financial crisis of 2008, whilst others are driven by objectives of market transparency, safety, or considerations for privacy of personal data.

In this session, several regulatory business cases will be presented, and, wherever possible, related to the need for improvements in data management, not only to support compliance, but to improve overall data management capabilities and effectiveness, in specific industries and across organisations.

Representative compliance business cases include:

Energy/High Power Electricity Transmission:

European Commission Regulation 543/2013 (of 14 June 2013) on the submission and publication of data in electricity markets);

Pharma / Life Sciences:   IDMP (IDentification of Medicinal Products):

In Europe: European Commission Commission Implementing Regulation (EU) No 520/201  (articles 25 and 26) which obliges European Union (EU) Member States, marketing authorisation holders and EMA (European Medicines Agency) to implement the standards developed by ISO (11238, …..)

In the U.S:  Compliance / regulation is applicable and enforceable by the FDA (Food and Drug Administration)

GDPR (General Data Protection Regulation, EU 2016/679 [14 April 2016, effective 25 May 2018]):

Regulation on the protection of natural persons with the regard to processing of personal data and on the free movement of such data;  It also addresses the export of personal data outside the EU and EEA (European Economic Area).  The GDPR aims primarily to give control to citizens and residents over their personal data and to simplify the regulatory environment for international business by unifying the regulation within the EU.

There will also be discussions about some of the international standards organisations and specific standards, which often play important roles in these compliance projects; for example, this will include standards of the IEC (International Electrotechnical Commission), the ISO (International Standards Organisation), and also the W3C (World Wide Web Consortium).

Wherever applicable, the DMBOK2 (Data Management Body Of Knowledge) will be discussed, and new data management services and improvements of this version of the DMBOK (released in 2017) will be highlighted, for the benefits this framework can bring, to facilitate regulatory / compliance projects, as well as well as to improve overall data management environments.

Why Training the Organisation and not just the Data Team is Vital

Phil Yeoman, Head of Data Governance, The Pensions Regulator

In this ever changing environment of data it’s easy to focus on the last bit of tech kit you have persuaded procurement to purchase and/or the skills of the data scientist that you have just recruited – Do you go Data Lake or Data Warehouse.  Let’s fire up an Hadoop Environment and  sandbox away.

But have you stepped back and thought about that gap between what we data folk talk about and the business folk understand?

Phil’s talk will focus on:

  • Bridging the gap between data people and business people.
  • Why training the organisation and not just the data team is vital.
  • Why people and culture matter

Delegates will learn:

  • How language and concepts can be intimidating
  • Why people need to have faith and confidence in what data teams do
  • If organisations want to be data driven all their people need to be data savvy.

12:10 – 12:55
The Automated Factory Worker
Data Product Development: from Zero to Hero
Data, Meet Regulations. How do you do?

Peter Campbell, Founding Member & Director, DAMA BeLux

Peter Campbell

Founding Member & Director, DAMA BeLux

Peter Campbell has been working in data management for most of his career, spanning over 3 decades.  He has worked in many different industries, primarily internationally, after relocating to Europe  from Boston in the early 1990s, working at that time for Bachman Information Systems.  Since 2012,  he has been actively working on regulatory / compliance projects which will be presented in this  session.  He is also a Founding Member and Director of the BeLux (Belgium & Luxembourg) Chapter of  DAMA (Data Management Association), and has been involved in the Brussels Data Science Community and the European Data Innovation Hub, and is a guest lecturer at local universities on data management topics.

Why Training the Organisation and not just the Data Team is Vital
12:55 – 14:25
Lunch, Exhibits and Perspective Sessions
How CDOs & CIOs are Driving Digital Transformation

Ajay Khanna, Vice President, Marketing

Today’s business landscape more dynamic than ever. New revenue models, stringent regulations, and high customer expectations are forcing organisations to evolve or face being overrun by more nimble competitors.

CDOs and CIOs of established business are looking to digital transformation as a key initiative. But what exactly does digital transformation entail? At its core, any digital transformation requires clean and consistent data, reconciled across systems and channels. An enterprise-wide data management foundation that ensures real-time access to reliable data of all types at scale and is non-negotiable. Data access must be democratised across all groups and divisions so that teams can get a 360-degree view of customers, products, organisations and more. However, it’s not just about disconnected siloed analytics. It’s about the next generation of operational data-driven applications that allow frontline business users to gain relevant insight and intelligent recommended actions so they can achieve their goals. This session explores how some of the largest companies in the world are transforming themselves using the same modern data management technology used by Internet giants such as Amazon, Facebook, LinkedIn, and Google.

13:25-13:50
How CDOs & CIOs are Driving Digital Transformation
Blockchain as Data Processing Railroads

Hossein Kakavand, CEO and co-founder, Luther Systems

Enterprise processes , within and between organizations, are fragmented due to focus on functions over processes, leading to duplication of effort, propagation of error, need for reconciliation of data across processes, which is costly and limits the streamlining of these processes and new potential applications.

We talk about the sources of fragmentation, how Smart Contracts can address these limitations by providing railroads for enterprise process which sit along side enterprise systems not replace them, the benefits of smart contracts, the interplay between smart contracts and artificial intelligence.

(i) Smart Contracts address enterprise process fragmentation by providing railroads for enterprise process

(ii) these railroads sit along side enterprise systems not replace them,

(iii) the benefits of smart contracts

(iv) the interplay between smart contracts and artificial intelligence

13:55 - 14:20
Blockchain as Data Processing Railroads
Plenary Keynote: Data Science for Grown Ups: How to Get Machine Learning out of the Lab to Scale it Across the Enterprise

Dr Alex Borek, Global Head of Data & Analytics, Volkswagen Financial Services AG

Digital players like Amazon, Uber and Netflix are using data science and machine learning at a large scale to drive business value in all of their core business processes. Many large organisations in more traditional industries have also invested heavily in data science, big data, data governance and business intelligence over the past few years and often struggle to scale their successful machine learning projects beyond a small pilot scope. In other words, algorithms stay in the lab and are not put into the heart of the enterprise. This keynote presentation highlights experiences and strategies in maturing a data lab into a global data factory organisation that can ensure that the promised value of data science and machine learning is truly realised. It also discusses the organisational implications for business intelligence, data management and data architecture and the role of cloud driven technologies in the required transformation to get your company ready for the age of artificial intelligence.

Key take aways:

  • Learn about the organisational implications for scaling data science and machine learning to the heart of the enterprise
  • Understand the role of business intelligence, data management and data architecture in the business transformation
  • Hear about good practices and strategies for managing the organisational and technological change process towards an AI driven enterprise

14:25 – 15:15
Plenary Keynote: Data Science for Grown Ups: How to Get Machine Learning out of the Lab to Scale it Across the Enterprise
Business Intelligence & Analytics Keynote: Making Money with Customer Data

Sakari Jorma, Head of Business Technologies, Chief Digital Officer & CTO, Talenom

When a company’s EA and data management reaches to the level of maturity where it becomes an asset, how many companies monetize this? Customer activity data and consumer profiling are here to stay and yes, many do this, but how about truly creating value to existing customers and making more money to your company? How does one ramp up a new Digital business using BI? What is the roadmap? What has to be in plan before one can jump to the world of SAAS business with customer data?

  • EA in good shape
  • Next to perfect BI tools
  • SAAS business model
  • USE CASES – value creation to customer – why should I buy this?
  • Transition from conventional sales to Up & Cross sales models

Talenom is the one of the largest accounting and financial services company in Nordics, which reported aggressive over 18% turnover increase Q1 2018 and continues to expand. The company is leading in the digital accounting and automation in the region, which has led to operating margin of 18%. Their story begins from “basics in place in IT” and understanding the data as an asset. From this, they went to customer value process and quick implementation of SAAS strategy.  With customer value cases combined to data and tools, the creation of BI business was obvious.

Talenom set up a Business Technologies business unit, which combines the IT, Automation, Integration, customer support and Up and cross sales functions, led by CDO.  Digital business is not IT or Head of BI job – it’s a job for the CDO!  This is a true opportunity for the future BI and Chief Data officers.

Enterprise Data Keynote Panel - GDPR: Beyond Compliance

Moderator: Mike Simons, Associate Editor CIO.co.uk, ComputerworldUK and Techworld 

We are living in the new world of GDPR, with most organisations proudly announcing that from 25 May they have been compliant with the new privacy regime.

Behind these announcements, however, is a deep concern about whether the effort that went in to achieving compliance is sustainable and how to demonstrate and deliver long-term business benefits from your compliance efforts.

GDPR is not a one-off test that an organisation passes or fails, it has to be embedded into the enterprise and its day-to-day practices.

Join this panel discussion and:

  • Explore ways of getting real business value from compliance
  • Examine how to get use compliance programmes to get actionable business intelligence from the data you hold
  • Using GDPR and transparency over data as a business differentiator
  • The scope for automation in GDPR-related data management programmes
  • How organisations are maximising the value of data lineage

15:20 – 16:05
Business Intelligence & Analytics Keynote: Making Money with Customer Data

Sakari Jorma, CDO & CTO, Talenom

Sakari Jorma

CDO & CTO, Talenom

Sakari Jorma has almost 20 years of experience in Business development and Data & Information Management.  He is currently leading Talenom (Nordics, Financial & Accounting Services Company) Digital Services business unit. Before Talenom, Sakari ramped up Software AG Information & MDM Management practice globally and developed the Go to market strategies for Software AG.  He also spent more than 10 years leading Nokia Master Data Management globally, where he worked with a Product, a Customer, a Reference and a Supplier Master Data Management including Data Quality and Data governance. He has also extensive expertise within the CRM and the Product Life Cycle Management (PLM) by being the in lead of implementing and leading Services of global CRM and PDM principles.  Sakari is board member of DAMA and received 2016 DAMA Excellence award from his work in leading and building a practical approach to data management.

Enterprise Data Keynote Panel - GDPR: Beyond Compliance

Mike Simons, Associate Editor CIO.co.uk, ComputerworldUK and Techworld

Mike Simons

Associate Editor CIO.co.uk, ComputerworldUK and Techworld

Mike Simons is a highly experienced technology journalist, working as Associate Editor for CIO.co.uk , ComputerworldUK.com and Techworld. He is regularly called on to judge industry awards, including, recently, the SAP Quality Awards also working as a film producer. He was News Editor at ComputerWeekly.com and of a combined Computer Weekly and ComputerWeekly.com before joining IDG as Launch Editor of ComputerworldUK and subsequently taking over responsibility of Techworld as well.

Gary Chitan, Head of UK Data Intelligence Sales, ASG

Gary Chitan

Head of UK Data Intelligence Sales

ASG

Gary is Head of UK Data Intelligence Sales at ASG. Gary has over 20 years’ experience in both manufacturing and service industries. Prior to his move into sales, Gary was a senior business consultant  working with clients in the Pharmaceutical and Finance Sectors, on business transformation and regulatory compliance.

James Archer, Privacy Champion, ITV

James Archer

Privacy Champion

ITV

James traces his structured approach to understanding processes and data to solve problems back to being trained as a Cobol programmer 30 years ago. He has 20 years experience practicing and teaching business analysis.  Building on the success of IIBA events in London that he organised, James is a co-founder of this conference. He was awarded a Masters in Innovation, Creativity and Leadership and with Penny Pullan co-edited the book, Business Analysis and Leadership (Kogan Page).   Recent work includes collaborating with NHS and Social Care organisations to set-up self-managed teams of nurses and care workers, adapting an innovative model from the Netherlands; and as a Privacy Champion for ITV International Studios to implement GDPR regulations for ITV.  James has taught Business Analysis in 17 different countries and believes that the key to great business analysis is an inclusive leadership style, thinking innovatively, working collaboratively, acting strategically and helping people discover their real business needs and requirements.

Cathy Pendleton, Senior Manager - Data Governance, comparethemarket.com

Cathy Pendleton

Senior Manager - Data Governance

comparethemarket.com

Cathy is Senior Manager, Data Governance at Comparethemarket.com. Heading up a rapidly growing team, central to business operation, Cathy is responsible for maturing the Data Governance framework within the business, embedding it into everyday business activities and focusing on the core areas of Data Security, Data Privacy, Data Quality and Data operations.

Compare the Market is a leading price comparison site that makes getting it done, simples. Working with an ever-expanding panel of partners they help customers save money across a diverse range of products, from car and home insurance; to energy; to personal finances, such as credit cards and loans. And that’s not all, they also reward customers for buying through them with a whole year of Meerkat Meals and Meerkat Movies, giving customers 2 for 1 on food and film for a year!

Having worked in data for almost 20 years Cathy has been involved in some pretty amazing data projects spanning the globe. Everything from review and migration of new business acquisitions into existing systems and processes in Switzerland, to the complete overhaul of the entire MARTECH stack in Washington and Australia to more recently the Change Management Programme across a collection of 30 brands in London based publisher to deliver compliance with new GDPR regulations – systems, processes, people and governance!

16:05 – 16:35
Break & Exhibits
Enterprise and Self-Service BI on Top of a Data Lake

Krystyna Kurinna, Teamlead Data Access Services & Solutions, Scout24 AG

The Data Lake concept has been constantly developed during past years. How to make sure all business users can access data easily and carry on performable and sustainable analysis? I would like to share our experience about building enterprise and self-service BI on top of a data lake and cover different aspects of building data-driven culture:

  • Structuring your teams as data platform teams (to make sure data lake works and data is accessible) plus two analytics teams (central and decentral) is a successful set up for working BI on Data Lake
  • To make sure a scalable and agile business intelligence is in place, data producers take full responsibility for data publishing and quality, and data consumers – for metrics definition and implementation of business logic layer
  • Additional technical skills like ability to build a data pipeline are required for powerful analysis and are indispensable for new generation of data analysts

Data Science Workbenches and Machine Learning Automation – New Technologies for Agile Data Science

Mike Ferguson, Managing Director, Intelligent Business Strategies

The demand for analytics is now almost everywhere in the business. Analytics are needed in sales, marketing and self-service, finance, risk, operations, supply chain and even HR. However, the current shortage of data scientists and the reliance on detailed skills such as programming, has led many corporate executives to question current approaches to development of high value analytical models and ask if they can be accelerated in any way to improve agility and reduce time to value. This session looks at this problem in detail and at how emerging data science workbenches and machine learning automation tools can help reduce the reliance on highly skilled data scientists and allow business analysts to become data scientists and so meet the demand of business

  • The explosion in demand for analytics
  • Data science and the modern analytical ecosystem
  • Challenges with current approaches to analytics
  • Requirements to reduce time to value and accelerate development of analytical models
  • Improving productivity by integrating Information catalogs and data science workbenches e.g. Cloudera Data Science Workbench, IBM Watson Studio
  • Accelerating model development, monitoring and model refresh using ML automation tools, e.g. DataRobot, Tellmeplus Predictive Objects, SAS Factory Miner, Dataiku Data Science Studio
  • Facilitating rapid analytics deployment via analytics as a service to maximise effectiveness and competitive edge

You Cannot Inspect Ethics into a Product: Ethics and Quality Management

Katherine O’Keefe, Lead Consultant / Chief Ethics Officer, Castlebridge

“It’s not enough to do your best; you must know what to do and then do your best”.

Ethics are big news in the headlines recently.  Whether we’re looking at remarkable failures in ethics the ethical implications in Big Data processing or self-driving cars, it’s becoming more and more clear that we need to get this right. The challenge arises in moving from abstract discussion to practical application.

This session uses Deming’s view of quality as a starting point to understand and implement ethics in Information Management. You’re already doing “ethics” whether you explicitly consider it or not.  The question is whether it is a “best efforts” production without the clear guidance of principles. Have you taken the time to consider what it is you’re doing? Is your organization’s leadership setting the tone from the top?  Do you have appropriate systems of management in place to support ethical decisions and actions?

Key takeaways for this session include:

  • An overview of Ethics and their relevance to Information Management practices
  • Three types of Normative Ethics in organizations
  • How W. Edwards Demings’ 14 points can be a starting point for introducing an ethical framework
  • An overview of practical methods to align ethics with Information Governance
  • Risk management, Information management practices

Mind your Language: The Criticality of Common Data Definitions in Managing Complex Data

Becky Russell, National Lead for Data Standards, UK Environment Agency
Nigel Turner, Principal Consultant EMEA, Global Data Strategy

The Environment Agency generates, collects and processes large quantities of complex data to support its mission to improve the environment of England.  To enhance its management of this data it has embarked on a major programme to develop and enforce clear data definitions and data standards, driven in part by the UK Government’s Open Data agenda which makes much of the Environment Agency’s data available for public use and scrutiny.

This case study will highlight the importance of shared, common data definitions as a vehicle to improve collaboration in data management, both across the Agency and with its extensive network of partners.   It will cover:

  • The role of the Environment Agency and the importance of data management
  • The issues resulting from a lack of data definitions and standards
  • How the Environment Agency is tackling the problems
  • Why a common language is critical in collaborative data management
  • The role of data science & data modelling in developing common definitions and standards
  • The necessity of multiple definitions in complex data sets
  • The relationship between data definitions and IT development
  • Lessons learnt and advice for other organisations trying to implement data definitions and data standards

16:35 – 17:20
Enterprise and Self-Service BI on Top of a Data Lake

Krystyna Kurinna, Teamlead Data Access Services & Solutions, Scout24 AG

Krystyna Kurinna

Teamlead Data Access Services & Solutions, Scout24 AG

Krystyna Kurinna have got a PhD in mathematics from Doneck University (Ukraine) and has many years of experience is business intelligence environments. She is responsible for Data Assess Services & Solutions team at Scout24 AG. Vision of her team is, that every business user is provided with best services and solutions to create sustainable and scalable analysis. Scout24 is the biggest German platform for those who search for real estate, cars and financial planning information and those who offers it.

Data Science Workbenches and Machine Learning Automation – New Technologies for Agile Data Science
You Cannot Inspect Ethics into a Product: Ethics and Quality Management
Mind your Language: The Criticality of Common Data Definitions in Managing Complex Data
The Power of Data as a Catalyst for Collaboration

Jonathan Sunderland, Data Evangelist, Harbr

Today the challenge is not how much data you have, it’s how much data you can use effectively – making better decisions quicker.

A solitary data source is rarely sufficient for analytics or decision-making. Yet within organisations the combination of legacy & non-integrated systems, cloud offerings and spreadsheets become bewildering.  Add the complexities of external data and the challenges become truly daunting.

Where analysts can collaborate in an ecosystem where data is acquired and profiled once they can debate and iterate analysis as quickly as possible and establish a single version of truth for others to trust in decision making.

Serendipitous discoveries are far more likely, conversations spark curiosity and it is easier for analysis to truly drive innovation – data can become a true catalyst for collaboration.

What delegates will learn:

  • Why customers expect organisational agility
  • The importance of collaboration at the raw data level
  • How data can be the catalyst for change
  • What does a data cultured organisation feel like

Brewing a Data Driven Organisation Leveraging on Self-Service Analytics

Alfredo Pirrone, VP Strategic Planning, Cerveceria Regional

Cerveceria Regional is the second largest Venezuelan brewery, holding a 14% share of the beer market. Since 2016 the company set the strategic goal to become a data centred organisation.

What started in 2014 as a traditional BI project based on Data Warehouse technology, rapidly evolved to become a massive initiative, leveraging on self-service analytics.  Becoming data-driven impacted the required tools, the organisation and the company’s culture.

Cerveceria Regional’s data architecture is based on SAP-ERP and its Data Warehouse run on SAP-BW 7.4. In 2015 the company adopted Tableau as graphical analysis and data visualization tool. No Big Data technologies has yet been adopted.

The presentation will distil the experience of becoming data-centred, providing insights and lessons learned:

  • Assessing change readiness
  • Shaping the organisation
  • Impact on governance and culture
  • Unexpected impediments and benefits
  • Roadmap

AI and IoT for Good

Naser Ali, Head of Solution Marketing, Hitachi Vantara

In this session, Naser Ali shares his experience working in the IoT and AI space; covering complexities, pitfalls, and opportunities to explain why innovation isn’t just good for business — it’s a societal imperative.

Key takeaways include:

  • Deeper understanding of what Big Data, IOT, and AI mean at a functional level, not just what brands the buzzwords refer to.
  • Detailed understanding of some use-cases, and why solving these is more complex than it seems.
  • Not just what it’s for, but who it is for, and how to think about the “business case” or social imperative around it.

Data Management in Manufacturing

Felix Streichert, Data Governance Manufacturing, Robert Bosch GmbH

Digitalization in manufacturing promises a large potential for cost savings, but is faced with a number of challenges, e.g. cost pressure, complex value chain networks, and legacy systems. To introduce data management to this environment requires a step-by-step approach yielding benefits all along the way. We propose a data process life-cycle that allows us to incrementally improve, exploratively analyze, and finally standardize data. Allowing for both limited and full data management in the same framework, while giving a path from one to the other.

  • I4.0 and Big Data
  • Perpetual Brown Field
  • Data Management Approach
  • Data Process Life-Cycle

17:25 – 17:45
The Power of Data as a Catalyst for Collaboration

Jonathan Sunderland, Data Evangelist, Harbr

Jonathan Sunderland

Data Evangelist, Harbr

Jonathan is a highly experienced transformational data leader with broad experience of building new organisation capability and establishing high performing agile data teams across a variety of industries.  As evangelist at HARBR his role is to champion agile data, to help organisations with the cultural changes needed to drive long term effective value from data as a asset.

Brewing a Data Driven Organisation Leveraging on Self-Service Analytics

Alfredo Pirrone, VP Strategic Planning, Cerveceria Regional

Alfredo Pirrone

VP Strategic Planning, Cerveceria Regional

Alfredo is VP Strategic Planning of Cerveceria Regional. For the last four years, he has led the undergoing transformation of Cerveceria Regional to a data centred organization. During this period, Alfredo has gradually become more involved in managerial and technical aspects of business intelligence and data science. Alfredo is an executive with more than 30 years of experience in the areas of corporate finance, capital markets, financial planning, strategic planning and business intelligence acquired in companies in the investment banking, beverage, mining and management consulting sectors. He has a vast experience in financial modelling, business analysis, and business planning, developed mainly in Venezuela, but with assignments in the United States and several countries in Europe and Latin America. Alfredo holds a Computer Engineer degree (Caracas 1980) and an MBA (London 1999).

AI and IoT for Good

Naser Ali, Head of Solution Marketing, Hitachi Vantara

Naser Ali

Head of Solution Marketing

Hitachi Vantara

Naser Ali is the Head of Solution Marketing for what was Pentaho and now the Analytics and IoT Division of Hitachi Vantara. Naser joined Pentaho after twenty years working for big data, cloud, middleware, data centre management, storage and power management companies where he handled marketing and strategic planning. Prior to that, Naser was a research engineer for GE Lighting. Naser has an honours degree in Electronic Engineering from the University of York, a master’s degree in Optoelectronics from the University of Strathclyde, a professional postgraduate diploma from the Chartered Institute of Marketing, and is a member of the Institute of Electrical Engineers. He is based in London, where he lives with his wife and two children.

 

Data Management in Manufacturing

Felix Streichert, Chief Data Manager, Bosch

Felix Streichert

Chief Data Manager

Bosch

Felix Streichert is the Chief Data Manager in manufacturing for the Bosch Group which is a leading global supplier of technology and services.  Originally a machine learning researcher he worked for several years on intelligent solutions and customer projects for mobility systems at Bosch. Originating from his work on the artificial intelligence initiative at Bosch he specialized some years ago on data governance and management for I4.0 and IoT activities in manufacturing establishing and coordinating several data governance teams in different business sectors at Bosch.

17:45 – 18:30
Drinks Reception & Exhibits
Wednesday, 21 November 2018, Conference Day 2 & Exhibits
Plenary Keynote: Challenges of Developing an Enterprise Data Marketplace

Rick van der Lans, R20/Consultancy

There is a new kid on the data block: the data marketplace. In a data marketplace business users shop for the right data products. Examples of data products are predefined KPI’s, reports, files, and data services. The data marketplace is a supply-driven architecture in which data products are developed before the business requests them. This is very similar to how most shops operate: products are researched and developed before there is any guarantee that they will be bought. One of the key goals of the data marketplace is to let organizations benefit more from the investment they have made in data over the years. In this Keynote, Rick explains how an enterprise  data marketplace differs from a data warehouse and a data lake. Additionally, the challenges of developing and maintaining a data marketplace are discussed. Because it’s another data delivery system developed to supply business users with the right information at the right time, some incorrectly think it’s the old data warehouse, but with a twist. Rick will address this common misunderstanding, in general the data marketplace extends the capabilities of existing data delivery systems, such as data warehouses, data marts and data lakes.

09:00 – 09:55
Plenary Keynote: Challenges of Developing an Enterprise Data Marketplace
09:55 – 10:25
Break & Exhibits
Business Intelligence & Analytics Keynote: Driving Change by Applying Analytics Enterprise-Wide

Ian Wallis, Head of Data, Analytics & Insight, Defence Infrastructure Organisation

The application of BI and analytics can often be in a specific function, such as sales, marketing, HR or Finance. However, there is a growing awareness that these techniques can be applied to any part of an organisation to drive change and release value. This presentation will explain how the use of BI and analytics can be applied to a range of activities and a centralised team can deploy their skills and experience from one functional activity to another, thereby broadening the capabilities of that team and achieving significant benefits for the organisation.

Enterprise Data Keynote: The Producer, the Consumer, the Owner and the Rest of the World: Governing Big Data

Jan Henderyckx, Partner, BearingPoint

Big data governance is not just about making sure that you efficiently use your Hadoop cluster or assuring that you work on the relevant use cases. With the democratization of big data capabilities and the wider access to data, questions arise on the regulatory and ethical compliance of the data usage. Locking all data down is not the answer as we would lose too much value. This presentation focuses on the steps you need to take to get sustainable and compliant value out of your big data.

What delegates will learn from attending the session:

  • What is the distinction between information and big data governance
  • Catering to the dynamics of data onboarding and usage flows towards policy-based classification and access
  • Use case governance vs critical data elements impact of the big data governance requirements on the architecture

10:25 – 11:10
Business Intelligence & Analytics Keynote: Driving Change by Applying Analytics Enterprise-Wide
Enterprise Data Keynote: The Producer, the Consumer, the Owner and the Rest of the World: Governing Big Data
Plenary Keynote: Digital Business: Tomorrow is Already Here

Andreas Bitterer, Chief Analytics Evangelist EMEA, SAP

Digital business is about intelligently connecting people, things and businesses. It’s an infinite world of new possibilities for companies to reimagine their business models, the way they work, and how they compete. New technologies like machine learning, the Internet of everything, blockchain, or cloud, etc will transform value chains to enable completely new ways of doing business and our way of life. Hear how leading organizations deliver an innovative customer experience, leveraging the latest technologies, and based on the creative use of a wide variety of information assets.

11:15 – 12:00
Plenary Keynote: Digital Business: Tomorrow is Already Here

Andreas Bitterer, Chief Analytics Evangelist EMEA, SAP

Andreas Bitterer

Chief Analytics Evangelist EMEA, SAP

Andy Bitterer is SAP chief analytics evangelist EMEA, based in Hamburg, Germany. He brings both corporate and industry analyst experience, joining from SAS where he was senior director for the BI product line. Prior to that, he was an IT research analys and consulted with clients as VP and research fellow at CXP/BARC and, as a research VP at Gartner, authored Magic Quadrants, Hype Cycles, and Predicts and spoke at many conferences worldwide. He also spent 15 years with IBM in various management, consulting and technical roles. Andy has authored numerous books, and he’s an accomplished pianist and award-winning photographer.

12:00 – 13:00
Lunch & Exhibits
Business Intelligence & Analytics
Business Intelligence & Analytics
Enterprise Data
Enterprise Data
Is your Company Ready for Self-Service BI?

Ivan Schotsmans

Most companies are still in a traditional warehouse mode, and not ready for self-service BI. , in some departments business users are handling their own data but it’s a silo based approach manipulating and extending trusted data source. For self-service a companywide approach is needed where results are based on data sources.  not about tools, there is no switch to turn traditional into self-service BI. It all starts with an architecture a multipurpose BI environment.   To create a service BI environment and business users make decision based on facts we need to take into account a number of key factors:

  • Do we all need to be data scientists?
  • Which architecture do we put in place?
  • How do we handle data governance?
  • How do we guarantee security?
  • Did we foresee the necessary change management?

Key Takeaways:

  • Architecture for Self-Service BI
  • How to translate data into information
  • Need for Change management
  • Data Correctness

Edge Analytics and Client-Side Machine Learning

Timo Kunz, Data Scientist, Catawiki

The proliferation of Internet of Things (IoT) devices and the corresponding surge in volume of data streams has created a need for processing and analysing data closer to where it is generated: at the edge of the network. In a similar fashion, Machine Learning models are coming to the browser rather than data being sent to a server. A number of powerful tools have become available recently that make this approach now very accessible: the development of sophisticated applications that make efficient use of the user’s hardware capacity and even allow model training on client side is now possible for everyone. This session introduces some recent trends and highlights some free, state-of-the-art tools that can get you started. A demo illustrates a simple implementation.

How to Lower Costs Using IoT Supported by AI

Majken Sander, Independent Consultant   

This session takes delegates through real-life cases: The story from a canteen on how to save money and help the environment; another about automatically meeting relocation based on sensors and algorithms. Also included is a retail store turning down an idea to saving money, because customers expect otherwise, proving that sometimes the smartness of IoT is outsmarted by end users and company optimisation parameters.

Forget About BI and EA. Digital Twin of an Organization (DTO) is Already Transforming Both

Petteri Vainikka, CMO, Ardoq

What is the common ‘thing’ across BI, EA, Enterprise data (MDM), and eGRC professionals, each operating within their own department? They all rely on data. They all rely on largely the same data in fact. Data on how people, business processes and outcomes, applications and their hosting environments, and data schema are interconnected in an ever more complex ecosystem.

Disruptive change to break down these silos and to become truly data-driven around one graph data truth about the enterprise — not many different and overlapping truths across disparate silos — is already taking place. Enter the Digital Twin of an Organization (DTO).

Learn how a DTO supports the discovery of cost optimization opportunities that deliver most value and that do not negatively impact other entities in the organization, visualize the interdependence between functions, processes and key performance indicators to drive value, and more. DTO is to enterprise data governance what search was to internet.

13:05 – 13:25
Is your Company Ready for Self-Service BI?

Ivan Schotsmans, Data Architect, Agile Information Factory

Ivan Schotsmans

Data Architect

Agile Information Factory

With over 30 years of experience in various data warehouse and data governance programs across multiple industries, Ivan Schotsmans serves as Data Coordinator in a Agricultural, one of the largest in Western Europe.  After two decades working on international data warehousing programs as developer, analyst and project manager, Ivan evolved to a data governance role. In this role he is responsible for the strategy and architect of several data migration programs.   Ivan is pioneer in turning business intelligence environments into agile projects. Recently he concepts into big data initiatives.  Ivan is (co-) founder and member of several global organizations (TDWI Benelux Chapter, DAMA, IAIDQ,) and international conferences.

Edge Analytics and Client-Side Machine Learning

Timo Kunz, Data Scientist, Catawiki

Timo Kunz

Data Scientist

Catawiki

Timo has spent over a decade in the retail industry, mainly working with and researching pricing and promotion related topics. He is currently a Data Scientist for Catawiki where he focuses on customer behaviour modelling, customer analytics, personalisation, and customer value. His previous experience includes working or consulting for companies such as Yoox Net-A-Porter, Morrisons Supermarkets, Dansk Supermarked, Boots, Swiss Coop, LVMH, SAP, and Simon Kucher & Partners. He holds a PhD in Management Science from Lancaster University and has published in journals such as Decision Support Systems and the Journal of Revenue & Pricing Management.

How to Lower Costs Using IoT Supported by AI

Majken Sander, Independent Consultant

Majken Sander

Independent Consultant

Majken Sander is a data nerd, business analyst and solution architect. She is well-known in the industry circles as an influential industry executive, international speaker, and accomplished data expert. Majken has worked in IT, management information, analytics, business intelligence, and data warehousing for more than 20 years.  As a tech evangelist Majken often writes on topics like Data Warehouse Architecture & Automation, BI and Analytics, Business value in data and decision support.

Forget About BI and EA. Digital Twin of an Organization (DTO) is Already Transforming Both

Petteri Vainikka, CMO, Ardoq

Petteri Vainikka

CMO

Ardoq

Petteri Vainikka is CMO at Ardoq, and is a frequent speaker at industry events globally. Petteri has an MSc in technology from Aalto University in Helsinki, a degree in higher education pedagogics, and a PhD currently on hold. Throughout his 15+ year professional career spanning mobile, internet, data, and enterprise SaaS technologies, he has always found himself at the intersection of emerging technology and its hands-on commercial application.

His most recent 7 years having been at the forefront of Data Management Platform (DMP) evolution from a buzzword to globally recognized billion dollar enterprise SaaS market. He is driven by genuine passion to help customers understand and succeed within rapidly changing ecosystems, catalyzed by new technology.

Petteri’s past work experience ranges across entrepreneur, business development, product management, academic, and general management roles at Sumea, Digital Chocolate, Rovio (Founder and Head of Sales and Marketing), Aalto University, Leiki, Enreach (CMO), and most recently Cxense ASA (CXENSE:Oslo), where he served as SVP Product Management. Petteri has a proven track record of launching and scaling new technology companies internationally, whilst at all times remaining customer experience obsessed in doing so.

Securing Business Data - Business Driven Security

W.T Bush, Business Consultant, Grayson Industries

A key principal in the migration, maintenance and archival of enterprise data is security. The approach to security and the integration of this method with the other steams of data activities (Data Migrations, Data Cleansing and ongoing Data Management) completes the necessary processes and procedures in effective Enterprise Data Management.

There are many technical descriptors and techniques for protecting against the Big 5 attributes of a security envelope:

  • Confidentiality
  • Integrity
  • Authentication
  • Authorisation
  • Non-Repudiation

This discussion is less about the technical elements of implementing security, and more about the enterprise view of how to treat different types of data within the enterprise so that the technical solutions can be applied.

The building blocks of BI rests on the raw data of an organisation. The more aligned this data (column alignment), the easier it is to analyse and report on.

A business view of data is often in more general terms: Master Data (Customers, Vendors, Assets, etc.), Transactional Data (Sales, Purchases, Use of Service, etc.), Reference Data (Organisational, Drop Downs, etc.) and unstructured data (email, social media, etc.). The more characteristics (attributes) these data objects contain, the more analysis can be performed.

Cataloguing the data in an organisation ultimately translates into a matrix of categories and owners that we apply technical security to. Business engagement in the process of reviewing and accepting responsibility for data at the attribute level will support a solid platform for applying security that can be demonstrated to internal and regulatory auditors.

Tell your Story with Data

Hylke Peek, Consultant BI and Data Analytics, VX Company.

Let the data speak for itself? The data needs some help with this. The visualizing process is getting more complex. There are many tools, different visuals for the same goal and a diverse audience. If you want to communicate effectively using visualizations, you need to know your audience, understand the context and think like a designer.

We know all the cheat sheets for the best visualizations and how to use colours. In this session, we’ll go further. A report is more than a collection of individual – well formatted – visuals. It’s a story you want to tell. Actually, it’s a story you want the data to tell.

  • Understand the context and audience
  • Design a (set of) report(s) as one story
  • Process of designing visualizations

Laying the Foundations Towards a Data-Driven Future

Jon Evans, Equillian & Sarah Whittle, Data Manager, LiveWest

Formed from the recent merger of Knightstone and Devon & Cornwall Housing, LiveWest is embarking on a journey of transformation to become a housing association of the future – an organisation that puts data at the heart of everything it does, from managing its stock as efficiently as possible to delivering the very best service to its tenants.

Developing a strategic approach to data within a very traditional sector, coupled with the added complexity of bringing together two organisations with differing cultures, processes and data landscapes, would present a number of challenges.

In this case study, LiveWest’s Data Manager, Sarah Whittle, will be joined by Jon Evans, the founder of Equillian, to describe the journey so far. Delegates will learn how LiveWest:

  • Created a “data vision” to set out its long-term aspirations for data in a way that would excite and enthuse senior stakeholders
  • Conducted a maturity assessment to understand the current approach to managing data and how this differs across the merged organisation
  • Developed a roadmap for improving its core data management capabilities and laying the foundations towards a data-driven future

Making Data Mainstream: Establishing a Data Function and Selling the Opportunities it Brings to a Commercial Organisation

Amy Balmain, Head of Data Exploitation, Southern Water

For a long time data has been seen as just a by-product of business processes. Now, established and emerging tools and techniques allow organisations to unlock the real value of data to make significant improvements in operational performance. Attendees will learn:

  • Context on why a dedicated data team is a must for business
  • How to set one up without importing expensive specialist skills or external consultants
  • The significance of the commercial opportunities

13:30 – 14:15
Securing Business Data - Business Driven Security

W.T. Bush, Business Consultant, Grayson Industries

W.T. Bush

Business Consultant

Grayson Industries

W. T. Bush is a business consultant with 27 years of experience in delivering positive business outcomes. For the last 17 years, he has focused on engaging business leaders in taking ownership of their data and actively engage in the outcomes of transformation programs. Mr. Bush has delivered data solutions for some of the largest enterprises on the planet including PepsiCo Europe, British American Tobacco, Pfizer, Israel Chemicals and many more pharmaceutical, FMCG, CPG and retail clients.

As a stepping stone for BI and Analytics, it is essential that data is set up and managed efficiently. The key to getting value of data during either a transformation project or just BAU is to get the business to engage and drive outcomes. W T Bush is an author of  business-focused books  geared to do just that….putting technical activities into business language. Armed with these tools, any business can engage confidently with a Strategic Integrator to ensure that this most precious asset, Data, is given the energy and structure required to make it fully utilised.

Tell your Story with Data

Hylke Peek, Consultant BI and Data Analytics, VX Company

Hylke Peek

Consultant BI and Data Analytics

VX Company

Hylke Peek is a consultant in data solutions and is a specialist in Business Intelligence and Analytics. He has consulted for a wide variety of companies on enterprise business intelligence, Big Data processing and Analytics, Machine Learning, both on-premise and cloud. Hylke has spoken on multiple events and IT-channels to share his experience in the field. He loves working with data and sharing his experience with other data-minded individuals and organisations. Follow Hylke on Twitter: @hylkepeek.

Laying the Foundations Towards a Data-Driven Future

Sarah Whittle, Data Manager, LiveWest

Sarah Whittle

Data Manager

LiveWest

Sarah Whittle is Data Manager for LiveWest, a large housing association based in South West England where she is responsible for implementing data governance & data quality frameworks and changing culture towards data. She started her early career in accountancy and since 1992 started work for a housing association in the finance department. She has always had an interest in data and led the association’s data migration & implementation of new financial and housing management systems. Prior to moving to LiveWest she worked for HouseMark for 12 years, the sector’s leading provider of benchmarking and performance improvement services where she further developed her knowledge on data quality and validation techniques.

Jon Evans, Information Strategist & Founder, Equillian

Making Data Mainstream: Establishing a Data Function and Selling the Opportunities it Brings to a Commercial Organisation
The Analytics Factory

Jos van Dongen, Tholis Consulting  

Data science, machine learning and AI are hot topics in todays analytics landscape. Many breakthroughs have been made due to advances in algorithms and computing power. Organizations adopt these new technologies and use them to their own advance to improve sales, customer interactions or internal processes. But, they are also facing new challenges when developing and deploying analytics solutions. Not all solutions are easily scalable or can be deployed and run in an automated way, and more often than not point solutions are created using different tools on platforms which are hard to maintain. Moreover, the performance of analytical models degrades over time, requiring a different type of monitoring and maintenance. This session shows how to overcome these challenges and will address the following topics:

  • How to industrialize analytical development processes for sustainable results
  • Pros and cons of lambda, kappa and other architectures
  • How to design, build and scale your analytics factory

Avoiding Data Warehousing Failure - Experiences Building a Logical Data Warehouse

Norbert Eschle, Enterprise Data Architect, Direct Line Group

In 2005, Gartner reported that about 50% of data warehousing projects fail. In 2011, they stated that between 70% and 80% of Business Intelligence projects fail. Online publications estimate a big data project failure rate at up to 85%. Despite all of this, business functions still need an enterprise view of business data.

The high cost and effort of moving and integrating data from a wide variety of data sources seems to be a key contributor to these failures both due to cost and lack of agility.

In this talk, Norbert will describe how his organisation is looking to avoid such pitfalls and challenges by applying alternative approaches and move towards the concept of a logical data warehouse (LDW). In this talk, attendees will learn of:

  • The architectural approach taken to building an LDW
  • The outline operational and governance capabilities required
  • The business benefits from some of the key architecture decisions taken
  • The journey to date and pitfalls avoided

Make Insights a Team Sport with Data and AI

Lena Woolf, Senior Technical Staff Member, IBM

Why are enterprises struggling to capture the value of AI? AI is not magic- it requires teamwork. AI is algorithms + data + team.
But data resides in silos & difficult to access. If the data isn’t secure, self-service isn’t a reality. Enterprises need an environment that enables a “fail fast” approach and provides governed access to data. They also need to provide data scientists, application developers and subject matter experts with set of tools to collaboratively and easily work with data and use that data to build, train and deploy models at scale. In this session we will outline best practices for setting up end-to-end AI workflow and enablement team productivity and collaboration.

Audience will find answers to the following questions:
– what can AI do for my business?
– how do I infuse AI into business to drive innovation?
– I hired a team of data scientists – what’s next?
– how can our data scientists gain access to right data?

Fact Oriented Modelling

Marco Wobben, Consultant, BCP Software

Data modellers interview business experts, study piles of requirements, talk extensively, and then, hocus pocus, present a diagram with boxes, crows feet, arrows, etc…. Such data models can be quite abstract, misunderstood, and perceived unnecessary.

Fact oriented modelling is the very opposite of abstract, using natural language to express facts that are intelligible to both business and IT. It does not require an understanding of the magical language of boxes and arrows. Although fact oriented models can be presented in several diagramming notations, the information can always be expressed in natural language. This gives data modellers, technically skilled people, and business people the benefit of having a fully documented, and easily validated model.

14:20 – 15:05
The Analytics Factory
Avoiding Data Warehousing Failure - Experiences Building a Logical Data Warehouse
Make Insights a Team Sport with Data and AI

Lena Woolf, Senior Technical Staff Member, IBM

Lena Woolf

Senior Technical Staff Member

IBM

Lena Woolf is an IBM Senior Technical Staff Member with IBM Watson Knowledge Catalog team. She helps businesses to transform and improve their data science and analytics life-cycle processes by leveraging Watson Data and AI software. Prior to joining Watson organization, Lena provided architectural direction to Master Data Management team. Lena regularly speaks at professional conferences. As an inventor, she contributed to many patents and constantly pushes the boundaries of what’s possible with IBM technology. She demonstrated history of effectively transforming customer challenges and business requirements into enterprise grade software.

Fact Oriented Modelling

Marco Wobben, Consultant, BCP Software

Marco Wobben

Consultant

BCP Software

Marco is an expert in the fact oriented modeling community, Marco Wobben has been consulting and contributing to IT projects for nearly 30 years. He is active in a wide range of industries, such as medical, banking, logistics, tourism, manufacturing, and governmental.

15:05 – 15:30
Break & Exhibits
Plenary Keynote: Ethics Schmethics: Hype or Hope?

Daragh O Brien, Castlebridge

Information Management is at a tipping point. The tools and technologies we have developed have great potential, but bring with them great risks. This is increasingly recognised by industry leaders, front-line workers, legislators, and Regulators. In this keynote, you will get a whistle-stop tour through how we got here, why it matters, what it means, and what we can do to ensure that how we manage and use information in the 21st century is trusted and trustworthy, and what lessons we need to learn from the past.

15:30 – 16:15
Plenary Keynote: Ethics Schmethics: Hype or Hope?

Daragh O Brien, Leading Consultant, Educator and Author, Castlebridge

Daragh O Brien

Leading Consultant, Educator and Author

Castlebridge

Daragh O Brien is a leading Consultant, Educator and Author and the Managing Director of Castlebridge Associates.  He is a well known and respected expert in the field of data ethics, data privacy and data governance.  Daragh has co-authored a book about Information Ethics which will be published in 2018 by Kogan Page.

Conference Close

Rick van der Lans & Jan Henderyckx

16:15 – 16:30
Conference Close
Thursday, 22 November 2018, Post Conference Full Day Workshops
Jumpstart your Enterprise Data Initiatives and Keeping Them on the Right Track

Jan Henderyckx, Partner, BearingPoint

You are convinced that data can bring value to your organisation and your company might have already started the data journey.  In reality many of the data lake and big data initiatives fail because there is more to it than just putting large quantities of data in a data lake.  This workshop will cover the key challenges that organisations faces when embarking on the data journey. You will understand what is required for data based value creation and know how to put the relevant building blocks in place to assure that your organisation interacts with the data in the most effective and efficient way.. Combining risk, privacy, security and value creation separately is the way that leads to proper management of the data. You will also learn how to maximise the use of the data by capturing the usage constraints of the data as early as possible.

Attendees will learn:

  • What the key building blocks of a data strategy consist of
  • Learn how to engage your business and have them take the lead and recognise the value of information.
  • Learn how to adapt the organisation to make information centric
  • Learn how to establish an information governance organisation
  • What kind of information governance styles can be applied
  • Which organisational structure as most suited for creating data value
  • How to best integrate with an analytics roadmap
  • How tooling can support the value delivery
  • Learn how to manage speech communities and business vocabularies
  • Learn how to align your IT with your information strategy
  • Get more value out of your MDM projects
  • Learn how to redefine your Business Intelligence architecture
  • Learn how to get the benefits of Big Data
  • Select the proper Enterprise Information platform to support your information strategy.

New Big Data Storage Technologies: From Hadoop to Graph Databases, and from NoSQL to NewSQL

Rick van der Lans, R20/Consultancy

Big data, analytical database servers, Hadoop, NoSQL, Spark, MapReduce, SQL-on-Hadoop, translytical databases, and appliances are all immensely popular terms in the IT industry today. Due to this avalanche of new developments, it’s becoming harder and harder for organizations to select the right tools. Which technologies are relevant? Are they mature? What are their use cases? Are they worthy replacements for the more traditional SQL products? How should they be incorporated in the existing data warehouse architecture?” These are all valid but difficult to answer questions.

This full-day workshop discusses and explains these new data storage technologies clearly and explains why and how they can be relevant for any organization. Market overviews are presented, strengths and weaknesses are discussed, and guidelines and best practices are discussed. It is intended for anyone who has to stay up to date and implement the new developments, including data warehouse designers, business intelligence experts, database specialists, database experts, consultants, and technology planners.

  • Why are traditional database technology not “big” enough?
  • How different are Hadoop and NoSQL are from traditional technology?
  • How can new and existing technologies such as Hadoop, NoSQL, and NewSQL help develop BI and big data systems?
  • Embedding Hadoop technologies in existing BI systems
  • Using Spark to boost performance for analytics
  • Three NoSQL subcategories: key-value, document, and column-family stores
  • Why graph databases are very different from all other systems
  • When to use NewSQL or NoSQL for developing transactional systems

Integrating Fast Data, Edge Analytics and Operational Decision Management into Your BI Environment

Jos van Dongen, Principal Consultant, Tholis Consulting

For many years companies have been building data warehouses and data marts supplied by data extracted from traditional on-line transaction processing systems using batch-oriented ETL processing. However over recent years, as the speed of business quickens, we have seen a huge growth in demand for so called ‘fast data’ (also known as streaming data) to be brought into the enterprise for analysis. There are many popular sources of fast data including telemetry in business operations where organisations have often been blind to problems in the past. Also, Internet of Things (IoT) data from smart consumer products and smart industrial equipment. Live clickstream recording on-line behavior as people surf your website is yet another example, as is data generated from pixels and tags embedded in digital marketing. Others include markets data and social media data. All of this is relatively new in the world of analytics. The benefit of analyzing fast data are obvious. It allows organisations to go beyond traditional BI and start analyzing ‘in-flight data’ in real-time before it is stored anywhere. This allows organisations to optimize their business operations, avoid unplanned operational cost, mitigate risk and see opportunities early so that they can become early movers in real-time markets. However, there are issues with this kind of data. It can be huge in volume. It never stops! Schema can change and new schema can appear without notice. Data can be missing, arrive out of sequence or arrive in huge bursts. Also, how do you process it? How do you analyse it? What kinds of analyses are relevant with this kind of data? What technologies do you need? Where do you analyse it? Should it be done at scale in the data centre or in the cloud? Why not at the edge, closer to where the data is being generated? With so much data being generated and much more to come, would pushing analytics into the network not scale better than streaming analytics at the centre? If so what problems does that bring? Also, what about decision management? How do you automate decisions? Can machine learning help? And how do you integrate all this with traditional BI environments?

This session looks at this problem and discusses the following:

  • What is fast data?
  • Streaming versus realtime versus right time analytics
  • Prevention and opportunity – use cases for streaming analytics
  • The characteristics of fast data and the challenges that these bring
  • How do you process fast data and integrate it with enterprise data?
  • Why is master data important in fast data analytics?
  • What is time series analysis?
  • Approaches to streaming analytics – in-flight analysis versus fast write processing and near real-time post write analysis
  • Technologies options for analyzing fast data:
    • Scalable messaging like Kafka
    • Streaming analytics platforms
    • Fast write NoSQL data stores and file systems
    • Hadoop and cloud options
    • In-memory processing e.g. MemSQL, Apache Spark, Apache Flink
    • Coding vs configuring,
    • Build vs buy vs assemble
  • Developing prescriptive machine learning models for real-time analysis
  • Building streaming analytic application pipelines using ETL tools
  • Model development & training – in stream vs in memory
  • Model deployment – central execution, edge analytics or both?
  • Model management and monitoring in a fast data world
  • Combining models & business rules in real time decision management systems
  • Architecture – how should it change to integrate fast data with existing analytical systems such as data warehouses and data marts?
  • How to integrate fast data with traditional BI tools?
  • Getting started – what do you need to do? Do’s and dont’s

09:00 – 16:30
Jumpstart your Enterprise Data Initiatives and Keeping Them on the Right Track
New Big Data Storage Technologies: From Hadoop to Graph Databases, and from NoSQL to NewSQL
Integrating Fast Data, Edge Analytics and Operational Decision Management into Your BI Environment
10:30 - 10:45
Morning Break
12:15 - 13:15
Lunch
14:45 - 15:00
Afternoon Break

View a PDF of the agenda here or switch to a larger screen to browse the full agenda, including comprehensive session details and speaker information.

Fees

  • 4 Days
  • £1945
  • £1,945 + VAT (£389) = £2,334
  • 3 Days
  • £1595
  • £1,595 + VAT (£319) = £1,914
  • 2 Days
  • £1245
  • £1,245 + VAT (£249) = £1,494
  • 1 Day
  • £795
  • £795 + VAT (£159) = £954
Group Booking Discounts
Delegates
2-3 10% discount
4-5 20% discount
6+ 25% discount

UK Delegates: Expenses of travel, accommodation and subsistence incurred whilst attending this IRM UK conference will be fully tax deductible by the employer company if attendance is undertaken to maintain professional skills of the employee attending.

Non-UK Delegates: Please check with your local tax authorities

Cancellation Policy: Cancellations must be received in writing at least two weeks before the commencement of the conference and will be subject to a 10% administration fee. It is regretted that cancellations received within two weeks of the conference date will be liable for the full conference fee. Substitutions can be made at any time.

Cancellation Liability: In the unlikely event of cancellation of the conference for any reason, IRM UK’s liability is limited to the return of the registration fee only.IRM UK will not reimburse delegates for any travel or hotel cancellation fees or penalties. It may be necessary, for reasons beyond the control of IRM UK, to change the content, timings, speakers, date and venue of the conference.

Venue

  • Radisson Blu Portman Hotel
  • 22 Portman Square
  • London W1H 7BG
  • UK

Platinum Sponsors

Standard Sponsors

Supported By

Association of Enterprise Architects   

The Association of Enterprise Architects (AEA) is the definitive professional organization for Enterprise Architects. Its goals are to increase job opportunities for all of its members and increase their market value by advancing professional excellence, and to raise the status of the profession as a whole.

BCS Data Management Specialist Group (DMSG)

The BCS Data Management Specialist Group (DMSG) helps Data Management professionals support organisations to achieve their objectives through improved awareness, management, and responsible exploitation of data.
We run several events each year whose focus areas include:
•    The benefits of managing data as an organisational asset
•    Skills for exploitation of data
•    Data governance as a ‘Business As Usual’ activity
•    Compliance with legislation, particularly that relating to data protection, data security and ethical usage of data
Our audience is anyone with an interest in the benefits to be gained from data. This includes: Chief Data Officers (CDO); Senior Information Risk Officer (SIRO); data managers/stewards; data governance officers; data protection/security advisors; data scientists; and business/data/database analysts.

DGPO

The Data Governance Professionals Organization (DGPO) is an international non-profit, vendor neutral, association of business, IT and data professionals dedicated to advancing the discipline of data governance.   The DGPO provides a forum that fosters discussion and networking for members and seeks to encourage, develop and advance the skills of members working in the data governance discipline.   Please click here to view a PowerPoint overview of the DGPO.

EDM COUNCIL

About the EDM Council
The EDM Council is a neutral business forum founded by the financial industry to elevate the practice of data management as a business and operational priority. The prime directive is to ensure that all consumers (business and regulatory)  have trust and confidence that data is precisely what is expected without the need for manual recalculation or multiple data transformations. There are four programs of the Council:
•    Data Content Standards (FIBO): the standards-based infrastructure needed for operational management (identification, semantic language of the contract, classification).  We own the industry ontology for financial instruments and entity relationships and make it available as an open source standard
•     Data Management Best Practices (DCAM): the science and discipline of data management from a practical perspective (data management maturity, data quality, benchmarking).
•    Data Implications of Regulation: translating the legislative objectives of transparency, financial stability, compressed clearing and cross-asset market surveillance into regulatory objectives and practical reporting requirements.
•    Business Network: global meeting ground, CDO Forum and mechanism for sustainable business relationships
There are 135 corporate members of the Council (http://www.edmcouncil.org/councilmembers). We are governed by a board of 24 (http://www.edmcouncil.org/board). For more information visit www.edmcouncil.org.

BearingPoint

BearingPoint is an independent consulting firm with European roots and global reach. We transform businesses We are committed consultants with adaptive intelligence. We know that the world changes constantly, so we offer real, tailored solutions for complex business environments. Our value proposition We help our clients realise their goals by applying our deep industry and functional expertise to understand and address their specific needs. Our consultants are passionate and highly engaged with a pragmatic yet innovative mindset. Our culture Passion drives everything we do. We live by our values: • Commitment to our clients and our people. • Excellence in all our work. • Teaming to achieve greater results. Stewardship is at our core. We strive to develop our people and build a firm that will remain strong for future generations. Our distinction As Business Consultants, we help clients make transformation real by delivering measurable and sustainable results: • We listen, understand and adapt to our clients’ needs • We help our clients implement strategies through operational focus • We combine management and technology capabilities to achieve the full value potential • We are independent advisors • We work closely with our clients and help them to embrace change • With passion for excellence we serve clients globally To get there. Together.

Comma Group

Comma was formed with a very clear vision; to deliver a truly standout approach to data and information management. We remain focused on business outcomes and delivering value to our clients. Our mission is to be recognised globally as the leading consultancy exclusively focused on data and information excellence. By remaining concentrated on PIM, MDM and supporting services, it allows us to excel in this area and provide a service that puts our clients’ needs first and delivers truly outstanding results.

Media Partners

ECCMA

Formed in 1999; the Electronic Commerce Code Management Association (ECCMA) has brought together thousands of experts from around the world and provides a means of working together in a fair, open and extremely fast internet environment to build and maintain global, open standard dictionaries used to unambiguously label information without losing meaning. ECCMA works to increase the quality and lower the cost of descriptions through developing International Standards.   ECCMA is the original developer of the UNSPSC, the project leader for ISO 22745 (open technical dictionaries and their application to the exchange of characteristic data) and ISO 8000 (information and data quality), as well as, the administrator of US TAG to ISO TC 184 (Automation systems and integration), TC 184 SC4 (Industrial data) and TC 184 SC5 (Interoperability, integration, and architectures for enterprise systems and automation applications) and the international secretariat for ISO TC 184/SC5. For more information, please visit www.eccma.org.

IQ International

IQ International (abbreviated as IQint), the International Association for Information and Data Quality, is the professional association for those interested in improving business effectiveness through quality data and information. All, including full-time practitioners, those impacted by poor data and information quality, and those who just want to learn more, are welcome!

IT-LATINO.NET

IT-latino.net is the most important online Hispanic IT Media Network. With more than 120,000 registered users we have become an important online IT Business Forum organizing daily webinars and conferences on different Technology issues. We inform regularly a strong IT community from both sides of the Atlantic: Spain and Latin America.

Modern Analyst

ModernAnalyst.com is the premier community and resource portal for business analysts, systems analysts, and other IT professionals involved in business systems analysis. Find what you need, when you need it. The ModernAnalyst.com community provides Articles, Forums, Templates, Interview Questions, Career Advice, Profiles, a Resource Directory, and much more, to allow you to excel at your work. From junior analysts to analysis managers, whether you integrate off-the-shelf products, perform systems analysis for custom software development, or re-engineer business processes, ModernAnalyst.com has what you need to take your career to the next level.

Silicon UK

Silicon UK  is the authoritative UK source for IT news, analysis, features and interviews on the key industry topics with a particular emphasis on IoT, AI, cloud and other transformative technologies.   The site is your guide to the business IT revolution, offering other resources such as jobs, whitepapers and downloads alongside its coverage.   Stay informed, register to the newsletters.

Via Nova Architectura

A number of thought leaders in the area of business – and IT architectures have set up a digital magazine on architecture: Via Nova Architectura. Although started as an initiative within the Netherlands, the magazine should reach all those interested in the area of architecture, where-ever they live. Via Nova Architectura aims to provide an accessible platform for the architecture community. It is meant to be the primary source of information for architects in the field. The scope of Via Nova Architectura is “digital” architecture in the broadest sense of the word: business architecture, solution architecture, software architecture, infrastructure architecture or any other architecture an enterprise may develop to realize its business strategy.

The Data Governance Institute

The Data Governance Institute (DGI) is the industry’s oldest and best known source of in-depth, vendor-neutral Data Governance best practices and guidance. Since its introduction in 2004, hundreds of organizations around the globe have based their programs on the DGI Data Governance Framework and supporting materials. www.DataGovernance.com

Follow us on Twitter:

@DGIFramework https://twitter.com/DGIFramework

Connect with us on LinkedIn:

The Data Governance Institute (DGI) https://www.linkedin.com/company/480835

Nederlands Architectuur Forum (NAF)

The mission of the Netherlands Architecture Forum (NAF) is to promote and professionalise the use of architecture in the world of business and IT. Since its foundation in 2002, the NAF has created a network of over 70 Dutch user organisations, vendors and knowledge institutes. It offers its members opportunities for networking with fellow architects, exposure through publications, and knowledge exchange in its working groups.
NAF working groups bring together professionals on such diverse topics as architecture principles, Web 2.0 architecture, Cloud & SaaS, TOGAF, IT governance, and many more. Furthermore, NAF is organiser of the NAF Insight seminars and co-responsible for the yearly Dutch national architecture congress (LAC).