Agenda Highlights

Key Factors in Successful Data Governance, Michael Bendixen, Data Governance Manager, Grundfos

Establishing Master Data in a Federated & Outsourced Environment, Alex Bähr, Director of Global Supply Chain & Sustainability – Process & Data COE, McDonald’s

The Swan Lake of Data – Data Governance with New Enabling Technologies, Adam Preston, Chief Data Officer, Santander Nic Gordon, Associate Director, BCG

Implementing Master Data Governance in Large Complex Organisations, Charlotte Gerlach Sylvest, Senior Master Data Governance Manager, Coloplast

Building a Sustainable Data Governance Ecosystem, Dhivya Venkatachalam, Head of Data Governance, Schroder Investment Management

Just a few of the Case Studies include:

Keynote & Featured Speakers

Testimonials

Sponsors

Fees

  • 4 Days
  • £1945
  • £1,945 + VAT (£389) = £2,334
  • 3 Days
  • £1595
  • £1,595 + VAT (£319) = £1,914
  • 2 Days
  • £1245
  • £1,245 + VAT (£249) = £1,494
  • 1 Day
  • £795
  • £795 + VAT (£159) = £954
Group Booking Discounts
Delegates
2-3 Delegates 10% discount
4-5 Delegates 20% discount
6+ Delegates 25% discount

Venue

  • Radisson Blu Portman Hotel
  • 22 Portman Square
  • London
  • W1H 7BG
  • UK

Join the conference group

Agenda

Monday 15 May 2017 : Pre-Conference Workshops
08:30 - 09:30
Registration
09:30 - 17:15
MDM & RDM "Quick Start" Workshop
MDM & RDM "Quick Start" Workshop

Here’s an excellent opportunity to improve your success as an enterprise/data/solutions architect or other IT professional embarking upon your first MDM or Data Governance initiative.  During this fast-paced workshop, you’ll learn first-hand the best practice insights every IT professional must know to fast-track success and minimize risk. This is your pre-conference opportunity to meet with the “Godfather of MDM” to ask the questions and set your own personalized agenda to maximize your conference experience.
The speaker’s reputation for cutting through the hype to deliver a no-nonsense view of what you need to know will provide insights into proven approaches to delivering business value along with the insiders’ view of strategic implications of these fast-evolving technologies.
Combining presentations and case studies, this power session’s proven agenda is practical, personal and uniquely tailored on-site to the needs of the participants.  The speakers will share real world insights from surveys and discussions with over 1,500 MDM programs to provide guidance concerning:
  • Initiating a successful MDM, RDM and/or MDG program
  • Convincing the business to take a leadership role with the goal to deliver measurable ROI
  • Choosing the right MDM, RDM and/or MDG solutions despite a rapidly churning market — multi-domain MDM, reference data management, hierarchy management, identity resolution, big data, social MDM, semantic databases and more

MDM - a Best Practice Guide to Design and Implementation
MDM - a Best Practice Guide to Design and Implementation

This workshop focusses on the end-to-end implementation of master data management and tries to address the hardest problems that arise in an MDM project. It looks at the broader picture of information governance, data quality and metadata management before applying these to an MDM project. It also addresses design issues such as inbound integration of master data to consolidate master data when it is scattered across many different data sources, and the outbound synchronisation of it to supply both operational and analytical systems. It also looks at master data virtualisation when you have a hybrid state of some master data consolidates and some not.  In particular it looks at what needs to be considered when dealing with data integration and data synchronisation to achieve best practice in design and implementation. The session covers the following:
  • An introduction to data governance
  • Introducing a shared business vocabulary
  • Metadata management
  • Enterprise data quality and data integration
  • The main approaches to implementing MDM
  • What kind of MDM system are you building? – a System of Record,
  • Centralised Master Data Entry System or both?
  • Understanding master data maintenance in your enterprise
  • Best practices in designing master data consolidation
    • Data capture techniques
    • The benefits of standardising inbound data to a an MDM system
    • Should history be kept in an MDM system?
    • Approaches to cleansing, and matching
    • Consolidation vs. virtualising master data to create an MDM system
    • Enriching master data using Big Data Analytics
    • Matching at scale – leveraging Hadoop and HBase for scalable master data matching
  • Best practices in designing outbound master data synchronisation
    • Integrating an MDM system with an enterprise service bus for outbound synchronisation of operational systems
    • Schema and integrity synchronisation problems that can occur and what to do about them
    • Conflict resolution on outbound synchronisation
    • Design considerations when integrating MDM with ETL tools for synchronising data warehouses and data marts
  • The emergence of Blockchain for master data maintenance
  • Accelerating master data queries using graph query processing and graph analytics
  • Maximising the use of data virtualisation in MDM
  • The implications of switching to centralised master data entry
  • The change management program imposed by centralised master data entry

Mike Ferguson, Managing Partner, Intelligent Business Strategies

Mike Ferguson

Managing Partner, Intelligent Business Strategies

Mike Ferguson is Managing Director of Intelligent Business Strategies Limited.  An analyst and consultant he specialises in business intelligence, data management and enterprise business integration.  With over 34 years of experience, Mike has consulted for dozens of companies on business intelligence/corporate performance management strategy and technology selection, big data, enterprise architecture, business integration, MDM and data integration.  He has spoken at events all over the world and written numerous articles.  Formerly he was a principal and co-founder of Codd and Date Europe Limited – the inventors of the Relational Model, a Chief Architect at Teradata, on the Teradata DBMS, and European Managing Director of Database Associates.  He teaches master classes in Big Data Analytics, New Technologies for Data Warehousing and BI, Mobile and Collaborative BI, Operational BI, Enterprise Data Governance, Master Data Management, Data Integration and Enterprise Architecture. Follow Mike on Twitter @mikeferguson1.

09:30 - 12:45
Practical Data Governance
Practical Data Governance

This workshop provides a practical guide to implementing Data Governance in an organisation, regardless of size. It is based on experiences across multiple organisations and will cover all aspects of Data Governance, from policies and guidelines, through ownership and stewardship, implementing data quality standards and aligning to both company strategies and regulation. The session will consider how to implement, what is the driving force behind the requirement for Data Governance and how to bring others along in the journey, including recognising those people in your organisation that can support you. The workshop will cover:
  • Why is Data Governance important for your organisation
  • Where to start
  • How to get Senior sponsorship at the beginning
  • What policies are needed
  • Defining Data Quality rules
  • Communicating Success

Making Enterprise Data Quality a Reality
Making Enterprise Data Quality a Reality

Many organisations are recognising that tackling data quality (DQ) problems requires more than a series of tactical, one off improvement projects. By their nature many DQ problems extend across and often beyond an organisation.  So the only way to address them is through an enterprise wide programme of data governance and DQ improvement activities embracing people, process and technology. This requires very different skills and approaches from those needed on many traditional DQ projects.
If you attend this workshop you will leave more ready and able to make the case for and deliver enterprise wide data governance & DQ across your organisation. This highly interactive workshop will also give you the opportunity to tackle the problems of a fictional (but highly realistic) company who are experiencing end to end data quality & data governance challenges. This will enable you to practise some of the key techniques in a safe, fun environment before trying them out for real in your own organisations.
Run by Nigel Turner of Global Data Strategy, the workshop will draw on his extensive personal knowledge of initiating & implementing successful enterprise DQ and data governance in major organisations, including British Telecommunications and several other major companies.  The approaches outlined in this session really do work.
The workshop will cover:
  • What differentiates enterprise DQ from traditional project based DQ approaches
  • How to take the first steps in enterprise DQ
  • Applying a practical Data Governance Framework
  • Making the case for investment in DQ and data governance
  • How to deliver the benefits – people, process & technology
  • Real life case studies – key do’s and don’ts
  • Practice case study – getting enterprise DQ off the ground in a hotel chain
  • Key lessons learned and maxims for success

Nigel Turner, Principal Information Management Consultant EMEA, Global Data Strategy

Nigel Turner

Principal Information Management Consultant EMEA, Global Data Strategy

Nigel Turner is Principal Information Management Consultant EMEA at Global Data Strategy.  He specialises in information strategy, data governance, data quality & master data management. During his consultancy career he has worked with over 150 clients, including British Gas, AIMIA/Nectar, HSBC, EDF Energy, Telefonica O2, the Chartered Institute for Personnel and Development (CIPD) and Intel US.  With more than 20 years’ experience in the Information Management industry, Nigel started his career working to improve data quality, data governance & CRM within British Telecommunications (BT), and has since used this experience to help many other organisations do the same.  Whilst at BT he also ran a successful Information Management and CRM practice of 200+ people providing consultancy and solutions to many of BT’s corporate customers.   He is also an elected member of the UK’s Data Management Association (DAMA) management committee.  In 2015 he was given DAMA International’s Community Award for setting up a mentoring scheme for data management professionals in the UK.  In 2007 fellow data professionals voted him runner up in Data Strategy magazine’s UK Data Professional of the Year awards.  Nigel is a well-known thought leader in data management and has published several white papers & articles and is a regular invited speaker at Information Management & CRM events.

Data Governance Interactive Surgery – Exploring the Challenges in Developing & Deploying DG
Data Governance Interactive Surgery – Exploring the Challenges in Developing & Deploying DG

The workshop will take an interactive approach to explore the challenges in developing a data governance strategy and deploying it in a range of organisations. Using the approach taken within DIO as an introduction to the workshop, the speakers will facilitate an interactive session as to what delegates have experienced and the challenges they have found. It will seek to utilise a number of tools and approaches to help delegates explore how to overcome some of these challenges and to consider alternative approaches or techniques to invigorate their data governance programmes.
Attendees will learn:
  • Wide range of shared experiences and approaches to identify good practice, including techniques and approaches that can be adopted
  • Collective effort to seek to address challenges faced by delegates, utilising wider experience and knowledge to help overcome these challenge
  • An understanding of the approach taken by the speakers to drive data governance within DIO

Ian Wallis, Head of Data, Analytics & Insight (DA&I), Defence Infrastructure Organisation (DIO)

Ian Wallis

Head of Data, Analytics & Insight (DA&I), Defence Infrastructure Organisation (DIO)

Ian Wallis is Head of Data, Analytics & Insight (DA&I) at the Defence Infrastructure Organisation (DIO). DA&I’s role is to transform DIO into a data-driven, evidence-based organisation to support efficient use of £3.5bn of annual spending to enable the armed forces to live, work and train. Ian’s remit encompasses data management, business intelligence, MI and reporting, analytics and insight.  Ian has nearly thirty years’ experience in this field and has managed programmes in blue chip organisations, including data warehouse projects at organisations including Centrica and BBC. He has led analytics teams at HSBC, Royal Mail and The Pensions Regulator and delivered data management programmes at Thomson Reuters, Barclays and Arqiva.  Ian works for Aecom and is deployed through the Strategic Business Partner arrangement which has insourced private sector expertise to transform the DIO. Prior to this, Ian worked as an interim manager through his own company, Data Strategists.

Godfrey Morgan, Head of Data Management, Defence Infrastructure Organisation (DIO)

Godfrey Morgan

Head of Data Management, Defence Infrastructure Organisation (DIO)

Godfrey Morgan is the Head of Data Governance working within the Data, Analytics and Insight team, at the Defence Infrastructure Organisation. The DIO has the responsibility for the management of property, infrastructure and related services to ensure strategic management of the defence estate as a whole, optimising investment and critically supporting military capability to the best effect. He is responsible for implementing a Data Governance framework working with Data Owners and Data Stewards to embed the value of data as a corporate asset through data ownership, data quality, and the usage and understanding of uniform data standards.. His responsibilities also include working with external industry suppliers to ensure data matching and verification of key defence asset data. Godfrey has experience of working in Data Governance for a number of large financial services and asset management organisations over the last sixteen years.

Learn How Graph Database Can Empower MDM Solutions via Exploration of Specific Case Studies
Learn How Graph Database Can Empower MDM Solutions via Exploration of Specific Case Studies

“Your Master Data Is a Graph”.  Whether it’s the organization master or a product master involving complex hierarchies and relationships, Master Data invariably takes the form of a graph or network, and is best modeled, stored and queried using a native graph technology.  Whether you are using a packaged MDM solution or a building a custom MDM solution, a Graph Database can help you get a higher ROI by reducing complexity, increasing agility and improving the speed and efficiency of your Master Data initiative.
Join this session to learn how a Graph Database fits into your MDM solution and how market-leading organizations like Pitney Bowes, Cisco and UBS are gaining significant competitive advantage by adopting different MDM implementation styles to incorporate graph technology into their solution portfolio.  Topics to be discussed include:
  • Understanding how a graph database complements MDM – from personalized product & service recommendations to websites adding social capabilities
  • Identifying the benefits of different MDM implementation styles – ranging from using Graph Database as the primary repository for your Master Data to using a Graph Database to build a metadata registry
  • Learning from industry-proven best practices in adopting Graph Databases

Lars Nordwall, COO, Neo Technology

Lars Nordwall

COO, Neo Technology

Lars Nordwall was born and raised in Stockholm, and has lived in Silicon Valley since 1998. Mr. Nordwall joined Neo Technology early 2011 as the COO, and has transformed the company from an early stage European based start-up to one of the rising stars in Silicon Valley.
His track record includes: (1) A turn-around and transformation of Pentaho from a flat-lined struggling Business Intelligence software provider to a leader in the Big Data Analytics space. The company was acquired by Hitachi for $600M; (2) SugarCRM where he joined as the founding VP of WW Sales, and established the firm to become one of the global SaaS CRM leaders; and (3) Cambridge Technology Partners (CTP), where he built his career foundation and was fortunate to experience rapid growth from 400 to 6,500 employees, transformation to one of the hottest consultancy firms in the world with a market cap over >$5B, followed by an acquisition by Novell.
Mr. Nordwall has an M.Sc. Degree in Mechanical Engineering from the Royal Institute of Technology in Stockholm, a B.Sc. in Business Administration from the Stockholm University School of Business, and he has completed an Executive Education Program at Harvard Business School in Boston.

It is All in the Petri Dish
It is All in the Petri Dish

The trend in the data world has seen the emergence of ‘information as an asset’, ‘data centricity’ or ‘data driven’ corporate statements.  In the majority of companies these are loosely worded statements where the policy makers have little or no idea turning this into a reality within their organisation.  As a result we are now seeing the emergence and need to address ‘corporate culture’. Which is great for us and actually is a quantum leap forward in our world of data professionals.
So how do we translate that desire to be more data driven into a reality where people within an organisation are more data savvy?  Is there a right answer and a single approach, probably not. Just like the Petri dish, there is no guarantee how the culture will develop.
The aim of this workshop is to be highly interactive where together we will explore:
  • What is corporate culture
  • How do you shape a ‘data driven’ culture
  • What is the environment for success
  • Why establishing a ‘citizen steward’ approach is essential
  • What you need in your toolbox

Neil Storkey, Director, InfoMana Ltd

Neil Storkey

Director, InfoMana Ltd

Neil Storkey is an independent consultant specialising in enterprise data and information management strategies with a focus on trust, integrity and sustainability of information assets. Neil started his data journey as an accountant back in 1991, where consolidation and reporting of financial management information depended on consistent quality data.  25 years on, he has lead and delivered change management programs in large global enterprises shifting the onus of accountability of data and information away from IT. These enterprise programs included BW / BI, MDM strategies, business change management, Data and Information strategies, all based on establishing business lead working practices around standards, stewardship, governance, organisation and quality.  His strategies have been recognised as industry leading and innovative which challenge established data management practices.  Neil’s career following the data has allowed him to experience industries in motor manufacturing, financial services, recruitment, telecommunications, mobile telephony, tobacco and Hydrocarbons.  His challenge to everyone is to lose the tag of ‘ownership’; we are but custodians of company data.

14:00 - 17:15
The Quality in Master Data Quality
The Quality in Master Data Quality

The end goal of Gerard and Dana’s MDM endeavours is to ensure the quality of their master data is suitable to support the business processes and reporting needs, at the lowest possible cost. The aim of the workshop is to present a practical approach towards embedding data quality in the day to day life of a master data organisation.
Two years into building their data governance community and supporting quality processes, they are at a point in which they can share the practicalities regarding the steps they have taken to do that, the challenges faced along the way, what they did well and what they would have done differently.
Delegates will be able to take away advice on:
  • How to bring data quality as a sellable message in their organisations
  • How to make best use of all available resources (people and technical) to improve the quality of master data
  • Real life examples of data quality activities, to deliver results
  • How to ensure that their efforts are sustainable

Gerard Bartley, Director Global Master Data, FrieslandCampina

Dana Julinschi, Master Data Governance & Projects Manager, FrieslandCampina

Dana Julinschi

Master Data Governance & Projects Manager, FrieslandCampina

Dana Julinschi currently leads an ambitious programme to implement company-wide master data governance, together with data quality processes and KPIs.  Throughout the past eight years she has been part of several MDM implementation projects in different industries with a strong focus on change management within governance organisations, MDM maturity assessments, data quality KPIs and data maintenance processes.  The overall goal of each of her projects is to turn master data management into a way of living for the organisations.

Building Professional Competencies for Information Management Practitioners
Building Professional Competencies for Information Management Practitioners

Considering a career in Information Management?
Already well established in the field?
Want to build an information management practice in your organisation?
It’s not only the “Information Management” skills that are essential.  This workshop will address the key issues of:
  • What key capabilities are necessary (and desirable) for IM professionals
  • What behaviours and attitudes should be exhibited
  • What are the roles necessary in a successful Information Management practice
  • What are the skills and skill levels required to fulfil those roles
  • What are the core services necessary to permeate an Information Management practice & how should these mature, and
  • Does certification help?
Taught by DAMA Award winner, DAMA Fellow, & President of DAMA UK this workshop is based upon real practical experience gained over 35 years in assisting Global organisations big & small & will help individuals & organisations plan their Information Management development.

Chris Bradley, Information Strategist, Data Management Advisors Ltd

Chris Bradley

Information Strategist, Data Management Advisors Ltd

Christopher Bradley has spent 35 years in the forefront of the Information Management field, working for leading organisations in Information Management Strategy, Data Governance, Data Quality, Information Assurance, Master Data Management, Metadata Management, Data Warehouse and Business Intelligence.   Chris is an independent Information Strategist & recognised thought leader.  Recently he has delivered a comprehensive appraisal of Information Management practices at an Oil & Gas super major, Data Governance strategy for a Global Pharma, and Information Management training for Finance & Utilities companies.  Chris guides Global organizations on Information Strategy, Data Governance, Information Management best practice and how organisations can genuinely manage Information as a critical corporate asset.  Frequently he is engaged to evangelise the Information Management and Data Governance message to Executive management, introduce data governance and new business processes for Information Management and to deliver training and mentoring.  Chris is Director of the E&P standards committee “DMBoard”, an officer of DAMA International, an author of the DMBoK 2.0, a member of the Meta Data Professionals Organisation (MPO) and a holder at “master” level and examiner for the DAMA CDMP professional certification. Chris is an acknowledged thought leader in Data Governance, author of several papers and books, and an expert judge on the annual Data Governance best practice awards. Follow Christopher on Twitter @inforacer.

Creating a Data Governance Framework that Promotes Innovation and the Creative Use of Data
Creating a Data Governance Framework that Promotes Innovation and the Creative Use of Data

Data governance cannot afford to become a straight-jacket for any organisation.  Sensible risk-mitigation has to be accompanied by managed flexibility that still allows the organisation to act in a nimble and agile fashion, to innovate as today’s operating requirements become more and more demanding especially with the introduction of GDPR.
  • Creating the right structures and processes for governance and innovation
  • Managing data governance and IT
  • The role of meta data
  • Relationship with SIRO / DPO

Successful Reference Data Governance and Management
Successful Reference Data Governance and Management

Reference data – often simply known as codes, lookups, or domains – is an area of enterprise data management that is becoming increasingly important.   However, many enterprises have difficulty formulating governance programmes and management practices for reference data.  This workshop explains the overall structure needed for both reference data governance and reference data management.  The very different roles need to manage external reference data (sourced from outside the enterprise) and internal reference data (produced wholly within the enterprise) are described.  The options for environments for producing and distributing reference data are compared and contrasted.  The significant role of semantics in reference data is also examined in detail, together with practical ways in which knowledge of reference data can be successfully managed.  Additionally, the special aspects of quality in reference data are described. Attendees will learn:
  • What reference data is, how it differs from other classes of data in its governance and management needs
  • The structures needed for successful reference data governance management
  • How the semantic needs of reference data can be addressed
  • How to deal with data quality in reference data content

Malcolm Chisholm, Chief Innovation Officer, First San Francisco Partners

Malcolm Chisholm

Chief Innovation Officer, First San Francisco Partners

Malcolm Chisholm has over 25 years’ experience in data management, and has worked in a variety of sectors, with a concentration on finance.   He is an independent consultant specializing in data governance, master/reference data management, metadata engineering, and the organization of Enterprise Information Management.  Malcolm has authored the books: “Managing Reference Data in Enterprise Databases”; “How to Build a Business Rules Engine”; and “Definitions in Information Management”.  He was awarded the DAMA International Professional Achievement Award for contributions to Master Data Management.  He holds an M.A. from the University of Oxford and a Ph.D. from the University of Bristol.

Graph-Based Cloud MDM – Business Drivers & Technology Best Practice
Graph-Based Cloud MDM – Business Drivers & Technology Best Practice

A key aspect of cloud-based software as a service (SaaS) solutions is the demand for being self-service for business users. In this workshop we explore the drivers we have seen for Cloud MDM with a focus on the self-service use cases for business users. For example, a very basic use case is to understand how many leads in a purchased lead list are not yet known customers in the MDM system using matching algorithms without any IT involvement. Another use case is the ability to explore entity relationships within master data entities and across master data and transactional data entities at a much broader and deeper scale. To address this need graph databases and graph visualization techniques are added to the cloud MDM solution stack. As MDM moves to cloud – we explain why we believe graph databases offer advantages for the more demanding entity relationship use cases over the relational / columnar persistency options used in on-prem solutions today. We conclude the session with a very short summary on IBM’s MDM cloud offering to illustrate a concrete solution for the previously discussed business and architecture drivers for cloud MDM.  You will learn about:
  • Use cases for cloud MDM with a focus on the business users with self-service needs
  • Quick intro to graph databases
  • Benefits of graph databases addressing the next gen entity relationship management use cases for MDM

Martin Oberhofer, Executive Architect, IBM

Martin Oberhofer

Executive Architect, IBM

Martin Oberhofer is a certified Executive Architect at IBM and a certified Distinguished Architect with The Open Group with 15+ years of experience in information management and 10+ years in the areas of master data management, information integration, data lakes and information governance. Between 2006 and 2015 he advised many of IBM’s largest customers on these topics. Since 2016 he is the responsible development architect for IBM’s on-prem MDM product portfolio. He co-authored several books on MDM and enterprise information architecture, contributed as inventor to over 70 patent applications and speaks regularly at conferences every year since 2006.

Lena Woolf, Senior Technical Staff Member (STSM), IBM

Lena Woolf

Senior Technical Staff Member (STSM), IBM

Lena Woolf is a Senior Technical Staff Member (STSM) at IBM with 15+ years of experience in the master data management and entity analytics area. Lena provides thought leadership advise to IBM customers worldwide on MDM, entity analytics and information governance solutions. She is the development architect for IBM’s cloud MDM solutions. Lena regular speaks at conferences since many years. She authored articles on master data management. As inventor, Lena contributed to many patents and constantly pushes the boundaries of what’s possible with MDM technology – a recent example includes adding graph technologies to IBM’s MDM capabilities.

Tuesday 16 May 2017 : Conference Day 1 & Exhibits
08:00 - 09:00
Registration
09:00 - 09:10
MDM Summit and DG Conference Opening
09:10 - 10:00
PLENARY KEYNOTE: Data Science and the Panama Papers
PLENARY KEYNOTE: Data Science and the Panama Papers

The trove of files that make up the Panama Papers is likely the largest dataset of leaked insider information in the history of journalism.  Mar will discuss the unique challenges that ICIJ’s Data and Research Unit encountered in analyzing this data. The overall size of the data (2.6 terabytes, 11.5 million files), the variety of file types (from spreadsheets, emails and PDFs to obscure and old formats no longer in use), and the logistics of making it all securely searchable for more than 370 journalists around the world are just a few of the hurdles they faced over the course of the 12 month investigation.

Mar Cabra, Editor, Data & Research Unit, International Consortium of Investigative Journalists (ICIJ)

Mar Cabra

Editor, Data & Research Unit, International Consortium of Investigative Journalists (ICIJ)

Mar Cabra, Spain, is the head of the Data & Research Unit, which produces the organization’s key data work and also develops tools for better collaborative investigative journalism. She has been an ICIJ staff member since 2011, and is also a member of the network.  Mar fell in love with data while being a Fulbright scholar and fellow at the Stabile Center for Investigative Journalism at Columbia University in 2009/2010. Since then, she’s promoted data journalism in her native Spain, co-creating the first ever masters degree on investigative reporting, data journalism and visualization and the national data journalism conference, which gathers more than 500 people every year. She previously worked in television (BBC, CCN+ and laSexta Noticias) and her work has been featured in the International Herald Tribune, The Huffington Post, PBS, El País, El Mundo or El Confidencial, among others. In 2012 she received the Spanish Larra Award to the country’s most promising journalist under 30.

10:05 - 10:50
MDM Keynote: MDM-Driven Digital Transformation via Systems of Engagement & Graph
MDM Keynote: MDM-Driven Digital Transformation via Systems of Engagement & Graph

Clearly, the “solid but boring” aspect of master data management (MDM) remains a key challenge for most enterprises.  Concurrently, market-leading enterprises are turbo-charging their MDM efforts by focusing on “master relationship management” via Graph Database technology coupled with Big Data analytics.  While traditional MDM purports to span the entire master data lifecycle, new dimensions such as Big Data, mobile, social, cloud and real-time are exerting tidal forces on the classic notion of MDM.  Moreover, IT leadership struggles when selecting MDM software because the solutions are diverse with no single vendor able to meet all requirements and use cases. Given the prevalence of multiple MDM brands and architectures as a result, two relatively newcomers (Data Governance and Graph Database) are proposing to unify these silo’ed worlds to overcome both organisational and technical issues as well as market dogma.
The mega vendor-centric MDM offerings thwart the notion of heterogeneous data and process integration, and often lack pro-active Data Governance capabilities for end-to-end data lifecycle management.  Concurrently, best-of-breed and niche vendors look to exploit this vacuum (cross-mega vendor governance and relationship management) yet are stymied by lack of resources and market traction.  All vendors need to better focus on next-generation MDM requirements as we move from “system of record” to add “system of reference” and (ultimately) move into “system of engagement” wherein relationship-driven analytics form the foundation of MDM-innate, data-driven and context-driven applications to fully enable the digital enterprise.
Concurrently, mismatches in reference data (also called “enterprise dimensions”) affect the integrity of business intelligence reports and are also a common source of application integration failure. Due to the strategic nature of and difficulty to build/maintain custom reference data management (RDM) capabilities, savvy IT organisations and Finance departments are increasingly opting to buy and not build RDM solutions.
This MDM research analyst keynote will review strategic planning assumptions such as:
  • Determining what your organisation should focus on in 2017-18 to initiate “master relationship management” via Data Governance & Graph Database
  • Planning to leverage Big Data & RDM as part of an enterprise MDM program
  • Understanding where MDM, RDM & Data Governance are headed in the next 3-5 years

Data Governance Keynote: Merging Perspectives on Information
Data Governance Keynote: Merging Perspectives on Information

As organisations are becoming increasingly aware of the importance of information several initiatives are often taken in different contexts. The information security officer might be concerned by the probability of data leaks or the threat of cybercrime, with the IT security officer scrambling to put the proper firewalls in place to prevent access to file sharing services. Your data protection officer will be concerned by proper management of personal data. Throw the chief data officer in the mix who is concerned by getting value out of the data. You might even have a chief analytics officer that wants to maximise the benefits of the data lake. To top it off with the chief information officer that needs to decide how to best support the infrastructure for effective data management.
How many perspectives does one need to properly manage information? Surely just a common one.
What delegates will learn from attending the session:
  • Common model for slicing the information “elephant”
  • Embedding the ISO 2700x model into your organisation
  • Information Classification model

10:50 - 11:20
Networking Break & Exhibits
MDM Track 1
MDM Track 2
Data Governance Track 1
Data Governance Track 2
11:20 - 12:05
Think Big or Start Small? Design Options for Data Governance of Finance MDM
Think Big or Start Small? Design Options for Data Governance of Finance MDM

There is more than one way to implement Data Governance. Taking a deeper look into approaches companies have chosen, there are multiple options to enable high data quality via decent governance structures. Options vary from purely local optimisation of data lifecycle processes to global shared service structures, both being applied with great success. This presentation will give insights into Data Governance patterns, which have been implemented with support of the authors at different companies.
Deeper insights will be given into the transformation of the Finance organisation at AstraZeneca, one of the top 10 global pharmaceutical companies.  Having started the MDM journey, the presentation will explain the transformation approach chosen as well as learnings made.  Topics which delegates will learn from attending this session:
  • Profiting from the business value of diverse Data Governance Design options— from local optimisation to outsourcing
  • Leveraging best practices from a “top 10” multi-national pharma company
  • Understanding the Do’s & Don’ts for successful digital transformation projects

Marco Pinheiro, Director Finance Master Data, AstraZeneca

Dr. Andreas Reichert, Partner, CDQ AG

Dr. Andreas Reichert

Partner, CDQ AG

Dr. Andreas Reichert is Partner at CDQ AG heading the consultancy services of the company. Dr. Reichert has been working on governance topics for MDM for more than 12 years. The focus of his work is supporting enterprises in designing master data strategies, setting up governance structures, and transferring project results into business operations. Prior to his role at CDQ, he worked as a business consultant at SAP for several years supporting companies in setting up business process structures for large ERP transformations.  Dr. Reichert holds a Ph.D. in Information Management as well as two Master’s degrees in Information Management and Technology Management.

Master Data Stakeholders & Ownership
Master Data Stakeholders & Ownership

Ricoh Europe is a multi-national imaging and electronics company (part of €16B Ricoh Group).  It is universally agreed that Data Governance is critical to achieving sustainable and effective MDM.  Failure to execute Data Governance concurrently with an MDM program greatly decreases the probability of success and economic sustainability of MDM processes.  Clearly, it is critical to establish clarity in the different roles of Data Ownership and Stewardship, Data Governance vs. Data Management, etc.  Why is it so difficult to accept the ownership and responsibility of data?  Topics to be discussed in this session include:

  • Orchestrating the broad application of sound Data Quality principles, including organisational process revision, technology & software adoption
  • Establishing the Data Quality roles accountability appointment
  • Defining Data Management Policies & Guidelines

Alberto Villari, Data Governance Manager, Ricoh EMEA

Alberto Villari

Data Governance Manager, Ricoh EMEA

Within the Business Governance Division, Alberto Villari manages the Master Data Management processes and Data standards, definitions and use across applications within Ricoh EMEA.  These Data Governance responsibilities include delivering effective management of Master Data and the achievement of Data Quality targets. Additional responsibilities include maintenance of the documentation on data guidelines, processes and standards, as stored on the Data Governance Portal, analysis of business requirements regarding data changes implications and the approval of any solution enhancements where data are involved and coordinate Data Issues remediation, whilst promoting a robust Data Governance Framework to move the organization towards an adequate Data Management Maturity level.  Prior to Ricoh EMEA, Mr. Villari was the Data Quality Lead at GE Capital (London) and Data Quality Manager at Bulgari (Rome).  He also chaired the 2014 Chief Data Officer Forum in London.  Mr. Villari received his degrees from Università di Roma – La Sapienza with his thesis in Multi-Dimensional Dynamic Data Management (GeoMap) system.

Building a Sustainable Data Governance Ecosystem
Building a Sustainable Data Governance Ecosystem

In many organisations Data Governance initiatives are created as projects or programs. While they begin with the right purpose, they usually fizzle out as the plan to include Data Governance in their everyday business activities is not executed right.
Dhivya will discuss creating a Sustainable Data Governance Ecosystem that stands the test of time – to mature and adapt to the enterprise, the people and the changing data and information needs.
  • Embed Data Governance in BAU
  • Sustainable Data Governance
  • Data Governance as a Capability

Data Security and Privacy in a Big Data Environment
Data Security and Privacy in a Big Data Environment

Over recent years new sources of data have given companies the opportunity to gain very intimate knowledge about customers. This includes where they are at any point in time, what they are browsing at any location, when they enter a store, what route they travel on a regular basis, what car they drive, what their driving behaviour is, what relationships they have with others, what they like and dislike and what their opinions are. Mobile applications can even access people’s contacts, their photographs and more.  With so much data available, ethics is now a topic on the minds of many in determining what exactly is deemed acceptable for companies to use when analysing data.  In addition, data is now highly distributed with many technologies in place that offer audit and security. Complexity has resulted in many organisations ending up with a piecemeal approach to information audit and protection. Policies are everywhere with no single view of the policies associated with securing data across the enterprise. Also, the number of administrators involved is often difficult to determine and now to cap it all, we have EU legislation demanding that data is protected and that organisations can prove this to their auditors by May 2018. So how are organisations dealing with this problem? Are data privacy policies enforced everywhere? How is data access security co-ordinated across portals, processes, applications and data? Is anyone auditing privileged user activity? This session defines this problem, looks at the requirements needed for Enterprise Data Security, Audit and Protection and then looks at what technologies are available to help you integrate this into your data strategy.

  • What is data protection what is involved in managing it?
  • What are the requirements for enterprise data protection?
  • The challenge of distributed data and distributed data lakes
  • What about privileged users?
  • Securing and protecting data in a Big data environment
  • What technologies are available to 
tackle this problem?
  • How do they integrate to enable end-to-end data governance and compliance?
  • How to get started in securing, auditing and protecting your data

12:10 - 12:55
Applying MDM to Improve Customer Experience in a Hybrid B2B & B2C Environment
Applying MDM to Improve Customer Experience in a Hybrid B2B & B2C Environment

With millions of customers and over 6,000 employees, Oxford University Press is the largest university press in the world – and the second oldest.  The media industry continues to be an innovator of digital technologies.  As part of that innovation trend, Oxford University Press uses MDM to enhance marketing engagement with B2C and B2B customers, thereby maximizing effectiveness, efficiency and customer experience.
It is tough bringing data from legacy systems together to create a Single Customer View, but when your business has been operating for five hundred years the challenges are enormous.  Add in the complexities of identifying customers who may have multiple relationships with your business over different commercial routes and several communications channels, and the problem might be viewed as impossible. Over the past two years, Oxford University Press has deployed a combination of traditional MDM techniques and empirical approaches to semantic interpretation of customer records to create a comprehensive marketing data mart representing information from around twenty underlying sources.
Topics to be discussed include:
  • Driving accurate marketing campaigns through combining internal customer data from myriad sources to external marketing information
  • Dealing with complex customer data structures and very poor incoming data quality to create the “golden view of the customer”
  • Ensuring compliance with data protection regulations around the world including the forthcoming European GDPR

David Walder, Head of Insight & Marketing Technology - Global Academic, Oxford University Press

David Walder

Head of Insight & Marketing Technology - Global Academic, Oxford University Press

David Walder leads Oxford University Press’ Insight and Marketing Technology program supporting 200 marketers worldwide. He has been instrumental in creating a transatlantic team to introduce new marketing systems; target direct marketing; undertake market research and provide deep, actionable customer analytics.  Prior to his current role, he was Managing Director of ScoreStore Music,Ltd, an online retailer.  After an early career in management consultancy, David held numerous positions with Microsoft spanning technical consultancy, pre-sales and management. His final position at Microsoft was as head of the Customer Systems Group where he led the design and development of innovative marketing databases and associated sales management tools. David holds a Masters in Engineering and Computer Science from Cambridge University.

Journey Towards a New MDM System
Journey Towards a New MDM System

This case study addresses the journey towards a new MDM system, and its successful implementation in public administration.  The public authority, Udbetaling Danmark a part of the ATP group, handles several municipal services such as disbursement of state pensions, rent subsidies, family benefits and maternity/paternity benefits. The new internal support MDM system contains shared data and functionality for all services placed in different silo specialist systems.
The MDM program has two objectives; to be cost-effective in ATP’s administration processes to fulfil the requirements of simple and efficient processes, and to ensure that data given by the citizens is distributed throughout all the specialist systems.  The new internal MDM system has a local registration of all citizens, companies and authorities to provide authoritative basic data enriched with additional data. This case study will discuss how the MDM system provides the following advantages:
  • Establishing data consistency & automated decisions within ATP silo specialist systems
  • Providing efficiencies via one joint data registration instead of many silo updates in specialist systems
  • Provisioning high data quality via the MDM data model rules

Birgitte Yde, Enterprise Architect, ATP

Birgitte Yde

Enterprise Architect, ATP

Birgitte Yde is an enterprise architect with more than 25 years of experience in IT and business development. She holds a degree in electronic engineering. After some years working with electronic apparatus, requirements and reliability Birgitte started her IT career at the Technical University of Denmark as a teacher in software development and database design. The last 15 years she has worked at ATP as IT specialist, IT architect, business developer, business architect and enterprise architect, with main interest in information and data architecture. Birgitte Yde is ATP’s representative in the Danish governmental basic data program as leading end user. Follow Birgitte on Twitter @birgitteyde.

Louise Pagh Covenas, Business Specialist, ATP

Louise Pagh Covenas

Business Specialist, ATP

Louise Pagh Covenas has 16 years of experience in the telecommunications industry, where she worked with business improvement and customer satisfaction, starting from correct data registrations to customer care. In 2014 Louise got the opportunity to work more intensely with Masterdata Management and data quality, and is now Business manager for two MDM systems within ATP. Louise Pagh Covenas is ATP’s representative in the Danish governmental basic data program as leading end user, as well as ATP internal program manager.

Data Governance Across Government
Data Governance Across Government

Setting up a data governance that works across the whole of government is not an easy thing to do. You need to setup a good structure that will help the individual agency, the sector, and the whole of government. This is done by setting up steering groups and have leading agencies responsible for specific parts of the governance.  You will learn:

  • How to setup a governance structure across government agencies
  • How to assess how mature the organisation is around information governance
  • How to make sure agencies help each other in maturing their organisation
  • What artefacts are available for agencies to help them with their information governance

Regine Deleu, All of Government Enterprise Architect, New Zealand Government

Regine Deleu

All of Government Enterprise Architect, New Zealand Government

Regine Deleu is a successful strategic transformation leader who aligns strategies, lay out business & technology -baseline & target architecture- perform gap analysis, reorient technology to fit business goals & processes. She has 30 years working experience. Regine is specialised in business innovation, data science, and business process optimisation.

The Swan Lake of Data - Data Governance with New Enabling Technologies
The Swan Lake of Data - Data Governance with New Enabling Technologies

Adam and Nic will discuss why data governance is ever more important with the advent of new enabling technologies – Big Data, Cloud, Digital etc. Attendees will be introduced to basic concepts of data quality and metadata and an approach to implementing successful strategies when embarking on data transformation programmes with real life examples on what works well and pitfalls to watch out for.

Adam Preston, Chief Data Officer, Santander

Nic Gordon, Associate Director, BCG

Nic Gordon

Associate Director, BCG

Nic Gordon is an Associate Director based in the London office, he has nearly 25 years experience in Financial Services with a focus on Data and Analytics.

12:55 - 14:25
Lunch, Exhibits & Perspective Sessions
13:25 - 13:50
Perspective Session: Data Challenges and Solutions for (Re)insurance and Financial Services
Perspective Session: Data Challenges and Solutions for (Re)insurance and Financial Services

Based upon his experience in the (re)insurance industry, Nick’s session will focus on common data challenges in (re)insurance and financial services. These include reliance on costly manual processes, key person dependency, regulatory demands,  risk management and the need for sustainable, scalable platforms for increased reporting cycles. He will also address insufficient early warning data quality indicators, the prevalence of end-user computing (EUC) and the business impact of inaccurate and incomplete data.
If you have experienced any of these data-related challenges in your business, join the session to learn more about:
  • (Re)insurance and financial services data challenges
  • Tackling these challenges in your business
  • Selected customer case studies from the (re)insurance industry

Nick Stammers, Director of Professional Services UK & Northern Europe, Ataccama

Nick Stammers

Director of Professional Services UK & Northern Europe, Ataccama

Nick Stammers is responsible for the growth and development of Ataccama’s UK & Northern European services business. Leading the delivery of solutions and services, Nick helps customers leverage the full potential of Ataccama’s solutions and has a particular focus on the (re)insurance market. Previously, Nick ran CMC Insurance, a specialist consulting firm delivering regulation, risk and finance programmes for the global (re)insurance market. Prior to joining CMC, Nick held a senior role at Pro Insurance Solutions, a specialist (re)insurance services provider. Nick also held senior roles at Concentra, a Business Intelligence and Analytics consulting firm and Serverside Group, a software firm delivering solutions in the banking sector.

Perspective Session: Demanding Material Design (by Google) for Enterprise B2B Software
Perspective Session: Demanding Material Design (by Google) for Enterprise B2B Software

Ever wonder why some online Apps are just so intuitive? Have you stopped to think why Gmail, Drive, and Google Translate don’t come with instruction manuals? Now think about the last interaction you had with an Enterprise Software. Big Difference. This session will explore the basis of Material Design, and the psychology that makes it so easily digestible for human interaction. We will use examples, and show an intelligent new way to look at Data Management for Mastering any kind of data.

Richard Branch, Vice President of Operations, UK and Northern Europe, Semarchy

Richard Branch

Vice President of Operations, UK and Northern Europe, Semarchy

Prior to joining Semarchy, Richard Branch held senior positions at Oracle and Informatica and established Sunopsis in the UK. He has worked in data management all his career, specializing in business intelligence, data warehousing and master data management.

13:55 - 14:20
Perspective Session: Beneficial Ownership: The Devil is in the Detail
Perspective Session: Beneficial Ownership: The Devil is in the Detail

In a recent Chief Compliance Officer survey, 90% of respondents highlighted ‘establishing Beneficial Ownership’ as the most significant challenge for their organisation when on-boarding or conducting reviews of their clients. Today, a critical business need is to calculate detailed levels of individual people ownership, to comply with increasingly burdensome regulations such as the 4th EU Money Laundering directive or the US FinCen rules.

Moreover, CCO’s need to reduce this burden whilst accelerating their due diligence processes, driving efficiencies and deliver cost savings for their organisation because the real motivating factor for any business is to be able to book client revenue faster. They needed a solution that could bring millions of data points together and quickly identify beneficial owners, based on policy rules, to instantly deliver analytics CCO’s could trust in terms of accuracy, timeliness and security. Dun & Bradstreet turned to Neo4j to help find truth and meaning through data.

What delegates will learn from attending the session:

  • Why Beneficial Ownership is such a data challenge
  • Why we chose graphDB and Neo4j to support us
  • What challenges did we uncover through this implementation

 

Stuart Swindell, Product Leader for Compliance & Supply, Dun & Bradstreet

Stuart Swindell

Product Leader for Compliance & Supply, Dun & Bradstreet

Stuart Swindell is a Product Leader for Compliance & Supply at Dun & Bradstreet. He is responsible for the API and Data products to support customer challenges in the Compliance and Supplier space. With over 20 years industry experience with roles in both Product and Technology he has seen how ‘Big Data’ challenges in the B2B space need to solve real problems.

Perspective Session: Solving the Challenges of Data Synchronization Amid Hybrid IT Complexity
Perspective Session: Solving the Challenges of Data Synchronization Amid Hybrid IT Complexity

The rapid adoption of new applications can reduce business agility due to interoperability issues across cloud services and on-premises systems, along with information silos and data quality issues that impede business insights.
Dell Boomi offers a unified cloud platform to move, manage and govern data wherever it resides.
Learn how Boomi MDM enforces data quality and synchronizes master data across the enterprise, improving mission-critical business functions and accelerating business agility.

Nilesh Parmar, Senior Enterprise Architect, Dell Boomi

Nilesh Parmar

Senior Enterprise Architect, Dell Boomi

Nilesh Parmar is a Senior Enterprise Architect for Dell Boomi in EMEA. He has extensive experience in Data Management and more specifically in Data Integration-iPaaS. Nilesh has over 15 years of experience in helping organisations harness their information and transforming it into a strategic business asset. Previous to working at Dell Boomi Nilesh has held senior positions at Informatica, IBM and Lloyds TSB Insurance. Nilesh has a BSc in Computer Science from Cardiff University.

14:25 - 15:10
How Multi-Domain MDM is Changing a Major Global Industry
How Multi-Domain MDM is Changing a Major Global Industry

SBM Offshore is a Dutch-based global group of companies selling systems and services to the rapidly evolving offshore oil and gas industry.  This is why this market leader recently launched an enterprise-wide program to redefine its way of working.  In this presentation, René Meijers, the Head of Data and Information Management at SBM Offshore, will provide an overview of their entire multidomain MDM program.  Topics to be discussed include:

  • Establishing the business case for your MDM program
  • Understanding why MDM is a critical component of any business transformation strategy
  • Strategising the roll out of enterprise-level MDM for a geographically distributed organisation

René Meijers, Group IT, Head of Data & Information Management, SBM Offshore

René Meijers

Group IT, Head of Data & Information Management, SBM Offshore

With more than 20 years of experience, René Meijers is a true expert in the area of master data and information management. For several international organizations, Mr. Meijers managed the design, implementation and daily operation of their global master data management programs. These solutions spanned multiple domains and industries. In September 2015, he joined SBM Offshore to establish and lead the Data & Information Management capability.  Previously, Mr. Meijers directed and managed global MDM programs at Zimmer Biomet and TNT-Cendris, and at other companies including Heineken, APG, and Philips. He earned his Masters degree in Business Economics from Maastricht University (NL) and his Bachelors degree in Commercial Economics from Zuyd University of Applied Sciences (NL).

Governing a Global Data Supply Chain
Governing a Global Data Supply Chain

Large global enterprises continue to struggle with the ability to source local data from multiple regions and harmonize it at a “global” level.  As data becomes more of a tangible asset for companies, many of the approaches for manufacturing supply chains can become relevant and valuable when applied to managing data, information and insight.
Dun & Bradstreet is a global business services company that provides commercial data to businesses on credit history, B2B sales and marketing, counterparty risk exposure, supply chain management, lead scoring and social identity matching.  Often referred to as D&B, the company’s database contains information on more than 235 million companies across 200 countries worldwide.
In this session, learn how D&B has established its own global data supply chain by unifying and governing a world-wide network of trusted data partners.  Topics include:
  • Determining the business value & technology architecture for an “information supply chain”
  • Rationalising global information needs with localized requirements
  • Balancing global vs. local attributes – finding the “sweet spot”

Sharon Lankester, Enterprise Data Governance Leader, Dun & Bradstreet

Sharon Lankester

Enterprise Data Governance Leader, Dun & Bradstreet

Sharon Lankester is a Data Governance Leader at Dun & Bradstreet a leading global provider of business insight and analytics. Her responsibilities include managing the Data Governance Office, providing oversight to a robust data governance program that helps to streamline business process, data architecture, sustain data quality, and maintain a metadata program across the organization.

Advanced Data Governance
Advanced Data Governance

With two years of experience of building their data governance organisation, this presentation will delve into where FrieslandCampina have got to, how they’ve got where they are, the practical steps they’ve taken, what they’ve ended up with and what the team does now.  It will look at the important decisions they’ve made, how they now structure themselves, significant problems they’ve encountered so far and where they’ve changed direction along the way.  Delegates will learn:

  • How to build their own master data governance organisation
  • Detailed advice on how to structure their organisation
  • Practical examples to help them implement their own governance programmes
  • Some of the pitfalls to watch out for

It's all Just Data Governance Isn't It?
It's all Just Data Governance Isn't It?

We think of Data Governance as being formulaic, linked to static frameworks and based on well thought out theory. But does that work in the real world. Having worked in a number of organisations Garry has found that whilst the theory works, in practical terms you have to be more fluid. This presentation will look at the differing views of Data Governance he has encountered and the ways he has looked to align with them and still deliver a successful program, it will also look at some of the mistakes that he and others have made and the impact they have had on the projects. In this session Garry will look to show that:

  • Data Governance is not a “one size fits all solution”, it has to meet the aims of the organisation
  • The ways Data Governance can be delivered are adaptable and fluid
  • Real world examples of both good and bad implementation

15:15 - 16:00
Data Modelling, Governance & MDM - Bringing It All Together
Data Modelling, Governance & MDM - Bringing It All Together

Willis Towers Watson is a global multi-national risk management, insurance brokerage and advisory company.  The company operates in more than 120 countries, has a workforce of more than 39,000 employees and revenues of €8.2 billion (FY2015).  In this session, the leader of their data architecture group will describe their journey to develop the target state vision of Willis Towers Watson’s Data Architecture.
Starting with a blank canvas, learn how the application of Data Modelling principles and standards has smoothed the integration of new MDM, Data Quality and Data Governance systems. Topics covered include:
  • Evangelising why Data Modelling is important
  • Linking Data Governance & Data Quality
  • Assembling the best tooling for the tasks

Kevin Smith, Head of Data Architecture, Willis Towers Watson

Kevin Smith

Head of Data Architecture, Willis Towers Watson

Kevin Smith is Head of Data Architecture for Willis Towers Watson, a leading global advisory, broking and solutions company that helps clients around the world turn risk into a path for growth. Data Architecture maintains the target state vision for the company’s Data Assets.  Kevin partners with the business, CDO and technology leadership to deliver this vision, maintain standards and is a thought leader in Data Modelling, Data Quality, MDM and Data Governance.  Prior to joining Willis Towers Watson, he ran his own successful software consultancy.  Kevin received his Bachelor of Science in Computer Science from the University of Hertfordshire (UK).

Implementing MDM in an Organisation that Operates by Consensus
Implementing MDM in an Organisation that Operates by Consensus

Goldsmiths, University of London has recently implemented MDM and Data Governance. The college is largely run by consensus and this presentation looks at the techniques and approaches that were used to obtain consensus and support for the project which is strategically important. In addition we consider how the master data strategy was arrived at, how the master data items were selected and how the work was scheduled; without exposing the business to the detail that might have derailed the project through incessant questioning of the approach. The Dell Boomi implementation at Goldsmiths has significantly reduced the technical debt in the domain of data but MDM can’t resolve all of the issues – what else are we doing? Attendees will learn:

  • Change management techniques employed
  • How Goldsmiths selected the MDM strategy to follow
  • From zero governance to coverage of the key entities in 8 months

David Swayne, Chief Information Officer Goldsmiths, University of London

David Swayne

Chief Information Officer Goldsmiths, University of London

An experienced and results-driven Chief Information Officer / IT Director with a successful background in improving IT service delivery in education, financial services and corporate sectors.
David Swayne takes a strategic approach to successful IT leadership, engaging with key business stakeholders and 3rd parties to deliver ‘cost appropriate’ IT solutions and business process change via emerging technology solutions.
David’s experience includes business change management, business transformation, IT service transition, IT strategies, technology roadmaps, stabilisation of systems, e-commerce, infrastructure renewal, shared service facilities, IT centralisation & systems integration, delivering added business value & ROI through the use of technology.

Data Governance in Bite-Sized Chunks
Data Governance in Bite-Sized Chunks

Have you ever heard it said “Data Governance is too theoretical! What am I really going to see from a DG initiative?” or “I don’t know how we can get started with Data Governance”?
Are you facing the dilemma of how can (or even should) Data Governance be introduced when many of the key stakeholders don’t get it?
Do you need to “sell” Data Governance to stakeholders?  Maybe they don’t get it, maybe they are too busy.  For whatever reason the “business case” and developing metrics proves to be a difficult, and for some organisations an insurmountable hurdle to cross in their DG journey.  If this is the case, a “DG in bite sized chunks” approach will help. So, are there some quick realistic aspects in a DG program that you should focus on first?
This session will show:
  • A workable framework for Data Governance
  • The different Data Governance approaches from process centric, to Data centric & more
  • How to produce a pragmatic business case & principles for DG and link these to metrics
  • The essential roles & responsibilities for Data Governance success
  • Building DG in bite sized pieces & sometimes covertly; it is possible
  • The Data Governance office & its critical role in sustaining success
  • A brief look at categories of tools supporting Data Governance

Staying Number 1 in the Industry by using Well-Governed Master Data
Staying Number 1 in the Industry by using Well-Governed Master Data

This is the story how business representatives took over the data design, the data governance and the data management from the IT professionals, resulting in a rare business engagement in the business data.
With more than 100 years in the industry, Volvo Penta has managed to remain on the frontline of technical innovation and offers a wide range of marine and industrial power solutions.
Now, as the business goes digital, they can refine the offering and extend the customer relationships by leveraging their master data framework – modelling, governance and data quality.  You will learn:
  • How data management is the job of the business, rather than the IT department
  • How business development is related to data governance, or rather, is depending on it
  • That the change of mindset into data centricity and not-so-glamorous data stewardship is much more important than fancy computing tools

Joachim Bondeson, Process Manager - Commercial Offer & Configuration, Volvo Penta

Joachim Bondeson

Process Manager - Commercial Offer & Configuration, Volvo Penta

Joachim Bondeson is the Process Manager for Commercial Offer and Configuration at Volvo Penta.  He has a background as a technical editor in the area of sales information where a secret addiction to databases left him with a passion for information structures and data quality that turned into a career path.

16:00 - 16:30
Networking Break & Exhibits
16:30 - 17:15
Data Integration for Non-Profit Organisations
Data Integration for Non-Profit Organisations

Only a small number of organisations are using a single system of record, where all data are stored in one single central database.  In practice, even data of individual business units of an organisation may be stored in multiple sets.  Moreover, most organisations will be using multiple systems over multiple departments, causing data to be distributed.  Although each data set is primarily designed for its corresponding application system, the data may be relevant to other applications.  Conversely, business processes very often require data from multiple domains.
Interchanging data between operational systems requires IT solutions.  There are several architectural types to implement these Data Integration (DI)solutions, many of which are to some extent supported by commercial Data Integration toolsets such as MDM.  Unlike some other organisation types, Non-Profit Organisations typically will not be able to use commercial toolsets.
This presentation will provide an overview of operational DI in general, followed by an explanation of how NL-based HAN University of Applied Sciences solved their DI challenges for operational MDM.  Topics concerning how to convert a homegrown Data Integration “steam train” into a “Hyperloop maglev train” include:
  • Understanding the models & architectures for “operational” Data Integration
  • Managing DI challenges using DIY (generic home-made toolset)
  • Tackling data quality issues while simultaneously dealing with MDM & Data Governance maturity challenges

Jan Lenders, Data Integration Specialist, HAN University of Applied Sciences

Jan Lenders

Data Integration Specialist, HAN University of Applied Sciences

Jan Lenders is an experienced Data Integration specialist, currently working at NL-based HAN University.  Mr. Lenders started his professional IT career in 1986, moving from data-centric programming and database design in the financial industry to data integration & distribution (DI&D).  He switched to HAN University in 2007, specialising in Data Integration; provisioning all major data streams to support operational business processes for 30,000 students and all staff involved.  Mr. lenders recently obtained a MSc in IT honours degree at the University of Liverpool, graduating on the subject of Data Integration and Distribution.

Enhancing Logistics Data with Customer MDM
Enhancing Logistics Data with Customer MDM

DPD Germany is part of the international DPD group, Europe’s second largest parcel service network and runs 77 depots and 6,000 Pickup parcel shops within Germany. Every day more than 8,000 employees and 9,000 drivers serve a wide range of customer needs. DPD is the second largest service provider on the German parcel market and transports about 350 million parcels a year (~1.5 million parcels per day) – with up to 10% annual growth in parcel volume.
Having started as B2B logistics service delivering to business customers, in 2013 DPD added the e-commerce dimension to its core business strategy. Today DPD enjoys innovation leadership on the market with a range of awards for its app and mobile solutions, together with its Predict product offering a 1h delivery window.  One of the biggest challenges within the strategic change was the collection and use of the data relating to all of the country’s potentially 40 million private households, especially since these do not have a contract with DPD and German privacy law is very strict on how to use/save data that is transferred in connection with parcel delivery operations.  The solution was to break up the existing data silos by deploying a commercial MDM solution with all the matching and deduplication software needed. DPD now has a database with 9.7 million known recipients, created as a “golden record” from several sources. All 1.5 million parcels per day have to be matched against this database in order to optimize the transport process and apply the MDM solution to directly boost the relevant business process.
This session will discuss DPD’s customer MDM journey via such topics as:
  • Building a business case for a large-scale MDM solution to convince management
  • Using agile development as a success factor for MDM projects
  • Architecting for ongoing scalability

Markus Müller, Team Lead - Data Governance & Data Quality Management, DPD

Markus Müller

Team Lead - Data Governance & Data Quality Management, DPD

Markus Müller is leading a small team within DPD responsible for all activities around the topics Data Governance and Data Quality Management. One of the flagship projects was to introduce a single point of truth on master data via unique “Golden Records” and use that knowledge to optimize business processes and introduce new innovative products.  Prior to DPD, Mr. Müller was a Senior Software Engineer with Capgemini Technology Services and started as a software developer for Albat + Wirsam. He received his master’s degree in computer science from Hochschule Darmstadt.

How Data Governance Delivers a Better (Data) Quality Environment
How Data Governance Delivers a Better (Data) Quality Environment

The Environment Agency is a public sector organisation with all of us likely to benefit from their data. In this session Simon and Nick will take you through the EA approach to data governance and data quality and show how they applied this to, and helped resolve a real life problem that could affect many of us at home or at work.
In this session, Nick and Simon will:
  • Explain how they approach data governance in the EA
  • Show how their approach to data governance supports data quality
  • Share the lessons learnt from their DG journey and how to engage with the business

Nick Keen, National Lead - Data Governance, The Environment Agency

Nick Keen

National Lead - Data Governance, The Environment Agency

Nick Keen has worked for the Environment Agency for 18 years in various roles across the organisation from inspecting waste sites to tackling environmental crime. He is now the National Data Governance lead – where he uses his knowledge of the organisation and culture of the Environment Agency to implement data governance that will last. Follow Nick on Twitter @kernowkeeno.

Simon Dimbylow, National Lead - Data Quality, The Environment Agency

Simon Dimbylow

National Lead - Data Quality, The Environment Agency

Simon Dimbylow has been in the Environment Agency for 10 years working in various data related roles. His work has naturally flowed into understanding data quality culminating in him working to change the way the organisation looks at and understands data quality. Follow Simon on Twitter @cumulodimbus.

Becoming a Data Driven Organisation: The Critical Role of Data Governance & MDM
Becoming a Data Driven Organisation: The Critical Role of Data Governance & MDM

Many organisations aspire to become digital, data driven enterprises.  In these organizations data is viewed as a critical asset, both to generate new digitally based products and services, and to guide and improve business operations and decision making.
But many companies are failing to live up to this aspiration.  They struggle to develop and implement data strategies that align with, and help to deliver, new business strategies.
Delivered by Nigel Turner, a highly experienced data management consultant, this session will explore what becoming ‘data driven’ really means, examines some of the reasons why many organisations are failing to realise their ambitions, and propose ways of overcoming the challenges.  Key to these is a strong emphasis on the increasingly critical importance of established data management disciplines, especially Data Governance and MDM, which both have a key role to play in a digital business of the future. This session will explore:
  • What is a data driven organisation and how does it differ from a traditional company?
  • The main challenges of creating a data driven organisation
  • Building a data driven capability – the role of business and IT
  • The central importance of a business aligned Data Strategy and how to achieve it
  • Why a successful data strategy needs both Data Governance and MDM

17:15 - 18:30
Networking, Drinks Reception & Exhibits
18:30 - 20:00
Presentation by the Data Management Specialist Group, The BCS, The Chartered Institute for IT: Securing Executive Support for Data Governance
Presentation by the Data Management Specialist Group, The BCS, The Chartered Institute for IT: Securing Executive Support for Data Governance

For many of us the principal of “Data as a valued asset” is the norm.
When you value something your instincts take over. You have primordial need to protect, to care for and cherish what you love. When you don’t have such values you make rules and impose controls so that all may benefit. Governance at its core is cold and unfeeling, legislation, often with no tangible bite. To many board members, the rhetoric is the same, lacks impact and relevance.
So how do you motivate people to do the right thing? Create board level data advocates, engage staff and empower employees. John draws from his experience of data and technology transformation in Government, Energy, Telecomms and financial services on embedding data as valued, and as an asset with equity.
Things you will find useful from this session:
  • New thinking metrics and models for a data driven organisation.
  • A risk-based approach rather than compliance based approach providing clarity on risk exposure and increased flexibility in embedding the data governance framework
  • Embedding assurance and quality into business practices
  • Loving raw data and understanding why
  • Methods for making data transformations and realising the need for governance, stakeholder engagement and the focus on partners that capture data

As ex regional CTO for SAS John highlights the analytic value of data and the additional challenge of getting data right for insight.

All registered conference attendees will be entitled to attend this session.  If you are not planning to enrol on the conference but would like to attend this BCS DMSG session please click here.

John Morton, Business and Technology Advisor

John Morton

Business and Technology Advisor

John Morton advises CxOs, Directors and Business Managers on the capabilities and value of new methods and technologies in providing data driven business enhancing services. John has broad and deep experience in data creating digital business in established business and challenger businesses. John has been a key member of transformation boards in Financial Services, Central Government, Utilities and Telecommunications in both large and medium sized companies. John’s experience includes: Former Senior Director and regional CTO for SAS; CTO and Technical Assurance Advisor transforming legacy systems to multi-channel, multi-product services leveraging existing investment; on the Overall Design Authority for the delivery of transformation of IT systems in the UK National Health Service ;Innovation director and healthcare CTO for Intel, Rationalisation of Global MIS and data warehousing systems from over 40 separate data systems to three key information platforms; Worldwide CRM architecture for a Mobile Telecommunications company; to name a few.

Wednesday 17 May 2017 : Conference Day 2 & Exhibits
09:00 - 10:00
PLENARY KEYNOTE: How to Create Massive IMPACT and be an Effective Zoo Keeper
PLENARY KEYNOTE: How to Create Massive IMPACT and be an Effective Zoo Keeper

In an ever-changing world and with pressures that come from a global source how do we make sure our teams are “in the room” and making an IMPACT.  Nigel will share his 6-stage approach for keeping people energised, focused and most importantly achieving results.
He will also include a fun inter-active communication session that will have delegates talking about it for days, weeks and months to follow.
In his unique style, he will identify everyone in the room and share with them how to manage the animals in their workplace by being an effective zoo keeper.
  • The power of focus
  • The cost of internal terrorists
  • The importance of communication

Nigel Risner, Motivational & Inspirational Speaker

Nigel Risner

Motivational & Inspirational Speaker

Nigel Risner is a respected author, television presenter and a prolific speaker and the only motivational speaker in Europe to have been awarded Speaker of the Year from The Academy For Chief Executives, Vistage, Footdown and The Executive Committee. Nigel’s workshops and keynote speeches are results –oriented, challenging his listeners to expand their horizons, embrace the opportunities that await them and dare to dream of achievements which seemed impossible before.  Nigel speaks at over 150 conferences a year, in over 18 different countries. His recent clients include: BT, BSKYB, PFIZER, Pepsi-cola, GSK, Siemens, HSBC Bank, The Academy for Chief Executives and many more. He is one of only six speakers in the UK to have been awarded the highly prestigious PSAE (professional speaking award of excellence) from the Professional Speakers Association.

10:00 - 10:30
Networking Break & Exhibits
10:30 - 11:15
MDM Keynote: Field Reports for 'Top 20' MDM Solutions
MDM Keynote: Field Reports for 'Top 20' MDM Solutions

Evaluating MDM solutions is comparable to purchasing your first home— too many new variables, lack of transparency in the pricing, and high pressure sales tactics. On top of this pressure, IT executives have to contend with the marketing dogma of ongoing “stack wars” among the mega vendors and the dogmatic “we are the world” viewpoints of MDM and (even) Business Process Management (BPM) vendors.
To further enliven the MDM evaluation process, enterprises are pressed to decide between the safety of mega vendor solutions (slow to innovate, high price tag, we-are-the-world mentality) and that of best-of-breed solutions providers (productivity of graph/semantic UIs, innate Cloud and Big Data support, relatively small software firms, etc.).  To cope during 2017-18, many enterprises will increasingly face these trade-offs as they embark on their MDM journey.  This session will focus on the why and how of MDM platform technical evaluations by providing insight into:
  • Understanding the pros & cons of the dominant architectural models & evaluation criteria— e.g., pro-active Data Governance, identity resolution, hierarchy management, scalability, Big Data & Cloud integration capabilities, etc.
  • Assessing the vendor landscape— e.g., registry, data hub, ultra-hub, SOA-based web services, data service provider, BPM-centric, etc.
  • Applying a rigorous methodology to product evaluations for both mega vendor solutions (IBM MDM, Informatica MDM, Microsoft MDS, Oracle MDM, SAP MDG) and more pure play (Ataccama, Enterworks, IBI MD Center, Magnitude, Orchestra Networks, Riversand, Semarchy, Stibo, Talend, Teradata, TIBCO, VisionWare, et al)

Data Governance Keynote: Key Factors in Successful Data Governance
Data Governance Keynote: Key Factors in Successful Data Governance

Since 2012 Grundfos has been on a journey to implement Data Governance across the entire organisation. During their journey, lessons have been learnt and they have identified some of the key factors in demonstrating the business value of Data Governance. These are the critical elements that has made a real difference in obtaining and maintaining executive management attention and sponsorship on Data Governance in Grundfos. In this session, delegates will learn:

  • How Grundfos started and the journey they have been on
  • How Data Governance is organised and who is driving it
  • The Data Governance Framework in which they operate
  • What the key factors are that have made Data Governance a success
  • What they have achieved and where they still have challenges

MDM Track 1
MDM Track 2
Data Governance Track 1
Data Governance Track 2
11:20 - 12:05
Agility to Support Changing Business Needs
Agility to Support Changing Business Needs

The world has become smaller.  Businesses, especially Manufacturing, need to adopt and change quicker to meet ever increasing marketplace demands which in turn may propagate a spiral of change that could become costly, as additional resources are thrown at projects to reduce time-to-value.  But what can we do to ensure our functions and support models have the agility to support ever changing business needs? This session shall provide insight, and discussion on why strategic partnerships are so important, by discussing the following topics:
  • Justifying the importance of MDM outside of IS/IT functions
  • Focusing on the importance of key expertise within deployment teams
  • Providing the necessary Governance model to accommodate frequency of change

Bradley Smith, Group Master Data Services Manager, Meggitt

Bradley Smith

Group Master Data Services Manager, Meggitt

As Group Master Data Services Manager, a role at the interface of Business and IT, Bradley Smith is the key sponsor for the SAP MDG implementation in Meggitt, a UK-based global manufacturer of civil and military aerospace sub-systems with revenues of £1.6B and nearly 11,000 employees. His responsibilities include the development and deployment of Meggitt’s master data strategy, the successful rollout of MDG, evangelising the business value of quality data, and driving Meggitt’s maturity in Data Governance and MDM.  Moreover, Mr. Smith has global responsibility for the SAP ERP modules Manufacturing, Inventory & Plant Maintenance.  Prior to Meggitt, he held management roles at Ultra Electronics Controls and Global Orthopaedics.  Mr. Smith received his MSc in Technology / Business Management from Kingston University

Implementing Master Data Governance in Large Complex Organisations
Implementing Master Data Governance in Large Complex Organisations

Coloplast is a leading medical device company with revenues of approximately €1.8 billion and 10,000 employees worldwide.  The company develops products and services to make life easier for people with deeply and private medical conditions.  2013 was the year that master data governance was placed on the Coloplast agenda with the purpose to establish a master data governance framework and a product hub to enable accurate and reliable master data through ownership, processes and approved definitions.
This session will cover the journey of master data governance within Coloplast, our lessons learned, our challenges and our successes.  In this session, delegates will learn best practises concerning:
  • Initiating the Master Data Governance journey via coordination of Senior Business & IT Stakeholder management
  • Developing & instantiating a Master Data Governance model for a large, complex organisation
  • Focusing on both on operational level & strategic level to ensure Master Data Governance success

Charlotte Gerlach Sylvest, Senior Master Data Governance Manager, Coloplast

Charlotte Gerlach Sylvest

Senior Master Data Governance Manager, Coloplast

Senior Master Data Governance Manager, Compliance and Systems at Coloplast since 2014.
Charlotte Gerlach Sylvest has been working with Master Data Management and Master Data Governance on both a strategic as well as an operational level since 2008 and has more than 15 years of experience with IT system implementations and process improvements in the cross field of IT and Business.

Governing Asset Data to Reduce TCO
Governing Asset Data to Reduce TCO

As with many airports, asset maintenance forms a significant portion of operating costs. To reduce costs (eg landing fares) Schiphol has started a program to obtain more value from the asset while lowering TCO: Plan, Design, Build and Maintain the asset smarter than before. This also initiated a Data Management program – after all one cannot predict maintenance if one doesn’t have reliable data. The purpose of the program is to allow the asset manager to perform analysis at asset level of cost, function (benefit) and risk associated with that asset. To improve the available data, a large data quality program has been running for over a year.
Key take-aways:
  • Implementing data governance in an asset management environment
  • Dealing with ‘asset’ as a master data object
  • MDM as a starting point for analytics and using analytics as key component in upgrading data quality
  • Transition from data orientated to data & process orientation

Rolf Emmens, Data Quality Officer, Amsterdam Airport Schiphol

Rolf Emmens

Data Quality Officer, Amsterdam Airport Schiphol

Rolf Emmens has a degree in Computer Science and IT Auditing. He has been in Data Management since early 2010. Rolf has substantive knowledge on the subject of data quality and has been implementing data governance at several organizations including Getronics and Schiphol. Within Schiphol Rolf is responsible for Asset Data Quality.

Stefan Van der Weide, Manager Asset Data Management & Business Analytics, Amsterdam Airport Schiphol

Stefan Van der Weide

Manager Asset Data Management & Business Analytics, Amsterdam Airport Schiphol

Within the Asset Management department, Stefan van der Weide, is responsible for the full scale of the Data Governance. Even including the data entry of the most valued data. Next to that he heads the team, who turns the data into real business value. This team consists of Business Process Engineers & Analysts and Data Scientist and Analists. Although improving the Data Quality is his primary target, he is inspired on a daily basis by the team working with the data. Combining business and data kwowledge is delivering value, even without the Data Quality always being on the correct level. Stefan started his career with Capgemini and has worked in various (project) management roles within the Royal Schiphol Group. He received his degree from the Rotterdam Business School, majoring in International Management with Cultural Antropology as a minor.

Establishing Data Management within a Large Regulated Utility
Establishing Data Management within a Large Regulated Utility

National Grid is a large regulated utility operating with large volumes of different types of data in two distinct territorial operations. Their drive to create the case for an improved approach to Data Management helped articulate the role of data in the organisation.  Increasingly, their conversations have improved senior leader recognition that data is one of their most important corporate assets.  Robust data provides the basis for informed decision making and enables them to measure and improve performance, underpinning the delivery of strategic objectives.  The effective management of data is also essential to the delivery of safe, seamless and efficient services to their customers.
In this session we will:
  • Explain the core ingredients which made their most recent approach successful
  • Describe how they created a ‘pull’ for data management (rather than relying on the traditional ‘push’ approach).
  • Provide analysis of the rationale and benefits of each of the core ingredients in their approach
  • Suggest how these will potentially help you shape your own data management strategies in your own organisations.

Jacqueline Harrison, Data & Information Manager, National Grid

Jacqueline Harrison

Data & Information Manager, National Grid

Jacqueline Harrison has worked to deliver data focused business change in several sectors including telecoms, finance, pensions, and fuel charge cards before joining National Grid in 2008.  In all of these, the same challenge exists: to establish sufficiently controlled and managed data to achieve a target business outcome.  As Data and Information Manager at National Grid, Jacqueline has been core to leading the case for change to establish improved data management in National Grid.  More recently, she has developed corporate wide Data Management principles, developed a suite of minimum standards and authored supporting guidelines to support business implementations.

Chris Bradley, Information Strategist, Data Management Advisors Ltd

12:05 - 13:30
Lunch, Exhibits & Perspective Sessions
12:30 - 12:55
Perspective Session - How Do We Adapt MDM Approaches to Data Governance in a Big Data World?
Perspective Session - How Do We Adapt MDM Approaches to Data Governance in a Big Data World?

The typical MDM initiative starts with the assumption that data can be mastered – the clue is in the title!  Driven by regulatory (know your customer) or marketing (cross – selling) needs, resources are focused on mastering reference lists of customers and products to deliver the required data join up. With known internal data sources, and control over the data designs, this approach has often been successful.

In a big data world, the lack of control over data and data design makes this approach more difficult.  The three Vs (volume, velocity and variety) has become well established as a taxonomy for understanding the nature of the data challenge, and this now seems useful for analysing the data integration challenge as well.  The approach to implementing MDM can be adapted based on increasing complexity in each of these dimensions.

As well as the increasing data complexity, the required type of analysis has increased in complexity in recent years.  Where a reporting and forecasting capability was previously considered state of the art in managing a business, the use of advanced analytics has now become the expectation with machine learning and neural computing becoming commoditised offerings.

This work considers how traditional MDM technology creates an initial data platform for delivering in a big data domain.  The investigation deployed various versions of IBM MDM technologies, to explore approaches to meeting the challenges of increasing complexity in both data and analytics requirements.  This demonstrated that consistent results can be generated in migrating from relational to big data matching technology, and that this increases the value of integrating external data of lower quality.

The use of big data matching also generates a number of new opportunities where multiple matching thresholds can be stored within the MDM, enabling multiple use cases to be deployed against a single store.  This contributes to a new wave of change in delivering new insights through data and analytics.

John Holland, Head of Data & Analytics Architecture, The Home Office (on behalf of Entity)

John Holland

Head of Data & Analytics Architecture, The Home Office (on behalf of Entity)

John Holland has worked in IT for over 25 years with a focus on driving business transformation through better use of data.  His first experience of MDM processes was with a CRM deployment in a multinational business, requiring the integration of data from 20+ disparate structured and unstructured databases.  In recent work, he has been driving the integration of data to deliver an analytics capability in a public sector organisation.  His experience with building this data platform spans the challenges of diverse data sources, variable data quality and NoSQL databases.  John’s insight into these challenges envisages a future where emerging MDM technologies will provide a platform for delivering organisational transformation from improved understanding of internal and external data.

Perspective Session - TBA
13:00 - 13:25
Perspective Session - The Right Approach to Achieving the Highest Returns from MDM
Perspective Session - The Right Approach to Achieving the Highest Returns from MDM

Organizations have historically struggled with controlling implementation time, cost, and risk and deriving value from MDM initiatives. This is why many organizations have a less than favorable view of investment in MDM. However, the right approach can quickly deliver high returns while minimizing risk.
This session will focus on proven methods for helping organizations quickly implement and derive value from MDM and will cover best practices for these three critical areas:
  • Planning and Preparation
  • Implementation and Integration
  • Extending MDM to Enable Specific Business Use Cases

Michael Ott, Senior VP, Innovative Systems

Michael Ott

Senior VP, Innovative Systems

Michael M. Ott is a Senior Vice President of Innovative Systems, Inc., a global leader in the development and delivery of MDM, Data Governance, Data Quality, and AML compliance solutions.  Mr. Ott has extensive experience in executive management, marketing, business development, product management, and software development. He currently oversees the Data Management, MDM, and business solutions areas for Innovative Systems. For 30+ years, Mr. Ott has been involved in managing the delivery of MDM, risk management, data governance, and data quality consulting and solutions to some of the largest companies in the world.  He frequently speaks on topics related to achieving better business results through more effective data management and data governance approaches. Mr. Ott received a Bachelor of Economics degree from Allegheny College and a Master of Business Administration degree from the University of Pittsburgh with a dual concentration in Marketing and Information Systems.

Perspective Session - TBA
13:30 - 14:15
Field Reports for 'Top 10' MDG Solutions
Field Reports for 'Top 10' MDG Solutions

Master data, reference data, meta data.  It is universally agreed that Data Governance (DG) is critical to achieving sustainable and effective MDM (also RDM and MM).  Failure to execute DG concurrently with an MDM program greatly decreases the probability of success and economic sustainability of MDM programs. Clearly, DG is both synergistic and co-dependent with MDM.  When deploying MDM, a proper DG discipline should consider the business drivers, project scope, roles and people filling each role, policies and procedures, data quality, inheritability, social norms, and the business operating model.  Moreover, DG is more than a single product or process, rather, it is an ecosystem of products, processes, people, and information.  At present, DG for MDM is moving beyond simple stewardship to convergence of task management, workflow, policy management and enforcement.
Then why are both the mega vendors and boutique/best-of-breed vendors still lagging in this critical area?  Understanding the scope, diversity and limitations of current DG solution offerings for master data is tremendously challenging – even more so, given the fast pace of M&A & complexities of integrating such diverse software portfolios.  Nonetheless, business and IT leadership chartered with defining and executing MDM programs need help to understand and navigate through the number and variety of DG options.   Moreover, why the marketing dogma and confusion over such mundane concepts as “integrated” and “pro-active”?
Through 2017-18, most enterprises will struggle with enterprise DG while they initially focus on Customer, Vendor, or Product; integrated enterprise-strength DG that includes E2E data lifecycle will remain elusive as most organisations turn to lightweight glossaries with modest Data Steward workflows to support devolved autonomy and multi-disciplinary, bi-modal teams.  During 2018-19, the majority of MDM software and service providers will focus on productising such lightweight DG frameworks while mega MDM software providers will struggle to link governance process with process and data hub technologies.  By 2019-20, mega vendor DG solutions will finally move from “passive-aggressive” mode to “proactive” Data Governance mode.
This session will a review of the current solutions in market as well provide a “top10” list of evaluation criteria for such solutions. Topics include:
  • Understanding the “top 10” evaluation criteria for DG of master data solutions — e.g., decision rights management, E2E lifecycle management, Big Data & ECM support, DQ/ETL integration capabilities, etc.
  • Assessing the vendor landscape— e.g., passive, active, integrated, pro-active, & passive aggressive, etc.
  • Determining an enterprise-specific road map to evolve from a siloed, motley collection of DQ tools, processes & point products to a non-obtrusive enterprise

PIM as the Core of Digital Eco-Systems
PIM as the Core of Digital Eco-Systems

Dorel Juvenile is the world’s leading juvenile products company. The company’s products are available in more than 100 countries and developed and supported by 7,000 highly-driven professionals in 25 different countries.
Having a single source of truth for multi-channeling in a complex organization was the key driver in selecting and implementing a PIM system for Dorel.  The main goal was to deliver data centralization and access management to increase the efficiency of content management across brands, channels and countries and improving the quality of content.
How does Dorel respond to the various channels they want to serve? How can content management be used efficiently and effectively despite the complexity of multiple brands and the variety of stakeholders? What is the effect of having qualitative content on the business goals, e.g. conversion rate?
Digital content management is a journey where opportunities pop up along the way.  During this presentation, not only will the tool be described from a technology point of view, but also the importance of connecting people, process, content and tools to roll out an iteratively successful PIM system. Best practices will show how data consistency and a single source of truth leads to a higher conversion rate and also unlocks opportunities for better internal communications.   Topics to be discussed include:
  • Enabling an ensured customer experience by providing consistent, meaningful & context-optimized master data to every channel & every customer need – current & future
  • Providing high-quality golden records beyond the article/SKU level by also providing different aggregation levels such as: bundles/kits, sets/marketing products & other perspectives that customers & marketing have
  • Ensuring a consistent business glossary across all applications for attributes, class models & taxonomies – including a central set of tags to ensure comparability & consistence across applications for search, filtering & reporting

Naima El Omari, Information System Project Manager, Dorel Juvenile

Naima El Omari

Information System Project Manager, Dorel Juvenile

Naima El Omari currently is project manager for the New Product Development, Marketing and Quality at Dorel Juvenile Group.  Prior to this, Ms. El Omari was ICT manager at Intrasurance Technology Services.  Her other major employment tenures include: ProPlanet, Societe Generale, and BNP Paribas Personal Finance.  In addition to her TOGAF certification for Enterprise Architecture, Ms. El Omari received her Masters degree in Information Technology from Université Paris-Est Créteil (UPEC).

Creating and Sustaining a Data Governance Function at The Co-operative Bank
Creating and Sustaining a Data Governance Function at The Co-operative Bank

Every organisation has its own vision and they have developed a unique strategy to fulfil this vision. Each business operates with different values, ethics, principles and people, making their data governance need distinctive to one another.
By better understanding your organisations drivers you can use aligned key data metrics, to motivate and sustain the right fit data governance function for the company and when all else fails you can always play the unfailing data governance trump cards.
If you attend this session you will gain an understanding of how to use your company’s data and strategy, to motivate and sustain, the right fit data governance function for your company.
  • How the data governance function was created and sustained at the Co-operative Bank
  • Data governance is not one size fits all · Data governance needs will vary per organisation
  • Data metrics that can be used to support data governance
  • When all else fails use your data governance trump cards

Suzanne Coumbaros, Head Data Governance, The Co-operative Bank

Suzanne Coumbaros

Head Data Governance, The Co-operative Bank

Suzanne Coumbaros is a data management professional with many years of experience in data governance, data architecture, data warehousing, business intelligence, data quality,   data development and data strategy in a variety of organisations. Originally a computer programmer, statistician and mathematician from Cumbria UK, she had worked for many government led organisations as well as public and privately own companies both across the UK and Africa. Her background information comes from having created data governance teams in different organisations and countries.

A Data Governance Roadmap, Building From a Single Customer View
A Data Governance Roadmap, Building From a Single Customer View

Following the merger of Dixons and Carphone Warehouse, the Customer Data Management Function is leading a program to maximise the value of data to generate strategic insights and better customer experiences across multiple channels. In this session we will describe the journey undertaken by Dixons Carphone to-date, the roadmap and longer term vision, covering:

  • Creation of a Single Customer view from numerous legacy systems, spanning 6 organisational brands
  • Establishment of a wider Data Governance organisation
  • Developing the supporting people and process capabilities

Fiona Healy, Senior Business Analyst, Dixons Carphone

Fiona Healy

Senior Business Analyst, Dixons Carphone

Fiona Healy has over 11 years of information technology experience with emphasis on a few key areas such as IT Operations, Delivery, System integration and most recently Data Governance, Data Quality, Data Security and Information Management.  Fiona was involved from inception with the development of Dixon Carphone’s Single Customer View. Her initial role was with the Agile delivery team prioritising the work and liaising with the stakeholders to capture business and data requirements. Fiona went on to collate the security requirements for the final solution.  Fiona is now leading an effort to establish a centre of excellence to provide data management services to the organisation including data quality, data governance, data security and protection policy and process involving the full extent of people, process and technology.

Mary Drabble, MDM Consultant & Customer Success Manager, Agile Solutions GB

Mary Drabble

MDM Consultant & Customer Success Manager, Agile Solutions GB

Mary Drabble has more than 15 years’ experience in Information Management and has helped clients across all industries in a wide variety of engagements ranging from Analytical to Operational data and information management solutions. Mary has a proven track record in Master Data Management, Data Governance and Data Quality tools, methodologies, architectures and processes and joined Agile Solutions as a Customer Success manager, focussed on collaborating with customers, peers and partners to achieve successful project outcomes.

14:20 - 15:05
Making Master Data Fashionable - Transforming Information Chaos into a Governance-Driven Culture
Making Master Data Fashionable - Transforming Information Chaos into a Governance-Driven Culture

BESTSELLER is a family-owned clothing and accessories company founded in Denmark in 1975, which provides fast and affordable fashion for women, men, teenagers and children.  The company’s products are available online, in branded chain stores and in multi-brand and department stores across the world.  As a global company with diverse sales channels, “customer data standardization” is crucial to support business reports.  BESTSELLER’s Global Master Data department was tasked with standardizing all customer data for our different subsidiaries. This effort required centralizing most of the data lifecycle processes – not only for customer data but also vendor and location data – in order to align across all companies.  Now BESTSELLER is undertaking another data management transformation, one that will move the business towards a decentralized MDM model.  In this session, you will learn about the challenges faced by the BESTSELLER MDM organization, lessons learned and best practises you can apply inside your own organization.  Some of the topics that will be discussed include:

  • Evolving a central MDM team into a Global Governance function
  • Developing the business & organizational justification for decentralized MDM
  • Designing a practical approach to creating your own best practice

Erika Bendixen, MDM Team Leader, Bestseller, A/S

Erika Bendixen

MDM Team Leader, Bestseller, A/S

Erika Bendixen is MDM Team Lead at BESTSELLER, a family-owned clothing and accessories company based in Denmark.  As part of BESTSELLER’s Global Master Data department, she is responsible for Data Governance and leads the team that oversees all data lifecycle processes, including documentation, availability and maintenance.  In her role, Ms. Bendixen works closely with the IT organization to ensure standardized data lifecycle processes across companies.  Prior to BESTSELLER, she held roles at DANFOSS in Logistics, Planning and Buying.  Ms. Bendixen holds graduate and postgraduate degrees from ITESM and Århus School of Business (DK).

Retail Grocery in a Digital World
Retail Grocery in a Digital World

With over 350 branches, Waitrose is the sixth largest grocery retailer in the UK. Recently, Waitrose introduced a digital transformation strategy creating a data lake foundation to guide the product life cycle.  By bringing together both Product and Customer domains, Waitrose customers will evolve from a collection of individual channels to an ecosystem of integrated data and services.
Waitrose has started its digital transformation where the foundation of the change is the implementation of MDM and Data Governance, growing into an ecosystem of integrated services.  This session will cover the Waitrose experience through its MDM journey with pitfalls and successes and will include:
  • Applying MDM as the foundation to future transformation
  • Overcoming cultural change challenges formed by transformational data programmes
  • Leveraging lessons learned from the implementation of MDM & Data Governance – the journey so far

Caroline Schofield, Product Data Manager, Waitrose

Caroline Schofield

Product Data Manager, Waitrose

Caroline Schofield has led the implementation of product data governance within Waitrose and for the last 2 years has been working on the implementation of a product MDM from a data perspective. Prior to this, Caroline was a senior business user with 10+ years of experience within Commercial at Waitrose which has given her insight and expertise into how product data is used across the business and the opportunities and challenges that the implementation of MDM and data governance bring. Caroline has championed data quality for many years at Waitrose and brings the perspective of the business to changes to the underlying data structure and the product life cycle.

Ivo-Paul Tummers, Managing Director, Jibes

Ivo-Paul Tummers

Managing Director, Jibes

Ivo-Paul Tummers, Bsc/MBA.  Ivo-Paul held several international positions for the aerospace/defence industry in mechanical engineering. In the mid ‘90s he became involved in resource planning, in an era where middleware was introduced. Out of this perspective he was an early adaptor of SOA principles, consequently recognizing the increasing demand for accurate, complete, in context and actionable decision streams.  This resulted in the foundation of JIBES. To date Jibes is a leading European implementation partner with a focus on data integration, governance & data analytics.

Data Governance for Big Data Analytics
Data Governance for Big Data Analytics

The Amsterdam ArenA is a large football stadium in the Netherlands. In its daily operations, ArenA collects a superfluous amount of data on a range of activities, such as solar panels, lights, Wi-Fi browsing, purchasing behaviour and much more. As part of their innovative platform, Amsterdam ArenA has set up a big data analytics platform called KAVE on Azure in collaboration with KPMG. The large processing capacity of the platform allows it to be the home for large scale big data analyses, as well as data from external providers.
Although it is an extremely intriguing idea to collect and analyse data from companies, government and the ArenA itself, there are several challenges that arise, such as: How do they mitigate potential compliance issues that may arise from combining different datasets? How can they ensure external data providers that their data is not used for purposes other than desirable, even in big data analytics? Who is the owner of the data on the platform and how do they convince data providers that the platform operations are properly controlled? Though at the same time, how do they make sure theey do not take away the innovative and playground character of the platform by being too strict?
To facilitate this, ArenA designed and implemented data governance for big data analytics, including a governance organization and a data lifecycle process. In this session they will share their solution and experiences.  Attendees will learn:
  • Key governance challenges with innovative big data analytics, such as data privacy, user access and data ownership
  • How to gain control over the application of big data analytics with an enterprise using data governance
  • Balancing the controlled environment created by data governance and the innovative “playground” of big data analytics

Bas-Jeroen Busscher, CEO of Arena do Brasil and DG Lead and Corporate Lawyer, Amsterdam Innovation ArenA

Bas-Jeroen Busscher

CEO of Arena do Brasil and DG Lead and Corporate Lawyer, Amsterdam Innovation ArenA

Bas-Jeroen Busscher is an experienced lawyer and contract manager in the field of large-scale infrastructure projects. In the past 25 years, he has been involved in almost all large-scale infrastructure projects in the Netherlands Bas-Jeroen was Involved in the opening of Amsterdam ArenA in 1996 as part of the Management Team. Bas-Jeroen was responsible as Director of Operations for new World Cup stadiums in Salvador da Bahia [Arena Fonte Nova] from 2012 until after the World Cup 2014 and Natal  [Arena das Dunas] from 2014-2015]. Bas-Jeroen is now CEO of Arena do Brasil, the Brazilian daughter of Amsterdam ArenA as well as Manager of the Projects Bureau of Amsterdam ArenA. He is also the Data Governance Lead and Corporate lawyer for the Amsterdam Innovation ArenA.

Nick Martijn, Senior Consultant, Enterprise Data Management, KPMG IT Advisory

Nick Martijn

Senior Consultant, Enterprise Data Management, KPMG IT Advisory

Nick Martijn is a senior consultant at the Enterprise Data Management department of KPMG IT Advisory in the Netherlands. He advises companies in various sectors in the development and implementation of data governance, but also other data management topics, such as Master Data Management, data management strategy or data quality improvement. Together with Amsterdam ArenA, Nick has developed and implemented data governance for their big data analytics environment. This interesting case follows the latest trends in data management.

Implementing a Data Governance Framework – When One Size Does Not Fit All!
Implementing a Data Governance Framework – When One Size Does Not Fit All!

The aim of the session is to explain the challenges faced when we started to introduce a new Data Governance framework into the Type 26 programme at BAE Systems Naval Ships. How we needed to adapt our approach to meet the challenges head on of an ingrained culture struggling to adapt to change and think holistically about the data used on the programme. It is envisaged at the end of the session the delegates will have learnt:

  • What are the signs that the standard approach is failing to make an impact
  • Steps needed to adapt your approach whilst ensuring the ultimate goal is achieved
  • Getting your organisation/business to understand and utilise the value that Data Governance brings

Nathan Young, Data Quality & Governance Manager, Maritime - Naval Ships BAE Systems

Nathan Young

Data Quality & Governance Manager, Maritime - Naval Ships BAE Systems

Nathan Young has worked in the defence industry for 27 years with 25 of those being in the data management field. Nathan spent 25 years at Rolls-Royce predominantly in data management roles rising to the position of Data Delivery Manager before leaving the company in 2015. During his time at Rolls Royce, he saw significant changes to the defence market and helped to adapt the data management service to meet the challenges of today. In 2015 he joined BAE Systems – Naval Ships in the role of Data Governance Manager within the Data Quality & Governance team tasked with implementing a Data Governance Framework on the Type 26 programme. After an organisational change Nathan is now the Data Quality & Governance Manager heading up the team.

15:05 - 15:30
Networking Break & Exhibits
15:30 - 16:15
Using a Graph to Manage Multi-Dimensional Customer Data to Discover the Single Point of Truth
Using a Graph to Manage Multi-Dimensional Customer Data to Discover the Single Point of Truth

With over 650 team members, GBG is a global leader in 24 locations across 15 countries in Europe, the Americas, Asia-Pacific and Africa.  GBG provides solutions to many of the world’s biggest organisations, from established brands like HSBC and Zurich Insurance to disruptive newcomers such as Stripe and Xpress Money.
GBG is focused on Identity Data Intelligence to enable informing decisions between people and organisations globally.  Using graph technologies, GBG’s solutions capture all the richness of the modern world to improve data clarity and help build better relationships with customers.  Topics to be discussed include:
  • Understanding graph advantages over more traditional technologies
  • Applying Graph & NoSQL technologies to simplify MDM
  • Enhancing an MDM solution to drive out more business value

Scott Benson, Head of Architecture, GBG

Scott Benson

Head of Architecture, GBG

Scott Benson heads up the architecture team within GBG and is responsible for defining its strategic technical direction.  Mr. Benson has over 12 years of experience on large scale systems within the finance industry, government and other large scale organisations.  Prior to GBG, he worked at consultancies Misys and Capgemini, where he had major project roles at HMRC, Heathrow Airport and Royal Mail.  As the Head of Architecture at GBG, Mr. Benson helps set the strategic technology direction, governance and assess new technologies for use in GBG’s industry.  His current focus is of course graph technologies but also other Big Data capable technologies to help GBG process its global datasets.

Field Reports for 'Top 10' RDM Solutions
Field Reports for 'Top 10' RDM Solutions

The impact of poor or non-existent reference data management (RDM) is profound.  Errors in reference data ripple outwards affecting quality of master data in each domain, which in turn affects quality in all dependent transactional and analytical systems.  Because reference data is used to drive key business processes and application logic, errors in reference data can have a major negative and multiplicative business impact.  More than 55% of large enterprises surveyed by the MDM Institute are planning on implementing RDM in the next 18 months.  This session will focus on the “why” and “how” of RDM by providing insight into: Why is RDM mission critical today?  How does RDM differ from (how is it similar to) MDM?  What are the top business drivers for RDM?  Where are most organizations focusing their RDM efforts?  Topics to be discussed include:

  • Understanding the pros & cons of commercial RDM solutions vs. custom-built (“Buy vs. Build”)
  • Applying a “top10” evaluation criteria methodology to product evaluations for both mega vendor solutions (IBM RDM Hub, Informatica, Oracle DRM) and more pure play (Ataccama, Collibra, Magnitude, Orchestra, Software AG, Teradata, TopQuadrant, et al)
  • Planning for the future of RDM (dimension management for Big Data marts) & its relationship to overall MDM programs

Grooming Data Stewards
Grooming Data Stewards

Data Stewardship – the job that nobody wants but everyone has – so how do we make it real and get people to buy in to becoming a Data Steward?
Data Stewardship is very tough to sell. You are expecting individuals inside your organisation to take on an accountability that they may not have full control over and do that on top of their day job. This presentation is aimed at providing a how to case study on what you need to do to firstly sell the concept and secondly groom colleagues to become great data stewards and sustain data governance in your organisation.
  • Selling the Concept
  • Defining the Good vs Not Good for a Data Steward
  • Training the Data Steward to become a Change Agent
  • Branding the Data Steward for instant recognition

Sue Geuens, Head: Data Standards & Best Practice Adoption, Barclays & President DAMA International

Sue Geuens

Head: Data Standards & Best Practice Adoption, Barclays & President DAMA International

Sue Geuens started in Data Management during 1996 when she was handed a disk with a list of builders on it and told they were hers to manage. Sue mentions this as fate taking over and providing her with what she was “meant to do”. Various data roles later, her clients numbered 3 of the top 4 banking institutions in SA, a number of telco’s and various pension funds, insurance companies and health organisations. Sue was the initial designer of data quality matching algorithms for a SA built Data Quality and Matching tool (Plasma Mind). This experience has stood her in good stead as she has slowly but surely climbed the ladder in Southern Africa to become the first CDMP in the country. Sue worked tirelessly on starting up DAMA SA with the successful Inaugural meeting in February of 2009. She was unanimously voted as President just prior to this event. From that time on Sue has been the leader and driving force of the DAMA SA Board. Sue has been the DAMAI VP Operations since 2011 and is the DAMA I President for 2014. She is a sought after presenter and has been a prominent speaker at EDW in 2009, 2010, 2011 and 2013, DAMA Brazil, DAMA Australia in 2013 and IRMUK in 2006 and 2011. Follow Sue on Twitter @suegeuens.

Getting Data Governance on Track at Network Rail
Getting Data Governance on Track at Network Rail

As one of the largest asset management organisations in Britain, Network Rail has an abundance of asset-related data. This data is itself an asset which, if not properly managed, can impose serious risk to Network Rail’s ability to deliver a safe, reliable and efficient railway. Lack of confidence in the quality of the data by Network Rail personnel, regulators, and the public compromises the organisation’s ability to make effective data-driven decisions and stakeholders’ trust in the railway.
Leveraging upon the ISO 8000 part 150, the international standard for data quality management, Network Rail in partnership with Deloitte and LSC Group has developed the quality management system for asset related data incorporating data governance and assurance. This management system takes a risk-based approach to delivering required outcomes rather than compliance driven to support business units in embedding governance and data management principles in line with their business priorities, culture, and the devolved environment within Network Rail operates.
Lessons learned from the session:
  • Developing a framework based upon leading industry practices from ISO 8000 and ISO 55000
  • A risk-based approach rather than compliance based approach providing clarity on risk exposure and increased flexibility in embedding the data governance framework
  • Embedding assurance into business practices and driving continuous improvement
  • Key lessons learned on building the business case for governance, stakeholder engagement and the focus on communications

Davin Crowley-Sweet, Professional Head of Asset Data, Network Rail

Davin Crowley-Sweet

Professional Head of Asset Data, Network Rail

Davin Crowley-Sweet is the Professional Head of Asset Data and Information for Network Rail. In this role Davin is accountable for setting the corporate governance and assurance framework for asset-related data. Over the past decade Davin has held a number of asset roles ranging across front line maintenance management and asset reliability on the West Coast Mainline, delivery of multimillion pound asset renewal programmes, leading award winning data quality improvement initiatives and development of strategic business plans for regulated investment periods

Dr. Timothy King, Principal Consultant, LSC Group

Dr. Timothy King

Principal Consultant, LSC Group

Dr. Timothy King is Principal Consultant at the LSC Group with expertise information and knowledge management across Defence, Energy, Transport and Utilities markets. Tim is the chair and convener of the ISO working group that is developing ISO 8000 for Data Quality. This new standard enables organizations to implement a systematic and systemic approach to data quality management, resulting in data that conform to requirements. Tim is Principal Consultant within a practice of 35 individuals who provide expertise in information and knowledge management across Defence, Energy, Transport and Utilities markets. He is a Chartered Engineer, Chartered IT Professional and Fellow of both the IMechE and British Computer Society.

Sara Monsef, Deloitte MCS Limited

Sara Monsef

Deloitte MCS Limited

Sara Monsef is a Manager at Deloitte specialising in design and implementation of Data Governance Frameworks and Data Management Target Operating Models across diverse industries including Financial Services, Public Sector, Media and Human Resources. Sara has implemented data and reporting governance frameworks focusing upon roles and responsibilities, data definition analysis, governance processes, change controls and Master Data Management. Sara leads the data governance training and development in the Deloitte UK Consulting practice. In her role as Data Governance & Assurance Design and Deployment Lead, she is supporting Network Rail to design, trial and deploy the quality management system to formalise the management and assurance of asset-related data, enabling optimal whole lifecycle management of assets and provide fit for purpose asset-related information.

16:20 - 17:05
Building an Enterprise Reference Data Hub
Building an Enterprise Reference Data Hub

First Data is the world’s leading payment processing company and provides electronic commerce and payment processing services to financial institutions, governments and merchants in more than 100 markets around the world. The company’s growth strategy is based on acquisition. Thus, their IT and data landscape has faced many challenges resulting from incompatible or different technical stacks within their acquired organizations.  Their Enterprise Reference Data Hub enables the whole company (including new departments working with Hadoop) to access and use data from a single managed source of truth across departmental lines.
Join this session to learn:

  • Building an Enterprise Reference Data Hub
  • Improving data quality to help save millions of euros in detecting fraud & billing errors
  • Leveraging patterns, partners, successes, failures & lessons learned for the journey ahead

Thomas Place, Director of Data Management & Governance, First Data Corporation

Thomas Place

Director of Data Management & Governance, First Data Corporation

Thomas Place is the Director of Data Management & Governance at First Data Corporation, where he leads enterprise data initiatives. Mr. Place currently focuses on pivoting the organization’s transactional data to a value-add asset through development of an enterprise data lake, governance and quality controls. Thus enabling new product development, and optimization of existing sales channels. Prior to First Data, he has over a decade of experience leading enterprise data technology initiatives in capital markets on both the buy and sell side. Mr. Place holds a BSc in computer science from the University of Nottingham in England.

Establishing Master Data in a Federated & Outsourced Environment
Establishing Master Data in a Federated & Outsourced Environment

McDonald’s is the world’s leading global food service retailer with over 36,000 locations in over 100 countries. More than 80% of McDonald’s restaurants worldwide are owned and operated by independent local business men and women. The strength of the alignment among the company, its franchisees and suppliers (collectively referred to as the “System”) has been key to McDonald’s long-term success. By leveraging this System, McDonald’s have been able to identify, implement and scale ideas that meet customers’ changing needs and preferences. In addition, this business model enables McDonald’s to consistently deliver locally-relevant restaurant experiences to customers and be an integral part of the communities they serve.
In recent years, demand for data and accurate analysis has increased significantly within the Supply Chain function. With master data being at the core of any analysis, McDonald’s has been challenged to establish an MDM system with connected processes that ensure consistent and reliable Supplier and Product information across the system. This effort includes: establishing standards by utilizing GS1 industry norms, implementing a PIM system, and establishing global governance to include all three parties (Company, Franchisee, Suppliers).
This session will discuss McDonald’s learnings during this data journey, including the topics of:
  • Leveraging best practices derived from using industry norms
  • Establishing flexible & scalable governance structures to enable McDonald’s “System” business model
  • Deploying an agile PIM system that also supports other key domains such as Company & Franchisee

Establishing Data Governance for a Data Driven Future
Establishing Data Governance for a Data Driven Future

Thames Valley Housing is a growing organization with innovation in abundance. Data is everywhere and used in many different guises. We have a wide project portfolio, and operate in an ever more regulated environment. We have ambitions to be a truly data driven organization but to get there we first have to understand our data, deliver it securely and meet compliance objectives such as PCI and GDPR. So, with so many priorities, how do you deliver value for the organization without getting bogged down by the necessities? This session will look at the steps we took to establish a data governance framework within a medium sized housing association. We’ll walk through our data governance journey, how we got buy in, and ensured users understood the value of their data.
Attendees will learn:
  • Establishing a data governance program
  • Growth vs Control, Security vs Flexibility, and Admin Vs Insight
  • Where’s the data owner when you need them?

Douglas Silverstone, Thames Valley Housing

Douglas Silverstone

Thames Valley Housing

Douglas Silverstone, has worked in the Not for profit sector for over a decade. Helping organizations of all sizes to identify how to get the most from technology and the data at their disposal. He is currently the Data governance manager for Thames Valley Housing, an Organization with 15,000+ homes. His role is establishing a framework as a foundation for a data driven future. He is responsible for Data Compliance (GDPR, PCI, FOI), Information security, Data Quality and Integrity, as well as proving the value of data and its criticality to any business strategy.

Developing Data Governance in a Large Middle East Bank
Developing Data Governance in a Large Middle East Bank

Nobody can be unaware of the critical importance that data has in the Financial Services sector.  This is particularly true for Riyad Bank and its effective and professional management is crucial to the success of the Bank.  At Riyad Bank they need to ensure that their data is genuinely managed as a business asset and much like their other core assets that it is subject to professional rigours and disciplines; and crucially that is recognised across all the Bank as being an enterprise asset to be managed for the benefit of all stakeholders and of course be linked to their core strategic initiatives.  An Enterprise Data Management program sponsored by the CEO has been established and one of the early steps was to establish Data Governance especially the organisation structures, roles and responsibilities, and crucially a stakeholder communication & engagement plan.

  • Hear how the case was made for the Data Governance strategy & the challenges they were seeking to address
  • Outline the steps to introduce Data Governance into the Bank in bite sized chunks
  • Describe why they selected Risk Data Aggregation as their pilot area and what they accomplished
  • How Data Governance fitted within the overall Enterprise Data Management program

17:10 - 17:20
Conference Close
Thursday 18 May 2017 : Post-Conference Workshops
08:30 - 09:00
Registration
09:00 - 16:30
Jump Start Your Information Strategy, Get a Grip in a Day
Jump Start Your Information Strategy, Get a Grip in a Day

Getting value out of your data and remain compliant with regulations seems to the next holy grail. In this full day workshop Jan packed the key messages of his 3-day seminar to allow you to get started in transforming your organisation. The workshop is packed with useful approaches that have been tested in the field and have proven to deliver value in a short period. You will be equipped with convincing arguments that allow you to engage both business and IT in your own company. Both the value creation through analytics and data exploration and the risk avoidance inclusive of compliance will be covered. The platforms for managing master- and transactional data will be looked at. Bringing the entire puzzle together in the operating model.  You will learn from attending the session:
  • Learn how to engage your business and have them take the lead and recognise the value of information
  • Learn how to adapt the organisation to make it information centric
  • Get more value out of your MDM projects
  • Learn how to redefine your Business Intelligence architecture
  • Learn how to get the benefits of Big Data
  • Learn how to define a metadata strategy
  • Select the proper Enterprise Information platform to support your information strategy

Successful Implementation of a Master Data Management Programme
Successful Implementation of a Master Data Management Programme

This workshop focuses on the key elements of an MDM programme that are needed for overall success.  It gives practical recommendations while at the same time providing a conceptual understanding of what is involved in these recommendations.  Both governance and management are covered, and emphasis is placed in how MDM fits into a larger business strategy and architectural setting.   The business needs of master data are described, including strategies for meeting manual and procedural needs.  The more technical details are fitted into this framework.  MDM programmes are rapidly evolving as new data possibilities emerge and enterprises demand more from MDM than they have previously.  These emerging challenges of MDM are addressed in detail, including how MDM supports data scientists and the relationship between Big Data and MDM.   Attendees will learn:
  • What Master Data is, how it differs from other classes of data, and what its special needs and challenges are
  • The structure of an MDM programme, including how the business needs to be aligned to data governance and data management within an MDM programme
  • How to deal with integration, semantic, history, quality, and other requirements in an MDM programme
  • MDM architectures, including what tooling can offer, and how these vary with different master data entities
  • The relationship of master data with reference data, and emerging areas of MDM

Sustaining your Data Governance Program
Sustaining your Data Governance Program

There is a lot of talk about starting a program for data governance, but not a lot has been discussed about keeping it going. This workshop is designed to provide you with practical insight on how to make Data Governance live in your organisation.
Data Governance programs take a lot of time, money and effort to get going. Very often companies fail again and again to keep going. And they go back to basics and start all over again – different people, different sponsors, maybe even a different methodology – but starting again. And of course, the rest of the business is frustrated waiting for Data Governance to work. This workshop is designed to help you to take the next steps. We will focus on the following areas:
  • Identify where we are in the maturity curve
  • Prioritise the work to be done
  • Get people involved
  • Refine our communication strategy
  • Get HR involved in building the right KPI’s
  • Report our Quick Wins

Join Sue and find out how Data Governance can be sustained and become self-supporting

GDPR One Day DPO Intensive: Key Skills for the Data Protection Officer
GDPR One Day DPO Intensive: Key Skills for the Data Protection Officer

The role of the Data Protection Officer (or Chief Privacy Officer for our North American cousins) will increasingly be a critical one in organisations processing personal data. The General Data Protection Regulation (GDPR, coming into force on 25th May 2018) makes it a mandatory role in certain circumstances, but it is generally recognised as a good idea in organisations to have someone with responsibility for the oversight and governance of Data Privacy issues and obligations.
This workshop will take you through a detailed overview of the DPO as a Data Governance role. It will look at the key skills and knowledge a DPO must have. Combining a whistle-stop tour of the Data Protection law principles, the workshop will then:
  • Examine Article 29 Working Party guidance on the role of the DPO and how that maps to good practice in Data Governance
  • Look at the role of the disciplines of the DMBOK wheel in effective Data Privacy Compliance
  • Demonstrate how Data Quality principles, practices, and methods can be applied by a DPO to support Privacy Impact Assessments and demonstrate effectiveness of compliance
  • Provide an overview of how effective data governance and stewardship practices are key to ensuring alignment of day to day information management with the requirements of data privacy compliance
  • Examine how Agile approaches to Governance and Master Data Management can help ensure a responsive and proactive data privacy governance environment for the DPO in your organisation.

Daragh O Brien, Castlebridge  

Daragh O Brien

Castlebridge  

Recently rated the 24th most influential person in Information Security worldwide on Twitter (http://www.onalytica.com/blog/posts/data-security-top-100-influencers-and-brands/ ), Daragh O Brien, FICS, is a leading consultant, educator, and author in the fields of Information Privacy, Governance, Ethics, and Quality. After over a decade in a leading telco, Daragh now works with clients in a range of sectors on a range of Information Management challenges.  Daragh is a Fellow of the Irish Computer Society and a Privacy Officer for DAMA-l. He teaches Data Privacy Law and Practice at the Law Society of Ireland. Castlebridge is a commercial partner of the Adapt Centre in Trinity College Dublin and collaborates with the Insight Centre for Digital Analytics, Europe’s largest Analytics research group. Follow Daragh on Twitter @cbridgeinfo.

Sorry, your screen is too small to view this agenda.

Please switch to a larger device to see the full details.

Fees

  • 4 Days
  • £1945
  • £1,945 + VAT (£389) = £2,334
  • 3 Days
  • £1595
  • £1,595 + VAT (£319) = £1,914
  • 2 Days
  • £1245
  • £1,245 + VAT (£249) = £1,494
  • 1 Day
  • £795
  • £795 + VAT (£159) = £954
Group Booking Discounts
Delegates
2-3 Delegates 10% discount
4-5 Delegates 20% discount
6+ Delegates 25% discount

UK Delegates: Expenses of travel, accommodation and subsistence incurred whilst attending this IRM UK conference will be fully tax deductible by the employer company if attendance is undertaken to maintain professional skills of the employee attending.

Non-UK Delegates: Please check with your local tax authorities

Cancellation Policy: Cancellations must be received in writing at least two weeks before the commencement of the conference and will be subject to a 10% administration fee. It is regretted that cancellations received within two weeks of the conference date will be liable for the full conference fee. Substitutions can be made at any time.

Cancellation Liability: In the unlikely event of cancellation of the conference for any reason, IRM UK’s liability is limited to the return of the registration fee only.  IRM UK will not reimburse delegates for any travel or hotel cancellation fees or penalties. It may be necessary, for reasons beyond the control of IRM UK, to change the content, timings, speakers, date and venue of the conference.

Venue

  • Radisson Blu Portman Hotel
  • 22 Portman Square
  • London
  • W1H 7BG
  • UK

Platinum Sponsors

Silver Sponsors

Standard Sponsors

Supported By

Association of Enterprise Architects   

The Association of Enterprise Architects (AEA) is the definitive professional organization for Enterprise Architects. Its goals are to increase job opportunities for all of its members and increase their market value by advancing professional excellence, and to raise the status of the profession as a whole.

BCS Data Management Specialist Group (DMSG)

The BCS Data Management Specialist Group (DMSG) helps Data Management professionals support organisations to achieve their objectives through improved awareness, management, and responsible exploitation of data.
We run several events each year whose focus areas include:
•    The benefits of managing data as an organisational asset
•    Skills for exploitation of data
•    Data governance as a ‘Business As Usual’ activity
•    Compliance with legislation, particularly that relating to data protection, data security and ethical usage of data
Our audience is anyone with an interest in the benefits to be gained from data. This includes: Chief Data Officers (CDO); Senior Information Risk Officer (SIRO); data managers/stewards; data governance officers; data protection/security advisors; data scientists; and business/data/database analysts.

DAMA International

DAMA International is a not-for-profit, vendor-independent association of technical and business professionals dedicated to advancing the concepts and practices for data resource management and enterprise information. The primary purpose of DAMA International is to promote the understanding, development, and practice of managing data and information to support business strategies.   As Data Management becomes more relevant to the business, DAMA is keeping pace with new products and services such as the 2nd edition of the DAMA Data Dictionary, the DAMA BOD (Body of Knowledge) and several new certification exams.  We are participating on the Boards of many academic and standards bodies and sharing our knowledge with other organizations.
DAMA International is pleased to announce that a new chapter is forming in Turkey which will join the 8 other European chapters as part of DAMA International.   DAMA International and its affiliated chapters have grown year after year with chapters operating in Australia, China, India, North America, South America, Japan, and South Africa and DAMA is facilitating the formation of new chapters in many other countries.
As a DAMA member you receive the benefits of your local or global chapter’s activities and all the benefits of DAMA International’s products and services. You can network with other professionals to share ideas, trends, problems, and solutions. You receive a discount at DAMA International conferences and seminars, and on associated vendor’s products and services. To learn more about DAMA International, local chapters, membership, achievement awards, conferences and training events, subscriptions to DM Review and other publications, discounts, job listings, education and certification, please visit the DAMA International web page at www.dama.org.  Both the DAMA UK chapter and DAMA International will have a meeting during the conference.  We invite interested parties to join this vital and growing organization.  More information can be found at www.dama.org or you can email me at president@dama.org.

DAMA UK

The drive for the future is to successfully focus on providing quality support to core members whilst guaranteeing sufficient financial income to ensure sustained activity.   The four areas which DAMA UK recommends addressing over the next two years are:
Academic – to survey UK organisations to understand their Data Management skill set needs and then induce academic institutions to supply them.
Data Quality (DQ) – to benchmark data quality standards in the UK and encourage development of business awareness of the importance of DQ and help develop DG metrics.   Government regulations versus data – to increase awareness of the legal implications of data management, assist organisations in reducing their legal liabilities and support ETA (and others) lobby for “data clever” legislation.   Data Standards – survey requirements then work with other organisations (eg BCS) to develop effective data standards.

EDM COUNCIL

About the EDM Council
The EDM Council is a neutral business forum founded by the financial industry to elevate the practice of data management as a business and operational priority. The prime directive is to ensure that all consumers (business and regulatory)  have trust and confidence that data is precisely what is expected without the need for manual recalculation or multiple data transformations. There are four programs of the Council:
•    Data Content Standards (FIBO): the standards-based infrastructure needed for operational management (identification, semantic language of the contract, classification).  We own the industry ontology for financial instruments and entity relationships and make it available as an open source standard
•     Data Management Best Practices (DCAM): the science and discipline of data management from a practical perspective (data management maturity, data quality, benchmarking).
•    Data Implications of Regulation: translating the legislative objectives of transparency, financial stability, compressed clearing and cross-asset market surveillance into regulatory objectives and practical reporting requirements.
•    Business Network: global meeting ground, CDO Forum and mechanism for sustainable business relationships
There are 135 corporate members of the Council (http://www.edmcouncil.org/councilmembers). We are governed by a board of 24 (http://www.edmcouncil.org/board). For more information visit www.edmcouncil.org.

Media Partners

DGPO

The Data Governance Professionals Organization (DGPO) is an international non-profit, vendor neutral, association of business, IT and data professionals dedicated to advancing the discipline of data governance.   The DGPO provides a forum that fosters discussion and networking for members and seeks to encourage, develop and advance the skills of members working in the data governance discipline.   Please click here to view a PowerPoint overview of the DGPO.

The Data Governance Institute

The Data Governance Institute (DGI) is the industry’s oldest and best known source of in-depth, vendor-neutral Data Governance best practices and guidance. Since its introduction in 2004, hundreds of organizations around the globe have based their programs on the DGI Data Governance Framework and supporting materials. www.DataGovernance.com.   Follow us on Twitter: 
@DGIFramework https://twitter.com/DGIFramework
Connect with us on LinkedIn: 
The Data Governance Institute (DGI) https://www.linkedin.com/company/480835

Belgian Association of Data Quality   

DQA is the Belgian Association of Data Quality professionals. Our goal is to bring together people interested in the DQ subject to share experience and knowledge.

ECCMA

Formed in 1999; the Electronic Commerce Code Management Association (ECCMA) has brought together thousands of experts from around the world and provides a means of working together in a fair, open and extremely fast internet environment to build and maintain global, open standard dictionaries used to unambiguously label information without losing meaning. ECCMA works to increase the quality and lower the cost of descriptions through developing International Standards.   ECCMA is the original developer of the UNSPSC, the project leader for ISO 22745 (open technical dictionaries and their application to the exchange of characteristic data) and ISO 8000 (information and data quality), as well as, the administrator of US TAG to ISO TC 184 (Automation systems and integration), TC 184 SC4 (Industrial data) and TC 184 SC5 (Interoperability, integration, and architectures for enterprise systems and automation applications) and the international secretariat for ISO TC 184/SC5. For more information, please visit www.eccma.org.

IT-LATINO.NET

IT-latino.net is the most important online Hispanic IT Media Network. With more than 120,000 registered users we have become an important online IT Business Forum organizing daily webinars and conferences on different Technology issues. We inform regularly a strong IT community from both sides of the Atlantic: Spain and Latin America.

Modern Analyst

ModernAnalyst.com is the premier community and resource portal for business analysts, systems analysts, and other IT professionals involved in business systems analysis. Find what you need, when you need it. The ModernAnalyst.com community provides Articles, Forums, Templates, Interview Questions, Career Advice, Profiles, a Resource Directory, and much more, to allow you to excel at your work. From junior analysts to analysis managers, whether you integrate off-the-shelf products, perform systems analysis for custom software development, or re-engineer business processes, ModernAnalyst.com has what you need to take your career to the next level.

Technology Evaluation Centers

Technology Evaluation Centers (TEC) helps organizations choose the best enterprise software solutions for their unique needs—quickly and cost effectively. With detailed information on over 1,000 solutions in the world’s largest vendor database, TEC delivers a broad range of evaluation and selection resources to help ensure successful software selection projects. As impartial software evaluators since 1993, TEC’s expert team of analysts and selection professionals are involved in thousands of software selection projects every year. The TEC newsletter goes out to 920,000 subscribers and is available in 4 languages.  Visit TEC: www.technologyevaluation.com.  
Subscribe to the TEC Newsletter: http://www.technologyevaluation.com/newsletter-subscribe/

Silicon UK

Silicon UK  is the authoritative UK source for IT news, analysis, features and interviews on the key industry topics with a particular emphasis on IoT, AI, cloud and other transformative technologies.   The site is your guide to the business IT revolution, offering other resources such as jobs, whitepapers and downloads alongside its coverage.   Stay informed, register to the newsletters.

Via Nova Architectura

A number of thought leaders in the area of business – and IT architectures have set up a digital magazine on architecture: Via Nova Architectura. Although started as an initiative within the Netherlands, the magazine should reach all those interested in the area of architecture, where-ever they live. Via Nova Architectura aims to provide an accessible platform for the architecture community. It is meant to be the primary source of information for architects in the field. The scope of Via Nova Architectura is “digital” architecture in the broadest sense of the word: business architecture, solution architecture, software architecture, infrastructure architecture or any other architecture an enterprise may develop to realize its business strategy.

IQ International

IQ International (abbreviated as IQint), the International Association for Information and Data Quality, is the professional association for those interested in improving business effectiveness through quality data and information. All, including full-time practitioners, those impacted by poor data and information quality, and those who just want to learn more, are welcome!