Testimonials

Sponsors

Fees

  • 4 Days
  • £1845
  • £1,845 + VAT (£369) = £2,214
  • 3 Days
  • £1495
  • £1,495 + VAT (£299) = £1,794
  • 2 Days
  • £1145
  • £1,145 + VAT (£229) = £1,374
  • 1 Day
  • £695
  • £695 + VAT (£139) = £834
Group Booking Discounts
Delegates
2-3 Delegates 10% discount
4-5 Delegates 20% discount
6+ Delegates 25% discount

Venue

  • 22 Portman Square
  • London W1H 7BG
  • UK

Join the conference group

Agenda

Monday 16 May 2016: Pre-Conference Workshops
Full Day Workshops
09:30-17:15
MDM & RDM "Quick Start" Tutorial
MDM & RDM "Quick Start" Tutorial

Here’s an excellent opportunity to improve your success as an enterprise/data/solutions architect or other IT professional embarking upon your first MDM or Data Governance initiative.  During this fast-paced workshop, you’ll learn firsthand the best practice insights every IT professional must know to fast-track success and minimize risk. This is your pre-conference opportunity to meet with the “Godfather of MDM” to ask the questions and set your own personalized agenda to maximize your conference experience.
The speaker’s reputation for cutting through the hype to deliver a no-nonsense view of what you need to know will provide insights into proven approaches to delivering business value along with the insiders’ view of strategic implications of these fast-evolving technologies.
Combining presentations and case studies, this power session’s proven agenda is practical, personal and uniquely tailored on-site to the needs of the participants.  The speakers will share real world insights from surveys and discussions with over 1,500 MDM programs to provide guidance concerning:

  • Initiating a successful MDM, RDM and/or MDG program
  • Convincing the business to take a leadership role with the goal to deliver measurable ROI
  • Choosing the right MDM, RDM and/or MDG solutions despite a rapidly churning market — multi-domain MDM, reference data management, hierarchy management, identity resolution, big data, social MDM, semantic databases and more

Aaron Zornes, Chief Research Officer, The MDM Institute

Aaron Zornes

Chief Research Officer, The MDM Institute

Aaron Zornes is the founder and chief research officer of the MDM Institute (formerly the CDI-MDM Institute). Mr. Zornes is a noted speaker and author on Global 2000 enterprise IT issues and is the most quoted industry analyst on the topics of master data management (MDM), customer data integration (CDI), and data governance. Mr. Zornes is also the editor and lead contributor to DM Review’s CDI and MDM Newsletters as well as the monthly columnist for both CDI and MDM. Prior to founding the MDM Institute, he was founder and executive VP of META Group’s largest research advisory practice for 15 years. Mr. Zornes also held line and strategic management roles in leading vendor and user organizations, including executive and managerial positions at Ingres Corp., Wang, Software AG of North America, and Cincom Systems. Mr. Zornes received his M.S. in Management Information Systems from the University of Arizona.

MDM - a Best Practice Guide to Design and Implementation
MDM - a Best Practice Guide to Design and Implementation

This workshop focusses on the end-to-end implementation of master data management and tries to address the hardest problems that arise in an MDM project. It looks at the broader picture of information governance, data quality and metadata management before applying these to an MDM project. It also address design issues such as inbound integration of master data to consolidate master data when it is scattered across many different data sources, and the outbound synchronization of it to supply both operational and analytical systems. It also looks at master data virtualization when you have a hybrid state of some master data consolidates and some not.  In particular it looks at what needs to be considered when dealing with data integration and data synchronization to achieve best practice in design and implementation. The session covers the following:

  • An introduction to data governance
  • Introducing a shared business vocabulary
  • Metadata management
  • Enterprise data quality and data integration
  • The main approaches to implementing MDM
  • What kind of MDM system are you building? – a System of Record, Centralised Master Data Entry System or both
  • Understanding master data maintenance in your enterprise
  • Best practices in designing master data consolidation
    • Data capture techniques
    • The benefits of standardizing inbound data to a an MDM system
    • Should history be kept in a MDM system?
    • Approaches to cleansing, and matching
    • Consolidation Vs virtualizing master data to create an MDM system
    • Enriching master data using Big Data Analytics
    • Matching at scale – Leveraging Hadoop and Hbase for scalable master data matching
  • Best practices in designing outbound master data synchronization
    • Integrating an MDM system with an enterprise service bus for outbound synchronization of operational systems
    • Schema and integrity synchronisation problems that can occur and what to do about them
    • Conflict resolution on outbound synchronization
    • Design considerations when integrating MDM with ETL tools for synchronizing data warehouses and data marts
  • Maximising the use of data virtualization in MDM
  • The implications of switching to centralized master data entry
  • The change management program imposed by centralized master data entry

Mike Ferguson, Managing Partner, Intelligent Business Strategies

Mike Ferguson

Managing Partner, Intelligent Business Strategies

Mike Ferguson is Managing Director of Intelligent Business Strategies Limited.  An analyst and consultant he specialises in business intelligence, data management and enterprise business integration.  With over 34 years of experience, Mike has consulted for dozens of companies on business intelligence/corporate performance management strategy and technology selection, big data, enterprise architecture, business integration, MDM and data integration.  He has spoken at events all over the world and written numerous articles.  Formerly he was a principal and co-founder of Codd and Date Europe Limited – the inventors of the Relational Model, a Chief Architect at Teradata, on the Teradata DBMS, and European Managing Director of Database Associates.  He teaches master classes in Big Data Analytics, New Technologies for Data Warehousing and BI, Mobile and Collaborative BI, Operational BI, Enterprise Data Governance, Master Data Management, Data Integration and Enterprise Architecture. Follow Mike on Twitter @mikeferguson1.

Preparation for the Certified Data Management Professional (CDMP) Exams Examinations x three 90 minute sessions
Preparation for the Certified Data Management Professional (CDMP) Exams Examinations x three 90 minute sessions

This workshop covers an overview of the process, tips and techniques of successful CDMP exam taking. In this interactive and informative session, you will learn:

  • What is the CDMP certification process
  • The DAMA-DMBOK & CDMP data exams alignment
  • What topics comprise each exam’s body of knowledge
  • Concepts and terms used in the CDMP exams
  • A Self-assessment of your knowledge and skill through taking the sample exams

VERY IMPORTANT:  You will need to bring your own computer which can connect to the internet.  The exam is taken online you will need to register a minimum of 2 hours before the exam at www.dama.org . Your test (and live) exam results and performance profile can be viewed immediately.

Attendees of the half day workshop will also receive some refresher tuition covering several of the most common topics seen in recent examinations.  Note however this is not a substitute for past experience and education.  In the afternoon there will be three 90 minute exam sessions.  The schedule for the day will be as follows:

09:30-12:45: Workshop Preparation for the Certified Data Management Professional (CDMP) Exams
13:30-15:00: Exam 1
15:30-17:00: Exam 2
17:30-19:00: Exam 3

Workshop attendees will take the certification exams on a “pay if you pass” basis (passing is 60% for associate & 70% for practitioner). If you take and pass all three certification exams, you would leave MDM/DG 2016 with a CDMP credential

EXAM
3 * 90 minute examination sessions (in the afternoon).
Each exam is 90 minutes in length and has 110 multi-choice questions
Your score is immediately known after exam is taken
Exam fees for MDM/DG attendees – there is a fee payable for each CDMP exam, with a ‘pay only if you pass’ agreement for attendees of this workshop”
Passing at Associate level requires 60% or higher in one exam.  Practitioner level is attained by passing 3 exams at 70% or greater

Chris Bradley, Information Strategist, Data Management Advisors Ltd

Chris Bradley

Information Strategist, Data Management Advisors Ltd

Christopher Bradley has spent 35 years in the forefront of the Information Management field, working for leading organisations in Information Management Strategy, Data Governance, Data Quality, Information Assurance, Master Data Management, Metadata Management, Data Warehouse and Business Intelligence.   Chris is an independent Information Strategist & recognised thought leader.  Recently he has delivered a comprehensive appraisal of Information Management practices at an Oil & Gas super major, Data Governance strategy for a Global Pharma, and Information Management training for Finance & Utilities companies.  Chris guides Global organizations on Information Strategy, Data Governance, Information Management best practice and how organisations can genuinely manage Information as a critical corporate asset.  Frequently he is engaged to evangelise the Information Management and Data Governance message to Executive management, introduce data governance and new business processes for Information Management and to deliver training and mentoring.  Chris is Director of the E&P standards committee “DMBoard”, an officer of DAMA International, an author of the DMBoK 2.0, a member of the Meta Data Professionals Organisation (MPO) and a holder at “master” level and examiner for the DAMA CDMP professional certification. Chris is an acknowledged thought leader in Data Governance, author of several papers and books, and an expert judge on the annual Data Governance best practice awards. Follow Christopher on Twitter @inforacer.

Morning Workshops
09:30-12:45
Implementing MDM with a Graph Database
Implementing MDM with a Graph Database

Graph databases directly tackle one of the biggest technical challenges in building MDM systems: the inherent complexity of master data.

Master data – whether it’s the organization master or a product master involving complex hierarchies and relationships – invariably takes the form of a graph or network, and is best modelled, stored and queried using a native graph technology. Graph databases reduce complexity, increase agility and greatly improve the speed and efficiency of a master data application, while reducing the hardware footprint.

Join this session to learn why adopting a graph database is essential for handling complex MDM systems, and how market-leading organizations like Cisco and Pitney Bowes are turning to graph technology to build their MDM solutions.  Topics to be discussed include:

  • Understanding how a graph database complements MDM – from personalized product and service recommendations to websites adding social capabilities
  • Reimagining master data, identity and access models via graph technology
  • Adopting graph databases as the best way to model, store and query data and relationships

Ian Robinson, Engineer, Neo Technology

Ian Robinson

Engineer, Neo Technology

Ian Robinson is an engineer for Neo Technology, the company behind the Neo4j graph database. He has worked extensively with customers to design and develop mission-critical graph database solutions. He is a co-author of “Graph Databases: New Opportunities for Connected Data” and “REST in Practice” (both from O’Reilly) and a contributor to “REST: From Research to Practice” (Springer) and “Service Design Patterns” (Addison-Wesley). Ian presents at conferences worldwide on using the Web as an application platform, and the graph capabilities of Neo4j.

Successful Reference Data Governance and Management
Successful Reference Data Governance and Management

Reference data – often simply known as codes, lookups, or domains – is an area of enterprise data management that is becoming increasingly important.   However, many enterprises have difficulty formulating governance programmes and management practices for reference data.  This workshop explains the overall structure needed for both reference data governance and reference data management.  The very different roles need to manage external reference data (sourced from outside the enterprise) and internal reference data (produced wholly within the enterprise) are described.  The options for environments for producing and distributing reference data are compared and contrasted.  The significant role of semantics in reference data is also examined in detail, together with practical ways in which knowledge of reference data can be successfully managed.  Additionally, the special aspects of quality in reference data are described.  Attendees will learn:

  • What reference data is, how it differs from other classes of data in its governance and management needs
  • The structures needed for successful reference data governance management
  • How the semantic needs of reference data can be addressed
  • How to deal with data quality in reference data content

Malcolm Chisholm, Chief Innovation Officer, First San Francisco Partners

Malcolm Chisholm

Chief Innovation Officer, First San Francisco Partners

Malcolm Chisholm has over 25 years’ experience in data management, and has worked in a variety of sectors, with a concentration on finance.   He is an independent consultant specializing in data governance, master/reference data management, metadata engineering, and the organization of Enterprise Information Management.  Malcolm has authored the books: “Managing Reference Data in Enterprise Databases”; “How to Build a Business Rules Engine”; and “Definitions in Information Management”.  He was awarded the DAMA International Professional Achievement Award for contributions to Master Data Management.  He holds an M.A. from the University of Oxford and a Ph.D. from the University of Bristol.

Practical Data Governance: Getting Started
Practical Data Governance: Getting Started

Data Governance Programs often seem to start with failure as a foregone conclusion.  Horror stories from other organisations and data professionals permeate the industry

Having been a big part of the implementation and maintenance of a successful DG program at a large Southern African Telco, Sue will focus on the practicalities, how to turn the theory into practice and how to make it stick. Having a number of years experience in practical Data Governance, the next logical step is for Sue to take various themes and issues raised during her work and get the attendees working on the practical pieces that are needed to start your Data Governance Journey. If you are a beginner in DG and are looking for a framework or some practical hand-holding this is the ideal workshop. If you are already working on DG but feel like you are wallowing, then this workshop will help you focus on what you need to do next. You will walk away with a much better understanding of what you are going to be doing, together with various documents that you have helped craft in this workshop. Each part of the agenda has one or more exercise sessions.

  • Starting Up
  • Identifying Stuff To Do
  • The Fun Part
  • What went wrong and how did we fix it?
  • The practical know-how of implementing DG

Sue Geuens, Head: Data Standards & Best Practice Adoption, Barclays & President DAMA International

Sue Geuens

Head: Data Standards & Best Practice Adoption, Barclays & President DAMA International

Sue Geuens started in Data Management during 1996 when she was handed a disk with a list of builders on it and told they were hers to manage. Sue mentions this as fate taking over and providing her with what she was “meant to do”. Various data roles later, her clients numbered 3 of the top 4 banking institutions in SA, a number of telco’s and various pension funds, insurance companies and health organisations. Sue was the initial designer of data quality matching algorithms for a SA built Data Quality and Matching tool (Plasma Mind). This experience has stood her in good stead as she has slowly but surely climbed the ladder in Southern Africa to become the first CDMP in the country. Sue worked tirelessly on starting up DAMA SA with the successful Inaugural meeting in February of 2009. She was unanimously voted as President just prior to this event. From that time on Sue has been the leader and driving force of the DAMA SA Board. Sue has been the DAMAI VP Operations since 2011 and is the DAMA I President for 2014. She is a sought after presenter and has been a prominent speaker at EDW in 2009, 2010, 2011 and 2013, DAMA Brazil, DAMA Australia in 2013 and IRMUK in 2006 and 2011. Follow Sue on Twitter @suegeuens.

From Vision to Capabilities: Enabling Information Governance
From Vision to Capabilities: Enabling Information Governance

Keeping control over the design- and architecture activities that are performed within your organisation without creating a top heavy governance organisation is quite a challenge. Justifying why certain choices have been made and come up with the proper reasons to invest in a particular solution can be quite tricky. The latter is often performed in an ad-hoc and unstructured way.
One of the current hypes is the concept of capabilities which can be loosely defined as “things that you should be able to do”. The idea behind the capability approach makes sense as it starts from the assumption that when you describe what and why you want to perform you end-up with the right mix to reach your objectives. This brings us to the more fundamental question of the underlying drivers.
The workshop will take you through the different steps of translating a vision into policies, principles and standards that guide your reference architectures and blueprints. We will cover the framework that connects all the different elements into a practical toolkit that underpins you reference architecture and transforms standalone statements such as “information is an asset” into useable steering of your designs and architecture.

  • Understanding the policy framework concept
  • Linking vision to actions
  • Outcome driven capabilities
  • Reference architectures for common use cases, MDM, Analytics, BI, Data Quality

Jan Henderyckx, Managing Partner, Inpuls     

Jan Henderyckx

Managing Partner, Inpuls     

Jan Henderyckx is a highly rated consultant, speaker and author who has been active in the field of Information Management and Relational Database Management since 1986. He has presented, moderated and taught workshops at many international conferences and User Group meetings worldwide. Jan’s experiences, combined with information architecture and management expertise, have enabled him to help many organisations to optimise the business value of their information assets. He has a vision for innovation and ability to translate visions into strategy. A verifiable track record in diverse industries including non-profit, retail, financial, sales, energy, public entities. Contributed to better stream lined and higher yielding operations for some of the leading businesses through a combination of creativity, technical skills, initiative and strong leaderships. He is a Director of the Belgium and Luxembourg chapter of DAMA (Data Management Association) and runs the Belgian Information Governance Council. He has published articles in many leading industry journals, and has been elected to the IDUG Speakers Hall of Fame, based upon numerous Best Speaker awards. Jan is Chair of the Presidents Council DAMA International.

Building the Business Case for Integrated Data Governance Programs
Building the Business Case for Integrated Data Governance Programs

Data is now recognised as a crucial business asset and many companies have taken the first steps toward initiating programmes that manage Data assets.  Of course, this requires a well structured and coherent Data Governance programme that successfully controls ‘People, Process, Technology & Data’ as key enablers for benefit delivery. The question often asked by C-level management is ‘Is there a tangible business case for Data Governance?’
This Masterclass takes you through building a Data Governance framework and ways in which to consider building a business with tangible benefits. The session will be interactive in places and is presented in a lively delivery that will enable attendees to propose a format that will resonate with all levels of an organisation. Anwar will also present a unique view on how to measure the real impact Data has on company revenues and costs.
For those embarking upon a Data Governance programme (including MDM), this session will help accelerate progress with an agenda covering the most difficult obstacles. With many practical tips and tricks, this session will give valuable insights into real life business and technical challenges.
This Masterclass is an interactive session which helps attendees:

  • Identify which data needs managing
  • Compile the key components of a Data Governance program
  • Complete the pre-requisites for a business case
  • Quantify the benefits of improved data quality
  • Integrating a Data Governance program with core business activity

Anwar Mirza, International Data Governance Specialist

Anwar Mirza

International Data Governance Specialist

Anwar Mirza has been a keynote presenter at IRM MDM DG conferences on two occasions.  His senior roles in Financial Management have allowed him to specialise in the subject of Information Management and Governance both within TNT and at other companies. Anwar has developed his own Data Governance framework and implementation methodology which is supported by a unique business case approach that tangibly measures the impact of data quality from a top line revenue and a bottom line perspective. He has presented these views at numerous seminars and congresses globally. Additionally, he has published articles for, amongst others, the legal, financial, IT and professional services industries. He is an advisor to many multi-national companies and consulting firms in USA, Australia, Asia and Europe. With over 25 years of service at TNT, Anwar has a proven track record in managing large teams, controlling core business processes, implementing applications and technology with tangible business benefits. Anwar is now in the process of engaging universities to embed his unique techniques into mainstream education.

Afternoon Workshops
14:00 - 17:15
Implementing Data Governance - Learning by Best Practices
Implementing Data Governance - Learning by Best Practices

There is not only one way for implementing data governance. Taking a deeper look into approaches companies have chosen, there are multiple options to enable high data quality by descent governance structures. Options vary from purely local optimisation of data lifecycle processes to global shared service structures, both being applied with great success. Complexity is added by requesting different patterns of handling different master data objects.
The workshop will give insights into data governance patterns, which have been implemented with support of the author at different companies. By showing detailed best practices participants could reflect their own situation and develop their own solution scenarios. Furthermore, the author will give insights into lessons learned after having realized structural changes in several companies. Focus of the workshop is:

  • Design options for data governance: from local optimisation to outsourcing
  • Best practices, approaches by leading companies for implementing data governance
  • Lessons learned: from data governance concept to business transformation.

Andreas Reichert, Partner, CDQ AG

Andreas Reichert

Partner, CDQ AG

Andreas Reichert is Partner at CDQ AG. He has been working on governance topics for Master Data Management for more than 11 years. Focus of his work is supporting enterprises in designing master data strategies, setting up governance structures and transferring project results into business operations.

How to Go Beyond MDM with Modern Big Data Management
How to Go Beyond MDM with Modern Big Data Management

We are in the era of Big Data wherein technologies can now support a wide variety of data at seemingly infinite data volumes at real-time velocity.  Yet MDM tools and technologies remain relatively unchanged in the 10 years since companies began deploying such solutions.  Some might say that MDM itself has turned into the very silo it was designed to circumvent.  Granted, certain solution providers now offer MDM in the cloud to enable smaller companies to benefit from MDM at a much lower ongoing cost, but for most enterprises that isn’t enough to meet increasing business demands.
Today’s end-users want access to a complete view – not just of customers or products – but rather a blended view of all master data entities plus transaction, interaction and social data. And they want their information delivered in the form of LinkedIn/Facebook style data-driven applications.  They also want faster time-to-value and expect a new breed of enterprise data-driven applications that include reliable data, relevant insights and recommended actions.
In this workshop, one of the pioneers of modern data management which combines MDM, Big Data, Analytics and Machine Learning,  will share best practices, case studies and technology considerations by discussing these topics and more:

  • Leveraging enterprise multi-channel data to enable ‘inside-out’ client view via MDM
  • Understanding the business value of Big Data, NoSQL vs. RDBMS vs. Data Warehouse, Hadoop (HDFS & MapReduce)
  • Establishing the business case for MDM, Big Data & real-time data-driven applications (a case study)

Ramon Chen, Chief Marketing Officer, Reltio

Ramon Chen

Chief Marketing Officer, Reltio

Ramon Chen is the Chief Marketing Officer at Reltio, a provider of a modern data management Platform as a Service (PaaS) and enterprise data-driven applications.  Mr. Chen is a widely read blogger and guest author on the topics of MDM, Big Data and Cloud Computing, and is also a frequent industry speaker and panelist.  A former software engineer, he has spent over 25 years developing and marketing Enterprise, Cloud and Big Data software.  Most recently, prior to Reltio, Mr. Chen was VP Product Marketing at Veeva Systems, and before that he was part of the core team at Siperian, a market-leading MDM platform acquired by Informatica.  He received his Bachelors in Computer Science from the University of Essex.

Making Enterprise Data Quality a Reality
Making Enterprise Data Quality a Reality

Many organisations are recognising that tackling chronic data quality (DQ) problems requires more than a series of tactical, one off improvement projects. By their nature many DQ issues extend across and often beyond an organisation.  So the only way to address them is through an enterprise wide programme of data governance and DQ improvement activities embracing people, process and technology. This requires very different skills and approaches from those needed on many traditional DQ projects.
If you attend this workshop you will leave more ready and able to make the case for and deliver enterprise wide data governance & DQ across your organisation. This highly interactive workshop will also give you the opportunity to tackle the problems of a fictional (but highly realistic) company who are experiencing end to end data quality & data governance challenges. Attending this workshop will enable you to practise some of the techniques taught in a safe, fun environment before trying them out for real in your own organisations.
The workshop will draw on the extensive personal knowledge & experience of Global Data Strategy’s Nigel Turner who has helped to initiate & implement enterprise DQ and data governance in major companies including BT Group, British Gas, Intel and many other organisations.   The approaches outlined in this session really do work.
The workshop will cover:

  • What differentiates enterprise DQ from traditional project based DQ approaches
  • How to take the first steps in enterprise DQ
  • Applying a practical DQ & data governance framework
  • Making the case for investment in DQ and data governance
  • How to deliver the benefits – people, process & technology
  • Real life case studies – key do’s and don’ts
  • Practice case study – getting enterprise DQ off the ground in a hotel chain
  • Key lessons learned and pointers for success

Nigel Turner, Principal Information Management Consultant EMEA, Global Data Strategy

Nigel Turner

Principal Information Management Consultant EMEA, Global Data Strategy

Nigel Turner is Principal Information Management Consultant EMEA at Global Data Strategy.  He specialises in information strategy, data governance, data quality & master data management. During his consultancy career he has worked with over 150 clients, including British Gas, AIMIA/Nectar, HSBC, EDF Energy, Telefonica O2, the Chartered Institute for Personnel and Development (CIPD) and Intel US.  With more than 20 years’ experience in the Information Management industry, Nigel started his career working to improve data quality, data governance & CRM within British Telecommunications (BT), and has since used this experience to help many other organisations do the same.  Whilst at BT he also ran a successful Information Management and CRM practice of 200+ people providing consultancy and solutions to many of BT’s corporate customers.   He is also an elected member of the UK’s Data Management Association (DAMA) management committee.  In 2015 he was given DAMA International’s Community Award for setting up a mentoring scheme for data management professionals in the UK.  In 2007 fellow data professionals voted him runner up in Data Strategy magazine’s UK Data Professional of the Year awards.  Nigel is a well-known thought leader in data management and has published several white papers & articles and is a regular invited speaker at Information Management & CRM events.

Communicating your Data Governance Message
Communicating your Data Governance Message

Getting your data governance messages right and delivering them well is critical to the success of your data governance initiative.  Join Nicola Askham, The Data Governance Coach, at this highly interactive session as she shares advice, tips and best practice gained from over 13 years’ experience to:

  • Learn how to structure your data governance communications for success
  • Understand what to communicate to whom
  • Start to construct a data governance communications plan and look at templates/formats for some different types of messages that you will need to successfully deliver.

Nicola Askham, The Data Governance Coach

Nicola Askham

The Data Governance Coach

Nicola Askham is the leading data governance coach and training provider in the UK. She supports companies with implementing their data governance initiative, so they can sustain it on an on-going basis. She holds a unique level of experience in the Data Governance field, and has experience in training and coaching major organisations to help them implement full data governance frameworks.
Nicola has developed a powerful methodology for implementing data governance based on over 13 years of experience. Her methodology breaks down the data governance initiative into logical steps, which ensures that businesses design and implement a data governance framework that is right for the organisation and which delivers significant and sustainable benefits. Nicola is a Committee Member of DAMA UK, sits on the Expert Panel of Dataqualitypro.com, and regularly writes and presents internationally on data governance best practice. Follow Nicola @Nicola_Askham

Aligning Your Organisations Big Data Strategy With Your Business Strategy
Aligning Your Organisations Big Data Strategy With Your Business Strategy

Big Data is the buzzword of the moment.  There is much discussion on what it is and why it is different from other IT-enabled projects.  Big Data is framed typically in terms of the 7Vs.  The missing elements from this framework are strategy and benefits.
According to Google’s Chairman, the amount of data produced over millennia is now being produced every few days.  Organisations are being encouraged to make significant amounts of investment into ways of gathering, analysing and presenting data that is now available.  There are a several examples of companies that have benefitted from big data; yet, many have drowned in the sea of data.
This workshop will provide participants with tools and techniques that ensure investments in big data programs support the strategic direction of the business.  It will highlight some of the pitfalls that organisations face and ways overcoming these.
Delegates will learn:

  • Linking big data to strategy
  • Pitfalls to avoid in big data programs
  • Dealing with the reality of big data

Ashley Braganza, Professor of Organizational Transformation, Brunel Business School

Ashley Braganza

Professor of Organizational Transformation, Brunel Business School

Ashley Braganza is Professor of Organizational Transformation at Brunel University, London.  He is Head of Department of Economics and Finance.  Prior to becoming Head, Ashley held a number of senior positions in Brunel Business School, UK, which recently won the coveted Times Higher Education’s Business School of the Year Award.  His research and consultancy expertise covers the development and implementation of change management.  He has direct experience in creating and implementing strategic processes and enterprise-wide architectures.  Some examples include his work with organizations such as DFiD (UK), Astra Zeneca, The National Audit Office, McDonalds, British Telecom, ABN Amro Bank, Brunel Business School and IFAD (Rome).

Tuesday 17 May 2016 : Conference Day 1 & Exhibits
08:00 - 09:000
Registration
09:00 - 09:10
MDM Summit and DG Conference Opening: Aaron Zornes, The MDM Institute & Jan Henderyckx, Inpuls
09:10 - 10:00
PLENARY KEYNOTE: Disruption is a Level Playing Field: The Key is Turning Innovation into Advantage
PLENARY KEYNOTE: Disruption is a Level Playing Field: The Key is Turning Innovation into Advantage

Disruption is often being described as an external factor to an organisation that favours startup companies and that causes tremendous pressure on existing markets. In reality the essence of disruption is the ability of innovations to have a non-linear relation between the investment and the outcome. The Hilton hotel chain took almost a century to expand their portfolio to 600K rooms. In less than 4 years Airbnb reached the same level. Is this a unique characteristic of Airbnb? No, it’s just reflecting the capabilities of information in networked, shared, digital platforms and analytics with almost ubiquitous connection capabilities. The latter is available to anyone that is able to embrace the technological possibilities and has the ability to adapt their business strategy and operating model to allow the innovations to provide a competitive advantage.
The established companies have in many cases more datapoints available than the smaller startups. Yet, the newcomers are often more successful in growing their business and capturing the market. There are many organisational and company culture elements in this equation but the key information management element is the fact that the established companies fail to turn the data that they hold into an advantage. In essence the data trap: you have the data but fail to turn into information that leads to actionable insights.

  • The dynamics of disruption: what the data person needs to know
  • Setting up a governance model that embraces innovation and creates value
  • Defining the capabilities that will underpin your information strategy

10:05 - 10:50
MDM KEYNOTE Mind the Gap - Bridging from "System of Record" to "System of Engagement" for Master Relationship Management
MDM KEYNOTE Mind the Gap - Bridging from "System of Record" to "System of Engagement" for Master Relationship Management

Clearly, the “solid but boring” aspect of master data management (MDM) remains a key challenge for most enterprises.  While MDM purports to span the entire master data lifecycle, new dimensions such as Big Data, mobile, social, cloud and real-time are exerting tidal forces on the classic notion of MDM.  Moreover, IT leadership struggles when selecting MDM software because the solutions are diverse with no single vendor able to meet all requirements and use cases. Given the prevalence of multiple MDM brands and architectures as a result, two relatively newcomers (data governance and graph database) are proposing to unify these silo’ed worlds to overcome both organizational and technical issues as well as market dogma.
The mega vendor-centric offerings thwart the notion of heterogeneous data and process integration, and often lack pro-active governance capabilities for end-to-end data lifecycle management. Concurrently, best-of-breed and niche vendors look to exploit this vacuum (cross-mega vendor governance and relationship management) yet are stymied by lack of resources and market traction.  All vendors need to better focus on next-generation MDM requirements as we move from “system of record” to add “system of reference” and (ultimately) move into “system of engagement” wherein relationship-driven analytics form the foundation of MDM-innate, data-driven and context-driven applications to fully enable the digital enterprise.
Mismatches in reference data (also called “enterprise dimensions”) affect the integrity of business intelligence reports and are also a common source of application integration failure. Due to the strategic nature of and difficulty to build/maintain custom reference data management (RDM) capabilities, savvy IT organizations and Finance departments are increasingly opting to buy and not build RDM solutions.
This MDM research analyst keynote will review strategic planning assumptions such as:

  • Determining what your organisation should focus on in 2016-17 to initiate “master relationship management” via data governance & graph database
  • Planning to leverage Big Data & RDM as part of an enterprise MDM program
  • Understanding where MDM, RDM and Master Data Governance are headed in the next 3-5 years

DG KEYNOTE Moving Towards Data Governance 2.0
DG KEYNOTE Moving Towards Data Governance 2.0

Clearly, current Data Governance processes and tools are inefficient.  A 2015 Forrester survey shows that existing tools lack efficiency in several areas of Data Governance including business alignment, measurement, data definitions, policies, and stewardship. The evolution of Data Governance processes to introduce more dynamic policy changes will only exacerbate this efficiency problem.  Moreover, market-leading enterprises are deploying “systems of insight” (vs. “systems of record”) that test what happens when they put insights into action and learn from it. Building data processes and rules to predict “what will work” and “what won’t” is a risky bet — so Data Governance programs will need to rely on oversight in light of business results to ensure that data aligns with business objectives and policies. This analyst keynote will present research findings from the Forrester Research report “Vendor Landscape: Data Governance Stewardship Applications” (4Q2015).  It will focus on understanding what the new principles and models shaping Data Governance 2.0 are by discussing these vital topics:

  • Understanding the shift from “predefined” to “feedback-based” DG rules & policies
  • Leveraging the expansion of DG into new domains – e.g., complex 3D data representations & advanced analytics algorithms
  • Managing the convergence of rules across DG domains – e.g., MDM rules, DQ rules & privacy policies

Henry Peyret, Principal Analyst, Forrester Research

Henry Peyret

Principal Analyst, Forrester Research

Henry Peyret serves Enterprise Architecture (EA) Professionals. His research focuses on the concepts, techniques, and tools required to bring business agility within enterprises. He publishes research on key agility indicators and agile governance and applies these principles to change EA practices to help build a sustainable business agility. Business agility depends highly on the level of enterprise risk that decision-makers can take. He is also researching metadata management as a means to bring agility at the right cost for IT supporting business agility. His research addresses numerous changes to EA practices to support the digital business, the changes toward data governance 2.0, and the integration tooling landscape, particularly in the cloud.  Henry also researches values-driven customers, another potential next step for the age of the customer and how it will interact with companies’ privacy concerns.

10:50 - 11:20
Break & Exhibits
MDM Track 1
MDM Track 2
DG Track 1
DG Track 2
DG Track 3
11:20 - 12:05
Governing the Data Lifecycle
Governing the Data Lifecycle

Dun & Bradstreet is a global company delivering indispensable content through modern channels to serve customer needs. Dun & Bradstreet’s Data Governance Office provides oversight and governance over the entire enterprise data asset, managing an inventory of data on over 220 million global businesses. This session will cover how to govern data across its lifecycle: Discovery & Profiling > Data Acquisition > Data Maintenance> Data Usage> Archival and Purge. It will address data policies, establishing accountabilities for data assets, business rules governance, data compliance concerns and best practices. The session will focus on how to achieve a global perspective and act from the outside in using modern tools and methodologies.  It will also address how enterprise data governance responsibilities exist at all levels of the organization and how to ensure effective operationalization of the data governance objectives across the enterprise’s data lifecycle.

  • Achieving effective governance across the complete data lifecycle
  • Using data confidently in a compliant manner
  • Understanding how a small Data Governance Office can govern a large enterprise

Sharon Lankester, Enterprise Data Governance Leader, Dun & Bradstreet

Sharon Lankester

Enterprise Data Governance Leader, Dun & Bradstreet

Sharon Lankester is a Data Governance Leader at Dun & Bradstreet a leading global provider of business insight and analytics. Her responsibilities include managing the Data Governance Office, providing oversight to a robust data governance program that helps to streamline business process, data architecture, sustain data quality, and maintain a metadata program across the organization.

Managing 2 Versions of the Truth to Enable Public Sector MDM
Managing 2 Versions of the Truth to Enable Public Sector MDM

The City of Genk performed a study in 2015 to investigate the reasoning behind the current scattered landscape with regards to the maintenance of 3th party data (citizens, companies & associations).
Among others, one of the main reasons for the scattered landscape & data maintenance was the need for the City of Genk to deal with 2 versions of the truth.
Both the use of authentic 3th party data (controlled mainly by public bodies, but sometimes lacking data quality, specific fields or metadata/structures) as well as the use of 3th party data directly obtained from the 3th parties (and therefore more accurate & trustworthy) created a duality that lead to bypass solutions.
Managing these two versions of the truth to enable the mandatory use of authentic data in public sector forces us to rethink the management of 3th party data.
Capturing and propagating these two versions of the truth and continuously dealing with their delta makes us rethink an MDM solution.
This case will show how the city of Genk created a model to deal with this – a model that could be considered a reference model for other cities and organizations working with authentic data.
The presentation will cover the following topics:

  • The reasons for executing the “3th party management study” and how the City of Genk build up a scattered landscape over time.
  • How the need for managing 2 versions of the truth arose and how that created data accuracy, 3th party solution & data maintenance/process issues.
  • An approach on how MDM can be an enabler for managing these 2 versions of the truth for any company/organization that is working with authentic sources

Jonny Geussens, CIO, City of Genk

Jonny Geussens

CIO, City of Genk

Jonny Geusens has more than 18 years’ experience in IT-business development, IT-consultancy and IT-project management. He started as an academic business researcher regarding virtual – and collaborative working environments at the University of Limburg, Belgium, from which several spin-off activities were launched in the field of e-business at the end of the nineties. At the beginning of 2002, he took up the product management and business development regarding the development of a software suite to manage computer terminals in public environments. From 2004 until 2011 he implemented dedicated ERP-software for the building industry as a senior consultant. Since 2011, he has been the head of IT for the city of Genk, Belgium, managing the IT-budget and a group of 13 IT-administrators in the field of network/switching, (virtual) server infrastructure, local government applications. One of the projects started beginning 2015 is the implementation of a unified MDM solution for the local administration of Genk.

Christoph Balduck, MDM Practice Lead & Senior Information Manager, Inpuls

Christoph Balduck

MDM Practice Lead & Senior Information Manager, Inpuls

Christoph Balduck has been working with MDM since 2007 and is a senior practitioner experienced in MDM, Information Governance, Data Quality, Information Strategy and Information Architecture.  Over time Mr. Balduck has worked in several roles ranging from solution architect, program and project manager, business analyst, line manager and management consultant.  He has been active in IT for over 14 years and has had experience with MDM since 2007.  Mr. Balduck has evolved from a general SAP background into the specific areas of CRM and MDM over the last several years.  He has been working as Head Information Architect and Enterprise Architect at Volvo Group and moved to Inpuls in 2014 where he is using his knowledge & experience to help companies around the world with implementing and/or improving their Information Management (incl. data privacy & data protection) capabilities. Follow Christoph @balduck_c

Successful Enterprise Data Hub Design Patterns
Successful Enterprise Data Hub Design Patterns

“Information is an Asset” is a phrase on the lips of every CIO but getting the CFO to invest in it is hard work, even with clear ROI.
This session looks how to use an “accumulation of marginal gains” approach to achieve success with an Enterprise Data Hub through a portfolio of marginal gains in:-

  • Establishing Hadoop infrastructure in the data centre as a multi-tenant service platform
  • Providing a new home for ageing Batch applications (including MDM)
  • Data Discovery, Quality and Integrity (harnessing schema on read to find lost schemas)
  • Data Science (from MIS to Machine Learning)
    Operational Data Stores and Warehouses (including Data Vault Methodology)
  • Securing your Data (exploiting Active Directory integration, Sentry, Kerberos & Encryption)

The content of this session is based on a set of design patterns built up over 5 years establishing BT’s Enterprise Data Hub.  It will also look at foundations that under pin these including establishing a clear architectural vision and communicating it to key stakeholder groups.

Phill Radley, Chief Data Architect, BT

Phill Radley

Chief Data Architect, BT

Phill Radley is a Physics Graduate with an MBA who has worked in IT and communications industry for 30 years , mostly with British Telecommunications plc. He is Chief Data Architect for BT at their Adastral Park campus in the UK.  Phill works in BT’s core Enterprise Architecture team with responsibility for Data Architecture across BT Group plc.  He currently leads  BT’s MDM and Big Data initiatives driving associated strategic architecture and investment roadmaps for the business.
His previous roles in BT include;  9 years as Chief  Architect for Infrastructure Performance Management solutions, from UK consumer broadband through to outsourced Fortune 500 networks and hi-performance trading networks. He has broad global experience including BT’s Concert global venture in USA and 5 years as Asia Pacific BSS/OSS Architect based in Sydney. Follow Phill @radleyp

Our Open Data Revolution
Our Open Data Revolution

The Environment Agency is a public sector organisation. They are part of the Department of the Environment whose new approach is to open up all their data for use without charge or restriction. Their data is automatically open, unless there’s a good reason not to share it. This session shows how their approach to data governance has enabled them to respond to the challenge of releasing all of our data as open.  In this ‎session Lisa and Martin will:

  • Explain what being an open data organisation means for them
  • How their approach to data governance is supporting our transition
  • The lessons learnt from sharing all their data as open

Lisa Allen, National Data Integrity Manager, Environment Agency

Lisa Allen

National Data Integrity Manager, Environment Agency

Lisa Allen leads a team in implementing data governance and data quality, with responsibility for developing, implementing and measuring the EA’s data maturity model. She is responsible for delivering ODX- the Environment Agency’s project to release vast amounts of data as open.

Martin Whitworth, Deputy Director for Data, Mapping Modelling & Info, The Environment Agency

Martin Whitworth

Deputy Director for Data, Mapping Modelling & Info, The Environment Agency

Martin Whitworth is accountable for the Environment Agency’s approach to managing data, maps, models and information that underpin all of the EA’s activities.  He has a held a number of technical and regulatory roles in the Environment Agency and on secondment to the European Commission.

Data Governance in Bite-Sized Chunks
Data Governance in Bite-Sized Chunks

Data Governance is too theoretical! What am I really going to see from a DG initiative?
Are there some quick realistic aspects a DG program should focus on first? This session will show:

  • A workable framework for Data Governance
  • The different Data Governance approaches from process centric, to Data centric & more
  • How to produce a pragmatic business case for DG linked to business objectives
  • Building DG in bite sized pieces
  • Introducing DG covertly; it is possible
  • The Data Governance office & its critical role in sustaining success

12:10 - 12:55
Using Agile RDM & MDM – Multi-Domain & Beyond
Using Agile RDM & MDM – Multi-Domain & Beyond

Mundipharma have implemented an MDM & RDM platform to meet the upcoming European Medicines Agency (EMA), Identification of Medicinal Products (IDMP) regulations, set to hit the Life Sciences industry in 2016 and follow a three year, phased implementation plan.  To meet the needs of the “somewhat vague requirements”, Mundipharma required an MDM solution that was “agile”, could deliver business benefit fast, scale to meet “true multi-domain” ambitions, and all whilst not requiring a large number of people to administer and develop the system.
Beginning with Product data, and only three people, Mundipharma IT Services managed to join up its Regulatory Tracking and Submission System to four Manufacturing systems across Europe, each managed by a separate company within the Mundipharma group of companies, all using different systems.  In the first eight weeks, they were able to go-live with the first view of Product, having used the Semarchy toolset to Profile & Analyse the data, Integrate with source systems, Validate, Cleanse, Match and finally Manage the Product Data.  Since the first release, Mundipharma have continued to deliver significant new functionality each quarter by either bringing in additional sources, new data entities being mastered or beginning to push cleansed data back to source systems.  In parallel, they’ve also achieved their goal of moving sets of Reference Data to the Semarchy platform, and introduced an Information Governance Capability to support the new RDM & MDM platform.
Topics will include:

  • Using Agile RDM & MDM to deliver business benefit early via regular releases
  • Implementing MDM & Data Governance in a federated organisation of over sixty companies
  • Meeting the evolving needs of the European Medicines Agency IDMP Regulations

Cliff Gibson, Enterprise Architect, Mundipharma IT Services

Cliff Gibson

Enterprise Architect, Mundipharma IT Services

As the Enterprise Architect for Mundipharma IT Services, Cliff Gibson’s role involves alignment of strategic platforms, information architecture and identifying technology opportunities for our customers; the Independent Associated Companies (IACs) that form the Mundipharma group of companies.  With 60+ IACs, and 7000+ users across a global footprint, technologies like MDM & RDM are having a huge impact on Data Quality, Analytics and Operational Efficiency.  Mr. Gibson has championed the MDM Capability, introducing the concept to the organisation at the end of 2014, and led the procurement and implementation to their first release only nine months later.  Prior to Mundipharma, he also held key technical roles at Napp Pharmaceutical, Miracle Information Services and OK International.

Data Quality Assurance and Quality Assessment at De Nederlandsche Bank
Data Quality Assurance and Quality Assessment at De Nederlandsche Bank

The Statistics division of De Nederlandsche Bank contributes to the mission of De Nederlandsche Bank by making available financial and economic data, information and analyses of the right quality, in the right form, at the right moment and in the right context.
This session will focus on:

  • What is data quality in the context of De Nederlandsche Bank?
  • Data quality assurance and assessment framework at De Nederlandsche Bank
  • Benefits of data quality assurance and assessment framework
  • Implementation challenges of quality assurance and assessment framework
  • Quality assurance at De Nederlandsche Bank
  • Quality assessment at De Nederlandsche Bank

Diederick Nevenzeel, Information Manager, De Nederlandsche Bank

Diederick Nevenzeel

Information Manager, De Nederlandsche Bank

Diederick Nevenzeel has been at De Nederlandsche Bank for 28 years. He worked first as a policy advisor in various business departments. Later he became responsible for the Information & Automation unit of the Supervision Division and was in charge of the Archive Department and the Expert Centre of Statistics systems. Recently he has worked as the Information Manager and Data Quality Officer in the Statistics domain.

How Data Governance Works with BI
How Data Governance Works with BI

Since Peter Weill and Jeanne Ross have put the IT Governance topic on the agenda with their book “IT Governance, How Top Performers Manage IT Decision Rights for Superior Results” published in 2004, the analytics community hasn’t really paid much attention to their framework, let alone to the subtopic of data governance. The advent of Big Data and the data lakes have only exacerbated this neglect.  The book “Business Analysis for Business Intelligence” (Brijs 2012), contributes only three pages to this topic. But experience of the last three years has clearly shown that data governance in analytics is the hidden part of the analytics iceberg of failures. With this presentation, Bert Brijs will catch up with reality and present some compelling arguments for duopolistic data governance and put BI governance into practice.  Topics covered:

  • Overview of the governance options in analytics, illustrated by real life cases, the successful cases named, the unsuccessful will be presented as “anonymous”…
  • How the tandem BI and data governance works in practice
  • The roles involved
  • Linking data governance to analytics architecture
  • A practical case from customer analytics

Bert Brijs, Lingua Franca Consulting

Bert Brijs

Lingua Franca Consulting

Bert Brijs has been active in business analysis for business and customer relationship management since the eighties when he worked for a mail-order company. This is where the idea grew that IT is a major strategic asset: “It dawned on me that IT was more important than brick and mortar when I was driving the tape with our customers’ addresses to the laser print shop: this was our capital, our source of income and there was a direct relationship between our IT strategy and our success in the mail-order business.”  He has worked on over fifty business intelligence projects and coached dozens of teams between 1994 and today. In 2012 he released his book “Business Analysis for Business Intelligence” where he bundled his experience into practical hands on reference material. Follow Bert @bertbrijs

Journey to Data Governance using Tools and Change Management
Journey to Data Governance using Tools and Change Management

This presentation will be a case study from Caterpillar outlining business case development, the use of metrics to drive improvement and tools used. Business rules will be illustrated that are relevant to Caterpillar’s environment that are used to provide visibility and drive action.

  • Learn simple processes and tools used in a SAP environment
  • Change management drivers for change and engagement
  • Tools that can be applied to any environment

Martin McCloy, SAP Business Manager, Caterpillar

Martin McCloy

SAP Business Manager, Caterpillar

Martin McCloy has 15 years SAP experience in supply chain and black belt roles and is currently SAP Business Manager. He has been closely involved in driving improvements in corporate initiatives for Data Governance. His background includes management consulting with PwC across multiple sectors and industrial engineering experience with Michelin and General Motors.

12:55 - 14:25
Lunch & Exhibits
Perspective Session
13:25 - 13:50
Implementation of Data Quality Services within GSK R&D
Implementation of Data Quality Services within GSK R&D

This session provides an overview of how GSK’s Pharmaceutical R&D organisation have implemented Data Quality service supporting all R&D IT projects. The presentation will focus on the technology selection, the complexities of delivering data quality services in this environment, our successes, challenges and future plans.

Colin Wood, Enterprise Information Architect, GSK

Colin Wood

Enterprise Information Architect, GSK

Colin Wood is the Enterprise Information Architect for GlaxoSmithKline Research and Development. Within this role he was responsible for establishing services for master data management, data quality and data modelling. Colin has more than 25 years experience within IT and during that time has specialised in Enterprise Architecture and information management roles. He is currently involved in the development of a Target Data Landscape, which will set the strategic direction for the establishment and consumption of master data across the full application portfolio for GSK R&D, and in the development of medicinal product master data solutions.

From Data Governance Strategy to Organisational Steady-State: Key Ingredients for Data Governance Success
From Data Governance Strategy to Organisational Steady-State: Key Ingredients for Data Governance Success

When a national government establishes a vision to be ranked among the top five administrations in the world, it needs a clear and comprehensive strategy for optimising its operation in order to deliver the vision.
The Abu Dhabi Systems and Information Centre (ADSIC) is charged with responsibility for the IT and transformation agenda within the Abu Dhabi Government (itself the capital of the United Arab Emirates). Last year, it launched its e-Government strategy under the banner of ‘Towards a Digital Abu Dhabi’, and showcased a set of initiatives that have been developed to drive IT modernisation across Abu Dhabi over the coming years.
A key aspect of the strategy is the multi-year Abu Dhabi Government Data Management Programme. This is a major pan-government initiative, the first of its kind globally, where the programme aims to achieve the following goals across more than fifty Abu Dhabi Government Entities (ADGEs):

  • Standardise Data Management across ADGEs
  • Improve data quality and accuracy
  • Govern and control the data life cycle
  • Support decision-making systems
  • Facilitate and improve data exchange between ADGEs
  • Improve data transparency within ADGEs
  • Provide single version of truth to reduce data fractions.

During this case study session, the authors of this strategy provide an insight into the considerations for meeting these goals at a national level through data management.  The session includes the following topics:

  • Data Governance – Where are you now? How do you get started?
  • How to move from Data Governance Strategy to organisational steady-state  – key pitfalls to avoid and how this applies to your organisation
  • How to unlock measurable value from data – by aligning people, processes and technology to metrics crucial for digital transformation success.

Jason Edge, General Manager - Entity Middle East and Vice President Global Advisory, Entity Group

Jason Edge

General Manager - Entity Middle East and Vice President Global Advisory, Entity Group

A value-driven IT Strategist, Enterprise Information Management (EIM) Evangelist, Transformation Lead and Enterprise Architect, with international experience of delivery across a range of sectors, including Financial Services and Central Government.  Jason Edge specialises in Data Governance and Management and is skilled at working with executive stakeholders across an organisation to shape strategies and achieve their vision. His mission is to drive improvement through pragmatic and appropriate use of best practice frameworks and approaches – establishing and implementing Enterprise Information Management and Data Governance strategies that align policy and standards with corporate objectives.

13:55 - 14:20
Data Quality: Central Key Success for Big Data
Data Quality: Central Key Success for Big Data

Data-driven companies are more and more faced with the challenge to actively maintain and develop the production factor “data”; they especially need reliable master data. Master data is an important comprehensive component, which is involved in operational processes and business decisions as well as in data evaluations and data analyses or products and services. An institutionalized master data and data quality management are therefore essential for an efficient data management and a value-oriented use of data.
The session “Data Quality: Central Key Success for Big Data” deals, for example, with the subjects:

  • Importance of reliable master data for digitalization of economy
  • Operative processes and technology

Andreas Stock, Head of Presales, zetVisions AG

Andreas Stock

Head of Presales, zetVisions AG

Andreas Stock, Head of Presales, has joined zetVisions in 2006. He is responsible for all presales activities of zetVisions AG and acts as interface between the product management and the customer. Before zetVisions he had worked ten years for the software companies Oracle (Hyperion Solutions) and SAS Institute. He has sales experience in the area of Enterprise Performance Management, Business Intelligence, Legal Entity Management and Master Data Management.

Leveraging Cloud MDM for Cloud First Strategies and the Hybrid IT Organization
Leveraging Cloud MDM for Cloud First Strategies and the Hybrid IT Organization

Organizations of all sizes are fast adopting and launching SaaS/PaaS initiatives to modernize their IT infrastructure, embrace big data analytics, and realize value from the ever maturing cloud economy.
Given this transformation, is your MDM strategy aligned with your “cloud first” strategy for your systems, applications, data sources and data repositories?
Dell Boomi Cloud MDM provides an affordable, configurable solution to ensure the agility and integrity of data that companies rely on to drive business decisions and insight.
In this session, you will hear how our customers have successfully used Dell Boomi Cloud MDM to achieve success with their new SaaS/PaaS investments (e.g., Salesforce.com, NetSuite, Workday, Successfactors, Taleo etc) whilst continuously leveraging their existing investment around on-premises applications, data, and middleware, all in a hybrid IT deployment environment.

Mike Liu, Director Product Marketing, Dell Boomi

Mike Liu

Director Product Marketing, Dell Boomi

As Director of MDM Product Marketing at Dell Boomi, Mike Liu is responsible for MDM Product Marketing Management both internally and externally with a specialty in healthcare and life sciences solutions. Prior to Dell Boomi, Mike held Senior Product Marketing Management roles at RMS, Tibco and Informatica.
He received a Masters in Engineering from University of Michigan and a Masters in Computer Science from the University of Florida.

14:25 - 15:10
Using Data Quality to Enable MDM for Real-Time 360° Customer View
Using Data Quality to Enable MDM for Real-Time 360° Customer View

While traditional approaches to data management have made great strides, maximizing the quality of that data to gain a 360-degree view of a customer remains a challenge for many organizations. Learn in detail how a leading service provider is helping prominent UK customers; including one of Europe’s top home improvement retailers, the largest insurance price comparison shopping service in the UK, and a leading business process outsourcing (BPO) firm, derive value from their data assets. You will also learn how applying data quality capabilities is helping companies gain a 360° view of their customers by overcoming their MDM challenges, ultimately improving the ability to monetize their data.   This session will show how a unified platform can serve as a super-charged marshaling area for accessing, cleansing and delivering high-quality data that helps provide a golden record for each of an enterprise’s customers.  Topics include:

  • Building a dynamic, integrated customer view via extracting, cleansing, matching & linking data from virtually any data source (structured or unstructured)
  • Provisioning real-time, event-driven customer data integration (CDI) within an outside service level agreement (SLA) of minutes, and in some cases, seconds
  • Applying real-time CDI to resolve and master marketable entities for accounts, key group accounts & territory customers (individual within a territory)

Steve Cox, Senior Technical Consultant, dbg

Steve Cox

Senior Technical Consultant, dbg

Steve Cox is a direct marketing veteran and currently a senior technical consultant at independent software consultancy, dbg, in Bristol, UK. At dbg, Steve has used his long experience in data management and data driven marketing to build its Data Governance Practice. As a senior technical consultant, Steve looks for wherever there is an opportunity to “reach back into the client data” to improve process, management and governance of data within line of business systems and data warehouses prior to it being consumed by internal functions or by external service providers and partners. With over two decades of direct marketing and data experience, Steve was instrumental in taking the UK’s first email service provider to market and served on the Direct Marketing Association’s Email Marketing Council as their deliverability expert for several years, regularly contributing to marketing seminars as part of the DMA’s education program.

Todd Hinton, Vice President of Product Strategy, RedPoint Global

Todd Hinton

Vice President of Product Strategy, RedPoint Global

Todd Hinton is the vice president of product strategy at RedPoint Global, a leading provider of data management and customer engagement software. Todd brings over two decades of product management and executive leadership to the company, providing strategic direction for RedPoint’s data management products, including RedPoint Data Management for Hadoop. Prior to joining RedPoint, Todd served as executive vice president for Bernard Data Solutions, where he was responsible for the overall technology direction of the company’s CRM SaaS application serving the nonprofit industry. Todd specializes in data quality and the creation of building high-performance database applications capable of querying vast amounts of data in high-volume environments. Follow Todd @toddondata

Semantic PIM - MDM using a Graph Data Model
Semantic PIM - MDM using a Graph Data Model

To efficiently manage its complex and heterogeneous data across its entire international supply chain, Schleich, a famous German toy manufacturer, was looking for a more agile, easier and simpler Product Information Management (PIM) solution. Enter Neo4j: Schleich used the leading graph database as the core database for its new semantic MDM solution, benefiting from its high scalability, agility and performance. The solution enables the development of a semantic data model even in the design stage. Departments throughout the company can easily access all relevant data in real time via specifically-tailored apps. With this graph based system Schleich is now even better able to meet the high quality requirements and country-specific safety regulations in the toy industry. Key topics include:

  • Identifying the business value of graph databases such as higher flexibility and performance
  • Creating a semantic data context for MDM services to enhance BI decision making
  • Establishing greater workflow efficiency across the supply chain to ensure high quality management

Dr. Andreas Weber, Vice President Operations, Schleich

Dr. Andreas Weber

Vice President Operations, Schleich

Dr. Andreas Weber has been Head of Operations at Schleich GmbH since 2009. He is focused on consolidating and streamlining the operations processes and data handling, while taking into account internal as well as customers’ and suppliers’ requirements.  Previously Dr. Weber was responsible for logistics, IT and technical product development at Schwan STABILO Cosmetics GmbH & Co KG. He also headed all4cosmetics during its project phase as a “virtual enterprise”.

Master Data Governance - Taking Complexity and Making it Simple
Master Data Governance - Taking Complexity and Making it Simple

Yara, the world’s leading fertilizer organization, started its Master Data Governance journey in 2014. The approach taken has been to identify and use best practices and simplify these for implementation in Yara.
The presentation will explain:

  • Establishing the strategy
  • Implementing “governance” (who has the right to make decisions
  • Implementing Data Quality dashboards
  • Implementing the workflow solution

Yara’s aim has been to make Master Data Governance understandable for the business stakeholders.

Paul Lucas, Head of Master Data Governance, Yara International

Paul Lucas

Head of Master Data Governance, Yara International

Paul Lucas is Head of Master Data Governance for Yara International ASA. He has been in this role since 2014. Paul has extensive experience working in multi-cultural, international organizations for the last 33 years. He has worked in most roles in large internal IT service delivery organizations and brings his experience from those roles to Master Data Governance.

Establishing Data Governance in a Greenfield Data Driven Organization
Establishing Data Governance in a Greenfield Data Driven Organization

Data Governance is an (almost) invisible force that can make or break your data driven initiatives. As Big Data is increasingly being recognized as a source of competitive advantage, companies embracing these strategies quickly learn that their decisions cannot be any better than their data. The need for embracing data governance becomes clear, but in Big Data greenfields it is all but obvious how you make a case for addressing ownership and decision rights for data bearing systems. Learn from our journey to make data an important asset, that is as valuable and important as more commonly recognized corporate assets.

Tom Breur, VP Data Analytics, Cengage Learning

Tom Breur

VP Data Analytics, Cengage Learning

Tom Breur has a background in Database Management and Market Research. He has specialized in how companies can make better use of their data. He is an accomplished teacher at universities, MBA programs, and for the IQCP (Information Quality Certified Professional) and CBIP program (Certified Business Intelligence Professional). He is a regular keynoter at international conferences. At the moment he is a member of the editorial board of the Journal of Targeting, the Journal of Financial Services Management, and Banking Review. He is Chief Editor for the new Palgrave Journal of marketing Analytics and was cited among others in Harvard Management Update about state-of-the-art data analytics. Follow Tom @tombreur

Building a Business Case for Data Quality – it’s No Laughing Matter
Building a Business Case for Data Quality – it’s No Laughing Matter

In order to build a robust business case for change, Comic Relief first had to quantify the issues within their fundraiser database and to be able to demonstrate the improvements achievable through investment in Data Quality tools.
To this end, they embarked on a 15 day project in collaboration with a potential solutions provider, designed to provide a high value, light touch method of in-depth analysis and quantification of a representative dataset. The objectives of this analysis were to:

  • Identify issues that would impact the program
  • Set a benchmark of desired data quality against which improvements could be measured
  • Improve, update, clean and match data to demonstrate potential benefits

The engagement required the input of key business stakeholders and helped to stimulate interest in the importance of data quality by highlighting the business and financial benefits available.
In this session you will hear perspectives on:

  • Understanding the importance of having a financial benefit statement to build a business case for change – and how you must constantly validate what it is you are trying to measure and improve.
  • The importance of choosing the correct business process to form the basis of the investigation/project; is the problem big enough, does it impact on our key strategic initiatives?
  • How effective collaboration internally and externally is key to success

Yemi Okunade, Head of Data, Comic Relief

Yemi Okunade

Head of Data, Comic Relief

Yemi Okunade is Head of Data for renowned charity Comic relief. A business and technology hybrid, he is motivated by unlocking the value data adds to a company’s proposition. This could be through implementing data solutions such a SCV (Single Customer View), marketing automation tools and BI platforms through to producing insightful analytics, modelling and reporting to increase the effectiveness of marketing any product or service.

15:15 - 16:00
Applying MDM Patterns for Enterprise DM
Applying MDM Patterns for Enterprise DM

The Dutch Tax and Customs Administration (DTCA) needed to establish a consistent set of master data to enable the business to deliver a high quality and efficient service. The solution developed is a Master Information Hub implemented using IBM’s MDM solution on PureApplication systems.  DCTA implemented this solution as a software factory utilising MDM, InfoServer and CDC on the PureApplication platform using PureApp Patterns.  Data is retrieved from 15 sources currently and will grow to approximately 45 this year and to 80 next year.  Efficiency is a key driver and the deployment of patterns for the various environments, development, test and production is completely automated.  Environments are built and tested in a day where previously it was taking up to three months.  CDC subscriptions and ETL jobs feeding the data into the staging of MDM and the data warehouse are ready for production in less than two weeks. The plan going forward is to bring in both internal and external new sources as well as implement a Data Governance program using IBM’s Information Governance Catalogue. Topics include:

  • Leveraging the reality that “Information is the most important asset” for DCTA
  • Applying MDM patterns for data provisioning & data management
  • Getting the basics right by using PureApp patterns for software delivery & development

Anneke Karels, IT Architect Dutch Tax & Customs Administration

Anneke Karels

IT Architect Dutch Tax & Customs Administration

Anneke Karels is an IT architect with over 25 years of experience in the IT industry. She holds a degree in Social Sciences and a MBA as well. She TOGAF 9 and SAFe Agilist certified. Anneke started her IT career at a Dutch retail company and worked for the last 15 years as an IT architect at the Dutch Tax and Customs Administration. Her career focusses now on (master) data management. Her main interests are application integration and a model driven approach to software architecture.  Within the DTCA Anneke is the lead architect of the Center of Excellence for Data Provisioning.

Data Integration & Governance Platform for Digital Self Service
Data Integration & Governance Platform for Digital Self Service

Sligro Food Group is a European leader in food production, food service and food retail.  It divides its business into two segments: Food Retail as a wholesaler and retailer and Food Service as a wholesaler. Learn how Sligro created a unified set of predictive analytical capabilities that bring together data to facilitate information based collaboration across the “delivery” and the “cash & carry” businesses. The essence of this program is enabling self-service for analytics in the transformation from a traditional business to an increasingly digital-driven business.  This transformation and its critical foundation have been enabled by a robust data integration and governance platform powered by the IBM MDM Server platform.  This session explains how Sligro creates a complete information landscape with MDM, Metadata and Analytics. It also covers how they became more efficient and performance-oriented, better connected to the network economy, and driven by self-service. The future of manufacturers, wholesalers and retailers will not be dictated by them, but by consumers, bringing a whole new meaning to the ‘At Your Service’ model.
This session will discuss such topics as:

  • Getting the master data foundation right – technology, business processes & culture
  • Making the “customer profile” the center of the business across lines of business
  • Acknowledging internally & overcoming the fact that this is as much a cultural challenge as a technical challenge

Ivo-Paul Tummers, Managing Director, Jibes

Ivo-Paul Tummers

Managing Director, Jibes

Ivo-Paul Tummers, Bsc/MBA.  Ivo-Paul held several international positions for the aerospace/defence industry in mechanical engineering. In the mid ‘90s he became involved in resource planning, in an era where middleware was introduced. Out of this perspective he was an early adaptor of SOA principles, consequently recognizing the increasing demand for accurate, complete, in context and actionable decision streams.  This resulted in the foundation of JIBES. To date Jibes is a leading European implementation partner with a focus on data integration, governance & data analytics.

How to Successfully Implement an Enterprise-Wide Data Governance - the Case of YapiKredi
How to Successfully Implement an Enterprise-Wide Data Governance - the Case of YapiKredi

YapiKredi is Turkey’s fourth largest private bank with over 4 million domestic customers and 1,000 branches across the country. The bank maintains leading positions in value generating services and products. However, the management have realized that the bank was not using data and analytics to its full potential. Many data assets were managed in silos and with varying levels of rigor in data quality and accessibility. In mid 2015, the bank decided to engage in transforming its data management practices through an enterprise wide data governance initiative. This initiative was embedded in an overall data transformation program including a technical transformation of its data warehouse as well as scaling up advanced analytics activities.
A federated governance design was chosen and about 30 data domains have been defined and assigned to a business owner. The effort has been orchestrated by an enterprise data office (EDO) acting as design and control authority for all data within the bank. The EDO also has the role to oversee the transformation, communicate changes to the larger organization as well as to act as a partner to data owners.
Gökhan and Matthias will present both the approach as well as the key design elements of the target data governance solution. They will also provide insights about the current status of the implementation as well as the key challenges experienced in changing data management practices within business and IT towards a data driven mindset.

Gökhan Gökçay, Group Director BI and DWH, YapiKredi

Gökhan Gökçay

Group Director BI and DWH, YapiKredi

Gökhan Gökçay is Senior Vice President responsible for Information, Customer and Workflow Management solutions in Yapi Kredi Bank. Information Management domain consists of Data Governance, Enterprise Data warehouse, Big Data, BI and Analytics teams. He aims to produce and further position ‘information’ as a shared corporate asset and deliver solutions to get value and actionable insight. Prior to Yapi Kredi, Gökhan was the head of Financial Services industry in NTT Data Turkey and delivered transformation programs for leading corporates such as Yapı Kredi, Garanti Bank, Turkcell and Avea. Gökhan also worked for other technology consulting & service companies such as CSC and ATOS (former Siemens Business Services), and served large corporates such as Lloyds Bank, Barclays Capital, Legal & General.

Matthias Roggendorf, Senior Expert McKinsey & Company

Matthias Roggendorf

Senior Expert McKinsey & Company

Matthias Roggendorf is a senior Expert with McKinsey’s Business Technology Office in Berlin and leads McKinsey’s global data analytics service line. He supports clients globally on data strategy and data transformation topics. In addition, he works with clients on developing their future data architecture by leveraging Big Data technologies and innovations.

Automating Data Governance Policy: How Organizations are Automating DG Policy implementation in Data Management Software
Automating Data Governance Policy: How Organizations are Automating DG Policy implementation in Data Management Software

As more organizations develop their data governance competencies one question lingers—How should organizations track/audit/automate the implementation of these policies in data management?
In this talk, Nicola and Conrad, posit one approach. They describe how organizations are using cross-cutting workflows, feedback loops and integration between traditional data governance tools (e.g. business glossaries) and master data management software to bridge the gap between data governance and data management.
This session will include case studies and lessons learned from multiple industries.

Conrad Chuang, Director Product Marketing, Orchestra Networks

Conrad Chuang

Director Product Marketing, Orchestra Networks

Conrad Chuang is Director for Product Marketing at Orchestra Networks where he is responsible for positioning the EBX MDM, Data Governance and RDM products in the market.  Mr. Chuang is a frequent speaker at industry conferences and events where he brings significant insight from his working relationships with organizations across multiple industries and his understanding of the key challenges organizations are facing in the management of their reference data.  He received his BS in Economics and Masters of Arts in International Commerce & Policy from George Mason University, followed by his MBA in High Tech Marketing & Finance from Babson College’s Franklin W. Olin Graduate School of Business.

Nicola Askham, The Data Governance Coach

Expert Panel: Building your Career in Data and Data Governance
Expert Panel: Building your Career in Data and Data Governance

Data specialists have never been in more demand, but are those with real data skills adequately rewarded and are those with real skills under threat from those jumping on the Big Data bandwagon?
This session will look at general trends in business technology, Big Data and regulation and their impact on your career and look at strategies to build your career in data. Panellists include business technology, pay and HR experts and a leading headhunter.
This session will highlight:

  • Pay and HR trends for business technology and HR experts
  • Opportunities and threats to those building a career in data
  • Strategies for building your career in data and business technology

Mike Simons, Associate Editorfor, CIO.co.uk, ComputerworldUK and Techworld, IDG

Mike Simons

Associate Editorfor, CIO.co.uk, ComputerworldUK and Techworld, IDG

Mike Simons is Associate Editor for CIO.co.uk , ComputerworldUK.com and Techworld, joining IDG from Reed in 2006, where he worked on Computer Weekly and ComputerWeekly.com.. He was News Editor at the launch of ComputerWeekly.com in 2001 and News Editor of a combined Computer Weekly and ComputerWeekly.com operation from 2003. Mike helped Computer Weekly secure the Periodical Publishers Association awards as either “magazine of the year” or “campaigning magazine of the year” for four years out of five. Mike joined IDG as Launch Editor of ComputerworldUK and took over responsibility of Techworld in 2011.

Ken Mulkearn, Principal, Incomes Data Research

Ken Mulkearn

Principal, Incomes Data Research

Ken Mulkearn is one of the principals of Incomes Data Research, established in 2015. Prior to this he was Head of Pay and Research at IDS, where he led the Pay & Reward, Executive Compensation, and Research Services teams. He was Editor of the ‘Pay & Reward’ component of the ids.thomsonreuters.com online service, the monthly IDS Pay Report and a range of specialist sector reports, including ‘Pay and benefits in the public services’, ‘Pay and conditions in call centres’, ‘Pay and conditions in engineering’ and ‘Pay in retail’. As well as reporting on reward developments across the economy, his teams were responsible for compiling the data that appears in IDSPay.co.uk, the online pay benchmarking tool from IDS. During his time at IDS Ken covered pay developments across the private and public sectors and he was closely involved in a large number of research projects for a variety of external clients. He speaks to a wide range of audiences on pay issues, and regularly broadcasts on radio and TV. He holds an MSc in social research methods from the London School of Economics, where he also took modules in industrial relations. His primary degree is from Trinity College, Dublin.

Peter Segal, Managing Partner, Ogilvie & Associates

Peter Segal

Managing Partner, Ogilvie & Associates

Peter Segal is a Managing Partner with Ogilvie & Associates, a boutique Executive Search firm partnering with technology product and Services firms. Peter has more than twenty five years’ experience of partnering with Technology clients, building high impact executive teams in eighteen countries across Europe, North America and Asia.  His expertise covers both venture backed companies looking to grow to the next stage of development and publicly quoted companies; building leadership teams for both European owned companies and US owned companies looking to expand their International operations.  His experience includes partnering with software companies across a wide range of enterprise class applications and vertical markets.  Peter has acted in an advisory capacity to firms on a wide range of issues such as organisational design, remuneration, personal development and succession planning.  He has also led management assessment initiatives benchmarking client management teams against others in their sector as well as outsourcing services.

Michelle Teufel, Strategic Change Leader - Business and Technology Transformation for the Digital Age 

Michelle Teufel

Strategic Change Leader - Business and Technology Transformation for the Digital Age 

Michelle Teufel is a Global Executive based in London with extensive experience in IT, Information Management and Business Intelligence.   For the past year Michelle was interim CIO for Premier Farnell, Plc and has spent her career delivering multi-year transformation programme in both business and IT including establishing and leading the Global Information Management function for Premier Farnell covering Data Operations, Data Governance, Information Security and BI & Reporting Services.

16:00 - 16:30
Break & Exhibits
16:30 - 17:15
Expert Panel: Best Practises in MDM of Customer Data
Expert Panel: Best Practises in MDM of Customer Data

The ability to deliver a single, trusted, shareable view of customer is universally seen as a key business strategy for commercial and public sector enterprises.  Yet most enterprises struggle to provide even a 90°, let alone a 360°, view of those entities.  While historically, many organizations and vendors originally focused on derivatives of CRM, ERP or other vertical industry solutions as their system of record, industry-leading enterprises have since moved on to multi-domain MDM, Big Data/Social MDM and other key augmentations to that original concept. This panel will focus on answers to such questions as:

  • Establishing the seeds to grow customer data integration (CDI) — e.g., compliance & risk management; cost optimization & efficiency; cross-sell, up-sell & retention
  • Determining the pace to embrace multi-domain MDM, Big Data, Cloud MDM & Social MDM
  • Rationalizing an “MDM of CUSTOMER hub” architecture with the reality of multiple ERP, CRM & other MDM hubs

Aaron Zornes, Chief Research Officer, The MDM Institute

Sharon Lankester, Enterprise Data Governance Leader, Dun & Bradstreet

Ho-Chun Ho, Global Head of Data Governance and Management, JLL

Ho-Chun Ho

Global Head of Data Governance and Management, JLL

Ho-Chun Ho guides and oversees global data governance and management for the Corporate Solutions business line at JLL. Ho-Chun has over twenty five years of data management experience, across banking, insurance, financial services, pharmaceutical, research, telecommunication, media and e-commerce.

William O’Shea, Data Architect and DG Consultant, Saudi Telecom Company

William O’Shea

Data Architect and DG Consultant, Saudi Telecom Company

Will O’Shea is an experienced Data Governance consultant who is specializing in the data architecture aspect of DG. Will is a multi-disciplinary having worked in a wide range of roles from Project Manager to Technical Architect at blue chip organisations that include STC, the NHS, Pfizer, Johnson & Johnson, etc. He is an Oracle Certified Professional (OCP) and has won several awards from his implementation methods. Follow William @WillOShea

Short Path to Sustainable Master Data Quality
Short Path to Sustainable Master Data Quality

Headquartered in Switzerland, Ferring Pharmaceuticals is a research-driven, specialty biopharmaceutical group active in global markets.  In 2011, Ferring Pharmaceuticals undertook a global ERP re-implementation, whereby a Global Master Data Management organization was set up to define Master Data rules and standards that meet the business requirements and support the daily operations and system transactions.  This significant joint effort by the Business and Corporate Information Systems further highlighted the need to sustain the master data standards and quality by automating data quality checks. This session will focus on the implementation of the Informatica Data Quality solution at Ferring Pharmaceuticals, by discussing topics such as:

  • Masterminding an overall design of the global data quality solution
  • Designing custom functionalities (white-listing, reporting, rulebook…)
  • Leveraging & sustaining the lessons learned

DV Singh, Director Enterprise Services, AN Info Systems GmbH

DV Singh

Director Enterprise Services, AN Info Systems GmbH

DV Singh has more than 18 years of experience in Information Systems management in several sectors around the world including financial services, telecommunications, public sectors, manufacturing.  Mr. Singh has broad expertise across multiple DBMS, ETL platforms and related IT platforms.  Apart from being the author of a generic DQ platform, he is also owner of an SOA design platform which has laid the successful foundation for high stake global projects at organisations such as ABN, AXA and Maersk.  Mr. Singh holds degrees in Computer Science & Engineering and Mechanical Engineering from India, apart from other national level (India) qualifications.  He earned his Bachelor of Computer Science from North-Eastern Hill University (Shillong, India).

Ethics in an Information Management Context
Ethics in an Information Management Context

This presentation follows the view that “Action is indeed the sole medium of expression for ethics”, providing an overview of a practical approach to integrating ethics into information management practices.
From the European Data Protection Supervisor publishing an Opinion on Big Data Ethics, to countless articles about the call from data scientists for clarity on ethics, there is a growing consensus that “something must be done”.  However, much of the discussion of Ethics takes place in the abstract, and the real challenge in commercial and not-for-profit organisations is ultimately what happens in reality when the organisation is faced with the power of modern information management capabilities. Ethics risks being seen as another “tick box” item to be taken care of by the “Ethics people”, just as Information Quality is often seen as the role of the “Quality department”.
Key takeaways for this session include:

  • An overview Ethics and their relevance to Information Management practice
  • Ethics of privacy and Human Rights
  • An overview of practical methods to align ethics with Information Governance
  • Risk management, Information management practices

Katherine O’Keefe, Data Governance & Privacy Consultant, Castlebridge

Katherine O’Keefe

Data Governance & Privacy Consultant, Castlebridge

Dr Katherine O’Keefe is a Data Governance and Data Privacy consultant with Castlebridge. Katherine has worked with clients in a variety of sectors on consulting and training engagements since starting with Castlebridge. In addition to her professional experience in Data Governance and Privacy, Katherine holds a Doctorate in Anglo-Irish Literature from University College Dublin and, as well as being a Data Governance and Privacy consultant, is a world leading experts on the Fairy Tales of Oscar Wilde. Follow Katherine @okeefekat.

Leveraging Regulatory Initiatives to Deliver Value through more Effective Information Management
Leveraging Regulatory Initiatives to Deliver Value through more Effective Information Management

We are in an era where compliance is a must have, but can we be “just compliant”? As our companies have to ensure business continuity, finding the right balance between ‘established’ information management systems, and new ones; or between localized, and integrated information architecture became a day-to-day challenge for most of us. So when your relevant regulatory authorities are coming up with a new information standard, it rapidly calls for a major strategical choice: “Containing and Minimizing the impacts”, or “Seizing the opportunity as a driver for change”.
Quentin will examine this case through a concrete example: “The roll-out of the new Identification of Medicinal Product (IDMP) legislation in GSK Vaccines”.  By attending this session you will gain:

  • An overview of the methodology, we have applied.
  • Thoughts on how any company may improve efficiency through external compliance.

Overview of the challenges we have faced

Quentin Grignet, DMP - Project Lead, GSK Vaccines

Quentin Grignet

DMP - Project Lead, GSK Vaccines

Quentin Grignet is responsible of the GSK Vaccines R&D Master Data Management group since 2010, and actual project leader for the GSK Vaccines IDMP initiative. Mr. Grignet plays an active role in the development of the R&D information management strategy, where his work is focused on integration and simplification, but also on bringing innovations to GSK R&D information management. Following his work on the design & implementation of the two R&D Project Management systems, he assists and advises project managers and related stakeholders. He has an MSc. in International Business & Management and a Specialization in Information Strategy for a global economy from Sheffield Hallam, and a BSc. in Information Technology.

Jump-start Your Data Governance and Information Quality CoE Program in Zero to 30 Days
Jump-start Your Data Governance and Information Quality CoE Program in Zero to 30 Days

This presentation is a practical primer for building a Data Governance and Information Quality Center of Excellence program. This session will include lessons learned from Michael’s experiences of building a Healthcare oriented Data Governance and Information Quality program.
Though Michael’s experiences revolve around the Healthcare space, the principles and best practices can be applied to sundry industries. Michael will touch on topics that illustrate how to:
Establish a Data Governance Council and use it as a driver for improving information quality within your organization.

  • Develop a federated Data Stewardship model
  • Produce quantitative and qualitative results to achieve senior leadership buy-in
  • Manage business and technical metadata assets

Michael Davis, Data Quality Center of Excellence Engineer Lead, Cigna Healthcare

Michael Davis

Data Quality Center of Excellence Engineer Lead, Cigna Healthcare

Before joining Cigna and becoming the Data Quality Center Of Excellence Engineering Lead, Michael Davis served as CEO and President of OmegaSoft consulting. At OmegaSoft, Michael provided customers with staff augmentation and database Management services. Mr. Davis has over 18+ years of Information technology experience and has presented and published articles on Database Management. Mr. Davis hails from St. Croix, U.S. Virgin Islands and is an avid golfer and runner. He is also very passionate about Health and Wellness. He started the first local chapter of the Health 2.0 in Connecticut. Health 2.0 is an organization with a mission to improve healthcare outcomes using social media and other web-based technologies.

17:15 - 18:30
Drinks, Reception & Exhibits
Wednesday 18 May 2016 : Conference Day 2 & Exhibits
09:00 - 10:00
PLENARY KEYNOTE - Red Monkey Innovation Management; Organisations in Search of a New Balance
PLENARY KEYNOTE - Red Monkey Innovation Management; Organisations in Search of a New Balance

The world is changing faster and faster.  Organizations, companies, schools and regions have to adapt to a world that is flooded with information and need to increase their power to learn and innovate dramatically. Today’s organisations and companies however are not able to create the right learning and working environment that enables and energizes disruptive innovation by using passion for talents. We unintentionally transformed talented adults and children into passionless sheep. We have to rethink the organization of working and learning. We have to boldly go for disruptive business innovation powered by disruptive culture innovation. This session is a plea for a dramatic change in the organization of work and education. After this session 2D, 3D, Sheep and Red Monkeys will be branded in your brain. You will become disrupted.

  • Attendees will learn that transforming organizations into real learning and innovating organizations will not be possible with consensus but will be driven by conflicts
  • Attendees will learn a new model for disruptive innovation: Red Monkey Innovation Management
  • Attendees will understand the impact of today’s information luxury on the organization of learning and work. We have to get rid of our diploma-addiction and go for Competence Playlist Based learning and working

Jef Staes, jefstaes.com

Jef Staes

jefstaes.com

Jef Staes is an authority on learning processes, innovation and culture change. With 25 years of professional experience (GTE Atea / Siemens), he currently assists CEO’s and managers to find a comprehensive answer to the changing dynamics of today’s market. Jef Staes answers a crucial question: “Why don’t organisations learn and innovate fast enough?”.  As an author, speaker and expert, he not only awakens people, but also presents them with a unique concept to guide them through the necessary changes. He is a passionate and inspirational keynote speaker. His story is a guaranteed eye-opener and his thoughts on the future of business and education inspire many. With striking metaphors, he tackles the most fundamental issues organisations struggle with.  Follow Jef on Twitter: @jefstaes.

10:00 - 10:30
Break & Exhibits
10:30 - 11:15
MDM Keynote: Field Reports for 'Top 15' MDM Solutions
MDM Keynote: Field Reports for 'Top 15' MDM Solutions

Evaluating MDM solutions is comparable to purchasing your first home— too many new variables, lack of transparency in the pricing, and high pressure sales tactics. On top of this flux, IT executives have to contend with the marketing dogma of ongoing “stack wars” among the mega vendors and the dogmatic “we are the world” viewpoints of MDM and (even) Business Process Management (BPM) vendors. To cope during 2016-17, many large enterprises will increasingly mandate a unified approach to both data and process architecture/design/management tools. This session will focus on the why and how of MDM platform technical evaluations by providing insight into:

  • Understanding the pros & cons of the dominant architectural models & evaluation criteria— e.g., pro-active data governance, identity resolution, hierarchy management, scalability, Big Data & Cloud integration capabilities, etc.
  • Assessing the vendor landscape— e.g., registry, data hub, ultra-hub, EAI/EII, portals, SOA-based web services, data service provider, system-centric BPM, human-centric BPM, etc.
  • Applying a rigorous methodology to product evaluations for both mega vendor solutions (IBM MDM, Informatica MDM, Microsoft MDS, Oracle MDM, SAP MDG) and more pure play (Ataccama, IBI MD Center, Kalido, Orchestra Networks, RedPoint Global, Riversand, Semarchy, Stibo, Talend, Teradata, TIBCO, Visionware, et al).

Data Governance Keynote: An Agile Data Strategy for the Modern Enterprise - Regaining Order In a Sea of Data Chaos
Data Governance Keynote: An Agile Data Strategy for the Modern Enterprise - Regaining Order In a Sea of Data Chaos

For most organisations today, their data landscape is becoming increasingly more complex. Transaction systems are now spread across both on premises and in the cloud, multiple  data warehouses and data marts often exist and big data platforms have also entered the enterprise.  Data quality issues in this kind of landscape can cause significant problems and be hard to eradicate. In addition, new data sources continue to grow and new data collected is often too big to move to process it centrally. So how do you deal with all this to ensuring data remains trusted and to ensure that data governance keeps data under control? This session looks at this problem and shows how to implement a agile data strategy to manage data in a distributed and hybrid computing environment.

  • The increasing complexity of a distributed data landscape
  • What do you need to consider in a modern data strategy?
  • Managing data in a distributed and hybrid computing environment
  • Multiple tools – self-service DI Vs EIM – how they fit together?
  • Dealing with data when it is too big to move
  • The move towards data as a service inside the enterprise
  • The role of data virtualisation in a modern data strategy

MDM Track 1
MDM Track 2
MDM Track 3
DG Track 1
DG Track 2
11:20 - 12:05
BEST PRACTICES: Establishing a Data Governance Authority in the Very Large Global Enterprise
BEST PRACTICES: Establishing a Data Governance Authority in the Very Large Global Enterprise

Thomson Reuters is the world’s leading source of intelligent information for business and professionals, combining industry expertise with innovative technology to deliver critical information to leading decision makers in the financial, legal, tax and accounting, healthcare, science and media markets.
Thomson Reuters’ vast ecosystem of data sources posed numerous challenges across its business, including: multiple systems and data sources mostly established through acquisition; issues surrounding the validity, accuracy, uniqueness and validity of core data across the Enterprise; and, inconsistent workflows and manual processes. Governance issues included: few restrictions on who maintains core data across the variety of data models, structures and standards; varying governance models with varying levels of sophistication; and, varying levels of process documentation and supporting toolsets across the multiple ‘trusted’ data sources consumed across each strategic business unit.
A “single customer view” was mandated as a key Enterprise strategic business goal, with the project encompassing all strategic business units and thirty different sources. Moreover, the initiative focused on core customer and prospect data, interfaces and data flows, legal and operational hierarchies, and data quality (including centralised governance of customer data).
Topics covered in this session include:

  • Identifying the key principles to adhere to & pitfalls to avoid, in establishing a Data Governance Authority & in implementing Data Governance
  • Understanding the importance of data description as a foundation for data management
  • Applying lessons learnt in delivering a complex MDM programme across multiple global business units

Kennedy Warwick, Head of Customer Data Strategy & Management - Finance & Risk, Thomson Reuters

Kennedy Warwick

Head of Customer Data Strategy & Management - Finance & Risk, Thomson Reuters

Kennedy Warwick is the Head of Customer Data Strategy & Management for the Finance & Risk business of Thomson Reuters. Mr. Warwick has over 20 years’ experience within data management in a career spanning FMCG, Legal Services, Government and Financial Services sectors. His current focus is to drive the strategic data/information delivery, data governance, operational leadership, and program/project management in support of F&R’s strategic Sales & Marketing initiatives. Mr. Warwick is the senior stakeholder from the F&R business on Thomson Reuters MDM implementation, in support of the enterprise-wide view of the customer. He earned his BA (Honors) in Politics/Economics from the University of Hertfordshire and his MSc, Information Systems & Systems Analysis from the University of Brighton.

Agile MDM for Global Multi-Channel PIM
Agile MDM for Global Multi-Channel PIM

Sartorius is a leading international pharmaceutical and laboratory equipment supplier (~€900M and ~6,000 employees CY2015).  The company manufactures, markets and sells a highly complex product assortment, making Product Information Management (PIM) an important challenge.  The prior PIM solution at Sartorius included a very manual process with a limited UI with few capabilities for managing multiple taxonomies, product lifecycle, workflow management, multi-channel distribution, reporting, and channel-specific translation and localization.  Riversand’s PIM solution, MDMCenter, has become the centralized, single-point master data system for aggregation, management and distribution of enterprise product data and digital assets at Sartorius.  This session will focus on the journey to MDM and PIM by discussing these topics and more:

  • Tailoring a multi-domain MDM solution to support guaranteed governance, configuring of multiple structures, interaction with digital media assets & information syndication
  • Establishing business process management & data flows for precise, powerful & automated data syndication across various channels such as e-procurement, e-commerce & others
  • Leveraging PIM to govern medical device industry regulatory compliance (globally & locally) – e.g., International Medical Device Regulators Forum (IMDRF) & Unique Device Identification (UDI)

Nicholas Rioux, CTO, Senior Partner, enscight

Nicholas Rioux

CTO, Senior Partner, enscight

Nicholas Rioux is the technical leader for enscight, a specialty boutique consulting firm focusing on eBusiness strategy and technical implementation for the life sciences marketplace. A long term veteran of the life sciences industry, Mr. Rioux has spent years helping major life sciences firms hone their technical strategies and to sell better online. Having started his career at age fourteen in working as a senior developer for many major B2C web businesses during the early days of online selling, he mixes a wide variety of technical expertise with an intense drive to improve the customer experience. Prior to his time leading enscight, he had a variety of technical and strategic roles at firms such as ThermoFisher Scientific and others. Mr. Rioux holds an MBA and an MIS degree from Boston University and attended both Northeastern University and Simon’s Rock College of Bard University.

Colin Price, PIM Data Manager, Sartorius

Colin Price

PIM Data Manager, Sartorius

Colin Price has worked on e-commerce and MDM systems for over a decade and a half. In the past, Colin managed consumer-focused publishing systems at Harvard Medical School. Most recently he managed PIM/MDM/DAM solutions at Sartorius Stedim North America Inc. During his career he has been key contributor in multiple projects including e-commerce websites, downstream information systems, e-publishing systems, interactive tools, and branding initiatives.

Expert Testimony: Field Reports for 'Top 10' MDG Solutions
Expert Testimony: Field Reports for 'Top 10' MDG Solutions

Data Governance is critical to achieving sustainable and effective MDM.  Failure to execute Data Governance concurrently with an MDM program greatly decreases the probability of success and economic sustainability of MDM programs. Clearly, Data Governance is both synergistic & co-dependent with MDM.  When deploying MDM, a proper Master Data Governance (MDG) discipline should consider the business drivers, project scope, roles and people filling each role, policies and procedures, data quality, inheritability, social norms, and the business operating model.  Moreover, Data Governance is more than a single product or process, rather, it is an ecosystem of products, processes, people, and information.  At present, Data Governance for MDM is moving beyond simple stewardship to convergence of task management, workflow, policy management and enforcement.
Understanding the scope, diversity and limitations of current Data Governance solution offerings is tremendously challenging – even more so, given the fast pace of M&A & complexities of integrating such diverse software portfolios.  Nonetheless, business and IT leadership chartered with defining and executing MDM programs need help to understand and navigate through the number and variety of MDG options. During 2016, major systems integrators, MDM boutique consultancies and Tier 2/3 MDM solution providers will focus on productising Data Governance frameworks while mega MDM software providers struggle to link governance process with process hub technologies.  During 2016-17, vendor MDM solutions will finally move from “passive-aggressive” mode to “proactive” Data Governance mode. This session will a review of the current solutions in market as well provide a “top10” list of evaluation criteria for such solutions. Topics include:

  • Understanding the “top 10” evaluation criteria for MDG solutions — e.g., E2E lifecycle management, Big Data & ECM support, DQ/ETL integration capabilities, etc.
  • Assessing the vendor landscape— e.g., passive, active, integrated, pro-active, & passive aggressive, etc.
  • Determining an enterprise-specific road map to evolve from a siloed, motley collection of DQ tools, processes & point products to a non-obtrusive enterprise MDG program (supporting multiple domains & federated data management groups).

Implementing a Data Governance Framework at Schroders
Implementing a Data Governance Framework at Schroders

The importance of implementing a robust data governance framework within any organisation cannot be underestimated.  This presentation will focus on the approach we adopted at Schroders to implement a governance framework that will provide the foundation for supporting a new way of working throughout the firm.  Consequently, this should lead to improved data quality that is sustainable and a framework to ensure that only change initiatives that are consistent with the firm’s data estate strategies are approved.

Barry Robinson, Head of Data and Governance Delivery, Schroder Investment Management

Barry Robinson

Head of Data and Governance Delivery, Schroder Investment Management

Barry Robinson has worked within the financial services sector for 33 years, including 20 years split between JP Morgan Investment Management and Schroder Investment Management, in a variety of different roles which have given him a broad understanding of the asset management business and the importance of data.  He also spent 6 years working for SmartStream Technologies as a Project Manager so he has a good appreciation of what is important to buy and sell side firms.

Charlotte Koolstra, Data Governance Analyst, Schroder Investment Management

Charlotte Koolstra

Data Governance Analyst, Schroder Investment Management

Charlotte Koolstra Law and Criminal Law at the University of Amsterdam in the Netherlands, and in 2014 she completed an MSc in Criminology and Criminal Justice at the University of Oxford. During an internship at the corporate crime law department of law firm Stibbe, Charlotte became interested in the financial services industry. Subsequent to her studies Charlotte joined Schroders and started working as a paralegal on a legal data verification project after which she progressed into a data analyst role. Within Data Services Charlotte’s main focus has been the design and implementation of a data governance framework.

You Can Start Small and Quickly Deliver Benefits
You Can Start Small and Quickly Deliver Benefits

Getting buy-in for a huge and expensive data governance project is difficult; so why not start small, and cheap, and simple?  This session presents a practical case study of how Mutifonds have achieved this with a very small team on the back of a large project implementation for a client.  Benefits have quickly been demonstrated in reduced analysis time and greater transparency of data use; and once patterns for success are established you can expand beyond the initial scope.  Governance must start immediately a new piece of data is identified, and has to cover all uses of that data.  The process therefore needs to follow right through: from initial analysis to all eventual flows involving that data.  It started with a spreadsheet…

  • Case study of a successful data governance project with small beginnings
  • Best practice example around data dictionary and data usage; leading to creation of a data mart as part of a large financial project implementation
  • Proving the model – delivering ROI and early benefits including reduced analysis time, cost savings and greater transparency of data usage
  • Expanding and replicating the model – delivering benefits in one area and then expanding beyond the original scope

Nick Jones, Manager, Product Multifonds

Nick Jones

Manager, Product Multifonds

Nick Jones is a data management and data governance expert with a long background in financial services.  He is currently working for Multifonds producing a metadata hub to cover the data used in a series of major client projects – adding greater visibility and traceability to the analysis and development process, as well as providing added value to their clients through a distributed data dictionary.   His background is in financial data integration and reporting; and the design of systems to handle a very wide range of financial products, from vanilla, to exotic.  He has been involved in designing and implementing Data Warehousing for large TPA’s, Treasury and Private Banking developments, and unitised UK Life and Pensions systems.

12:05 - 13:35
Lunch & Exhibits
Perspective Sessions
12:35 - 13:00
Who's afraid of the Big (Bad) Data?
Who's afraid of the Big (Bad) Data?

The extent of an organization’s revenue growth, cost reduction, and risk mitigation ultimately hinge on their ability to exploit new technology and new data to uncover new opportunities and threats early and often.
None of this is feasible, however, if they’re not able to overcome data siloes and put reliable data in the hands of those who need it most.
Big Data demands a new approach to information management – one that embeds data quality across business processes and provides fit-for-purpose data anywhere, to anyone, at any time. As a senior technology or data strategist, you must make sure you are putting in place flexible, scalable, and usable data quality solutions that can grow in sync with the evolving complexities of your technology and data infrastructure.
Please join us to hear Trillium’s perspectives on the ever more important role that a focus on Data Quality must play in developing today’s complex data ecosystems.

Chris Furlong, Senior Consultant, Trillium Software

Chris Furlong

Senior Consultant, Trillium Software

Chris Furlong is an experienced, versatile and commercially astute IT leader with extensive expertise in product management, product development and pre-sales consultancy, Chris has been focused on Business Intelligence, Analytics and Data Management technologies for over 20 years.

He has been involved with developing and delivering data management and business intelligence solutions across all verticals in all size companies, from SME to large multi-national enterprises, in multiple leadership roles from pre-sales and technical account management to project management.

Chris is most comfortable working in fast paced, agile environments, and when he is responsible for driving new product propositions to market as a key member of customer facing teams. He enjoys being at the leading edge of technology and is motivated by the challenge of delivering solutions that make a difference to people and businesses.

Filling the Gap: Master Relationship Management in the Face of Change
Filling the Gap: Master Relationship Management in the Face of Change

While MDM strives to deliver clean data, the fact is data never stops changing, nor do the opportunities for enrichment. Adding sentiment to customer data, understanding relationships between parties, and understanding the actions customers are undertaking, all create a never-ending cycle of opportunities to enrich your understanding. Business users need to leverage this data, but the fragmentation of systems, an infinite number of segmentations, analytics and overall digital disruption creates further demand better data harmonization. Come to this session to hear some new ideas about innovative ways to enrich your master data and manage relationships during rapid business change.

John Evans, Director of Marketing, Magnitude Software

John Evans

Director of Marketing, Magnitude Software

John Evans is director of marketing at Magnitude Software, a company formed in 2014 from the merger of Noetix and Kalido. He has over 25 years of marketing and product management experience in the information management industry, including data warehousing and business intelligence, and has been involved in evangelizing master data management solutions for nearly a decade.

Chris Allan, Chief Technical Architect, Magnitude Software

Chris Allan

Chief Technical Architect, Magnitude Software

Chris Allan is Chief Technical Architect at Magnitude Software, and is responsible for the design and development of Kalido software products, including Kalido MDM.  He has ten years’ experience developing enterprise data management products, and works closely with customers to identify areas of innovation to build into the next generation of MDM solutions.

13:05 - 13:30
A Business First Approach to Mastering Data
A Business First Approach to Mastering Data

Many companies struggle to maintain clean, consistent data to run their operations let alone be able to successfully implement a master data programme to meet more strategic goals.  Increasing amounts of data proliferated across the business and more quickly evolving goals mean failure rates for more complex, data consolidation programmes are high and often don’t result in achieving what they set out to do.  At this session attendees will learn about how new applications of data management technology derived from best practices acquired through years of project delivery in this area can help organisations start to derive value quicker from a Master Data Management programme.

Ben Morrish, Technical Manager, Information Builders

Ben Morrish

Technical Manager, Information Builders

Ben Morrish is the Technical Manager for Information Builders (UK) with over 15 years experience within the Data Management, Business Intelligence and Analytics sector. In this role Ben heads up a team who focus on helping companies determine the best solution for their Information Management landscape in line with their strategic goals. Ben gained a solid, practical grounding through roles as a consultant, business and strategic experience heading up the BI team for a large global company and invaluable insight into the everyday operations and challenges that face companies across most major verticals.

The 5 Must Do's for Guaranteed Data Governance Success
The 5 Must Do's for Guaranteed Data Governance Success

What’s stopping your data initiatives from succeeding? Stuck in endless data committee meetings or just paralysed by fear of the colossal task ahead? In this session Diaku gives you 5 data initiative hacks to start a collaborative data revolution towards a more data driven organisation today.

Patrick Dewald, Director, Diaku Limited

Patrick Dewald

Director, Diaku Limited

Patrick Dewald is a Data Governance architect and founding partner in Diaku.  He has a wealth of experience designing Master Data Management and Data Governance solutions for financial institutions. Patrick has been heading up Data Governance initiatives, designing and implementing group-wide data services from the ground up for the best part of 15 years. Patrick is recognised by its peers as a thought leader in the field of Data Governance.

Darius Clayton, Director, Diaku Limited

Darius Clayton

Director, Diaku Limited

Darius Clayton is an experienced change specialist and founding partner in Diaku. With a management consultancy background in business transformation and outsourcing he brings a practical, value-driven approach to data disciplines. Since 2007 his focus has been on data governance, collaboration, and the business view of the data asset. Darius has spent over 16 years working with institutions to control and improve their data while delivering tangible business benefits.

13:35 - 14:20
Avoiding the MDM, RDM & Master Data Governance ‘Money Pit’
Avoiding the MDM, RDM & Master Data Governance ‘Money Pit’

Given the substantial investment that enterprises undertake with implementation partners, the selection of the appropriate partner(s) must be given considerable scrutiny – not only to contain costs, but to insure success of these vital MDM, RDM and Master Data Governance initiatives.  Implementation partners such as systems integrators and specialist consultancies are more important than ever – not just because of the services:software ratio for such projects (universally ascribed as  approaching 4:1) but precisely because of an ongoing shortage of experienced MDM, RDM and Data Governance professionals.  In a recent MDM Institute survey of more than 1,200 such programs, systems integrators (SIs) have been seen as essential to the success of the majority of such projects, yet previously incumbent SIs are becoming less dominant.  Expert assistance from SIs will remain especially critical and problematic to the success of these programs during 2016-17 as organisations deal with a shortage of MDM/RDM/MDG experience, and tool expertise.
This session includes findings from a year-long readiness assessment of more than 75 leading consultancies to provide a balanced view of:

  • Understanding why SIs are essential to the success of your MDM, RDM & Master Data Governance projects
  • Structuring how an enterprise should evaluate the capabilities of “new” potential SI partners
  • Orienteering the SI landscape for both the traditional leaders as well as the new “young Turks”

Resetting Data Governance – Leveraging Successes & Strategic Planning
Resetting Data Governance – Leveraging Successes & Strategic Planning

Data Governance has been a hot topic for most data-driven organizations for the past five years, yet few organizations take the first step of standing up Data Governance for the fear of failure on a colossal scale.  With the appropriate stakeholders and sponsorship, small-to-medium size organizations should not fear taking the plunge of launching a Data Governance program.  Nationale Nederlanden Investment Partners (NNIP) successfully stood up Data Governance in 2011 and after four years of steady progress, NNIP is ready to progress to the next phase of Data Governance.
While a charter, roles/responsibilities and target operating model are all integral components of a successful Data Governance program, without a sound Data Governance (and Data) strategy, organizations have difficulty measuring success and appropriately steering resources and capabilities towards objectives that contributes to an organization’s vision.  This presentation aims to tell the story of how NNIP developed a Data Governance strategy based on the proven strategic intent model, yet produced goals and objectives that are relevant, non-academic, and easily actionable by resources both inside and outside of the Data Governance structure.   Key topics include:

  • Aligning strategic intent with the importance of a mission & vision
  • Leveraging successful programs/initiatives & cutting out deadwood
  • Structuring objectives to enable achievement of long term goals & satisfy short term metrics

Lance Cameron, Data Governance Specialist, NN Investment Partners

Lance Cameron

Data Governance Specialist, NN Investment Partners

Lance Cameron is an experienced Data Governance/Data Management specialist with over 18 years of experience in Enterprise Data Management and 6 years of experience in Enterprise Architecture and Data Governance.  Mr. Cameron has a mastery of logical and physical data modeling in Business Intelligence and Operational environments as well as expertise in properly utilizing industry data model accelerators.  Other accomplishments include performing assessments and developing roadmaps for enterprise metadata management, reference data management and data model management.  His focus includes semantics, business information models, industry data models, metadata management, MDM, data quality, enterprise data strategies and roadmaps to support them, data requirements gathering, logical and physical data base design, enterprise data models, business information models, business analysis, data analytics, and service oriented architecture.  More recently, Mr. Cameron has focused on demonstrating the value of and implementing Business Information Models and accompanying Data Governance Programs.

Frank Gresnigt, Data Governance Specialist, NN Investment Partners

Frank Gresnigt

Data Governance Specialist, NN Investment Partners

Frank Gresnigt is an experienced Data Governance/Data Management specialist with over 10 years of experience in the financial industry.  Since his start at NN Investment partners he has focused on improving data quality in the organization and held various roles,  including managing the static data department and implementing market data management tool for security master publications. Other accomplishments include the development and implementation of NNIP’s Data Governance and data quality framework. More recently Mr. Gresnigt has focused on the establishment of the Data Governance office, the implementation of Data Governance software and the roll out of the data quality program.

Organic Aligning Between MDG & Business Needs
Organic Aligning Between MDG & Business Needs

Meggitt PLC is a leading international company specializing in high performance components and sub-systems for the aerospace, defence and energy markets.  Meggitt’s MDG journey started in October 2013, by deploying SAP MDG hub for Customer, Vendor, Material and Finance master data.  However, as the Business organism evolves, and the organizational landscape adapts to meet the ever increasing demands of the marketplace, change is inevitable. Thus, modification of the deployed Master Data Governance (MDG) solution from its original Business case may be required, however what should be done to ensure the primary MDG goals are kept in sync with organizational change? This session will provide insight by discussing these topics and more:

  • Leveraging tactics to ensure strategic MDG aims are kept in alignment with Business change
  • Understanding partnerships & why internal resource continuity is so important
  • Focusing on the future by anticipating Business & market change

Bradley Smith, Group Master Data Services Manager, Meggitt

Bradley Smith

Group Master Data Services Manager, Meggitt

As Group Master Data Services Manager, a role at the interface of Business and IT, Bradley Smith is the key sponsor for the SAP MDG implementation in Meggitt, a UK-based global manufacturer of civil and military aerospace sub-systems with revenues of £1.6B and nearly 11,000 employees. His responsibilities include the development and deployment of Meggitt’s master data strategy, the successful rollout of MDG, evangelising the business value of quality data, and driving Meggitt’s maturity in Data Governance and MDM.  Moreover, Mr. Smith has global responsibility for the SAP ERP modules Manufacturing, Inventory & Plant Maintenance.  Prior to Meggitt, he held management roles at Ultra Electronics Controls and Global Orthopaedics.  Mr. Smith received his MSc in Technology / Business Management from Kingston University

Building a World-Class Data Governance Organization
Building a World-Class Data Governance Organization

JLL have established a global data governance organization in less than two years. Their data governance program consists the global oversight and shared services, while the implementations in EMEA, APAC and Americas are designed specifically for regional needs.
JJL’s approach is methodical, practical and effective. It is the foundation of JLL’s global data and insights platform, RED. RED is a revolutionary approach to uncovering insights; it brings together master data governance, knowledge management, business intelligence and advanced analytics, underpinned by cutting-edge technologies and tools to dramatically improve real estate decision making.

Ho-Chun Ho, Global Head of Data Governance and Management, JLL

Katarzyna Puchalska, EMEA Data Governance Manager, JLL

Katarzyna Puchalska

EMEA Data Governance Manager, JLL

Katarzyna Puchalska has over eight years of industry experience. She is responsible for the data governance function in JLL’s EMEA region.

The New Data Quality Manager in a Large Organisation
The New Data Quality Manager in a Large Organisation

On taking up a lead data quality role in a large Public Sector organisation Stephen was advised that a recent study into his new team’s work had shown it to be largely ineffective in its central role of improving the quality of the organisation’s workforce data.   This session is an exploration of the situation Stephen faced, the reasons it had occurred and the actions were taken to address the issues.  Looking back over a busy, challenging and, ultimately, rewarding year he will talk about the progress that has been made, what has worked and what has not.  Looking forward, how improvement can be sustained and developed to ensure that data quality is of a sufficient standard to enable accurate and effective planning, and reporting within a large and complex organisation.

  • Thoughts on how to reinvigorate an organisation’s data quality procedures
  • Some practical ideas for those new to a data quality role
  • Why everyone believes they do data quality
  • An exploration of prevention versus cure

Stephen Read, Head of the British Army's Personnel Data Assurance Team, British Army

Stephen Read

Head of the British Army's Personnel Data Assurance Team, British Army

Stephen Read has been the head of the British Army’s Personnel Data Assurance Team for 12 months.  Prior to taking up this post he was responsible for the development and support of recruiting and training IT systems where the availability and quality of data were central to the successful delivery of high profile, multi million pound operations.  During this time, to ensure that lessons were identified and learnt from a series of major government data losses, Stephen took on responsibility for Data Protection within the Army’s recruitment and training division where he developed, and implemented flexible but effective data management processes and procedures.  Stephen has enjoyed a varied career in the Army and Civil Service, and gained a BSC(Hons) from the Open University.

14:25 - 15:10
Upstream MDM for Enterprise Customers - How Microsoft Does It 
Upstream MDM for Enterprise Customers - How Microsoft Does It 

Enterprise customers are complex systems of organizations with related accounts and individuals. In this session you will learn how Microsoft approaches the mastering of its own customers in a single store that not only facilitates the ‘single view of the customer’ but also provides a “transaction service” for customer master data so that the myriads of CRM, ERP and Customer Support systems etc. do not need to maintain their own customer masters any more. This session will be of interest to data quality experts and anybody with an interest in integrating disparate systems at the point of data entry rather than consolidating data downstream. Highlights of this session included an update of Microsoft’s 2014 presentation on “MDM for B2B Customer” along with current topics such as:

  • Developing a future-proofed CUSTOMER master data strategy across multiple lines-of-business
  • Determining what capabilities (including Governance) are required to accomplish this
  • Identifying which employees & partners have the best competencies to implement & operate

Ulrich Landbeck, Senior Business Program Manager, Microsoft

Ulrich Landbeck

Senior Business Program Manager, Microsoft

Ulrich Landbeck is a seasoned data management practitioner who started his career in the 1980s at the Commerzbank in Frankfurt pioneering analytics of B2B customer data in the hands of business users. Mr. Landbeck retained the B2B focus throughout his career and moved to consulting in data warehousing projects in transport and banking in Australia during the 1990s. Since 1998 he has worked for Microsoft covering all data management roles including data strategy, governance, meta-data, data warehousing and B2B data integration cantered around the customer dimension. At Microsoft he has experienced the evolution of all four Gartner MDM styles in practice and resulting from this can now draw from a large repertoire of ‘lessons learned’.  Mr. Landbeck is presently involved in providing MDM for Customer capabilities for new LOB platforms to apply data management at the source of creation. He received his postgraduate degree in economics and management studies from Hochschule Ludwigshafen am Rhein, Germany.

Priti Padhy, Global Director of BI and Information Management, Microsoft

Priti Padhy

Global Director of BI and Information Management, Microsoft

Priti Padhy is the Global Director of BI and Information Management at Microsoft Ltd. Priti was responsible for the successful design and implementation of the global data management and BI shared services at Microsoft. His 21 years of career in information technology spans across multiple geographies, industries and functions, such as IT Strategy, Business Intelligence, Data Management, Data Governance, Program Management etc. Trained as a Bachelor of Engineering in Computer Science from Utkal University, India, Priti started his career with Atomic Energy of India, and moved on to work with Fujitsu Japan, CapGemini, Credit Swiss, RBS Bank UK and Microsoft Ltd.

MDM as Product Data Source for eCommerce
MDM as Product Data Source for eCommerce

Clearly, getting your business case for MDM to succeed is hard.  Proving the ROI and convincing the C-suite why your business needs MDM can seem an impossible task.  Elsevier decided to resolve this by funding MDM when a new eCommerce data source was needed. This funded the MDM project but tied it very strongly to one purpose.  The Elsevier team also learned that mitigating the narrow focus and delivering an enterprise-ready MDM system requires a lot of effort.  Additionally, creating an MDM system and not a data warehouse is also a recurring challenge.  Lastly, while standing up Data Governance is often a challenge, Elsevier confirmed that strong coupling to an eCommerce system created opportunities within the project to introduce and justify the need for such governance.  This session will review best practises in how to manage the compromises among Product master data and pass-through data to enable success by discussing these topics and more:

  • Guaranteeing the MDM program gets funded by coupling it to business needs
  • Ensuring IT is driving as hard as the Business stakeholders to make it happen as partners
  • Managing “scope tension” by delivering value that serves MDM & eCommerce at the same time

James Carne, Head of Global Product Data, Elsevier

James Carne

Head of Global Product Data, Elsevier

With over 25 years’ experience in  scientific publishing, for the last 4 years James Carne has been head of Product Data Governance at Elsevier, the largest scientific publisher in the world. He’s passionate about getting data right and is experienced in the challenges inherent in a mature global organisation. His roles in the business have ranged from copy editing and indexing to designing online submission tools and introducing new software systems. This gives him unique insight into the company’s data legacy as well as its future data needs. His current projects include embedding the concept of the “information asset” into his business in order to ensure the value of Elsevier’s data. He received his BSc in Cell Biology and Genetics from the University of Portsmouth, UK.

Karsten Hupperetz, Application Manager, Elsevier

Karsten Hupperetz

Application Manager, Elsevier

Karsten Hupperetz has been a Business Intelligence professional throughout his 12 year career. Having worked for international companies in retail, IT hardware and services, he has always recognised that bad data leads to misinformation, which leads to bad decision making. For more than 3 years, as part of the MDM team, his current role in Elsevier puts in him at the centre of Product Information and system landscapes. Together with previous roles in sales operations, finance and strategy, it gives him the opportunity to provide the business with enterprise-quality product information. He has an MSc in International Business from Maastricht University, the Netherlands.

Setting Up & Managing a Master Data Maintenance Organisation
Setting Up & Managing a Master Data Maintenance Organisation

FrieslandCampina is the world’s largest dairy co-operative and one of the top 5 dairy companies in the world with offices in 28 countries and more than 20,000 employees.  This session will provide an update on the 2015 session regarding the FrieslandCampina master data centralisation programme journey.  The presentation will outline the steps in building and operating an MDM function within a business, but with a practical focus on operations and people management.  In particular, it will look at building a master data maintenance organisation to align Data Governance with global business process harmonisation requirements.  Delegates should be able to take away advice to help them implement their own management functions.  Topics include discussions of practical examples such as:

  • Managing the scope of operations & the ownership of it (together with what it owns)
  • Defining the Master Data Maintenance organisation structure (including key teams) & recruitment processes
  • Rationalising key operational procedures, data management, service levels, tooling & stakeholder management

Gerard Bartley, Director Global Master Data, FrieslandCampina

Gerard Bartley

Director Global Master Data, FrieslandCampina

Gerard Bartley is a seasoned Netherlands-based Chartered Accountant and data expert with nearly two decades of extensive experience. Born and raised in Newcastle Upon Tyne, United Kingdom, his passion for all things technology began early on and it has stayed with him ever since. However, his avid interest in the data world started back in 1999 while he worked at Tesco.com before moving to Asda.com.  Currently, Gerard serves as a Director of Global Master Data at FrieslandCampina, where he successfully manages global master data, implementing data governance and raising the quality of the organisation’s data generally.  He also oversees two different data service centres on an international level – one in The Netherlands and another in Malaysia.  Additionally, Gerard is a frequent conference speaker and focuses primarily on the topics of data governance and management at his speaking engagements.

Healthcare MDM & DG - from the Cradle to Grave
Healthcare MDM & DG - from the Cradle to Grave

In Northern Ireland the Health and Care number programme was formed in 2003 to develop and master the population index for Northern Ireland.  This was at a time when the MDM market was in its infancy.  Over the past 13 years the programme has evolved and much has been learnt about Master Data Management from both the operational and strategic perspectives.  This session will cover our experiences, learning and challenges – both the good and the bad, from the technical solution through to the customer’s perspective and how it has enabled their business and the pivotal role in eHealth in Northern Ireland.
Delegates will learn:

  • The importance of Data Governance, or in its absence, a strong data custodian
  • The importance of being an informed customer in vendor selection
  • Critical factors for long term programme success

Gary Loughran, Programme Delivery Manager, Department of Health Northern Ireland

Gary Loughran

Programme Delivery Manager, Department of Health Northern Ireland

Gary Loughran has over thirteen years of experience in the delivery of regional, strategic eHealth programmes, from initially working on the delivery of the Health and Care Number Index through to today where Gary is Northern Irelands eHealth Programme Manager, for the Business Services Organisation, responsible for the strategic delivery of regional eHealth initiatives for Northern Ireland.  The largest of which is the award winning Northern Ireland Electronic Care Record.

Dermot Boyle, Delivery Manager, Sopra Steria

Dermot Boyle

Delivery Manager, Sopra Steria

Dermot Boyle is the delivery manager in Sopra Steria responsible for the overall solution and the management of the Health and Care Index operationally.  In his thirteen years on the project, Dermot has performed roles from writing the code through to today in managing the overall solution and now road-mapping it’s future.

Data Governance in a Non Regulated Environment 
Data Governance in a Non Regulated Environment 

Bringing DG into an environment with no regulatory requirements means that the DG must be sold based on purely financial benefit.  In this session we show how William and Hassan have introduced and sold the concept DG into STC from application level through to warehouse and BI.
This was done by the introduction of a Business Information Model, which is to be used from application integration through to being the basis of the model of the DWH, using a Business Support Systems transformation program to do so.  The session will show:

  • Benefits of creating a BIM
  • How the BIM transforms to a Repository
  • Mapping applications to the Repository
  • Capturing Business Validation Rules against BIM items
  • Using the BIM in a Data Lake
  • One BIM to rule all data, One BIM to find all data, and one model to bring all Data Items together and it in the DWH bind them

15:35-16:20
Next-Generation MDM using Big Data Technologies
Next-Generation MDM using Big Data Technologies

As organisations increasingly leverage Big Data technologies for scalable data management platforms, MDM solutions are shifting from traditional technologies to hybrid solutions involving some of these emerging platforms.  How can enterprises cope with Big Data sources and their increasing volume, variety and velocity, while managing veracity?  Clearly, the convergence of MDM, Big Data and Data Governance are driving a shift towards next-generation business intelligence.  Just as MDM and Governance has evolved over the years, so have the value propositions and business drivers behind its applications.  Topics include:

  • Learning from case studies to leverage the successes of market-leading enterprises embracing Big Data technologies
  • Understanding how next-generation MDM is evolving to enable timely access to quality data to deliver the right consistent message across channels
  • Enabling new levels of just-in-time customer engagement via MDM & Big Data to drive new levels of customer acquisition, engagement & loyalty

Ashok Nayak, Managing Director, Accenture

Ashok Nayak

Managing Director, Accenture

A digital leader with more than 16 years of experience in Data Management and Analytics, Ashok Nayak has designed and implemented Information Management strategies, roadmaps and operating organizations for some of world’s largest companies across multiple industries including retail, consumer packaged goods, auto, industrial equipment, and life sciences.  He has also directed design, architecture, delivery, and optimization of Enterprise Data Warehousing and advanced analytics solutions using latest technologies in the data management space. A thought leader, Ashok ‘s articles and case studies in ‘Information Management (formerly DM Review)’ demonstrate practical approaches to current Information management challenges.  Ashok is a certified Solutions Architect, Delivery Lead, and certified QA Director within Accenture’s Digital organization. He also leads the MDM Community of Practice.  Ashok has an MBA from the University of Texas at Arlington, He also has a Master of Science (Industrial Engineering) from National Institute of Engineering, Bombay, and Bachelors in Engineering from National Institute of Engineering, Rourkela, India.

Marketing the Commercial Value of Data Quality
Marketing the Commercial Value of Data Quality

Cargill is the US’s largest private corporation (US$134B annual revenues) whose diverse global operations include: grain, cotton, sugar, and petroleum trading; food processing; futures brokering; and agricultural services-including animal and aqua feed as well as fertilizer production.   The company is in the midst of a multi-year global process, data and technology business transformation.  In order to realize the business value of the transformation, quality of data is of utmost importance.  Cargill therefore created a model to proactively find problems with its data and fix such issues.  Clearly, the organisation needed a process and technology that would enable the enterprise to be agile and respond at the speed of business.   Successful user adoption across the organization to fix data issues has enabled reporting and analytics to be more accurate as well as prevent down time at plants because of data issues.  Topics to be presented during this session include:

  • Establishing a cost effective, practical operating model for the enterprise DQ practice
  • Marketing Global IT’s solutions, building of rules, monitoring/fixing data & the tools involved
  • Leveraging commercial examples of why or how having a DQ solution can positively impact the business

Eric Parkin, Data & BI Delivery Lead – EMEA, Global IT, Cargill

Eric Parkin

Data & BI Delivery Lead – EMEA, Global IT, Cargill

Eric Parkin is presently working as the Data & Business Intelligence (D&BI) Delivery Lead in EMEA for Cargill.  He is responsible for all D&BI projects in the region and ensuring successful execution and partnering with our businesses.  Prior to his current role, Mr. Parkin was the Data Quality Lead in D&BI and built an approach, tools and resourcing model for our Data Quality capabilities.  Prior to joining Cargill, he held roles in Project Management and Sales for a midsize ERP Software company.  Mr. Parkin graduated from University of Minnesota, in Minneapolis Minnesota.  He is originally from Minneapolis Minnesota, and currently living in Amsterdam, NL.

Shifting MDM into the Next Gear via Operational DQ
Shifting MDM into the Next Gear via Operational DQ

What happened since Michelin won the Gartner MDM Excellence Award (EMEA) in 2013? Beyond extending the MDM approach building on their fundamentals, Michelin took a further step forward focused on making data quality an operational reality. Transforming what is sometimes a threat to MDM initiatives into an opportunity, Michelin leverages its ERP program not only to make it a success, but beyond that to instill data quality management into its operations in a sustainable manner.
The ambition of this case study is to show which road was taken by Michelin to bring their MDM framework, applied to data quality, to an operational reality and achieve business value. This presentation will start with a brief reminder of the fundamentals of Michelin’s MDM initiative which led to receiving the aforementioned award, and what the latter brought. The subsequent extension and development of their MDM approach will follow, with an update on the current context around governance and leadership. The presentation will then focus on the DQ management approach which was developed around the Company’s ERP program: how MDM embarked on this fast moving vehicle, the methodology built to accelerate DQ management, the governance set to shift gears and instill DQ management in its daily operations, the follow-up to benefit from the positive inertia and make it sustainable. Lessons learned and key challenges will conclude this down-to-earth case study, which then opens on a Q&A session.  Key learnings from this session:

  • Embarking with MDM on the ERP journey & not be left on the roadside
  • Constructing DQ management as a sustainable operational reality
  • Identifying the potholes to avoid when driving alongside a fast ERP vehicle

Alain Dubost, Head of Master Data Management, Michelin

Alain Dubost

Head of Master Data Management, Michelin

Alain Dubost is leading the Enterprise initiative around MDM, for all business domains including Data Governance, Data Quality and Data Referential architecture.  Positioning Michelin as the winner of the (EMEA) Gartner MDM Excellence Award 2013, Alain has one driver: value, and one key message: data means business. Alain has over twenty years of international experience in the IT field. His prior experience includes project and portfolio management, quality assurance, risk management and audit. He received his Master’s degree from Télécom ParisTech. Follow Alain @alaindor

Data Governance - Raising Structures on No Man's Land
Data Governance - Raising Structures on No Man's Land

Changes to legislation and recent, well-publicised data breaches indicate that we have a perfect storm where ensuring compliance for an organisation and the best protection against attack both require an information governance framework and an information architecture shared across business and technology units.
Do we now have the type of compelling story that data specialists often struggle to identify when trying to engage stakeholders?  Can we explain the framework required in a way that engages the business and technology units and which they both understand?  Should it now be easier to resource, fund and build the right governance structures?
The presentation will focus on the protection of personal data, setting out:

  • The difficulties in bringing together the different areas of a large business which must collaborate to achieve end-to-end data governance and compliance
  • The enterprise architecture as a framework for governance
  • Key deliverables in the current approach being developed in response to the EU GDPR and examples of what has worked in other compliance activities
  • The impact on data governance of popular organisational and technological approaches being adopted, e.g. Cloud Services, Big Data and consolidation of systems into a ‘Global’ instance

Helen Hepburn, Information Architect, BT

Helen Hepburn

Information Architect, BT

Helen Hepburn has been an Enterprise Data Specialist for over 15 years and her focus is now on BT’s ‘Group’ Line of Business.  In her years with BT’s Technology division, she has worked primarily on the central Enterprise Architecture team but her current role is focused on the main Enterprise Resource Planning (ERP) functions:  HR, Payroll, Finance, Supply Chain and other pan-BT functions.  She takes a leading role in underpinning Data Privacy Compliance in the IT domain and is currently working on the impact of the EU General Data Protection Regulation on her area.

Smart or Surveillance Cities
Smart or Surveillance Cities

The ongoing digitalization of urban space entails new challenges in data governance. Smart Cities involve enormous amounts of data, the evaluation of which holds the key to creating sustainable and efficient structures. Data must be stored securely and legal issues must be clarified. Who owns which data and who is entitled to access it? Which data can be made accessible for the general public in the context of an open data initiative and which data must be kept under lock and key because it enables people to be identified or constitutes a security risk? To answer questions like these, new processes, tools, and mechanisms are needed to manage data and ensure its integrity. Christoph Kögler will provide compelling insights into the topic and best practices from actual Smart City projects.

Christoph Kögler, Head of Innovation T-Systems Multimedia Solutions GmbH

Christoph Kögler

Head of Innovation T-Systems Multimedia Solutions GmbH

Christoph Kögler heads the Innovation Department at T-Systems Multimedia Solutions GmbH, one of the leading European Internet solution providers. He has more than two decades of leadership and project management experience in multimedia, telecommunications and media industries, most of the time primarily in the areas of business development and innovation management. Christoph is certified as a Project Management Professional® and as trainer of SRI Internationals Five Disciplines of Innovation® program. As a recognized expert in the topic of innovation management, he coaches start-ups and intrapreneurs and is a regular speaker at national and international conferences.

16:25 - 17:10
Enabling Business Areas for DQM & MDM
Enabling Business Areas for DQM & MDM

Allianz SE is both one of world’s largest insurance company.  Its subsidiary Allianz Global Corporate & Specialty SE recently undertook an initiative to create Data Quality Management (DQM) and Master Data Management (MDM) processes.  This session will focus on enabling of business areas on DQM and MDM by discussing topics such as:

  • Focusing on the Master Data Update Process to provide Business alignment on master data updates to ensure reliable finance reporting
  • Creating rigorous Data Quality Management to ensure Data Quality alignment between Data Consumers & Data Producers
  • Establishing sustainable Data Change Management to enable data quality assurance for projects

Rudolf Pfaffenzeller, Senior Data Quality Analyst, Allianz Global Corporate & Specialty

Rudolf Pfaffenzeller

Senior Data Quality Analyst, Allianz Global Corporate & Specialty

After several years of project management, knowledge management and consulting at Swiss Life, Allianz Group Holding IT, Allianz Versicherungs AG, Rudolf Pfaffenzeller joined Allianz Global Corporate & Specialty in 2008 to form a Business Intelligence Framework. In 2011 he focused on establishment Data Governance and Data Quality Management. Since 2013 he has been also in charge of Master Data Management.

Expert Testimony: Field Reports for 'Top 10' RDM Solutions
Expert Testimony: Field Reports for 'Top 10' RDM Solutions

The impact of poor or non-existent reference data management (RDM) is profound.  Errors in reference data ripple outwards affecting quality of master data in each domain, which in turn affects quality in all dependent transactional and analytical systems.  Because reference data is used to drive key business processes and application logic, errors in reference data can have a major negative and multiplicative business impact.  More than 55% of large enterprises surveyed by the MDM Institute are planning on implementing RDM in the next 18 months.  This session will focus on the “why” and “how” of RDM by providing insight into: Why is RDM mission critical today?  How does RDM differ from (how is it similar to) MDM?  What are the top business drivers for RDM?  Where are most organizations focusing their RDM efforts?  Topics to be discussed include:

  • Understanding the pros & cons of commercial RDM solutions vs. custom-built (“Buy vs. Build”)
  • Applying a “top10” evaluation criteria methodology to product evaluations for both mega vendor solutions (IBM RDM Hub, Informatica, Oracle DRM) and more pure play (Ataccama, Collibra, Kalido, Orchestra, Software AG, Teradata, TopQuadrant, et al)
  • Planning for the future of RDM (dimension management for Big Data marts) & its relationship to overall MDM programs

Global Agility via MDM
Global Agility via MDM

Like most other disciplines, MDM is going through evolutionary phase.  Although many companies already have implemented “MDM Frameworks”, the companies who are yet to implement face many new and evolving challenges.  National Bank of Abu Dhabi (NBAD) is the largest lender bank in the Emirate of Abu Dhabi and has the largest market capitalization among UAE banks. Its operations span across 17 countries in five continents, from the Far East to Americas.  Consequently, NBAD’s data is quite diverse – some very localized and yet others globalized, culturally-shaped and regulated by many external authorities.  This session will focus on the main challenges faced since NBAD started its MDM journey in 2015, and how they were addressed to enable agility in the bank.  Topics include:

  • Understanding the potential problems of data diversification
  • Rationalising the cultural elements of data management & the options of centralization vs. decentralization as a choice of data management
  • Managing the importance of data classification (localized, regionalized or globalized) & the impact of regulations on data management design

Mustafa Dulgerler, Senior Enterprise Architect, National Bank of Abu Dhabi

Mustafa Dulgerler

Senior Enterprise Architect, National Bank of Abu Dhabi

Mustafa Dulgerler has managed IT projects across industries ranging from intelligent transportation and healthcare to manufacturing and banking.  Mr. Dulgerler specializes in leading mid-sized teams of highly skilled engineers to solve some of the most pressing challenges in enterprise IT.  As an Enterprise Architect at National Bank of Abu Dhabi, he is responsible for defining the technology and business platform to execute strategy. He is also a widely recognized Project Management Trainer in the Middle East.  Mustafa has been invited to various international events as a speaker including PMI Global Congress 2015 and IRM UK Enterprise Architecture Congress 2015 in London, Middle East Forum 2015 and Big 5 Dubai Congress in Dubai.

Marketing your Information Governance initiatives through Info-man and Data-Kid: Defenders of the Customer Journey
Marketing your Information Governance initiatives through Info-man and Data-Kid: Defenders of the Customer Journey

The session will present DG professionals with a new and innovative approach to marketing DG and Information control in order to successfully win business buy-in across the organisation. Traditional methods of “selling” DG frameworks to the business through Committees, Data Stewards, Policy, Procedures (glossaries) and new processes can be steeped in industry jargon, time and resource intensive, resulting all too often in a lack of business commitment and ultimately failure.
This session will introduce DG professionals to a marketing concept involving two fictional “information superheroes”: Info-Man and Data-Kid.
INFO-MAN represents the criticality of what a healthy Data Governance Framework (or healthy Data) represents to an organisation. When Info-man is healthy consequently the business is lean, dynamic, flexible, informed, acclimatised for change, and prepared for growth. Equally the organisation can “fight threats” or risks such as bad reporting and forecasting, data and (privacy and protection) security breaches.
Conversely an unhealthy Info-Man i.e. overweight, withdrawn and uninspired becomes an antihero character for the organisation. The unhealthy Data will have exactly the opposite consequences for an organisation.
Data-Kid is info-Man’s assistant or “sidekick”. She symbolises the Data Stewardship roles (lives in each business area). This character is smaller and more agile, adept in dealing with the threats by using her special powers in the form of data quality IT tools – the micro-level or technical elements of data quality, definitional issues, and tooling issues.

Furkan Sharif, Information and Records Officer, Provident Financial

Furkan Sharif

Information and Records Officer, Provident Financial

Furkan Sharif is implementing a records management framework, and information classification, handling, retention and destruction controls, as well creating an End User Computing Applications (EUC’s) risk identification process at Provident, Satsuma Loans and Glo.  He is introducing a culture of information and records management, through policies, procedures, processes, IT systems and most crucially rolling out awareness and training for all colleagues. Prior to this role, for seven years he worked as an in-house Counsel (Legal Officer) at Acorn Mobility Services Ltd, working on a project to implement a data and records management framework across an international organisation. He has a Master’s in Law, from SOAS (University of London) specialising in banking law. Furkan is passionate about data and information governance, taking a creative approach to translating industry jargon into everyday language.

Addressing Legal, Privacy, and Compliance Issues in Data Governance
Addressing Legal, Privacy, and Compliance Issues in Data Governance

As it emerges from a more technical background, and becomes recognized as a valuable business resource, it is natural that data will be subject to additional pressures of a more commercial kind.  Amongst these pressure, legal, privacy, and compliance concerns are rapidly gaining ground.  This presentation examines the nature of these pressures and how they may be successfully addressed by Data Governance, particularly in the context of key Master Data entities such as individual and corporate customer.   Control of data sourcing, both from data vendors and from open sources is discussed.  Understanding the contractual implications of acquired data, and finding concrete ways to implement compliance for these implications is described, along with the dangers of atypical contractual clauses.   How to deal with laws and regulations, particularly in a global context, is examined.
Attendees will learn:

  • What the scope of legal, privacy, and compliance concerns are today for Data Governance
  • The necessity of partnering successfully with other areas of the business such as legal, vendor management, PMO, and internal audit
  • Tools for implementing successful legal, privacy, and compliance requirements
  • The roles of technology in support of legal, privacy, and compliance requirements

Thursday 19 May: Post Conference Workshops
09:00 - 16:30
Successful Implementation of a Master Data Management Programme
Successful Implementation of a Master Data Management Programme

This workshop focuses on the key elements of an MDM programme that are needed for overall success.  It gives practical recommendations while at the same time providing a conceptual understanding of what is involved in these recommendations.  Both governance and management are covered, and emphasis is placed in how MDM fits into a larger business strategy and architectural setting.   MDM programmes are rapidly evolving as new data possibilities emerge and enterprises demand more from MDM than they have previously.  These emerging challenges of MDM are addressed in detail.

  • What is Master Data, how does it differ from other classes of data
    • What the practical implications are for governing and managing Master Data.
    • How do you control scope on your MDM program
    • Dealing with business justification, and managing expectations
    • How strategies like customer-centricity require MDM
    • How new MDM-driven business models are emerging, and their implications
  • Architectures for MDM
    • The traditional MDM architectures (hubs) and their pro’s and con’s
    • The separation of Master Data creation and distribution
      Deciding the scope of Master Data, particularly static versus profile data
    • What MDM vendors can do for you and what they cannot
    • The need to put MDM into a bigger architectural picture to achieve business results
  • How to work with the business to be successful in MDM
    • Examples of how different Master Data entities require different overall approaches, and the implications for an MDM program.
    • How to deal with the Operations business community and meet their needs
    • How to deal with the Analytical business community and meet their needs
    • Getting governance needed for MDM into the business
  • Dealing with data integration, changed data capture, and data quality successfully
    • What is the right approach to data integration?
    • How to implement continuous production data quality monitoring
    • Inferring history, versus capturing historical events, and how to store this
    • The role of business rules in driving MDM
  • Mastering Master Data Semantics
    • Dealing with the different “types” of Customer, Financial Instrument, etc.
    • Capturing knowledge of the Master Data
    • The vital role of reference data in MDM
    • The roles of generic data models and specific data abstraction layers
  • Emerging Areas in MDM
    • Dealing with legal, privacy, and compliance issues of Master Data
    • Sourcing Master Data from outside the enterprise
    • Social Media fraud
    • The role of Big Data

Getting to the Next Maturity Level with Information Governance: Delivering Accuracy and Trust
Getting to the Next Maturity Level with Information Governance: Delivering Accuracy and Trust

We have evolved from the age of automation to the information age. Proper information management and insights have become a linchpin that act as a catalyst for the execution of your business strategies. Information can be supporting or defining your business model. Having the data in your organisation is not enough as the true value comes from your ability to turn the data into operational information and insights that allow you to create business value and make strategic and tactical decisions. Aligning your information requirements with strategic business objectives is critical.

  • Linking your business strategy to information flows
    • Architecting the business semantics
  • Information Enablement, establishing the information capabilities ◦
    • Capabilities required to support your information strategy:
    • Persistency: Column Based Storage, Appliances, In-memory Computing, NOSQL, Hadoop, .
    • Positioning the information management patterns; virtualisation, Extract-Transform-Load, Enterprise Application Integration, Web services, Enterprise Service Bus, Change Data Capture,
    • Managing the information life cycle: ILM platforms
  • Managing Accuracy and Trust
    • Delivering quality and security
  • Getting the business buy-in

Organising The Data Lake – Information Governance In A Big Data World
Organising The Data Lake – Information Governance In A Big Data World

For many companies, data preparation and integration is now happening almost everywhere using traditional ETL tools, data wrangling tools on Hadoop, self-service BI tools, custom code. In addition new data sources are increasing rapidly. The result is that cost of data integration is rising rapidly, silos are emerging and complexity in terms of managing a governing data is getting out of control. Therefore many say to create a ‘data lake’. But with thousands of files on premises and in the cloud the data lake is turning into a swamp. This 1-day workshop looks at this problem and proposes a new approach to organise, govern, process and provision data in a distributed data reservoir. It shows how data can be governed across Hadoop and non-Hadoop storage.

  • The increasing complexity of distributed data
  • Requirements for managing and governing data in a data lake
  • Introducing the data reservoir and data refinery
  • Controlling governance using classification and metadata in an information catalog
  • Governance aware runtimes
  • Roles, classifications, zones and services to manage, govern and prepare data
  • Using Apache Atlas to integrate metadata
  • Using publish and subscribe model to provision trusted data-as-a-service
  • Getting started

Applying Information Quality Principles to Regulatory Change – Getting Ready for the GDPR and Beyond
Applying Information Quality Principles to Regulatory Change – Getting Ready for the GDPR and Beyond

Significant Regulatory change has arrived that affects all organisations that process personal data. This is just one of a range of regulatory drivers for Information Quality however. By applying sound quality management principles, practices, and approaches to metrics across the life cycle of information, organisations can leverage the stick of the GDPR to dig up the carrots of business value, and reduced cost of non-quality in their data. This session explores how the requirements of the GDPR will affect the quality standards for information design and data process implementation and how current practices and tools in your organisations for measuring data quality will be transferable to the new need. Examples will be drawn from other data quality related Regulatory and Ethical breaches in 2015 to illustrate key points of how fundamental principles are applied.

  • Understand why Information Product Specification will be critical in a GDPR-compliance environment
  • Understand why your current approaches to modelling customer data are no longer fit for purpose
  • Learn how simple quality metrics can be used to help you drive your Data Protection risk mitigation strategy (and evidence its effectiveness
  • Find out what Critical-to-Quality metrics you should be considering for your data protection program, and how your Information Quality team can help drive this.
  • Learn about a holistic Ethical framework for positioning Regulatory, Quality, and Governance drivers in your organisation

Daragh O Brien, Castlebridge  

Daragh O Brien

Castlebridge  

Recently rated the 24th most influential person in Information Security worldwide on Twitter (http://www.onalytica.com/blog/posts/data-security-top-100-influencers-and-brands/ ), Daragh O Brien, FICS, is a leading consultant, educator, and author in the fields of Information Privacy, Governance, Ethics, and Quality. After over a decade in a leading telco, Daragh now works with clients in a range of sectors on a range of Information Management challenges.  Daragh is a Fellow of the Irish Computer Society and a Privacy Officer for DAMA-l. He teaches Data Privacy Law and Practice at the Law Society of Ireland. Castlebridge is a commercial partner of the Adapt Centre in Trinity College Dublin and collaborates with the Insight Centre for Digital Analytics, Europe’s largest Analytics research group. Follow Daragh on Twitter @cbridgeinfo.

Sorry, your screen is too small to view this agenda.

Please switch to a larger device to see the full details.

Fees

  • 4 Days
  • £1845
  • £1,845 + VAT (£369) = £2,214
  • 3 Days
  • £1495
  • £1,495 + VAT (£299) = £1,794
  • 2 Days
  • £1145
  • £1,145 + VAT (£229) = £1,374
  • 1 Day
  • £695
  • £695 + VAT (£139) = £834
Group Booking Discounts
Delegates
2-3 Delegates 10% discount
4-5 Delegates 20% discount
6+ Delegates 25% discount
UK Delegates: Expenses of travel, accommodation and subsistence incurred whilst attending this IRM UK conference will be fully tax deductible by the employer company if attendance is undertaken to maintain professional skills of the employee attending.
Non-UK Delegates: Please check with your local tax authorities
Cancellation Policy: Cancellations must be received in writing at least two weeks before the commencement of the conference and will be subject to a 10% administration fee. It is regretted that cancellations received within two weeks of the conference date will be liable for the full conference fee. Substitutions can be made at any time.
Cancellation Liability: In the unlikely event of cancellation of the conference for any reason, IRM UK’s liability is limited to the return of the registration fee only.IRM UK will not reimburse delegates for any travel or hotel cancellation fees or penalties. It may be necessary, for reasons beyond the control of IRM UK, to change the content, timings, speakers, date and venue of the conference.

Venue

  • 22 Portman Square
  • London W1H 7BG
  • UK

Platinum Sponsors

Silver Sponsors

Standard Sponsors

Supported By

DAMA International

DAMA International is a not-for-profit, vendor-independent association of technical and business professionals dedicated to advancing the concepts and practices for data resource management and enterprise information. The primary purpose of DAMA International is to promote the understanding, development, and practice of managing data and information to support business strategies.   As Data Management becomes more relevant to the business, DAMA is keeping pace with new products and services such as the 2nd edition of the DAMA Data Dictionary, the DAMA BOD (Body of Knowledge) and several new certification exams.  We are participating on the Boards of many academic and standards bodies and sharing our knowledge with other organizations.
DAMA International is pleased to announce that a new chapter is forming in Turkey which will join the 8 other European chapters as part of DAMA International.   DAMA International and its affiliated chapters have grown year after year with chapters operating in Australia, China, India, North America, South America, Japan, and South Africa and DAMA is facilitating the formation of new chapters in many other countries.
As a DAMA member you receive the benefits of your local or global chapter’s activities and all the benefits of DAMA International’s products and services. You can network with other professionals to share ideas, trends, problems, and solutions. You receive a discount at DAMA International conferences and seminars, and on associated vendor’s products and services. To learn more about DAMA International, local chapters, membership, achievement awards, conferences and training events, subscriptions to DM Review and other publications, discounts, job listings, education and certification, please visit the DAMA International web page at www.dama.org.  Both the DAMA UK chapter and DAMA International will have a meeting during the conference.  We invite interested parties to join this vital and growing organization.  More information can be found at www.dama.org or you can email me at president@dama.org.

DAMA UK

The drive for the future is to successfully focus on providing quality support to core members whilst guaranteeing sufficient financial income to ensure sustained activity.   The four areas which DAMA UK recommends addressing over the next two years are:
Academic – to survey UK organisations to understand their Data Management skill set needs and then induce academic institutions to supply them.
Data Quality (DQ) – to benchmark data quality standards in the UK and encourage development of business awareness of the importance of DQ and help develop DG metrics.   Government regulations versus data – to increase awareness of the legal implications of data management, assist organisations in reducing their legal liabilities and support ETA (and others) lobby for “data clever” legislation.   Data Standards – survey requirements then work with other organisations (eg BCS) to develop effective data standards.

DGPO

The Data Governance Professionals Organization (DGPO) is an international non-profit, vendor neutral, association of business, IT and data professionals dedicated to advancing the discipline of data governance.   The DGPO provides a forum that fosters discussion and networking for members and seeks to encourage, develop and advance the skills of members working in the data governance discipline.   Please click here to view a PowerPoint overview of the DGPO.

ECCMA

Formed in 1999; the Electronic Commerce Code Management Association (ECCMA) has brought together thousands of experts from around the world and provides a means of working together in a fair, open and extremely fast internet environment to build and maintain global, open standard dictionaries used to unambiguously label information without losing meaning. ECCMA works to increase the quality and lower the cost of descriptions through developing International Standards.   ECCMA is the original developer of the UNSPSC, the project leader for ISO 22745 (open technical dictionaries and their application to the exchange of characteristic data) and ISO 8000 (information and data quality), as well as, the administrator of US TAG to ISO TC 184 (Automation systems and integration), TC 184 SC4 (Industrial data) and TC 184 SC5 (Interoperability, integration, and architectures for enterprise systems and automation applications) and the international secretariat for ISO TC 184/SC5. For more information, please visit www.eccma.org.

BCS Data Management Specialist Group (DMSG)

The BCS Data Management Specialist Group (DMSG) helps Data Management professionals support organisations to achieve their objectives through improved awareness, management, and responsible exploitation of data.
We run several events each year whose focus areas include:
•    The benefits of managing data as an organisational asset
•    Skills for exploitation of data
•    Data governance as a ‘Business As Usual’ activity
•    Compliance with legislation, particularly that relating to data protection, data security and ethical usage of data
Our audience is anyone with an interest in the benefits to be gained from data. This includes: Chief Data Officers (CDO); Senior Information Risk Officer (SIRO); data managers/stewards; data governance officers; data protection/security advisors; data scientists; and business/data/database analysts.

Association of Enterprise Architects   

The Association of Enterprise Architects (AEA) is the definitive professional organization for Enterprise Architects. Its goals are to increase job opportunities for all of its members and increase their market value by advancing professional excellence, and to raise the status of the profession as a whole.

EDM COUNCIL

About the EDM Council
The EDM Council is a neutral business forum founded by the financial industry to elevate the practice of data management as a business and operational priority. The prime directive is to ensure that all consumers (business and regulatory)  have trust and confidence that data is precisely what is expected without the need for manual recalculation or multiple data transformations. There are four programs of the Council:
•    Data Content Standards (FIBO): the standards-based infrastructure needed for operational management (identification, semantic language of the contract, classification).  We own the industry ontology for financial instruments and entity relationships and make it available as an open source standard
•     Data Management Best Practices (DCAM): the science and discipline of data management from a practical perspective (data management maturity, data quality, benchmarking).
•    Data Implications of Regulation: translating the legislative objectives of transparency, financial stability, compressed clearing and cross-asset market surveillance into regulatory objectives and practical reporting requirements.
•    Business Network: global meeting ground, CDO Forum and mechanism for sustainable business relationships
There are 135 corporate members of the Council (http://www.edmcouncil.org/councilmembers). We are governed by a board of 24 (http://www.edmcouncil.org/board). For more information visit www.edmcouncil.org.

Belgian Association of Data Quality   

DQA is the Belgian Association of Data Quality professionals. Our goal is to bring together people interested in the DQ subject to share experience and knowledge.

Media Partners

TDAN.com

The Data Administration Newsletter, LLC — http://www.tdan.com — is an award winning electronic publication that focuses on the various disciplines of data management. TDAN.com will celebrate its 19th Anniversary in July of 2016 and presently attracts tens of thousands of visitors a month. The newsletter is published by Robert S. Seiner of KIK Consulting & Educational Services (http://www.KIKconsulting.com), a well-known data management specialist that focuses on Non-Invasive Data Governance™, Data Stewardship and Metadata Management program development.

IMNews360.com

IMNews360.com is an online news aggregator dedicated to the Information Management Industry. We collect and organize news and insights from hundreds of sources and thousands of articles, press releases, blogs and educational content published each week. In doing so, we make it easy for business and technology professionals to navigate the clutter and stay abreast of breaking news, understand trends, research topics, follow companies and authors that influence businesses and careers.

Via Nova Architectura

A number of thought leaders in the area of business – and IT architectures have set up a digital magazine on architecture: Via Nova Architectura. Although started as an initiative within the Netherlands, the magazine should reach all those interested in the area of architecture, where-ever they live. Via Nova Architectura aims to provide an accessible platform for the architecture community. It is meant to be the primary source of information for architects in the field. The scope of Via Nova Architectura is “digital” architecture in the broadest sense of the word: business architecture, solution architecture, software architecture, infrastructure architecture or any other architecture an enterprise may develop to realize its business strategy.

Technology Evaluation Centers

Technology Evaluation Centers (TEC) helps organizations choose the best enterprise software solutions for their unique needs—quickly and cost effectively. With detailed information on over 1,000 solutions in the world’s largest vendor database, TEC delivers a broad range of evaluation and selection resources to help ensure successful software selection projects. As impartial software evaluators since 1993, TEC’s expert team of analysts and selection professionals are involved in thousands of software selection projects every year. The TEC newsletter goes out to 920,000 subscribers and is available in 4 languages.  Visit TEC: www.technologyevaluation.com.  
Subscribe to the TEC Newsletter: http://www.technologyevaluation.com/newsletter-subscribe/

IT-LATINO.NET

IT-latino.net is the most important online Hispanic IT Media Network. With more than 120,000 registered users we have become an important online IT Business Forum organizing daily webinars and conferences on different Technology issues. We inform regularly a strong IT community from both sides of the Atlantic: Spain and Latin America.

TechWeek Europe

TechWeekEurope UK, is the authoritative UK source for news, features and reviews of business technology. Its aim is to help IT decision makers enhance their business with technology.   The site provides insight on topics including mobility, security, cloud computing, public sector and sustainable IT.   TechWeekEurope UK is published by NetMediaEurope, a leading B2B IT publisher owning a portfolio of more than 40 sites across Europe.

IAIDQ

IAIDQ trading as IQ International
IQ International (IAIDQ’s new trading name) is the professional association for those interested in improving business effectiveness through quality data and information.    IQ International is a leading source of credible and unbiased best practices on how to improve Information Quality to deliver great business results.   IQ International delivers:
•    IQ International Webinars
•    IQ International Journal
•    IQ Excellence Award
•    IQCP Certification
Join IQ International today … visit iaidq.org

Big Data Group

Big Data Group Brief
This is a vibrant community dedicated to promote Big Data & Visualization softwares, best practices and innovations needed for enterprises to get maximum value from massive amounts of data. 130000+ members Largest Big Data Experts Professional Group. Go beyond the big data hype.
About Big Data LinkedIn Group
A premier community for both existing expert professionals and companies researching the convergence of big data analytics and discovery, Hadoop, data warehousing, cloud, unified data architectures, digital marketing, visualization and business intelligence.
Who should be the member in this group?
•    @ Hadoop developers, data scientists, business analysts, statisticians and hackers
•    @ Business leaders and marketers who leverage data to compete and win
•    @ Enterprise architects, IT, data warehousing, and business intelligence professionals
•    @ CIO, CMO & CDO of large enterprises
Currently Group has more than 130000+ members and growing exponentially.
We hope to bring together stakeholder communities across industry, enterprises, academic, and government sectors representing all of those with interests in Big Data & Visualization techniques, technologies, and applications. The group needs your input to meet its goals so please join us for the discussion, expert comments, learning’s and contribute your ideas and insights.
Join Now.
For more details Click Here.