Agenda Highlights

Plenary Keynote – The New Literacy: The Skills and Insights You Need in the Information Economy
Donald Farmer, Principal, TreeHive Strategy

Enterprise Data Keynote: Big Data for Audience Personalisation at the BBC: Experiences in Wrangling Lots of Data, Evolving Technology and Changing Regulation
Janani Dumbleton, Head of Data & Leyla Khalili, Executive Product Manager, BBC

Enterprise Data Keynote: Are We the Baddies? The Ethical Wakeup Call for Information Professionals and Data Provocateurs in the IoT Age
Daragh O Brien, Leading Consultant, Educator and Author, Castlebridge

Enterprise Wide Data Quality Programme
Lars Slagboom, Head of Data Management, ABN AMRO

Don’t Become a Data Quality Slave
Cristina de Salas, Data Quality Expert, Zurich Insurance

An Insight to the Introduction of Gamification within an Enterprise DQ Solution
Dan Griffiths, Lead Data Analyst, BAE Systems

Open for Business: a Collaborative Approach to Developing Data Standards in the UK Environment Agency
Becky Russell, National Lead for Data Standards, Environment Agency
Nigel Turner, Principal Information Management Consultant EMEA, Global Data Strategy

Propagating Data Principles
Andrew Newman, Data Policy Manager & Lisa Allen, Head of Data Governance, Department for Environment Food & Rural Affairs

Merging Enterprise Data and Measuring Its Value
Suzanne Coumbaros, Head Data Governance, The Co-operative Bank

How to Create Massive IMPACT and be an Effective Zoo Keeper
Nigel Risner, Motivational & Inspirational Speaker

Just a few of the Case Studies include:

Keynote & Featured Speakers

Testimonials

Sponsors

Fees

  • 4 Days
  • £1945
  • £1,945 + VAT (£389) = £2,334
  • 3 Days
  • £1595
  • £1,595 + VAT (£319) = £1,914
  • 2 Days
  • £1245
  • £1,245 + VAT (£249) = £1,494
  • 1 Day
  • £795
  • £795 + VAT (£159) = £954
Group Booking Discounts
Delegates
2-3 10% discount
4-5 20% discount
6+ 25% discount

Venue

  • Radisson Blu Portman Hotel
  • 22 Portman Square
  • London W1H 7BG
  • UK

Join the conference group

Agenda

Monday, 20 November 2017: Pre-Conference Workshops
10 Analytics & Data Science Fundamentals Anyone Should Know

Jasper de Vries

Play time is over. Data Scientists need to leave their data labs and start delivering real added value. The techniques and models have been around for decades and recent technologies lower the barriers to use them. But then what? How to make sure they really add value to your company so they can deliver the transformational power everyone expects?

Whether you fit the description above or you are just thinking about getting started with Data Science, this highly interactive workshop offers you insight into critical elements of value added Data Science. Derived from experiences by clients in finance, healthcare, automotive, human resources, leisure and retail, these lessons learned will be applicable across all sectors and make sure you get where you need to be.

Key take aways:

  • What Data Science can do for you
  • Insight into crucial preconditions for an effective usage of Data Science
  • Best practices of Data Science

Technology for Big Data and Fast Data Explained

Rick van der Lans, R20/Consultancy

With the introduction of big data and later on fast data, a tsunami of new technologies for data storage, processing, and transportation has been introduced. Hadoop, Spark, Kafka, NoSQL, MapReduce, Hive, SQL-on-Hadoop are just a few of the countless technologies that have become available for developing big data systems. And with streaming data and the Internet of Things fast data has attracted the attention of many organizations as well. It’s great to have so many technologies available, but which ones do you pick? Due to this waterfall of new technologies, it’s becoming harder and harder for organizations to select the right tools. Which technologies are relevant? Are they mature? What are their use cases? These are all valid but difficult to answer questions. This tutorial gives a clear, extensive, and critical overview of all the new key technologies for developing big data and fast data systems. Technologies are explained, market overviews are presented, strengths and weaknesses are discussed, and guidelines and best practices are given.

  • New analytical needs, including data science, investigative analytics, and streaming analytics
  • Differences between semi-structured, poly-structured, multi-structured, and unstructured data
  • The Hadoop stack: HDFS, MapReduce, Hive, Kafka, Spark, HBase, YARN, ZooKeeper, Pig, HCatalog, and so on
  • Big SQL Solutions: SQL-on-Hadoop, NewSQL, analytical SQL Database Servers, and Streaming SQL Databases
  • Technologies for streaming data: Apache Kafka, Apache ActiveMQ, Amazon Kinesis, Kestrel, RabbitMQ, and ZeroMQ

Blockchain Fundamentals

Anders Brownworth, Chief Evangelist, Circle

Blockchain technology has garnered much hype recently while proven use cases remain few. Much of that is because the technology is not well understood. Anders Brownworth of Circle, a person to person payments company which leverages blockchain technology, will present a half-day session designed to get you familiar with the technology and help you sort the useful bits from the hype. Designed for a non-technical audience, we will start from a clear conceptual model and build up to some of the more recent advancements such as complex smart contracts and state channels. After this session, you will have a firm grasp of the core concepts and be able to identify viable uses for blockchain technology.

Information Management Fundamentals

Chris Bradley, Data Management Advisors

This workshop provides an introduction to the disciplines across the complete Information Management spectrum. Additionally, this course provides a solid foundation for students considering an Industry professional certification such as IRMS, ICCP CDP or DAMA CDMP.

Learning Objectives:

This workshop is intended to provide you with the knowledge, methods and techniques required to analyse, mature and implement information management solutions within your organisation. Areas covered include:

  • Data Governance
  • Data Quality Management
  • Master and Reference Data Management
  • Business Intelligence & Data Warehousing
  • The essential role of Data modelling
  • Data Lifecycle Management
  • Metadata Management
  • Risk, Security & Regulatory compliance
  • Data Operations
  • Content & Records management
  • Data Integration & Interoperability

At the end of the course, delegates would have gained the following:

  • Learn about the need for and application of Information Management disciplines for different categories of challenges
  • Explore concepts such as lifecycle management, data modelling and data virtualisation and why they are important
  • Understand the critical roles of Master Data Management and Data Governance and how to effectively apply them
  • Learn about the different Data Management disciplines and their interrelations.

A comprehensive version of this course over 3 days is offered by IRM UK – for full details click here

Big Data Governance

Jan Henderyckx, Inpuls

Big data governance is not just about making sure that you efficiently use your Hadoop cluster or assuring that you work on the relevant use cases. With the democratisation of big data capabilities and the wider access to data, questions arise on the regulatory- and ethical compliance of the data usage. Locking all data down is not the answer as we would lose too much value.

This presentation focuses on the steps you need to take to get sustainable and compliant value out of your big data.

What delegates will learn from attending the session:

  • What is the distinction between Information and Big Data Governance
  • Catering for the dynamics of data on-boarding and usage flows
  • Towards policy-based classification and access
  • Use case governance vs Critical Data Elements
  • Impact of the Big Data Governance requirements on the architecture

Getting Started With Data Quality – A Primer

Jon Evans & Nic Jefferis, Equillian

Today, more than ever, the quality of data, underpinned by a robust approach to Data Quality Management, is critical to the success of every organisation. Unfortunately, it is a topic that is still impenetrable to many through the use of unfamiliar jargon and too much emphasis on technology.

In this half-day workshop, Equillian’s Jon Evans and Nic Jefferis seek to redress the balance, by taking the audience on a journey from first principles right through to advice on establishing a Data Quality Programme. Along the way, both beginners and those already familiar with the topic will benefit from a business-focused approach, based on industry best practice coupled with many years of experience helping organisations tackle their Data Quality challenges.

The session will be structured around 4 key topics:

  • Why should I care about data quality?
  • Monitoring data quality
  • Improving data quality
  • Developing a DQ Programme

What Does MDM Have to Do With Innovation

Lars Nordwall, Neo Technology

“Your Master Data Is a Graph”. Whether it’s the organisation master or a product master involving complex hierarchies and relationships, Master Data invariably takes the form of a graph or network, and is best modelled, stored and queried using a native graph technology. Whether you are using a packaged MDM solution or a building a custom MDM solution, a Graph Database can help you get a higher ROI by reducing complexity, increasing agility and improving the speed and efficiency of your Master Data initiative.

Join this session to learn how a Graph Database fits into your MDM solution and how market leading organisations like Pitney Bowes, Cisco and UBS are gaining significant competitive advantage by adopting different MDM implementation styles to incorporate graph technology into their solution portfolio.

Topics to be discussed include:

  • Understanding how a graph database complements MDM – from personalised product & service recommendations to websites adding social capabilities
  • Identifying the benefits of different MDM implementation styles – ranging from using Graph Database as the primary repository for your Master Data to using a Graph Database to build a metadata registry
  • Learning from industry-proven best practices in adopting Graph Databases

09:30 - 12:45
10 Analytics & Data Science Fundamentals Anyone Should Know

Jasper de Vries,

Jasper de Vries

Jasper de Vries has been working with data analytics for over 15 years. He is co-founder of a startup which uses AI to improve the way people learn, owner of an independent consultancy practice and regularly writes on technology and data driven topics. During his career he has actively been involved in every phase of data analytics and worked with an extensive variety of data types and technologies. He has participated in numerous data (related) projects and set up data analytics teams across industries in The Netherlands. His clients range from C-level executives during data strategy trajectories to operational level when streamlining existing data initiatives. During all of his endeavours, he applies an holistic approach on how to innovate with data and technology by connecting dots between business goals, technological possibilities, cultural prerequisites and efficient ways of organising. He presents his work on a regular basis during seminars in The Netherlands and abroad.

Technology for Big Data and Fast Data Explained

Rick van der Lans, Independent Analyst, Consultant, Author and Lecturer, R20/Consultancy

Rick van der Lans

Independent Analyst, Consultant, Author and Lecturer

R20/Consultancy

Rick van der Lans is an independent analyst, consultant, author, and lecturer specializing in data warehousing, Business Intelligence, and database technology. He is Managing Director of R20/Consultancy. Mr. van der Lans is an internationally acclaimed lecturer. For many years now, he has been the chairman of the annual European Business Intelligence and Data Warehouse conference. Rick writes for various websites including the well-known B-eye-Network.com, and he is the author of many whitepapers. His popular books, including “Introduction to SQL” and “The SQL Guide to Oracle”, have been translated into many languages and have sold over 100,000 copies. Recently, Rick published a new book entitled “Data Virtualization for Business Intelligence Systems”.

Blockchain Fundamentals

Anders Brownworth, Chief Evangelist, Circle Internet Financial

Anders Brownworth

Chief Evangelist

Circle Internet Financial

Anders Brownworth is Chief Evangelist at Circle where he champions the Spark smart contract platform. He taught the first blockchain class at MIT and previously helped create and launch Republic Wireless, a WiFi / cellular hybrid smartphone service.  Circle is making it as easy to move money as it is to share text, photos and videos on the Internet. Using our experience in internet technology and global finance (like helping to found and build the Allaire Corporation and Brightcove, and honing our skills at places like Square, JP Morgan Chase, Goldman Sachs, Adobe and Amazon), we’re making online payments easier to use, safer and more convenient than ever.

Information Management Fundamentals

Chris Bradley, Information Strategist, Data Management Advisors Ltd

Chris Bradley

Information Strategist

Data Management Advisors Ltd

Christopher Bradley has spent 35 years in the forefront of the Information Management field, working for leading organisations in Information Management Strategy, Data Governance, Data Quality, Information Assurance, Master Data Management, Metadata Management, Data Warehouse and Business Intelligence.   Chris is an independent Information Strategist & recognised thought leader.  Recently he has delivered a comprehensive appraisal of Information Management practices at an Oil & Gas super major, Data Governance strategy for a Global Pharma, and Information Management training for Finance & Utilities companies.  Chris guides Global organizations on Information Strategy, Data Governance, Information Management best practice and how organisations can genuinely manage Information as a critical corporate asset.  Frequently he is engaged to evangelise the Information Management and Data Governance message to Executive management, introduce data governance and new business processes for Information Management and to deliver training and mentoring.  Chris is Director of the E&P standards committee “DMBoard”, an officer of DAMA International, an author of the DMBoK 2.0, a member of the Meta Data Professionals Organisation (MPO) and a holder at “master” level and examiner for the DAMA CDMP professional certification. Chris is an acknowledged thought leader in Data Governance, author of several papers and books, and an expert judge on the annual Data Governance best practice awards. Follow Christopher on Twitter @inforacer.

Big Data Governance

Jan Henderyckx, Managing Partner, Inpuls     

Jan Henderyckx

Managing Partner

Inpuls     

Jan Henderyckx is a highly rated consultant, speaker and author who has been active in the field of Information Management and Relational Database Management since 1986. He has presented, moderated and taught workshops at many international conferences and User Group meetings worldwide. Jan’s experiences, combined with information architecture and management expertise, have enabled him to help many organisations to optimise the business value of their information assets. He has a vision for innovation and ability to translate visions into strategy. A verifiable track record in diverse industries including non-profit, retail, financial, sales, energy, public entities. Contributed to better stream lined and higher yielding operations for some of the leading businesses through a combination of creativity, technical skills, initiative and strong leaderships. He is a Director of the Belgium and Luxembourg chapter of DAMA (Data Management Association) and runs the Belgian Information Governance Council. He has published articles in many leading industry journals, and has been elected to the IDUG Speakers Hall of Fame, based upon numerous Best Speaker awards. Jan is Chair of the Presidents Council DAMA International.

Getting Started With Data Quality – A Primer

Jon Evans, Information Strategist & Founder, Equillian

Jon Evans

Information Strategist & Founder

Equillian

Jon Evans is an Information Strategist, self-confessed data quality geek and the founder of Equillian, an independent UK consultancy practice specialising in Enterprise Information Management. For the past two decades, he has been helping organisations harness their information and transform it into a strategic business asset. His wealth of experience covers all the key disciplines that help define, manage and exploit enterprise information, from putting in place effective Data Governance to delivering insight through Business Intelligence. In the field of Data Quality, he contributes expert knowledge and thought-leadership, drawing upon a track record of successfully delivering DQ initiatives to a wide range of organisations, including a key role in advancing the statistical analysis of health data. As a regular speaker and panellist at industry events, Jon enjoys bridging the gap between the business and IT domains, bringing fresh understanding and clarity – the same approach he adopts as a respected Information Management coach and mentor. Follow Jon on Twitter: @MadAboutData.

Nic Jefferis, Information Consultant, Equillian

Nic Jefferis

Information Consultant

Equillian

Nic Jefferis is one of Equillian’s Information Consultants and a highly respected Data Governance expert. With a career spanning three decades, Nic has the knowledge and experience that can only be gained by tackling IM challenges from a variety of different perspectives – both as a consultant and as a client. His initial experience in Data Warehousing and BI implementations inspired him to cross over to the sharp end of Information Management, with high profile roles in Data Governance and Data Quality Management. As a Certified Data Management Professional, Nic now specialises in helping organisations understand their strategic information requirements, develop future roadmaps and adopt IM best practice. His satisfaction comes from seeing information finally recognised as a valuable business asset. Follow Nic on Twitter: @ndjefferis.

What Does MDM Have to Do With Innovation

Lars Nordwall, COO, Neo Technology

Lars Nordwall

COO

Neo Technology

Lars Nordwall was born and raised in Stockholm, and has lived in Silicon Valley since 1998. Mr. Nordwall joined Neo Technology early 2011 as the COO, and has transformed the company from an early stage European based start-up to one of the rising stars in Silicon Valley.  His track record includes: (1) A turn-around and transformation of Pentaho from a flat-lined struggling Business Intelligence software provider to a leader in the Big Data Analytics space. The company was acquired by Hitachi for $600M; (2) SugarCRM where he joined as the founding VP of WW Sales, and established the firm to become one of the global SaaS CRM leaders; and (3) Cambridge Technology Partners (CTP), where he built his career foundation and was fortunate to experience rapid growth from 400 to 6,500 employees, transformation to one of the hottest consultancy firms in the world with a market cap over >$5B, followed by an acquisition by Novell.  Mr. Nordwall has an M.Sc. Degree in Mechanical Engineering from the Royal Institute of Technology in Stockholm, a B.Sc. in Business Administration from the Stockholm University School of Business, and he has completed an Executive Education Program at Harvard Business School in Boston.

Agile Methods and Data Warehousing: How to Deliver Faster

Kent Graziano, Snowflake Computing

Most people will agree that data warehousing and business intelligence projects take too long to deliver tangible results. Often by the time a solution is in place, the business needs have changed. With all the talk about Agile development methods like SCRUM and Extreme Programming, the question arises as to how these approaches can be used to deliver data warehouse and business intelligence projects faster. This presentation will look at the 12 principles behind the Agile Manifesto and see how they might be applied in the context of a data warehouse project. The goal is to determine a method or methods to get a more rapid (2-4 weeks) delivery of portions of an enterprise data warehouse architecture.

Real world examples with metrics will be discussed.

  • What are the original 12 principles of Agile
  • How can they be applied to DW/BI projects
  • Real world examples of successful application of the principles

Digital Data Strategy

Pieter den Hamer, Strategist, Alliander

Digital business is fueled by data. Therefore, managing data is a critical enabler of many business optimisation & innovation initiatives. However, most organisations are actually not very good in their data management. Typically, both data quality and interoperability are seriously hampering data-driven business opportunities with high costs and long time-to-market of new solutions. In this workshop, we will address the urgent need to improve your data management capabilities, including data governance & architecture, aligned with digital business strategy, and focused on data that matters most. Given the current maturity of your capabilities, how do you set priorities and design a roadmap for future development? How can we make data management less like a corporate dinosaur and more like an agile enabler of business change? And what standards, frameworks, roles and technologies are there to support your digital data strategy?

  • Align your digital data strategy with digital market & technology trends, digital business strategy and innovation
  • Determine your organisation’s current and expected “data footprint” to focus your data management strategy and identify future needs
  • Assess the maturity of your current data management capabilities and generate a roadmap to bridge the gaps, working towards agile data management to enable your digital business

Data Modelling in a Big Data Environment

Chris Bradley, Data Management Advisors

This half day workshop will explore Data Modelling for Big Data together with the techniques and uses for data models beyond Relational DBMS development.

In the modern era, the volume of data we deal with has grown significantly. As the volume, variety, velocity and veracity of data keeps growing, the types of data generated by applications become richer than before. As a result, traditional relational databases are challenged to capture, store, search, share, analyse, and visualize data. Many companies attempt to manage big data challenges using a NoSQL (“Not only SQL”) database and may employ a distributed computing system such as Hadoop. NoSQL databases are typically key-value stores that are non-relational, distributed, horizontally scalable, and schema-free.

Many organisations ask, “do we still need data modelling today?” Traditional data modelling focuses on resolving the complexity of relationships among schema-enabled data. However, these considerations do not apply to non-relational, schema-less databases. As a result, old ways of data modelling no longer apply.
This workshop will show Data modelling approaches that apply to not only Relational, but also to Big Data, NoSQL, XML, and other formats. In addition, the uses of data models beyond simply development of databases will be explored.

What you will learn:

At the end of the workshop, delegates would have gained the following:

  • Learn about the need for and application of Data Models in Big Data and NoSQL environments
  • See the areas where Data modelling adds value to Data Management activities beyond just Relational Database design
  • Understand the critical role of Data models in other Data Management disciplines particularly Master Data Management and Data Governance.
  • Learn the best practices for developing Data models for Big Data and NoSQL environment
  • Understand how to create data models that can be easily read by humans

Preparation for the Certified Data Management Professional (CDMP) Exams

Mark Humphries, DAMA & Katherine O’Keefe, Castlebridge

This half day workshop covers an overview of the process, tips and techniques of successful CDMP exam taking. In this interactive and informative session, you will learn:

  • What is the CDMP certification process
  • The DAMA-DMBOK & CDMP data exams alignment
  • What topics comprise each exam’s body of knowledge
  • Concepts and terms used in the CDMP exams
  • A Self-assessment of your knowledge and skill through taking the sample exams

Attendees of the half day workshop will receive some refresher tuition covering several of the most common topics seen in recent examinations. Note however this is not a substitute for past experience and education.

Attendees will be able to take the exams according to their conference attendance schedule on 21 November. Workshop attendees will take the certification exams on a “pay if you pass” basis (passing is 60% for associate & 70% for practitioner). If you take and pass all three certification exams, you would leave ED&BIA 2017 with a CDMP credential.

Note exam fees are payable directly to DAMA.

VERY IMPORTANT: You will need to bring your own computer which can connect to the internet. The exam is taken online you will need to register a minimum of 2 hours before the exam at www.dama.org. Your test (and live) exam results and performance profile can be viewed immediately.

EXAM

  • 3 * 90 minute examination sessions (in the afternoon).
  • Each exam is 90 minutes in length and has 110 multi-choice questions
  • Your score is immediately known after exam is taken
  • Exam fees for ED&BI attendees – there is a fee payable for each CDMP exam, with a ‘pay only if you pass’ agreement for attendees of this workshop”

Passing at Associate level requires 60% or higher in one exam. Practitioner level is attained by passing 3 exams at 70% or greater.

Data Storytelling

Dirk Morgenroth, Senior Consultant Business Intelligencce, Atos

Analyzing data is only one side of today’s data professionals’ coin. Communicating that data in a compelling way is the other side. Typically, there is a particular need of verbally sharing the data with the management. Storytelling techniques are an effective mean to deliver your information and they offer many compelling reasons for making use of them. This workshop provides the context of data and storytelling and introduces to its fundamentals.

  • Get sensitized to the power and potential business contexts of narratives
  • Understand the importance of context and audience
  • Learn important Do’s and Don’ts for visualizing your data

Grooming Data Stewards

Sue Geuens, DAMA International

Data Stewardship is very tough to sell. You are expecting individuals inside your organization to take on an accountability that they may not have full control over and do that on top of their day job.

This presentation is aimed at providing a how-to case study on what you need to do to first sell the concept and second groom colleagues to become great data stewards and sustain data governance in your organization:

  • Selling the Concept
  • Defining the Good vs. Not Good for a Data Steward
  • Training the Data Steward to become a Change Agent
  • Branding the Data Steward for instant recognition

Mastering your Master- and Reference Data

Jan Henderyckx, Inpuls

In the age of data science, AI and big volumes one could wonder why we need to care about the little data. Wouldn’t it be just enough to define the critical data points and setup data sharing agreements? Practice has however shown that properly managing your master- and reference data is the cornerstone of being a data enabled company. The word data can be a bit deceptive as it will also be necessary to address the organisational- and technological aspects and obviously we need to consider data migration and cleansing. This presentation focuses on getting control over your master- and reference data. The practical approaches and examples draw from Jan’s years of hands-on experience in MDM/RDM-projects.

What delegates will learn from attending the session:

  • How to build a business case
  • Understand the various architectural MDM-styles
  • Blueprints for common use cases
  • Integrating an MDM-system in your existing application and data landscape
  • Setting up MDM/RDM organisational models
  • How to select a tool that fits your needs
  • Properly planning for the launch of the solution

14:00 - 17:15
Agile Methods and Data Warehousing: How to Deliver Faster

Kent Graziano, Senior Technical Evangelist, Snowflake Computing

Kent Graziano

Senior Technical Evangelist

Snowflake Computing

Kent Graziano is a Senior Technical Evangelist with Snowflake Computing and the author of The Data Warrior blog (http://kentgraziano.com).  He is a Data Vault Master and certified Data Vault 2.0 Practitioner (CDVP2), Oracle ACE Director, member of the OakTable Network, former member of the Boulder BI Brain Trust (#BBBT), expert data modeler and architect with over 30 years of experience, including over two decades doing data warehousing and business intelligence, in multiple industries, with multiple architectures. Kent is an internationally recognized expert in Data Modeling and Agile Data Warehousing. He has developed and led many successful software and data warehouse implementation teams, including multiple agile DW/BI teams. He has written numerous articles, authored three Kindle books (including A Check List for Doing Data Model Design Reviews and An Introduction to Agile Data Engineering), co-authored four other books on Data Modeling and Data Vault, and has given hundreds of presentations, nationally and internationally. In 2014, he was voted one of the best presenters at OUGF14 in Helsinki, Finland.  Follow Kent @kentgraziano

Digital Data Strategy

Pieter den Hamer, Strategist, Alliander

Pieter den Hamer

Strategist

Alliander

Pieter den Hamer has 20+ years of experience in the area of big data, analytics and AI, working as consultant, manager and researcher. Currently, he is a strategist at Alliander, focusing his expertise mostly on themes like the smart grid and the energy transition. He is also associated with the Copernicus Institute at Utrecht University. Having his academic background in artificial intelligence and data science, he has always worked on the interface between science and business: as managing partner at the knowledge center CIBIT, and as research director at DNV-GL Research & Innovation, amongst others. He has gained his international experience with a myriad of organizations, ranging from the public and industrial sector to the energy sector. Pieter is a frequent presenter at conferences and has authored a diversity of publications, including a book on business intelligence and various papers and articles.

Data Modelling in a Big Data Environment
Preparation for the Certified Data Management Professional (CDMP) Exams

Mark Humphries, Principal Business Consultant, Civica

Mark Humphries

Principal Business Consultant

Civica

Mark is a highly respected Information Management professional with over twenty years’ experience of improving business performance. He is an advocate for keeping IM as simple and business focused as possible, and promotes the principle that IM professionals should first understand how a business delivers value and then match the appropriate IM techniques to support value delivery.  Mark is a DAMA UK board member and he holds the CDMP certification at Mastery Level. In 2010 he was a finalist in the Dutch/Belgian Data Quality Award based on the quantified benefits of the Data Quality program that he led as Data Manager at an energy supplier.

Katherine O'Keefe, Lead Data Governance & Privacy Consultant, Castlebridge

Katherine O'Keefe

Lead Data Governance & Privacy Consultant

Castlebridge

Dr Katherine O’Keefe is a lead Information Governance and Privacy consultant, trainer, and Chief Ethicist with Castlebridge. She has worked with clients in a variety of sectors, from telco to healthcare to charities, on consulting and training engagements. She has represented Castlebridge at Data Governance conferences internationally and regularly writes about Data Governance topics on the Castlebridge website and in other industry publications. Katherine lectures on Data Ethics and Privacy at the Law Society of Ireland and is a member of an international Data Ethics roundtable. She is also a leading expert on the fairy tales of Oscar Wilde and leads Castlebridge’s Gamification team, exploring ways to use games, storytelling, and non- traditional training activities to help change how people in organisations think about information. She serves as VP for Professional Development with DAMA International and is the lead project manager for the CDMP exams.   Follow Katherine @okeefekat.

Data Storytelling

Dirk Morgenroth, Senior Consultant Business Intelligence, Atos

Dirk Morgenroth

Senior Consultant Business Intelligence

Atos

Dirk Morgenroth is a Senior Consultant for business intelligence at Atos, Vienna. As a professional with eleven years of hands-on experience in business intelligence, he covers the complete business intelligence life cycle, and supports clients in defining their business intelligence strategy, business intelligence related organizational settings and in designing, implementing and maintaining their individual data warehouse solutions. In addition to these core consulting activities, he trains externally in Germany, Austria and Switzerland.

Grooming Data Stewards

Sue Geuens, President, DAMA International

Sue Geuens

President

DAMA International

Sue is a proverbial “dataholic” and it shows. Being able to assist organisations both large and small to not just understand their data and why it is so imperative to manage it but to work out their best route to achieving this is the reason Sue gets up and “goes to work” each day. Whether she needs to dive into the weeds with the data analysts and quality auditors to look at their data or talk to the C-level team about their data challenges and successes, she enjoys each and every part of this growing professional industry. Every time she “talks data” she believes that she open the eyes of at least one person to exactly why she is so passionate about what she does – that person takes this knowledge back to their organisation and does exactly the same – sparks the passion in the next person.  DAMA International President since 2014 (on the board from 2011) DAMA SA President 2009 to 2015 CDMP in 2010 – first in Africa; MDQM in 2011. Follow Sue on Twitter @suegeuens.

Mastering your Master- and Reference Data
Tuesday, 21 November 2017: Conference Day 1 & Exhibits
09:00 - 09:10
Joint Conference Chair Introductions
Plenary Keynote - The New Literacy: The Skills and Insights You Need in the Information Economy

Donald Farmer, Principal, TreeHive Strategy

For all our complaints, we are fortunate to live in highly literate, educated societies. When it comes to data literacy, though, it’s more like the Middle Ages—a time when only the elite few had access to the necessary skills and materials to learn to read.

However, as business users in the enterprise increasingly help themselves to powerful devices, apps, and data services, new forms of literacy are developing. We’re learning to navigate complex visualizations, to understand the tentative language of probability and prediction, and to browse ever greater volumes of data.

How can we help these newly data literate users in their work? What tools do they need? How can we provision data for their use? Moreover, how do we cope with the explosion of insight and the ensuing debates that literacy always brings to societies and organizations?

In this session, Donald Farmer will explore the uses of data literacy in the modern organization, as well as the accompanying potential, pitfalls, and unexpected implications.

• The impact of improved data literacy on individual and organizational achievements.
• The importance of understanding and communicating ambiguity.
• The new role of the analyst in an organization with self-learning technologies and artificial intelligence.

09:10 - 10:00
Plenary Keynote - The New Literacy: The Skills and Insights You Need in the Information Economy
Business Intelligence & Analytics
Business Intelligence & Analytics
Enterprise Data
Enterprise Data
To Cloud BI in Three Months Flat

Ian Turfrey, CIO, British Medical Association

Can you swap an ageing Oracle system for integrated Microsoft BI in the cloud in a matter of weeks?

Find out how the British Medical Association (BMA) did just that, and how data now informs business strategy at every level. The BMA, the trade union and professional body for doctors in the UK, used to rely on incomplete and disparate data. It needed a better way to see trends, spot opportunities, and most importantly, serve doctors.

That’s when Ian Turfrey, Chief Information Officer, joined the BMA with a plan to scrap the old system and move to the cloud. In this session, he will talk about strategy and the challenges of an extensive BI overhaul in a short time frame.

You’ll get practical tips on:

  • Taking a cloud and SaaS-first approach to BI
  • Gaining C-suite support
  • Finding quick wins
  • Implementing two-speed IT (maintaining while innovating)
  • Getting the right skills capability

Future Blockchain

Anders Brownworth, Chief Evangelist, Circle

With all the hype surrounding blockchain technology these days, you might have missed what steps have happened in the ecosystem, why they are important and what to look for in the future. Anders Brownworth of Circle, a person to person payments company which leverages blockchain technology, will present on what blockchain technology is, the state of the industry and where the technology is headed.

Using Sensor Data From Trucks to Improve Profitability

Stijn Roelens, Enterprise BI Architect, A Leading Automotive Company

Hear why a leading multinational company that manufactures trucks, buses and construction equipment has consolidated their disparate data warehouse into a single Enterprise Data Warehouse and learn how truck sensor data is helping them to stand out from the competition and improve profitability.

  • The Challenge: This company needed to combine sensor data with operational data to deliver compelling offers, provide greater customer service and improve profitability.
  • The Solution: An integrated Enterprise Data Warehouse with a single global modelling standard. Automated and managed by WhereScape.
  • The Benefit: The company gets value from their sensor data, shortening times to market and delivering BI solutions faster than ever before.

Integrity Behind Intelligence- How JLL Drives Business Results Through a DQ Program for Our Global Clients

HoChun Ho, Global Head, DG and Management, JLL

This presentation is a case study of JLL’s data quality program for global clients. At JLL, they managed the data used by our corporate clients across many different industries and countries. The data must be accurate, thorough, consistent, reliable and widely understood. They have implemented a common data quality engine and a global rule catalogue that deliver configurable data quality scorecards and remediation panels. It leverages and monitors board-approved critical data elements, standard reference data and related mappings, and the data standards supported by global data stewardship programs. This program is completed with a client on-boarding model, standard operating models and a data help desk which is an award-winning implementation of a commercial data governance tool. This Data Quality Program enables JLL to offer data governance as a commodity service to their clients. It is a business user’s entry point to the full scope of enterprise data governance with deep industry and domain knowledge to jump start their clients’ data governance programs. This program goes beyond the technical implementation of a shared platform; it will include the end-to-end data governance best practices. They will also demo the tool to explain the features and challenges, and its connections with data standards, master data and data stewardship.

10:05 - 10:50
To Cloud BI in Three Months Flat
Future Blockchain
Using Sensor Data From Trucks to Improve Profitability

Stijn Roelens, Enterprise Business Intelligence Architect, A Leading Automotive Company

Stijn Roelens

Enterprise Business Intelligence Architect

A Leading Automotive Company

Stijn has over 13 years experience of BI. His roles have included analyst, developer, PM & enterprise architect. His main focus has been on how automation and agility can be effectively introduced in the data domain.  Currently he is focusing on how an organisation can be changed into an information driven company that understands and sees value in information.

Integrity Behind Intelligence- How JLL Drives Business Results Through a DQ Program for Our Global Clients

HoChun Ho, Global Head, Data Governance and Management, JLL

HoChun Ho

Global Head, Data Governance and Management

JLL

In his current responsibilities, HoChun guides and oversees global data governance and management for the Corporate Solutions business line at JLL. The scope includes the business oversight on master data management, data quality, meta-data management and the overall data road map. Mr. Ho and his team defined and established the Data Governance function, its organizational structures, global and regional resources, processes and operating procedures, tools, and communication protocols. He also mentors the organization on overall enterprise data strategy to leverage industry best practices and leading edge technology in enterprise information management, business intelligence, master data management and big data.

Anja Drescher, Change Management, Global Data Governance and Management, JLL

Anja Drescher

Change Management, Global Data Governance and Management

JLL

Anja Drescher is responsible for change management and communications for global data governance and management for the Corporate Solutions business line at JLL.  She has 23 years of customer service experience, including 15 years in commercial real estate, specializing Facility Management and data & analytic functions.  Her broad background in over a dozen of industries provides the foundation of her current focus in data standard industry research, knowledge management and training platform development and managing visualizations for data, documentation and industry publications.

10:50 - 11:20
Networking Break & Exhibits
IoD: Internet of Data

Pieter den Hamer, Strategist, Alliander

Will the ‘Internet of Data’ (IoD) allow us to reap the benefits of the ‘Internet of Things’ (IoT)? Or: How the IoT helps our business to survive in an ever more complex and networked reality, while the IoD keeps us from drowning in more and more data, more types of data and in more and more real-time data? If integrating the data from your known internal data sources on a daily basis is difficult as it is, then how will you manage when the IoT will flood your organization with much more data from numerous and varying external sources, that you will continuously want to integrate, enrich, analyze and translate into meaningful decisions and actions? The IoD – as a pragmatic and AI-driven reincarnation of the semantic web – promises to behave better. Concepts and techniques, such as linked (open) data, ontologies, OWL, RDF and SPARQL can help to link data in an ‘agile’ way, but then, focused towards domain specific applications, instead of on a global scale. Nevertheless, the Tower of Babel continues to interfere – fortunately AI and (deep) machine learning seem to be increasingly capable to overcome differences in semantics and language. We can observe The Internet of Data in real world initiatives like smart cities and smart societies. And who knows, might the ‘Intranet of Data’ invoke the end of the trusted enterprise data warehouse?

  • How the ‘Internet of Things’ leads to the ‘Internet of Data’: the growing need for agile data sharing and integration
  • The problem of semantics: why IoD may fail (as well) and how AI may come to the rescue
  • From enterprise data warehouses to enterprise linked data networks: the Intranet of Data
  • State-of-the-art technology & tools
  • IoD smart grid, smart city & smart society examples

Faster, Smarter Data Delivery at Generali Insurance

Zoltán Csonka, Head of Data Warehouse, Generali Insurance

We all know that today business users want better time to market and to achieve this goal they need faster and smarter decisions by using “faster” data. It is even bigger challenge to do it in a cost-effective way. In 2016 Generali Hungary introduced its new data warehouse using Data Vault modelling methodology and a custom ETL tool which supports agile development and better ROI, then the traditional 3NF models and hand coding. Data Vault modelling, code generation and the agile approach allow us to provide faster delivery, but the “shiny” technology is not enough. To give better services and improve customer satisfaction it is necessary to change the way IT and business work together. To fulfill the requirements of “Self Service BI” and avoid chaos new rules and new kind of services must be established.

This session concentrates on showing real life solutions on how we get more value out of our Data Warehouse.

  • The way our IT works vs. business requirements
  • How to resolve conflicts and get better results on budget.
  • How to get more working product in a short time. Sandboxes, prototype based and multi speed developments.
  • Explaining the technical challenges, resource issues, methodologies and processes which supports the agile approach

Implementing Master Data Governance: A Common Sense Approach 

Paul Lucas, Head of Master Data Governance, Yara

Yara started its Master Data Governance Strategy in 2014. It uses the Corporate Data Quality Management Framework. Yara has used many components that are readily available within the organization. Over the last three years, it has implemented MDG solutions for Finance and Customer, it is in progress with Supplier, and Material has started.

Delegates will understand:

  • How to execute the MDG strategy based on components that they have in their own organizations
  • The Importance of data metrics and continuous quality improvement
  • Organizational change management and training is vital to success

Session & Speaker TBC

TBC

11:20 - 12:05
IoD: Internet of Data
Faster, Smarter Data Delivery at Generali Insurance

Zoltán Csonka, Head of Data Warehouse, Generali Insurance

Zoltán Csonka

Head of Data Warehouse

Generali Insurance

Zoltán Csonka is Head of Data Warehouse at Generali Zrt. Zoltan has solid experience as data warehouse architect and business intelligence professional. Currently responsible for the DW development and operational processes, architecture and strategy. He is also a passionate data professional focusing on fast data delivery, data vault modeling, team management and development automation.  As a data professional he frequently speaks at DW/BI conferences and he is happy to share experiences and ideas about real life issues and solutions.

Implementing Master Data Governance: A Common Sense Approach 

Paul Lucas, Head of Master Data Governance, Yara International

Paul Lucas

Head of Master Data Governance

Yara International

Paul Lucas has managed the MDG strategy in Yara for the last 3 years. He has 35 years experience in IT in global chemical organizations, with 20 years experience in Yara. Paul has been responsible for many different domains within IT and in his latest role, he is applying his broad knowledge of all aspects of IT to Master Data Governance in a pragmatic manner.

Session & Speaker TBC
Harnessing Big Data Analytics with Data Democracy

Jason Perkins, Head of Business Insight & Analytical Architecture & Nick Reid, Decision Support Chief Architect, British Telecom

Join this session, to hear about how BT are empowering the business through data democracy. Enabling self-service using Data Analytics to exploit our information assets via a multi tenanted big data repository. Providing insight to drive informed decision making in an increasingly connected world. In this session Jason and Nick will take you on a journey through real world examples of using analytics to better understand business challenges and predict outcomes:

  • User Stories – Customer Experience, Operational Excellence, Self Service analytics & Innovation Analytics
  • Data Strategy & the Logical Data Warehouse (Hadoop vs. Analytics Databases)
  • Data Management – foundation for insight from 2,500 structured datasets to billions of non-relational data.
  • Breadth of Information usage – over 15,000 users across discovery, data science, reporting & visualisation.

Explainable AI – the Most Crucial Part of Our Future Technology

Jasper de Vries, Lead Consultant Analytics, Kadenza

When it comes to the impact data has on our lives, we haven’t seen anything yet. Advancements in Artificial Intelligence make sure we won´t be the only ones taking actions in the near future. The development of self driving cars is one of the most prominent cases where insight into the reasons behind a left or right turn matter to us all. But the usage of Deep Neural Networks (DNN) will stretch far beyond this isolated example and will be put to use within almost every service and product. Understanding DNN´s is increasingly complex while, at the same time, European legislation demands we are able to explain its inner workings. Apart from possible ethical dilemmas. That’s a challenging situation to deal with.  Key take aways:

  • Why Explainable AI matters to you
  • The state of Explainable AI and Deep Neural Networks
  • How you should prepare yourself and your organisation

Lessons Learned from the IRM UK CDO Executive Forums

Jan Henderyckx, Inpuls

Four times over the last couple of years a selected group of CDO’s have met up for the IRM UK CDO Executive Forum. The objective of the forum is to define best practices and exchange ideas on the role of the CDO in an organisation and to increase the impactfulness on the business outcome. As proper information usage is not just limited to the C-level the group decided that we wanted to share our findings and recommendations with a broader audience. Hence this session which is a compilation of “the best of” from our previous meetings.  What delegates will learn from attending the session:

  • How information can be used as a business enabler
  • How to position your CDO in your organisation
  • What capabilities are required to be successful

The next IRM UK CDO Executive Forum will take place on 22 November during this conference – if you would like to apply to attend please e-mail Jeremy.hall@irmuk.co.uk

Big Data & GDPR: An Incompatible Marriage or an Opportunity?

Christoph Balduck, Managing Partner, DataTrustAssocates

By now most companies are implementing GDPR – a new and broad EU data privacy & data protection regulation with a large number of topics to be taken into account.

Two years ago big data and GDPR were quite incompatible – especially when using big data storage & processing capabilities like Hadoop or NOSQL. Not only did we often ignore the principles of proportionality, there was also little insight into the data captured data-sets, it’s category, risk,… & in many cases the purpose of processing & storing big data was only determined after insights were gained instead of determining a specific purpose upfront.

Furthermore, complying with the rights of the data subjects such as right to access, right to data-portability, right to rectification and restriction of processing etc. was difficult to apply to data in a so called data lake.

Over the last 2 years though, we’ve seen a tremendous acceleration of data & information mgt. tooling to support the implementation of GDPR. Big data tooling was no different from that & can now be seen as an opportunity to become GDPR compliant. Besides big data tooling, other traditional capabilities such as MDM, DQ, Metadata Mgt, etc. are essential parts of a GDPR reference architecture.

Delegates will learn:

  • The risks associated with big data in a GDPR context
  • How Big data can be used as an opportunity and accelerator to implementing GDPR.
  • Understand all components of a GDPR reference architecture including both traditional and new data & information management components.

12:10 - 12:55
Harnessing Big Data Analytics with Data Democracy

Jason Perkins, Head of Business Insight & Analytical Architecture, British Telecom

Nick Reid, Decision Support Chief Architect, British Telecom

Nick Reid

Decision Support Chief Architect

British Telecom

Nick is the Chief Architect for the Decision Support Platform with 15 years experience in the BI and Data warehousing field.  He has lead on major transformation programmes that are rated in the top 1% in the world.

Explainable AI – the Most Crucial Part of Our Future Technology
Lessons Learned from the IRM UK CDO Executive Forums
Big Data & GDPR: An Incompatible Marriage or an Opportunity?

Christoph Balduck, Managing Partner, DataTrustAssocates

Christoph Balduck

Managing Partner

DataTrustAssocates

Christoph is an EU certified Data Protection Officer (EIPA) with a broad data and information management background.  Christoph advises, coaches and executes both on Data privacy and data protection roles (DPO,…) as well as on data and information management roles (e.g. CDO, Data/information architect, senior data management advisor,…). By bringing these 2 worlds together he can design and accelerate solutions for operational & analytics challenges – while overcoming compliance constraints by turning them into a competitive advantage for your business. Furthermore Christoph can advise, coach and execute on both a strategic, tactical and operational level – which allows for a faster translation of business objectives into actual projects, operating models, architectures, policies,… and value. By capturing, understanding and resolving operational issues Christoph can translate them into tactical and strategic actions to realize the change and value on all levels of the company.  Besides this, Christoph regularly teaches on the practical implementation of GDPR in a number academic institutes (since 2015) and is continuously looking into solutions for automating GDPR (capabilities) while maximizing the use of technology.  Follow Christoph @balduck_c

12:55 - 14:25
Lunch, Exhibits and Perspective Sessions
Case Study: At-Scale Real-Time Analysis Using Big Data Fabric

Are you adopting big data for performing high-velocity real-time analytics? Companies are investigating rapid analysis for business users using self-service BI on very large volumes of data. However, such initiatives are not yielding much value because these big data systems have become siloed from the rest of the enterprise systems, which hold critical business operational data. Big Data Fabric is a modern data architecture that combines data virtualization, data prep, and lineage capabilities to seamlessly integrate at scale these huge, siloed volumes of structured and unstructured data with other enterprise data assets.

This presentation will demonstrate:

  • Using proven customer case studies, the value of using big data fabric as a logical data lake for big data analytics in big data and IoT initiatives.
  • The architectural stack of the big data fabric, the functions of each component, and the value delivered by each of them.
  • Performance benchmarks across the big data fabric technologies and at-scale optimization techniques for the lowest possible latency.

13:25 - 13:50
Case Study: At-Scale Real-Time Analysis Using Big Data Fabric

Ravi Shankar, Chief Marketing Manager, Denodo

Ravi Shankar

Chief Marketing Manager

Denodo

Ravi Shankar is the Chief Marketing Officer at Denodo. He is responsible for Denodo’s global marketing efforts, including product marketing, demand generation, field marketing, communications, social marketing, customer advocacy, partner marketing, branding, and solutions marketing. Ravi brings to his role more than 25 years of proven marketing leadership and product management, business development, and software development expertise within both large and emerging enterprise software leaders such as Oracle, Informatica, and Siperian. Ravi holds an MBA from the Haas School of Business at the University of California, Berkeley, and an MS and Honors BS degree in Computer Science. Ravi is a published author and a frequent speaker on data management and governance.

Perspective Session
13:55 - 14:20
Perspective Session
Perspective Session
Business Intelligence & Analytics Keynote: Six Stubborn Myths on Data

Rick van der Lans, R20/Consultancy

There was a time when every concept we used in computing was very clearly and some even formally defined. Not anymore. Many concepts are introduced during marketing campaigns and are barely defined at all. We just have to understand what they mean intuitively. Irrespective of their poor definitions, some of these concepts become popular, they become trending topics, hypes, are being oversold, and they lead to myths. And with that come the misunderstandings. The effect is that organisations invest in products and technologies that don’t deliver resulting in project failures. Some of these myths, related to data, big data, and data lakes, are critically discussed during this keynote. Are data lakes really what the data scientists ask for? Is big data really unstructured? And does open source software for data processing make an organisation really vendor independent?

  • The 3 V’s of big data are flawed
  • Data can’t be a game changer
  • Data processing is not our core business
  • Open source software makes you vendor independent
  • Big data + analytics = disruptive
  • Data lakes are good for data scientists

Enterprise Data Keynote: Big Data for Audience Personalisation at the BBC: Experiences in Wrangling Lots of Data, Evolving Technology and Changing Regulation

Janani Dumbleton, Head of Data & Leyla Khalili, Executive Product Manager, BBC

The BBC Audience Platform has been transforming how the BBC engages with it’s audience by delivering more of the content they love. The platform is responsible for delivering more relevant content to our audience, by understanding what they do across the entire BBC online and broadcast estate. We do this by delivering a cloud based API suite of personalisation and participation services that fuel our big data solutions. Our data collection and analysis is transforming the BBC into a data driven organisation, and with the frameworks for this delivered, our mission moving forwards is to continue driving this transformation throughout the BBC.

Innovating practical and tangible experiences using the variety of data we collect has been a very interesting and satisfying journey for the people involved, with each challenge and success proving a great learning experience on big data, underlying technologies, and making exciting and innovative products. The Audience Platform Data team has the responsibility of balancing the privacy promise, regulatory compliance and relevant data governance and information security obligations while continuing to innovate our data use and the technology stack that underpins it. This session covers about how the BBC audience platform teams have embedded principles of governance when developing data products and services, while supporting data driven insights and interactions for our audiences and helping our products build more relevant features and content.

The session will provide an overview of big data platform, the types of data, and methods used to collect, transform and deliver data services. It will explain how data governance, security and privacy controls have been embedded into product development and data life cycles. It will cover actual challenges the teams have faced, across technology, data and processes, with some practical ways they have been overcome.

Delegates will learn from attending the session

  • Learn practical applications of using big data for audience facing innovation.
  • Learn how data governance, information security and privacy by design principles have embedded into product development and data life cycles.
  • Understand the challenges faced in the process of developing the big data platform, and approaches used to mitigate them.

14:25 - 15:15
Business Intelligence & Analytics Keynote: Six Stubborn Myths on Data
Enterprise Data Keynote: Big Data for Audience Personalisation at the BBC: Experiences in Wrangling Lots of Data, Evolving Technology and Changing Regulation

Janani Dumbleton, Head of Data, BBC

Leyla Khalili, Executive Product Manager, BBC

Leyla Khalili

Executive Product Manager

BBC

Leyla manages data products for the BBC Audience Platform, a suite of APIs and data warehousing capabilities supporting personalisation, insights and innovation for the organisation. Leyla is responsible for the vision and direction for the audience data platform, and works with stakeholders across the BBC to ensure the maximum value is derived from the data assets. Leyla has spent 15 years building software for a wide range of businesses of different industries and sizes, from startups to large organisations.

Empower Your Data and Think Differently with Data Virtualisation

Emanuel Chiavegato, Technical Product Manager, Royal Bank of Scotland & Erica Langhi, Solution Architect, Red Hat

Organisations are sitting on treasure troves of data and BI teams with the right tools can mine this data and get insights that could lead to the creation of new services and improve customer service and retention rates. In financial services, it’s key to obtain sophisticated information, captured from different systems (transactional, risk, ledger, static) to be delivered on multiple channels. With a data virtualization layer data are offered in real time, ensuring the accuracy and reliability of the data. In this session, we’ll discuss strategies to deliver real-time data looking at how Royal Bank of Scotland used Data Virtualization to implement a data access layer and support real-time data decisions within their organisation.

Key takeaways:

  • Use Data Virtualisation to avoid the need to copy data to construct data marts or data warehouses
  • Operate in real time
  • Reduce development time, work with agile prototyping

Everything Looks Like a Graph: Data Modelling Using Property Graphs

Thomas Frisendal, TF Informatik

People working with graph databases experience that lifting a data model off the whiteboard and into a graph database is easy. Because “everything looks like a graph”. But that must mean that graph data modelling can be used in many contexts? Yes, it does; even if the data store is not a graph database! This holds true for most NoSQL stores, and even for SQL-databases! In this presentation Thomas will look in some detail at data models with high degrees of connectedness (an Email data model, for example). You will see that property graph data modelling is a very versatile approach across the board. It handles the classic data modelling issues such as normalization in a visual manner, and graph data models communicate intuitively with the business folks. In addition, some real, and common, modelling challenges are solved most elegantly using graph models. The presentation is targeted at a modelling audience with little or no exposure to graph data models, and it will illustrate how property graphs fit into NoSQL as well as SQL. It will also illustrate what cannot be (easily) done in data stores, which are not graph technologies, and which data modelling differences there are.

Starting Enterprise Information Management Using DAMA DM-BOK

William (Bill) Carroll, Executive Manager – EIM, Kuwait Finance House

Starting up Enterprise Information Management requires a current situation assessment, developing a strategy and road map, building team competency, and guiding the implementation of Data Governance, Data Quality, Data Architecture, Metadata, and Master Data Management.

While the DM-BOK provides exceptional inspiration, you should infuse ideas from other thinkers in the different topics, and adapt best practices to your situation in order to achieve a working, successful program.

You will learn how to:

Use the DAMA DM-BOK for your maturity assessment

Adopt/Adapt ideas on Data Governance

Set up a Data Governance Program

Set up and populate a Metadata Repository (Business Glossary)

Set up a Data Quality Program

See how Data Governance, Metadata and Data Architecture facilitate starting an MDM Program

Propagating Data Principles

Andrew Newman, Data Policy Manager & Lisa Allen, DG Manager, Department for Environment Food & Rural Affairs

Defra group has a simple but ambitious aim: to create a great place to live in. They have a rich seam of data on the environment and things that affect the environment. In 2016, working together across ten organisations, they made over 13,000 data sets available as open data. This was their first step in their data transformation.

They now have a data transformation programme leading work across all of Defra’s organisations to realise the vision of Better Data, Better Used; helping their people to treat their data as an asset, whilst demonstrating how they can be more data driven and open by design.

To drive this approach they now have a single set of Data Principles. Their 9 Data Principles are easy to understand and apply to all data whatever its size, structure or format. As part of their Enterprise Architecture Principles, they have to be considered in all ICT projects.

In this session delegates will learn:

  • About Defra’s Data Principles
  • How they have rooted our Data Principles within their organisations
  • What difference this has made
  • How the principles are enabling transformation and enabling them to break down organisational data silos.

15:20 - 16:05
Empower Your Data and Think Differently with Data Virtualisation

Emanuel Chiavegato, Technical Product Manager, Royal Bank of Scotland

Emanuel Chiavegato

Technical Product Manager

Royal Bank of Scotland

Emanuel Chiavegato is Technical Product Manager at Royal Bank of Scotland, with more than 15 years’ experience in IT, working in the Financial Sector and Manufacturing Industry.Having worked in different types of industries and in different roles, he has experienced different technologies and projects throughout his career.  He is currently responsible for the DAL ( Data Access Layer) platform which provides data virtualization for Finance, Risk and Operation at Royal Bank of Scotland, creating the programme roadmap and driving its development and implementation.

Erica Langhi, Solution Architect, Red Hat

Erica Langhi

Solution Architect

Red Hat

Erica Langhi is a solution architect at Red Hat with over 15 years’ experience in the technology sector. Having held a variety of consultancy, technical, and solution architecture roles throughout her career, Erica now works with customers across a number of verticals for open source specialist Red Hat. Her specialisms include data integration, data virtualization, data fabric, application development, integration middleware, and enterprise architectures.

Everything Looks Like a Graph: Data Modelling Using Property Graphs

Thomas Frisendal, Data Architect, TF Informatik

Thomas Frisendal

Data Architect

TF Informatik

Thomas Frisendal is an experienced database consultant with more than 30 years on the IT vendor side and as an independent consultant. He has worked with databases and data modelling since the late 70s; since 1995 primarily on data warehouse projects. He has a strong urge to visualize everything as graphs – even data models!  He excels in the art of turning data into information and knowledge. His approach to information-driven analysis and design is “New Nordic” in the sense that it represents the traditional Nordic values such as superior quality, functionality, reliability and innovation by new ways of communicating the structure and meaning of the business context. Thomas is an active writer and speaker. He lives in Copenhagen, Denmark. His firm, TF Informatik, was founded in 1995 and is registered in Denmark. He has published two books: Design Thinking Business Analysis – Business Concept Mapping Applied, Sep 27, 2012, Springer and Graph Data Modeling for NoSQL and SQL – Visualize Structure and Meaning, Sep 18, 2016, Technics Follow Thomas @VizDataModeler

Starting Enterprise Information Management Using DAMA DM-BOK

William (Bill) Carroll, Executive Manager - EIM, Kuwait Finance House

William (Bill) Carroll

Executive Manager - EIM

Kuwait Finance House

William (Bill) Carroll began his career hanging tapes and sorting punch cards in a mainframe data centre in 1979. Since then, his hands-on technical experience includes programming with seven languages, working as a DBA with five DBMS products, working as a Data Architect many times, working on 13 DW/BI projects, designing three MDM programs and starting two EIM programs. His work experience spans 20+ clients in manufacturing, insurance, banking, environmental management, defence, customs, taxation, airline operations and air traffic management.  Mr. Carroll is currently the Executive Manager, EIM at Kuwait Finance House. His previous assignment was at National Bank of Abu Dhabi, where he started and ran the EIM program from 2010 until 2015.

Propagating Data Principles

Andrew Newman, Data Policy Manager, Department for Environment Food & Rural Affairs

Andrew Newman

Data Policy Manager

Department for Environment Food & Rural Affairs

Andrew Newman is a Data Policy Manager at Defra. He is leading the design of a data policy framework for Defra group. He is working to create a single set of data policy that can be used by all Defra organisations. He is using his 17 years of experience in geographic and environmental data management, data policy, strategy and engagement to drive significant transformation. Follow Andrew @andnewman

Lisa Allen, Head of Data Governance, Department for Environment Food & Rural Affairs

16:05 - 16:35
Networking Break & Exhibits
Building a Digital Publishing Analytics System

Adrian Wiles, Enterprise Data Architect, Financial Times

Adrian will share the story of how and why the Financial Times came to build its own real-time cloud based Web Analytics system. This presentation will cover:

  • Key features that were considered essential but missing from off-the-shelf solutions
  • Building a data enrichment and analytics pipeline able to cope with peak news events
  • Supporting developer flexibility and multi-varient testing, as well as traditional web analytics
  • Maintaining a single version of the truth whilst providing real-time dashboards and audited business reports
  • Migrating trusted metrics from an incumbent system

One Small Step for Machine, One Giant Leap for the Market: Lessons Learnt from a Real World Machine Learning Project

Paul Nicholson, Head of Business Performance, Hastoe Housing Association

Hastoe Housing Association is taking part in a unique 3-year project with the University of Surrey funded by Innovate UK to apply state-of-the art Data Analytics to the housing sector. By applying some clever “machine learning” techniques to our data we can now predict when a home will become vacant, if a tenant will fall behind with their rent and whether it’s better to replace or repair a faulty boiler.

This is all very novel for social landlords and persuading sector experts to ‘make the leap’ to using predictive tools is proving to be the biggest challenge.

What you’ll learn from the session:

  • Lessons from a real world machine learning project and selling advanced data analytics in an emerging market
  • Introduction to some techniques like “Non-negative matrix factorisation”
  • …and how we’ve applied them to gain deep insights into our business.
  • How Knowledge Transfer Projects (KTP) can benefit your business

Moving Towards GDPR Compliance in a Complex Organisation

Norbert Eschle, Lead Information Architect, IFDS

Standardisation of data management and governance solutions across an enterprise, however, provides significant value with regards to e.g. efficiency and auditability. However, business units within complex organisations often travel at different speeds, making such standardisation difficult to achieve. To provide appropriate data management and governance solutions to an organisation with different levels of readiness and need, an enterprise architecture function should adopt a pragmatic approach to solution and data architecture. Delegates will benefit by learning about:

  • Project challenges and pitfalls
  • Approach taken to identify enterprise and business area requirements
  • Approach to solution governance

Merging Enterprise Data and Measuring Its Value

Suzanne Coumbaros, Head of Data Governance, The Co-operative Bank

The activity of bringing data together can be costly and can be overlooked when organisations make business decisions. This is particularly true when organisations grow through mergers / acquisitions and when they try to reduce their costs, by simplifying their Enterprise Data landscape.

By understanding the inherited Enterprise Data effects of these decisions, organisations can be better informed of issues and costs that they should consider.

If you attend this session you will gain an understanding of what Enterprise Data you should pay close attention to, what data you should bring together, what data issues you need to look out for and how to measure the value of your data.

  • Enterprise data considerations
  • What data to merge
  • Data issues to consider
  • How to measure the value of data

16:35 - 17:20
Building a Digital Publishing Analytics System
One Small Step for Machine, One Giant Leap for the Market: Lessons Learnt from a Real World Machine Learning Project

Paul Nicholson, Head of Business Performance, Hastoe Housing Association

Paul Nicholson

Head of Business Performance

Hastoe Housing Association

Paul Nicholson has been with Hastoe for eight years helping to drive significant performance improvements across the business. Paul was instrumental in the formation of the Knowledge Transfer Partnership with Surrey University which was awarded funding from Innovate UK in 2015. Hastoe is the leading specialist provider of rural housing with affordable homes in over 250 villages across England. We have a reputation for sensitively designed, high quality homes to high environmental standards including Passivhaus.  Follow Paul @paulgnicholson

Moving Towards GDPR Compliance in a Complex Organisation

Norbert Eschle, Lead Information Architect, IFDS

Norbert Eschle

Lead Information Architect

IFDS

Norbert is Lead Information Architect for IFDS UK. In this role, he is working on regulatory projects, growing and establishing data architecture and its governance. He has a background in enterprise data management, business intelligence and analytics. Other work he has done included BI strategy and regulatory solution architectures.

Merging Enterprise Data and Measuring Its Value
17:20 - 18:30
Drinks, Reception & Exhibits
Wednesday, 22 November 2017 - Conference Day 2 & Exhibits
How to Create Massive IMPACT and be an Effective Zoo Keeper

Nigel Risner, Motivational & Inspirational Speaker

In an ever-changing world and with pressures that come from a global source how do we make sure our teams are “in the room” and making an IMPACT. Nigel will share his 6-stage approach for keeping people energised, focused and most importantly achieving results.

He will also include a fun inter-active communication session that will have delegates talking about it for days, weeks and months to follow.

In his unique style, he will identify everyone in the room and share with them how to manage the animals in their workplace by being an effective zoo keeper.

  • The power of focus
  • The cost of internal terrorists
  • The importance of communication

09:00 - 09:55
How to Create Massive IMPACT and be an Effective Zoo Keeper
09:55 - 10:25
Networking Break & Exhibits
Business Intelligence & Analytics
Business Intelligence & Analytics
Enterprise Data
Enterprise Data
Logical Data Lake and Logical Data Warehouse: Two Sides of the Same Coin?

Rick van der Lans, R20/Consultancy

For the popular data lake concept many definitions exist. Many of these definitions are like “A data lake is a storage repository that holds a vast amount of raw data in its native format, including structured, semi-structured, and unstructured data.” It makes sense to have one environment where all the data can be found in its rawest form. Especially for data science and for investigative analytics a data lake is incredibly useful. But the question is does it really have to be a physical repository of data? Isn’t it sufficient that users can access a system that gives them access to all the data? In other words, why not a logical (or virtual data lake)? The technology exists to develop them. Data virtualization servers are mature enough to develop data lakes. It would avoid copying massive amounts of big data from their source to the date lake.

But what’s the difference between a logical data lake and a logical data warehouse? Don’t they do the same thing? Are they not the same thing? Both present a heterogeneous set of data sources as one logical database to the users. This tutorial explains how the two concepts of virtual data lake and logical data warehouse can be merged together and still run typical data lake and data warehouse workloads. They are really two sides of the same coin. One integrated architecture is presented that covers both concepts.

  • What are the limitations of a physical data lake and what are the benefits are of a logical data lake?
  • How do we set up one integrated architecture for a logical data lake and a logical data warehouse?
  • How easy it is to make new data sources available for reporting, analytics and data science?
  • How can big data stored in Hadoop and NoSQL systems be made available to analysts and data scientists easily and transparently?

Cutting Diamond - the Art of Big Data

Jane Chang, Enterprise Architect, British Gas

There is no shortage of technologies in the support of data analytics and big data exploitation. The success of big data mining does not rest only on the superiority of the technologies but the art of mining. It takes the knowledge of techniques as well as the business intelligence of the objective to create true impact.

In this talk, Jane Chang is going to discuss her experience in Data Analytics using different tools and platforms and how true value can be gained only when the right level of business intelligence is applied. She will be sharing her view of the success factors, in particular, some of the learning outcome from an Innovate UK funded research project.

Don’t Become a Data Quality Slave

Cristina de Salas, Data Quality Expert, Zurich Insurance

All companies want to claim that they have good data quality. But what is good quality? Even Data Quality should have quality limits, otherwise you become a slave of your solution. Data Quality just for the sake of it is counter productive, and it should not be forgotten that it only offers value added when it is designed to serve the business: Zurich is already hands on managing its exposure data quality. This session will focus on

  • Myths of Data Quality
  • Limits of Data Quality
  • Insights into the Data Quality approach of Zurich

Open for Business: a Collaborative Approach to Developing Data Standards in the Environment Agency

Becky Russell, National Lead for Data Standards, Environment Agency & Nigel Turner, Global Data Strategy

The Environment Agency is at the forefront of the UK government’s Open Data agenda. There are many challenges in releasing a large number of data sets into the public domain, but ensuring that data is collected, processed and presented consistently has been recognised as critical. To meet this and other needs, the Environment Agency has put a strong focus on the development and implementation of data standards, and their enforcement through data governance and IT lifecycle management. This case study will highlight how to implement effective data standards in a large and complex organisation. A key message of the session is that a business led, collaborative approach is essential. It will do this by covering:

  • The primary drivers behind the Environment Agency’s focus on data standards, including Open Data and other requirements
  • The business and IT problems that data standards are addressing
  • A problem focused approach to the creation and development of data standards
  • The need for active business and IT participation through data governance
  • The relationship between data standards and data modelling
  • Lessons learnt and advice for other organisations trying to implement data standards and data governance

10:25 - 11:10
Logical Data Lake and Logical Data Warehouse: Two Sides of the Same Coin?
Cutting Diamond - the Art of Big Data

Jane Chang, Enterprise Architect, British Gas

Jane Chang

Enterprise Architect

British Gas

Jane is an experienced Enterprise Architect with over 20 years of experience in the Energy industry. In recent years, she has developed a passion for exploring the use of data in industrial problems. She was involved in industry projects such as one that is in relation to energy theft. She is the founding architect of the Smart Metering Programme within British Gas, and was instrumental in the early conception of the smart metering pilots and trials in British Gas. Her design thinking has laid some of the technical principles underpinning the UK Smart Metering Model today. Renowned for taking a systemic approach, Jane expanded her architectural thinking into Smart Homes, building the conceptual foundation of Hive. She is currently working on the concept design of a local energy market within Smart Grid.

Don’t Become a Data Quality Slave
Open for Business: a Collaborative Approach to Developing Data Standards in the Environment Agency

Becky Russell, National Lead for Data Standards, Environment Agency

Nigel Turner, Principal Information Management Consultant EMEA, Global Data Strategy

Nigel Turner

Principal Information Management Consultant EMEA, Global Data Strategy

Nigel Turner is Principal Information Management Consultant for EMEA at Global Data Strategy Ltd. and Vice-Chair of the Data Management Association of the UK.  Nigel has worked in Information Management for over 25 years, both as an in-house implementer of Information Management solutions at British Telecommunications plc and subsequently as an external consultant to more than 150 clients, including the Environment Agency, British Gas, HSBC, Intel US and others.  Follow Nigel @@NigelTurner8

Data Warehousing: Today and Beyond

Kent Graziano, Snowflake Computing

The world of data warehousing has changed! With the advent of Big Data, Streaming Data, IoT, and The Cloud, what is a modern data management professional to do? It may seem to be a very different world with different concepts, terms, and techniques. Or is it? Lots of people still talk about having a data warehouse or several data marts across their organization. But what does that really mean today? How about the Corporate Information Factory (CIF), the Data Vault, an Operational Data Store (ODS), or just star schemas? Where do they fit now (or do they)? And now we have the Extended Data Warehouse (XDW) as well. How do all these things help us bring value and data-based decisions to our organizations? Where do Big Data and the Cloud fit? Is there a coherent architecture we can define? This talk will endeavor to cut through the hype and the buzzword bingo to help you figure out what part of this is helpful. I will discuss what I have seen in the real world (working and not working!) and a bit of where I think we are going and need to go in today and beyond.

  • What are the traditional/historical approaches
  • What have organizations been doing recently
  • What are the new options and some of their benefits

Mining Your Data for Gold at Barrick Gold Corporation

Ed Humphries, Head of Digital Transformation, Barrick Gold Corporation & Melanie Mecca, Director, DM Products and Services, CMMI Institute

Barrick Gold Corporation (Toronto) is the world’s largest gold mining company, creating wealth through responsible mining for people, communities, and countries. Barrick is pioneering transformation of the mining industry through technology and innovative exploitation of data assets. “Digital Barrick” depends on data-driven discovery of business opportunities through advanced analytics and machine learning. Because effective management of data assets is essential, Barrick selected the Data Management Maturity (DMM) Model to precisely evaluate current practices and implement policies and processes critical for Digital Barrick’s success. They will discuss:

  • Barrick’s vision and the Common Data Platform
  • How the DMM creates a foundation for success

Panel Discussion: Benefits that Improved Data Exploitation Can Deliver an Organisation – Both Tangible and Intangible

Moderator: Julian Schwarzenbach, Data and Process Advantage Ltd & Chair, BCS DMSG

Data is an input and enabler to most business activities, however, the quality of data and the effectiveness of its exploitation does not always support organisational objectives.

This panel debate will explore:

  • Types of benefits
  • Data to support benefits delivery
  • Organisational exploitation of data

The Future of Data Governance: Data Governance in the Data Lake

Michael Davis, DG and DQ Leader, Voya Financial

This presentation highlights the key differences between Big Data Governance and traditional Data Governance. Michael will explore the new approaches to data governance by surveying the current landscape and offering real world, practical solutions that enable organizations to make the leap from traditional to Big Data Governance organizations. Attendees will walk away with the knowledge on how to quickly develop an Enterprise Data Management strategy that takes advantage of the emerging Big Data technologies and governance solutions that enable quicker and better organizational decision-making.

The following will be covered:

  • Implementing a Big Data Governance and self-service analytic strategy in your organization to fuel sustainable data-driven insights and solutions
  • Leveraging Unstructured and Structured data to build data products and solutions
  • Best practices and principles for implementing a Data prep and MetaData management strategy

11:15 - 12:00
Data Warehousing: Today and Beyond
Mining Your Data for Gold at Barrick Gold Corporation

Ed Humphries, Head of Digital Transformation, Barrick Gold Corporation

Ed Humphries

Head of Digital Transformation

Barrick Gold Corporation

Ed Humphries is the Head of Digital Transformation for Barrick Gold Corporation, based in Toronto.  Ed is a dynamic leader with a proven track record in delivering high growth in the complex operational and regulatory setting of emerging markets. He is leading a 50M+ program consisting of a massive modernization of Barrick’s technology stack and associated data layer to unleash the power of advanced analytics, machine learning, and unstructured sensor data and communications, in the service of cutting edge mine planning, operations, and environmental controls.

Melanie Mecca, Director, DM Products and Services, CMMI Institute

Melanie Mecca

Director, DM Products and Services

CMMI Institute

Melanie is a regular presenter at many conferences, including DataVersity conferences, Data Driven Business, ISACA, Enterprise Data World, Predictive Networks, and CDO Summits.  She writes a quarterly column for The Data Administration Newsletter, “Data Management Introspective” featuring analysis of trends in data management and special topics, such as cost avoidance and organizational structures applied to the data layer. She is the managing author of the Data Management Maturity Model and led development of a sequence of certification courses for the Enterprise Data Management Expert certification.

Panel Discussion: Benefits that Improved Data Exploitation Can Deliver an Organisation – Both Tangible and Intangible

Julian Schwarzenbach, Director & Chair, Data and Process Advantage Ltd & BCS Data Management Specialist Group

Julian Schwarzenbach

Director & Chair

Data and Process Advantage Ltd & BCS Data Management Specialist Group

Julian is an experienced data management professional with a track record of over 25 years involvement in data exploitation in a variety of industrial sectors both as an end user and through provision of consultancy services. Julian is particularly interested in the organisational exploitation of data and is creator of the popular Data Zoo concept which considers the generic behaviours that people exhibit in data situations and the organisational drivers for these behaviours.

Lars Slagboom, Head of Data Management, ABN AMRO

Nikolai Petrou, Data Strategy Consultant, PA Consulting Group

Nikolai Petrou

Data Strategy Consultant

PA Consulting Group

Nikolai Petrou leads the Data Strategy capability within PA Consulting’s BI & Analytics practice. As lead consultant he designs data analytics teams to deliver the people, process and technology required for data driven decision making. Follow Nikolai @nikolaipetrou

Sarah Burnett, Chief Data Architect, Department for Environment, Food and Rural Affairs

Sarah Burnett

Chief Data Architect

Department for Environment, Food and Rural Affairs

Sarah Burnett is currently implementing a Data Architecture Framework for use across Defra, as part of the Data Transformation Programme, to contribute to the organisation’s commitment to deliver data driven services. She has worked in Data Management for over 20 years and implemented solutions for master data management, geographic information, data transformation and migration, across diverse areas such as Property, Planning, Education, Social Care, Defence and the Environment.

The Future of Data Governance: Data Governance in the Data Lake

Michael Davis, Data Governance Leader, Voya Financial

Michael Davis

Data Governance Leader

Voya Financial

Michael Davis is a Data Governance leader at Voya Financial. Previously, Michael was the Data Quality Center Of Excellence Engineering Lead at Cigna Healthcare, where he was instrumental in establishing the Cigna data team. Mr. Davis is the founder of OmegaSoft consulting. At OmegaSoft, Michael provided customers with staff augmentation and database Management services. Mr. Davis has over 18+ years of Information technology experience and has presented and published articles on Database Management. Mr. Davis hails from St. Croix, U.S. Virgin Islands and is an avid golfer and runner. He is also very passionate about Health and Wellness. He started the first local chapter of the Health 2.0 in Connecticut. Health 2.0 is an organization with a mission to improve healthcare outcomes using social media and other web-based technologies.

12:00 - 13:30
Lunch, Exhibits & Perspective Sessions
12:30 - 12:55
Perspective Session
Perspective Session
13:00 - 13:25
Perspective Session
Perspective Session
Business Intelligence & Analytics Keynote - Edge Analytics: The Next Frontier in Smart Business

Mike Ferguson, Intelligent Business Strategies

For many years companies have been building data warehouses and data marts for reporting and analysis with only a handful of professionals doing data mining and statistical analysis. However, the arrival of big data shone a spotlight on predictive and advanced and we are now at the point where they are considered strategic the boardroom. Today data science is mainstream but still mostly focused on machine learning on data stored centrally. However, the demand to analyse low latency streaming data and the emergence of the Internet of Things (IoT) is leading many to ask why is it that we have to analyse all data at the centre? Why not at the edge, closer to where the data is being generated? With so much data being generated and much more to come, would pushing analytics into the network not scale better? This keynote looks at why companies now need to develop models and rules centrally but deploy them anywhere all the way out into the network. It looks at why edge analytics is fundamental to being able to scale to manage IoT and how streaming data and distributed execution of an integrated suite of analytics can enable the ‘always on’ intelligent business.

  • The explosion of data and things that are emitting it
  • Prevention and opportunity – use cases for streaming analytics
  • Why do we have to move all data to the centre before analysing it?
  • Fast data and fast action edge analytics – develop centrally and deploy anywhere

Enterprise Data Keynote:Are We the Baddies? The Ethical Wakeup Call for Information Professionals and Data Provocateurs in the IoT Age

Daragh O Brien, Castlebridge

The pace of change and evolution in information management appears to accelerate year after year, with each generation promising newer and better ways to improve our lives. Whether it is making trains run on time or some other panacea to a social ill, our technology nirvana is always just one more release away.

But technology is neutral. The dark side of that panacea is that one person’s technology enabled dream can be another’s digitally enhanced nightmare, from ethical bias in sentencing systems, to business models built on the digital servitude of people, to fake news, to simple issues of work place safety in the digital age.

We don’t have to reinvent the wheel or throw the baby out with the bath water to embrace the opportunities posed by the Ethical Information Management Future as many of the lessons we need to learn have already been taught (we just haven’t been paying attention).

  • Get valuable insights on the reality of consumer attitudes to privacy
  • Understand how proven principles and practices can support Ethical Information Governance
  • Find out if you are really one of the baddies or not.

13:30 - 14:15
Business Intelligence & Analytics Keynote - Edge Analytics: The Next Frontier in Smart Business

Mike Ferguson, Managing Director, Intelligent Business Strategies

Mike Ferguson

Managing Director

Intelligent Business Strategies

Mike Ferguson is Managing Director of Intelligent Business Strategies Limited.  As an analyst and consultant he specialises in business intelligence / analytics, data management, big data and enterprise architecture.  With over 35 years of IT experience, Mike has consulted for dozens of companies on business intelligence strategy, technology selection, enterprise architecture and data management.  He has spoken at events all over the world and written numerous articles.  Formerly he was a principal and co-founder of Codd and Date Europe Limited – the inventors of the Relational Model, a Chief Architect at Teradata on the Teradata DBMS and European Managing Director of Database Associates.  He teaches popular master classes in Big Data, Predictive and Advanced Analytics, Fast Data and Real-time Analytics, Enterprise Data Governance, Master Data Management, Data Virtualisation, Building an Enterprise Data Lake and Enterprise Architecture.  Follow Mike on Twitter @mikeferguson1.

Enterprise Data Keynote:Are We the Baddies? The Ethical Wakeup Call for Information Professionals and Data Provocateurs in the IoT Age
Addressing the Need for Data Agility in the Insurance Industry

Ranjeet Athwal, MI Architect and Mark Bendall, Nova Programme Architect, Enstar

The traditional RDBMS approach to data warehousing has often proved costly, protracted and is not responsive to change. Using a modern BI architecture, we will demonstrate how Hadoop and complementary technologies can be used to deliver a platform that is secure, flexible, cost effective, and delivers early business benefit with reduced IT intervention.

We’ll also describe how we overcame some of the challenges, for example, new design patterns when using an immutable file system to manage data that is susceptible to change. Plus the desire to move away from a plethora of reports and extracts to guided self-service data exploration using visualisations. Finally we’ll describe how we are using the same platform to support more advance predictive analytics using data science toolkits that have always been inherent in insurance but in the past needed specialist solutions.

Delegates will learn:

  • The benefits of a simple yet effective design pattern using big data technologies combined with the appropriate controls delivers a cost effective BI platform
  • A sample of the considerations addressed and key design decisions taken on our journey to evolving this platform
  • A summary of the challenges faced introducing this technology including resources / skills, new BI paradigm, and impact on existing business processes

Predicting Social Sustainability of Global Supply Chain for a Better World

Anis Radianis, Business Intelligence Manager, Foreign Trade Association

There is growing interest to reap the benefits of artificial intelligence (AI) technology but in reality the challenge to incorporate AI into existing business intelligence (BI) technology is not an easy endeavour. Therefore, the purpose of this presentation is to provide the audience some real world examples of how FTA leverage their existing BI technology to optimize the new AI technology in order to help business with their most challenging issues, predicting social sustainability.

Attendees will learn:

  • How to leverage existing BI technology & processes that could help to leverage AI
  • How to use AI technology & processes to achieve predictive capabilities
  • What are the technological & business benefits

An Insight to the Introduction of Gamification within an Enterprise DQ Solution

Dan Griffiths, Lead Data Analyst, BAE Systems

Having moved into a Lead Data Quality role in order to deliver solutions that would vastly improve the quality of data across the enterprise, Dan struggled with audience engagement in some of his early implementations. After experiencing some of the more traditional delivery methods he used in his Business Intelligence role fail he took a gamble and implemented gamification into his Data Quality solution. This session takes you through the Journey Dan took and shares some of the successes and failures experienced both during and after the implementation. Employee Motivation and engagement are huge factors in any Data Quality implementation; attend this session to see if gamification could help to increase your chances of delivering successful Data Quality Solutions.

  • Benefits of implementing gamification into your data quality solution
  • Why this can work within your workforce
  • When not to use gamification

Driving Business Growth from Data – Moving from Notional Value to Deliverable Initiatives

Mike Maddock, Director, Kader Technology Ltd

With the explosive growth in Data over the last ten years, it is arguable that few businesses have yet to exploit the true potential value of their Data. With the convergence of technology trends and the real cost of technology reducing, organisations face the risk of being overtaken by smaller more agile companies. How can organisations mitigate this risk and move the internal debate from conceptual-level discussions of value and to well-defined, concrete initiatives that play to the strengths their business? What are some of the opportunities around data that emerging technology now presents? This session will discuss these considerations and provide insight into how other organisations have leveraged their data for commercial advantage, highlighting some of the challenges to be resolved in the journey towards being a truly data-centric company.

In this session, the delegates will learn:

  • How to move from abstract notions of the value of data, to specific and concrete areas aligned with business strategy
  • How converging technology trends now provide transformational opportunities for business
  • What are the key challenges businesses are likely to face in their journey to exploit their data

14:20 - 15:05
Addressing the Need for Data Agility in the Insurance Industry

Ranjeet Athwal, MI Architect, Enstar

Ranjeet Athwal

MI Architect

Enstar

Ranjeet is a practicing Enterprise Architect in the Insurance sector specialising in data and solutions architecture.

Areas of expertise include data warehouse implementation, analytics, MI and application design and integration in both relational and big data platforms.

Mark Bendall, Nova Programme Architect, Enstar

Mark Bendall

Nova Programme Architect

Enstar

Mark has 20 years’ experience designing and implementing IT solutions, including several years at lead architect level. Specialising in data and effective in enterprise data and solutions architecture roles, Mark has also been working with Big Data and Advanced Analytics in a number of industries including insurance and finance, retail and distribution.

Predicting Social Sustainability of Global Supply Chain for a Better World

Anis Radianis, Business Intelligence Manager, Foreign Trade Association

Anis Radianis

Business Intelligence Manager

Foreign Trade Association

Anis Radianis has 20 years of experiences in the IT industry specifically on business intelligence & analytics. He has worked with companies such as Coca-Cola, Levi Strauss, 3M & many others. Anis is a doctoral candidate from Ecole des Ponts Business School in France with research interest in sustainability & artificial intelligence. He has a bachelors degree in Economic from University of Indonesia and a master degree in IT Architecture from Antwerp Management School. He holds a masterclass certificate from TIAS in security management, AMS in IT leadership, TU Delft in Enterprise Innovation & Engineering, and from MIT in Big data. Anis holds professional credentials as PMP from PMI, CBIP from TDWI, CDP from ICCP, CDMP from DAMA,TOGAF certified from TOGAF, and many others. He is co-author of a book “Information Security Management Based on ISO 27001:2013: Do-It-Yourself and Get-Certified”.  Follow Anis @anisradianis

An Insight to the Introduction of Gamification within an Enterprise DQ Solution
Driving Business Growth from Data – Moving from Notional Value to Deliverable Initiatives

Mike Maddock, Director, Kader Technology

Mike Maddock

Director

Kader Technology

Mike is Director of Kader Technology Ltd, a consultancy firm that specialises in helping organisations leverage technology for competitive advantage. Currently working with Thomas Cook Group PLC, Mike leads a team of Solution Architects responsible for the realisation of business, data and technology strategies and the creation of a Group-wide executable technology roadmap.  Previously Mike provided consultancy and research on emerging consumer and technology trends for BGL Group, highlighting the potential impacts and opportunities to the CIO and the senior executive community. His remit also included the development of the BGL Group technology strategy and was a key contributor to the Group’s data strategy with the design and facilitation workshops for the Executive Board and other senior business stakeholders. Mike was also heavily involved in innovation initiatives within BGL Group including the setting up and running of ctmLabs, a dedicated innovation team within comparethemarket.com. With over 25 years’ experience in IT, Mike began his career designing mission critical systems for the medical sector before moving to financial services in 1997. Mike has held technical architecture positions in Cap Gemini, Thomas Cook and was the Head Software Architect in Travelex.

15:05 - 15:30
Networking Break & Exhibits
Putting Your Most Valuable Data Asset to Work - Current Challenges in Storing, Handling, and Working with Customer Data

Timo P. Kunz, Data Scientist, Catawiki

Is your organisation in the fortunate position that it is not only capable of capturing data but that it has also managed to recruit capable people that know how to tackle the most daunting  data challenges? Further, are ideas how to turn insights and model outputs into action plentiful? While this clearly puts you ahead of the curve, the next steps towards the success of your data operation are not less mission critical.

This talk will discuss challenges in storing, handling and working with what is most likely your company’s most valuable data asset: customer data. We address the organisational and UX challenges in capturing user data, the legal implications imposed by GDPR, as well as tradeoffs and compromises when designing a robust infrastructure that does not stifle the creativity or velocity of your analysts and data scientists.

In a Chaotic World, How Do You Solve a Problem Like Analytics?

Bernard Panes, Analytics Solution Architect & Bill Dawson, Innovation Program Ops Lead, Accenture

Nature is amazing, it solves complex information problems elegantly, and there isn’t a leader or a plan in sight…

In a digital world ‘chaos’ is the new normal, and turning data into actioned insight becomes exponentially harder to manage…

So, what can we learn from nature and its principles like replication, mutation, and competition?

We would like to share a pioneering approach to analytics transformation using Liquid Architecture and Lean Startup techniques; experimenting and rapidly evolving product designs through fast learning cycles.

You will learn:

  • Why you should make a case for a value focussed, and lean approach to your Analytics project and why it will probably be more successful; delivering greater value, less waste, a faster time to outcome, and reduced risk
  • What analytics components you should focus your design, build and delivery effort on when proving value… and equally importantly, what you should stop doing
  • How to structure and staff an analytics initiative from Ideation through to fully scaled production. What the skills, behaviours, and platform enablers required for this approach
  • We will share project examples of how decentralized evolutionary systems have inspired us to change our approach and what we are learning from the experience

Enterprise Wide Data Quality Programme

Lars Slagboom, Head of Data Management, ABN AMRO

In 2016 ABN AMRO started a DQ programme. They began with 13 different lists of DQ issues, minimal guidance and no bank-wide policy on DQ. Today they have a single registration of all their DQ issues, a bank-wide governance on three different levels, bank-wide awareness around DQ, bank-wide DQ tooling as well as Reference & Master Data. They are now taking the next steps and are accelerating our DQ programme.

  • How did they implement our bank-wide DQ programme?
  • How did they create awareness around DQ?
  • What is the impact of Reference & Master Data management on DQ?

Data Modelling is Not JUST for DBMS’s

Chris Bradley, Data Management Advisors

Have you ever heard any of these comments regarding Data Modelling?

  • What’s the point of data modelling?
  • We don’t need models as we use packages
  • We’re an agile shop, no need for models.
  • We don’t build custom DBMS’s so don’t need Data models.

Unfortunately, these and other similar comments are still heard across organisations worldwide.  In part the problem is the way in which Data modelling has been taught with its focus on the development of technical solutions. Although Data Modelling has been around for over 30 years, its original roots were firmly in the DBMS world but the World has moved on. Today’s Business systems landscape isn’t just about developing “new” DBMS based systems from scratch. Yet this is all too often how Data Modelling is taught and promoted. In most organisations today the IT portfolio contains a variety of additional components such as: COTS packages (ERP, CRM, ECM etc),  BI & DW systems, NOSQL, SOA & XML message based systems, Communication with the business, Data Quality & Governance and more.

So, is DATA important for these systems? – you bet.

Has data modelling moved on to cater for these? Well – that’s what this talk is about!  We’re all probably familiar with how to create a database from a logical and physical data model. But how do the rules change when we’re dealing with an ERP package or XML in SOA applications?  How can we leverage our existing logical data models for this new audience?

This session will re-emphasise the “traditional” place modelling has in the DBMS design lifecycle.  It will then go on to show how data modelling can be used and why it’s vital in other areas of the application portfolio. Chris will describe why Data modelling is NOT just for use in DBMS design, in fact it hasn’t been for a long time. Also how the techniques we learned in the 70’s and 80’s for the pre-relational era are useful again now, and why data models are essential for COTS package implementation.

15:30 - 16:15
Putting Your Most Valuable Data Asset to Work - Current Challenges in Storing, Handling, and Working with Customer Data

Timo Kunz, Data Scientist, Catawiki

Timo Kunz

Data Scientist

Catawiki

Timo has spent over a decade in the retail industry, mainly working with and researching pricing and promotion related topics. He is currently a Data Scientist for Catawiki where he focuses on customer behaviour modelling, customer analytics, personalisation, and customer value. His previous experience includes working or consulting for companies such as Yoox Net-A-Porter, Morrisons Supermarkets, Dansk Supermarked, Boots, Swiss Coop, LVMH, SAP, and Simon Kucher & Partners. He holds a PhD in Management Science from Lancaster University and has published in journals such as Decision Support Systems and the Journal of Revenue & Pricing Management.

In a Chaotic World, How Do You Solve a Problem Like Analytics?

Bernard Panes, Analytics Solution Architect, Accenture

Bernard Panes

Analytics Solution Architect

Accenture

Bernard Panes is an Innovator within Accenture’s Digital practice.  As a Solution Architect in Analytics he shapes large complex transformation programs for clients in many domains.  Despite his expertise, he is sometimes sceptical about how anyone can assure or plan delivery in an ever more chaotic digital world.  Follow Bernard @bernardpanes

Bill Dawson, Innovation Program Ops Lead, Accenture

Bill Dawson

Innovation Program Ops Lead

Accenture

Bill Dawson is the Operations Lead for Accenture’s UK Innovation Programme and has a history of working with new technologies and helping clients with their digital strategies.  He spearheads the European Liquid Lean solutioning approach, enabling speedy concept to proven value.

Enterprise Wide Data Quality Programme
Data Modelling is Not JUST for DBMS’s
Why Analytics Fails and How to Fix It

Jim Halcomb, Practice Leader, CMMI Institute

Organizations typically treat data as a byproduct of technology-enabled processes. Data managed at this level is focused on improving process results. This approach breaks down where data is used by downstream applications.

Analytics is at the end of the line for data, effectively exposing weakness in the data infrastructure. Without a data management program, analysts maintain the quality of data they consume. This is a source of frustration for analysts and their efforts rarely benefit the organization at large.

Jim will discuss:

  • Typical struggles thwarting analysts
  • Data management practices that would help
  • How to navigate this journey using the DMM

Attracting the Best Customer Base in Higher Education Using Intelligent Data Mining

Jagdev Bhogal, Senior Lecturer in Data Technologies, Birmingham City University & Guy Garrett, CEO, Achieve Intelligence Ltd

Higher Education Institutions (HEI) need to have a strong position in the competitive market of the Education Sector. The new funding model for UK Universities, relies on student fees. Recruitment and retention are important to maintain reputation and financially advantageous. Data Mining (DM) and Business Intelligence (BI) are established tools in industry that can help improve decision making in HEI. This session discusses the challenges being faced in student recruitment and describes the development of a prototype application to investigate how data mining can be used to identify important factors related to student recruitment. Jagdev and Guy investigate how to analyse HE results, enrolment and graduation data to identify, profile and target students that are most likely to succeed on a variety of their undergraduate courses. Attendees will learn:

  • The Higher Education sector can benefit from the way Data Analytics has been used in business
  • Student Recruitment and Retention go hand in hand to improve student progression
  • Profiling can be used to recruit the “ideal” students for each course

AI Dream, Winter, BI, Machine Learning and What is Next

Thiago Assuncao de Faria, DataOps & AI Consultant, LINKIT

Artificial Intelligence was the future and dream in the 60’s. Then it was almost a curse word, ignored by many and feared by investors. Then it came Data Mining, BI, Machine Learning, Deep Learning and AI is blooming again. What can we learn from it?

  • The history of AI
  • The current hype
  • Next steps and challenges

Using a Digital Assistant to Proactively Manage Enterprise Data as an Asset

Delphine Clement, Senior Business Program Manager & Erman Oral, Data Engineer, Microsoft

More and more, the Enterprise needs data to run its business at the pace of a digital world. This data must be connected, appropriate, reliable, readily available and accurate for assisting the business teams with intelligent decision making.

  • Learn how My Data Health, a Hackaton born Enterprise application, is a digital assistant that provides the business teams with proactive, contextualized data quality recommendations available at their fingertips.
  • Learn how business feedback is crowd-sourced for continuous active learning and adjustment of recommendations’ relevance.
  • Learn how recommendations consumption, business satisfaction and My Data Health product backlog is driven through UX design and telemetry.
  • Learn how My Data Health digital experience was designed with the business teams through usability studies.
  • Learn how Cloud Services and Agile Delivery empower real time and continuous business transformation.
  • Learn and leverage for starting or pursuing your own data transformation journey!

16:20 - 17:05
Why Analytics Fails and How to Fix It

Jim Halcomb, Practice Leader, CMMI Institute

Jim Halcomb

Practice Leader

CMMI Institute

Jim Halcomb possesses deep and diverse global data management experience from financial institutions, exchanges, and regulators.  Jim led the development of data management best practices, and was a primary author of the CMMI’s Data Management Maturity Model.  He currently leverages his background in corporate strategy, business intelligence, and extensive experience with various business models and cultures to train people in and assess organizations against data management principles and best practices.

 

 

Attracting the Best Customer Base in Higher Education Using Intelligent Data Mining

Jagdev Bhogal, Senior Lecturer in Data Technologies, Birmingham City University

Jagdev Bhogal

Senior Lecturer in Data Technologies

Birmingham City University

Dr. Jagdev Bhogal started off her research career at Wolverhampton University. She joined Birmingham City University as  a Senior Lecturer in Data Technologies and has been in academia for nearly 30 years, She is Programme lead for MSc Big Data Analytics and Lead for BCU Oracle Academy. Jagdev  has presented conference papers on areas such as NoSQL, Mobile business intelligence; and Ontologies.  Follow Jagdev @drbhogal

Guy Garrett, CEO, Achieve Intelligence Ltd

Guy Garrett

CEO

Achieve Intelligence Ltd

Guy Garrett provides consulting services  in the field of data integration, business analytics and business intelligence strategy.  He has contributed several papers in these areas, most recently at SAS Global Forum 12 (where his paper attracted an encore solicited by Chair Andy Kuligowski) and at 2012 SAS Professionals Convention in the UK. Guy has amassed over 20 years of SAS experience Currently leading projects on -Data Warehouse & Quality Data Management Integration Programme, Revolutionary Motor Insurance using BigData Telematics, Profit Protection through High Performance Loss Ratio Analytics.  Follow Guy @guygarrett

AI Dream, Winter, BI, Machine Learning and What is Next

Thiago Assuncao de Faria, DataOps & AI Consultant, LINKIT

Thiago Assuncao de Faria

DataOps & AI Consultant

LINKIT

Thiago is a different Brazilian – why? Because he likes football, but mainly loves Ice Hockey! Mathematician and Statistician by formation, he always loved modelling data, predicting and implementing AI products. Always worried about the Ops part of Artificial Intelligence, he is an active member of the devops community (organiser of devopsdays Amsterdam) and public speaker! Working at LINKIT as DataOps & AI Engineer he is able to do what he loves: Implementing AI systems, data pipelines and chatbots!

Using a Digital Assistant to Proactively Manage Enterprise Data as an Asset

Delphine Clement, Senior Business Program Manager, Microsoft

Delphine Clement

Senior Business Program Manager

Microsoft

Delphine Clément – has been since 2012 Senior Business Program Manager at Microsoft Data Management Services Organization. In her role, she is in charge of innovation as well as of incubation of new services in the area of Enterprise Data Management. Delphine has 17 years of experience in the domain of data quality. Previous to Microsoft, Delphine has been a consultant in Data Quality and Data Governance at A.I.D. (Add Intelligence to Data – France) and the lead of the Customer Information Quality Team at Hewlett Packard Corporate. She is also co-founder of ExQI, for Excellence, Quality, Information – the French Association for the promotion of the Data Quality Culture. She is certified from the MIT Information Quality Program – IQMI (2003) and MIT IQMII (2004) as well as from the Greenbelt Program. Delphine has co-written numerous professional papers.

Erman Oral, Data Engineer, Microsoft

Erman Oral

Data Engineer

Microsoft

Erman Oral pursued his education at Istanbul Technical University Computer Engineering Faculty. Together with friends from the academy, he participated in Microsoft’s Imagine Cup 2007 with their graduation project that earned him his BSc degree as well as their team the trophy in the domestic finals and also visibility to their project on “Parental Education Portal, Understanding Baby Language” in the worldwide finals held in Seoul / South Korea. He was hired by Microsoft MEA IT and supported the e2e delivery/landing of data management solutions/apps in the local region and afterwards the landing of the same worldwide as it became a best practice as part of the IT shared services. In the meantime he completed his MA degree at Business & Information Systems in Boğaziçi University. After 5 years of experience in MEA and worldwide teams, he moved to Microsoft Germany in 2013. He has been involved in many EMEA and worldwide projects as project manager and software architect/designer including the delivery of volunteering portals for emergent markets, consumer segment apps for China and also migration of existing platforms to cloud.

17:10 - 17:20
Conference Close
Thursday, 23 November 2017: Post-Conference Workshops
Data Management in a Cloud Computing Environment

Mike Ferguson, Managing Director, Intelligent Business Strategies

As the adoption of cloud computing continues to grow and we are now at the point where many companies may have deployed applications both off-premise on public clouds on-premise on private clouds. They may even be using off-premise infrastructure to extend their private cloud environments. As this investment continues to grow, there is now a demand to seamlessly manage and govern data in a consistent way irrespective of its location in a cloud computing environment. This session looks in detail at the challenge of consistently managing data in a cloud computing environment and looks at what is needed to keep data consistent across off-premise and in-premise systems. In particular, it looks at important data management disciplines such as maintaining data privacy, data access security, data quality, data consolidation, data virtualisation, replication, master data management and data synchronisation across on-premise and off-premise clouds and what is possible today. It also looks at hybrid data lakes and explores concerns about the added complexity that off-premise multi-tenancy brings. In addition, it will highlight problems that still need to be solved to get to a point where companies can confidently and freely manage off-premise and on-premise data in a seamless manner. The session looks at the following:

  • Pros and cons of deploying on the cloud?
  • Public versus Private clouds
  • Deploying systems on public clouds – what are the options
  • Deploying systems on private clouds
  • Cloud-based data storage
    o Multi-tenant databases
    o Cloud object storage, e.g. Amazon S3, Azure Storage, Openstack Swift
    o Managing multiple databases in the cloud Vs managing multi-tenant databases
    o Data warehousing with cloud-based analytical databases and cloud BI
    o Big Data in the cloud
    o MDM in the cloud
  • Managing data governance across cloud and on-premise systems
  • Ingesting data in the cloud – streaming and batch ingestion
  • Data quality on the cloud
  • Cloud-based data integration
    o Integrating cloud and on-premise data – options available
    o Integrating off-premise data sources into on-premise data warehouses
    o Integrating MDM with on-premise and off-premise systems
    o Data virtualisation across on-premise and off premise data sources
  • Building information services in a cloud
  • Managing data privacy in a hybrid cloud computing environment
  • Managing data access security in a hybrid cloud computing environment
  • What works and what doesn’t?
  • Do’s and Don’ts
  • Getting started with cloud-based data management

Modern Data Warehouse Architectures: From A - Z

Rick van der Lans, Independent Analyst, Consultant, Author and Lecturer, R20/Consultancy

There was a time when a data warehouse architecture consisted of a chain of databases all running on one or two machines in our own data centre. Handwritten ETL programs were used to copy and transform data from one database to another. But so much new technology offering innovative opportunities has become available, there are so many new BI requirements, and we have new ways to design our data warehouse architectures. Data warehouse architects are struggling with all these new developments. They have to find answers for an almost endless list of questions. Should the data warehouse be developed with Hadoop? Do we still need data marts if the BI tools read data into memory? Can we use Spark as query performance booster? What does it mean to design datavault-based data warehouses? How does data streaming and the IoT work together with the data warehouse? Should we move the entire architecture into the cloud? Can we replace the data warehouse by a data lake? What is the role of the logical data warehouse? Will an analytical SQL database server solve all our query performance problems? And so on, and so on.

This full day workshop covers all the architectural and technical developments. How are they interrelated? How to migrate to a modern architecture? What are the pros and cons of all these developments?

You will learn:

  • What the use cases are of Hadoop and Spark in a data warehouse architecture?
  • To distinguish between five levels of BI in the Cloud and how they differ.
  • What the advantages are of using datavault as design technique.
  • Whether data warehouse automation is a hype or reality.
  • How Spark can be used to boost query performance and may even replace data marts.
  • How a logical data warehouse and virtual data lake can work together.
  • How analysis of streaming data can be embedded in a more classic architecture?
  • Why operational BI demands a new architecture.

Making Enterprise Data Quality a Reality

Nigel Turner, Principal Information Management Consultant EMEA, Global Data Strategy

Many organisations are recognising that tackling data quality (DQ) problems requires more than a series of tactical, one off improvement projects. By their nature many DQ problems extend across and often beyond an organisation. So the only way to address them is through an enterprise wide programme of data governance and DQ improvement activities embracing people, process and technology. This requires very different skills and approaches from those needed on many traditional DQ projects.

If you attend this workshop you will leave more ready and able to make the case for and deliver enterprise wide data governance & DQ across your organisation. This highly interactive workshop will also give you the opportunity to tackle the problems of a fictional (but highly realistic) company who are experiencing end to end data quality & data governance challenges. This will enable you to practise some of the key techniques in a safe, fun environment before trying them out for real in your own organisations.

Run by Nigel Turner of Global Data Strategy, the workshop will draw on his extensive personal knowledge of initiating & implementing successful enterprise DQ and data governance in major organisations, including British Telecommunications and several other major companies. The approaches outlined in this session really do work.

The workshop will cover:

  • What differentiates enterprise DQ from traditional project based DQ approaches
  • How to take the first steps in enterprise DQ
  • Applying a practical Data Governance Framework
  • Making the case for investment in DQ and data governance
  • How to deliver the benefits – people, process & technology
  • Real life case studies – key do’s and don’ts
  • Practice case study – getting enterprise DQ off the ground in a hotel chain
  • Key lessons learned and maxims for success

GDPR One Day DPO Intensive: Key Skills for the Data Protection Officer

Daragh O Brien, Leading Consultant, Educator and Author, Castlebridge

The role of the Data Protection Officer (or Chief Privacy Officer for our North American cousins) will increasingly be a critical one in organisations processing personal data. The General Data Protection Regulation (GDPR, coming into force on 25th May 2018) makes it a mandatory role in certain circumstances, but it is generally recognised as a good idea in organisations to have someone with responsibility for the oversight and governance of Data Privacy issues and obligations.

This workshop will take you through a detailed overview of the DPO as a Data Governance role. It will look at the key skills and knowledge a DPO must have. Combining a whistle-stop tour of the Data Protection law principles, the workshop will then:

  • Examine Article 29 Working Party guidance on the role of the DPO and how that maps to good practice in Data Governance
  • Look at the role of the disciplines of the DMBOK wheel in effective Data Privacy Compliance
  • Demonstrate how Data Quality principles, practices, and methods can be applied by a DPO to support Privacy Impact Assessments and demonstrate effectiveness of compliance
  • Provide an overview of how effective data governance and stewardship practices are key to ensuring alignment of day to day information management with the requirements of data privacy compliance
  • Examine how Agile approaches to Governance and Master Data Management can help ensure a responsive and proactive data privacy governance environment for the DPO in your organisation.

09:00 - 16:30
Data Management in a Cloud Computing Environment
Modern Data Warehouse Architectures: From A - Z
Making Enterprise Data Quality a Reality
GDPR One Day DPO Intensive: Key Skills for the Data Protection Officer

View a PDF of the agenda here or switch to a larger screen to browse the full agenda, including comprehensive session details and speaker information.

Fees

  • 4 Days
  • £1945
  • £1,945 + VAT (£389) = £2,334
  • 3 Days
  • £1595
  • £1,595 + VAT (£319) = £1,914
  • 2 Days
  • £1245
  • £1,245 + VAT (£249) = £1,494
  • 1 Day
  • £795
  • £795 + VAT (£159) = £954
Group Booking Discounts
Delegates
2-3 10% discount
4-5 20% discount
6+ 25% discount

UK Delegates: Expenses of travel, accommodation and subsistence incurred whilst attending this IRM UK conference will be fully tax deductible by the employer company if attendance is undertaken to maintain professional skills of the employee attending.

Non-UK Delegates: Please check with your local tax authorities

Cancellation Policy: Cancellations must be received in writing at least two weeks before the commencement of the conference and will be subject to a 10% administration fee. It is regretted that cancellations received within two weeks of the conference date will be liable for the full conference fee. Substitutions can be made at any time.

Cancellation Liability: In the unlikely event of cancellation of the conference for any reason, IRM UK’s liability is limited to the return of the registration fee only.IRM UK will not reimburse delegates for any travel or hotel cancellation fees or penalties. It may be necessary, for reasons beyond the control of IRM UK, to change the content, timings, speakers, date and venue of the conference.

Venue

  • Radisson Blu Portman Hotel
  • 22 Portman Square
  • London W1H 7BG
  • UK

Platinum Sponsors

Silver Sponsor

Standard Sponsors

Supported By

DAMA International

DAMA International is a not-for-profit, vendor-independent association of technical and business professionals dedicated to advancing the concepts and practices for data resource management and enterprise information. The primary purpose of DAMA International is to promote the understanding, development, and practice of managing data and information to support business strategies.   As Data Management becomes more relevant to the business, DAMA is keeping pace with new products and services such as the 2nd edition of the DAMA Data Dictionary, the DAMA BOD (Body of Knowledge) and several new certification exams.  We are participating on the Boards of many academic and standards bodies and sharing our knowledge with other organizations.
DAMA International is pleased to announce that a new chapter is forming in Turkey which will join the 8 other European chapters as part of DAMA International.   DAMA International and its affiliated chapters have grown year after year with chapters operating in Australia, China, India, North America, South America, Japan, and South Africa and DAMA is facilitating the formation of new chapters in many other countries.
As a DAMA member you receive the benefits of your local or global chapter’s activities and all the benefits of DAMA International’s products and services. You can network with other professionals to share ideas, trends, problems, and solutions. You receive a discount at DAMA International conferences and seminars, and on associated vendor’s products and services. To learn more about DAMA International, local chapters, membership, achievement awards, conferences and training events, subscriptions to DM Review and other publications, discounts, job listings, education and certification, please visit the DAMA International web page at www.dama.org.  Both the DAMA UK chapter and DAMA International will have a meeting during the conference.  We invite interested parties to join this vital and growing organization.  More information can be found at www.dama.org or you can email me at president@dama.org.

DAMA UK

The drive for the future is to successfully focus on providing quality support to core members whilst guaranteeing sufficient financial income to ensure sustained activity.   The four areas which DAMA UK recommends addressing over the next two years are:
Academic – to survey UK organisations to understand their Data Management skill set needs and then induce academic institutions to supply them.
Data Quality (DQ) – to benchmark data quality standards in the UK and encourage development of business awareness of the importance of DQ and help develop DG metrics.   Government regulations versus data – to increase awareness of the legal implications of data management, assist organisations in reducing their legal liabilities and support ETA (and others) lobby for “data clever” legislation.   Data Standards – survey requirements then work with other organisations (eg BCS) to develop effective data standards.

Association of Enterprise Architects   

The Association of Enterprise Architects (AEA) is the definitive professional organization for Enterprise Architects. Its goals are to increase job opportunities for all of its members and increase their market value by advancing professional excellence, and to raise the status of the profession as a whole.

BCS Data Management Specialist Group (DMSG)

The BCS Data Management Specialist Group (DMSG) helps Data Management professionals support organisations to achieve their objectives through improved awareness, management, and responsible exploitation of data.
We run several events each year whose focus areas include:
•    The benefits of managing data as an organisational asset
•    Skills for exploitation of data
•    Data governance as a ‘Business As Usual’ activity
•    Compliance with legislation, particularly that relating to data protection, data security and ethical usage of data
Our audience is anyone with an interest in the benefits to be gained from data. This includes: Chief Data Officers (CDO); Senior Information Risk Officer (SIRO); data managers/stewards; data governance officers; data protection/security advisors; data scientists; and business/data/database analysts.

DGPO

The Data Governance Professionals Organization (DGPO) is an international non-profit, vendor neutral, association of business, IT and data professionals dedicated to advancing the discipline of data governance.   The DGPO provides a forum that fosters discussion and networking for members and seeks to encourage, develop and advance the skills of members working in the data governance discipline.   Please click here to view a PowerPoint overview of the DGPO.

EDM COUNCIL

About the EDM Council
The EDM Council is a neutral business forum founded by the financial industry to elevate the practice of data management as a business and operational priority. The prime directive is to ensure that all consumers (business and regulatory)  have trust and confidence that data is precisely what is expected without the need for manual recalculation or multiple data transformations. There are four programs of the Council:
•    Data Content Standards (FIBO): the standards-based infrastructure needed for operational management (identification, semantic language of the contract, classification).  We own the industry ontology for financial instruments and entity relationships and make it available as an open source standard
•     Data Management Best Practices (DCAM): the science and discipline of data management from a practical perspective (data management maturity, data quality, benchmarking).
•    Data Implications of Regulation: translating the legislative objectives of transparency, financial stability, compressed clearing and cross-asset market surveillance into regulatory objectives and practical reporting requirements.
•    Business Network: global meeting ground, CDO Forum and mechanism for sustainable business relationships
There are 135 corporate members of the Council (http://www.edmcouncil.org/councilmembers). We are governed by a board of 24 (http://www.edmcouncil.org/board). For more information visit www.edmcouncil.org.

Media Partners

The Data Governance Institute

The Data Governance Institute (DGI) is the industry’s oldest and best known source of in-depth, vendor-neutral Data Governance best practices and guidance. Since its introduction in 2004, hundreds of organizations around the globe have based their programs on the DGI Data Governance Framework and supporting materials. www.DataGovernance.com.   Follow us on Twitter: 
@DGIFramework https://twitter.com/DGIFramework
Connect with us on LinkedIn: 
The Data Governance Institute (DGI) https://www.linkedin.com/company/480835

ECCMA

Formed in 1999; the Electronic Commerce Code Management Association (ECCMA) has brought together thousands of experts from around the world and provides a means of working together in a fair, open and extremely fast internet environment to build and maintain global, open standard dictionaries used to unambiguously label information without losing meaning. ECCMA works to increase the quality and lower the cost of descriptions through developing International Standards.   ECCMA is the original developer of the UNSPSC, the project leader for ISO 22745 (open technical dictionaries and their application to the exchange of characteristic data) and ISO 8000 (information and data quality), as well as, the administrator of US TAG to ISO TC 184 (Automation systems and integration), TC 184 SC4 (Industrial data) and TC 184 SC5 (Interoperability, integration, and architectures for enterprise systems and automation applications) and the international secretariat for ISO TC 184/SC5. For more information, please visit www.eccma.org.

IQ International

IQ International (abbreviated as IQint), the International Association for Information and Data Quality, is the professional association for those interested in improving business effectiveness through quality data and information. All, including full-time practitioners, those impacted by poor data and information quality, and those who just want to learn more, are welcome!

IT-LATINO.NET

IT-latino.net is the most important online Hispanic IT Media Network. With more than 120,000 registered users we have become an important online IT Business Forum organizing daily webinars and conferences on different Technology issues. We inform regularly a strong IT community from both sides of the Atlantic: Spain and Latin America.

Modern Analyst

ModernAnalyst.com is the premier community and resource portal for business analysts, systems analysts, and other IT professionals involved in business systems analysis. Find what you need, when you need it. The ModernAnalyst.com community provides Articles, Forums, Templates, Interview Questions, Career Advice, Profiles, a Resource Directory, and much more, to allow you to excel at your work. From junior analysts to analysis managers, whether you integrate off-the-shelf products, perform systems analysis for custom software development, or re-engineer business processes, ModernAnalyst.com has what you need to take your career to the next level.

Technology Evaluation Centers

Technology Evaluation Centers (TEC) helps organizations choose the best enterprise software solutions for their unique needs—quickly and cost effectively. With detailed information on over 1,000 solutions in the world’s largest vendor database, TEC delivers a broad range of evaluation and selection resources to help ensure successful software selection projects. As impartial software evaluators since 1993, TEC’s expert team of analysts and selection professionals are involved in thousands of software selection projects every year. The TEC newsletter goes out to 920,000 subscribers and is available in 4 languages.  Visit TEC: www.technologyevaluation.com.  
Subscribe to the TEC Newsletter: http://www.technologyevaluation.com/newsletter-subscribe/

Silicon UK

Silicon UK  is the authoritative UK source for IT news, analysis, features and interviews on the key industry topics with a particular emphasis on IoT, AI, cloud and other transformative technologies.   The site is your guide to the business IT revolution, offering other resources such as jobs, whitepapers and downloads alongside its coverage.   Stay informed, register to the newsletters.

Via Nova Architectura

A number of thought leaders in the area of business – and IT architectures have set up a digital magazine on architecture: Via Nova Architectura. Although started as an initiative within the Netherlands, the magazine should reach all those interested in the area of architecture, where-ever they live. Via Nova Architectura aims to provide an accessible platform for the architecture community. It is meant to be the primary source of information for architects in the field. The scope of Via Nova Architectura is “digital” architecture in the broadest sense of the word: business architecture, solution architecture, software architecture, infrastructure architecture or any other architecture an enterprise may develop to realize its business strategy.