Keynote & Featured Speakers

Testimonials

Sponsors

Fees

  • 1 Day
  • £795
  • £795 + VAT (£159) = £954
  • 2 Days
  • £1245
  • £1,245 + VAT (£249) = £1,494
  • 3 Days
  • £1595
  • £1,595 + VAT (£319) = £1,914
  • 4 Days
  • £1945
  • £1,945 + VAT (£389) = £2,334
Group Booking Discounts
Delegates
2-3 10% discount
4-5 20% discount
6+ 25% discount

Venue

  • 22 Portman Square
  • London W1H 7BG
  • UK

Agenda

Monday Day 1 - November 7
Morning Workshops
09:30-12:45
Designing the Logical Data Warehouse
Designing the Logical Data Warehouse

The classic data warehouse architecture has had a long and successful run, but we’re starting to stretch its abilities to the limit. The logical data warehouse is a modern alternative. It offers several practical benefits, including that it’s more agile, it makes adoption of big data easier and more seamless, it allows a management the self-service BI component, it can more easily exploit new data storage technologies such as Hadoop and NoSQL, and is more suited for operational BI applications. Mature technology in the form of data virtualization servers exists to develop a logical data warehouse. Products from Cisco, Denodo, and RedHat, have proven that large BI systems can be developed using data virtualization. This workshop explains the architecture of a logical data warehouse and discusses numerous tips, tricks, and guidelines for designing and developing one, and a structured approach for migrating to a logical data warehouse is taught.

The Key to Big Data Modelling: Collaboration
The Key to Big Data Modelling: Collaboration

Some claim that in the age of Big Data, data modelling is less important or even not needed. However, with the increased complexity of the data landscape, it is actually more important to incorporate data modelling to understand the nature of the data and how they are interrelated. To do this effectively, our approaches to data modelling need to adapt to this complex environment. One of the key data modelling issues is how to foster collaboration between new groups, such as data scientists, and traditional data management groups. There are often different paradigms, and yet it is critical to have a common understanding of data and semantics between different parts of an organization. In this presentation, Len Silverston will discuss:

  • How Big Data has changed the enterprise landscape and affected data modelling
  • How to conduct data modelling in a more ‘agile’ way for Big Data environments
  • How to collaborate effectively within an organization, even with very different perspectives

Len Silverston, President, Universal Data Models

Len Silverston

President, Universal Data Models

Len Silverston is a best-selling author, consultant, and speaker with over 30 years of experience helping organizations in various data governance and data management programs. Mr. Silverston is an internationally acclaimed expert and thought leader in the fields of data governance, data modeling, data management, and in the human dynamics of integrating information. He has helped many organizations successfully implement data governance programs. For example, from 2010 through 2012, he guided and provided data governance consulting for his client who won the 2012 Data Governance Best Practice Award.

Data Science in Action
Data Science in Action

Data Science has been called ‘The Sexiest Job in the 21st Century’ and is a hot topic in the world of analytics. It promises breakthrough insights and skyrocketing ROI percentages, delivered by teams of extremely smart people using state of the art technology. At least, that’s what analyst firms and trade magazines want us to believe. Reality is that many organisations are just starting out with their first ‘Big Data Labs’ or are struggling to move beyond technology oriented proof of concepts. The challenge is how to put the business in the driver’s seat and use technology as an enabler and not for the sake of technology alone. This session will look at both concepts as well as practical approaches in moving towards becoming a data and analytics driven organisation. The following topics will be covered:

  • Concepts: what is Data Science and how is it different from BI?
  • Use Cases: Data Science success stories from a variety of sources
  • Skills: can my BI team become a Data Science team? (and how?)
  • Process: how do you facilitate Data Science within your organisation?
  • Tools & Technology: what are the technical requirements for a successful Data Science team?

After this session you will be able to look beyond the hype and will have a good understanding of the benefits of adopting data science within your organisation.

Jos van Dongen, Principal Consultant, Tholis Consulting

Jos van Dongen

Principal Consultant, Tholis Consulting

Jos van Dongen is a consultant, author, speaker and analyst. Jos has been involved in software development, business intelligence (BI) and data warehousing since 1991 and is the (co)author of three highly acclaimed (open source) BI books and numerous magazine articles. Over the past years he has been the lead architect for a wide collection of analytical solutions in a variety of organisations, both profit and non-profit. Jos speaks regularly at national and international conferences about new developments in BI, Analytics and Data Science. After being an independent consultant for over 15 years he joined SAS in May 2013 as a principal consultant. In his current role he advises customers on various topics such as Data Visualization, Data Science, Machine Learning and Internet of Things. Follow Jos on Twitter: @josvandongen.

Master Data Management
Master Data Management

This workshop provides a practical guide to implementing successful MDM from experience. It covers all the aspects of MDM, from justification to architecture to data management and project management using agile principles. This class gives the common process, organisational and architectural focus for building strategies and implementing master data management programs such as those that have consistently, for many years, improved the productivity and performance for clients including global giants.

  • How MDM provides benefits to an organisation and how to justify an MDM project
  • The various architectural styles of MDM
  • What MDM provides for hierarchy management and data governance
  • How to go to market for an MDM tool
  • All the roles and responsibilities on an MDM project
  • How to manage the organisational change that occurs with MDM projects

William McKnight, Consultant, McKnight Consulting Group

William McKnight

Consultant, McKnight Consulting Group

William McKnight is President of McKnight Consulting Group (www.mcknightcg.com).  He is an internationally recognized authority in information management.  His consulting work has included many of the Global 2000 and numerous midmarket companies.  His teams have won several best practice competitions for their implementations and many of his clients have gone public with their success stories.  His strategies form the information management plan for leading companies in various industries.  William is author of the book “Information Management: Strategies for Gaining a Competitive Advantage with Data”.  William is a very popular speaker worldwide and a prolific writer with hundreds of articles and white papers published. William is a distinguished entrepreneur, and a former Fortune 50 technology executive and software engineer.  He provides clients with strategies, architectures, platform and tool selection, and complete programs to manage information. Follow William on Twitter: @williammcknight.

Carrying out a Data/ Information Maturity Assessment
Carrying out a Data/ Information Maturity Assessment

Knowing how mature your organization is and how ready it may be for a data and information management program should be high on your list of priorities. After all, we have all heard “you don’t know what you don’t know” before, haven’t we? But (and this is the big one), how do you assess this maturity? Is there a magic bullet, a special methodology or a trusted approach that everyone uses? One that you can go and buy somewhere? Sue says she doesn’t know of one specific one and has been working with clients over a number of years trying to perform that miracle. Sometimes it has worked and sometimes not. However, challenges are her bread and butter!

Join Sue on a journey to assess your maturity for Data and Information. But be prepared: there is work to be done and you will walk away at the end of the workshop with a maturity assessment of your organization (or if you are a consultant with that of a client of your choice!). You will also walk away knowing that you can do this assessment again and again and again, over as long a period as you want to. This will, of course, allow you to measure your success! Please bring along a PC, a tablet, or – if all else fails – a Smartphone. You will be taking the assessment.

Key Learnings:

  • How important doing a regular maturity assessment is
  • Take home the actual tool to do this
  • Some new skills on how to get people to do the assessment

Getting Our Heads into the Clouds: Infinitely Scalable Data Management
Getting Our Heads into the Clouds: Infinitely Scalable Data Management

The cloud is not the future of enterprise computing: it’s the NOW of enterprise computing. We are in the midst of a data awareness renaissance, and our organizations are demanding that we build analytics capabilities to drive future success. Cloud technologies are part of the answer, but to fully respond to these challenges we need to bring new mindsets and techniques to data management. It’s up to us data professionals to bring these capabilities to our businesses. If your business is not already there, it needs to be. Come to this session to learn how to get to the cloud, and how our data management practices must evolve to harness the power.

This session will cover a pragmatic foundational perspective on hot data terms you have heard about, like:

  • Big Data – a relative scale challenge that has always existed
  • Data Lakes – optimizing data structuring energy across place, time, and purpose
  • The Cloud – forget what you knew about it three years ago
  • Private Cloud – like buying an airplane versus flying commercial
  • Internet of Things – this will make current Big Data scale laughable in the next few years
  • Data Science – a capability far too important to be left entirely to data scientists
  • Why the cloud makes some new things possible, many things easier, and a few things much harder – and how we need to adapt our data management practices in response
  • Why business impact is the only thing that really matters
  • What all this means for business as usual, and how to build a high-performance organization driven by data in the cloud.

Anthony Algmin, Chief Data Officer, Uturn Data Solutions

Anthony Algmin

Chief Data Officer, Uturn Data Solutions

Anthony J. Algmin helps businesses use data to get better at what they do. He is currently the Chief Data Officer at Uturn Data Solutions, a company specializing in helping businesses move to the cloud. Learn more at www.uturndatasolutions.com. Previously, Anthony was the Chicago Transit Authority’s first Chief Data Officer, developing data analytics capabilities to help the CTA improve transit services for millions of Chicagoans. Anthony also spent several years as a data strategy and management consultant with experience across many industries. His early career was in the capital markets industry performing technical and management roles to improve business performance with data and analytics. Anthony frequently speaks at national and local events, and contributes to the data governance and data management communities. He has a BA in Business Administration from Illinois Wesleyan University and an MBA from the Kellogg School of Management at Northwestern University.

What is Quality Data, and Why Do We Need It?
What is Quality Data, and Why Do We Need It?

Even in companies where some people are already convinced, selling the idea of investing in something that looks as trivial as data quality, is not easy. So, where do you start? How can you measure quality with no or very little tooling at hand? What arguments can you use towards your stakeholders?  And once your stakeholders are  on board, where do you start?  Sophie will give you some examples of companies that started from scratch, and will go through a typical DQ-roadmap with solutions in the 3 aspects that impact the quality of your data: people, governance and technology.  She’ll also bring some of QuaData’s surveys so you can assess the situation of your own company.  You will leave this workshop with at least one feasible first or next step for your own company. And lots of them to build your roadmap to the right level of data quality. You will you learn:

  • How to convince your stakeholders of the importance of DQ
  • How to gather basic measurements about the quality of your data
  • Where to start and what to consider in your roadmap

Sophie Angenot, Managing Partner, QuaData

Sophie Angenot

Managing Partner, QuaData

Sophie Angenot has more than a  decade of experience in Data Quality. She built that experience at companies like Dun&Bradstreet and Bisnode Belgium. She’s also the president of DQA, the Belgian Data Quality Association, that she founded in 2012. DQA’s goal is to bring together business and IT people that care about and need good data. And to share experiences and knowledge, through seminars and a yearly congress in October, where a Data Quality Award is also granted, to support companies that invest in Data Quality. QuaData was her next challenge. She founded the business consultancy company in 2014. As Managing Partner, Sophie is working together with the QuaData consultants to help customers become champions In Data Quality, Master Data Management and Data Governance.

Big Data Platform Fundamentals – New Infrastructure in Your Analytical Ecosystem
Big Data Platform Fundamentals – New Infrastructure in Your Analytical Ecosystem

This session provides an introduction to core platforms for big data analytics and walks through the main components.

  • The new multi-platform analytical ecosystem
  • New platforms in the logical data warehouse
  • An introduction to Hadoop and the Hadoop Stack
  • What is HDFS, MapReduce, Pig & Hive?
  • What is Apache Spark? –in-memory massively parallel analytics
  • Data Warehouse offload – ETL on Hadoop and Spark
  • Accessing Hadoop data using SQL on Hadoop
  • Security on Hadoop
  • The Big Data Marketplace
    • Hadoop distributions
    • Big Data Appliances
    • Streaming Analytics
    • NoSQL databases
  • The Cloud deployment options
  • Creating a multi-platform analytical ecosystem

Mike Ferguson, Managing Partner, Intelligent Business Strategies

Mike Ferguson

Managing Partner, Intelligent Business Strategies

Mike Ferguson is Managing Director of Intelligent Business Strategies Limited.  An analyst and consultant he specialises in business intelligence, data management and enterprise business integration.  With over 34 years of experience, Mike has consulted for dozens of companies on business intelligence/corporate performance management strategy and technology selection, big data, enterprise architecture, business integration, MDM and data integration.  He has spoken at events all over the world and written numerous articles.  Formerly he was a principal and co-founder of Codd and Date Europe Limited – the inventors of the Relational Model, a Chief Architect at Teradata, on the Teradata DBMS, and European Managing Director of Database Associates.  He teaches master classes in Big Data Analytics, New Technologies for Data Warehousing and BI, Mobile and Collaborative BI, Operational BI, Enterprise Data Governance, Master Data Management, Data Integration and Enterprise Architecture. Follow Mike on Twitter @mikeferguson1.

Afternoon Workshops
14:00-17:15
Introducing Agile Business Intelligence Sustainably: Implement the Right Building Blocks in the Right Order
Introducing Agile Business Intelligence Sustainably: Implement the Right Building Blocks in the Right Order

“We now do Agile BI too” is often heard in todays’ BI community. But can you really “create” agile in Business Intelligence projects? This workshop shows that Agile BI doesn’t necessarily start with the introduction of an iterative project approach. An organisation is well advised to establish first the necessary foundations in regards to organisation, business and technology in order to become capable of an iterative, incremental project approach.

In this workshop you will learn, which building blocks you need to consider. In addition you will see what a meaningful sequence to these building blocks is. Selected aspects like test automation, BI specific design patterns as well as the Disciplined Agile Framework will be explained in more and practical details.  Top 3 Take Aways:

  • Identify what building blocks are necessary to introduce Agile BI in a sustainable way
  • Learn about the sequence of implementation of the building blocks for Agile BI
  • Become aware of typical trap doors when starting Agile BI

Raphael Branger, Senior BI Solution Architect, IT-Logix AG

Raphael Branger

Senior BI Solution Architect, IT-Logix AG

Raphael Branger holds a Master of Arts in Information Management and is a Certified Disciplined Agilist. Today he works as Chief Knowledge Officer and Senior Solution Architect at IT-Logix AG. He has more than fourteen years of experience in the area of Business Intelligence (BI) and data warehousing. His current focus is around BI specific requirements engineering as well as the adaption of agile methods in the context of BI. His technical background is mainly (SAP) BusinessObjects, the Microsoft BI stack as well as the data warehouse automation toolkit of WhereScape. Raphael is a regular trainer and speaker at national and international BI conferences. He is actively contributing to the BI community on his blog (http://rbranger.wordpress.com) and you can follow him on Twitter (@rbranger).

Modeling for SQL and NoSQL
Modeling for SQL and NoSQL

This workshop will be a sharing of experiences and lessons learned in the trenches of projects for Lufthansa Airlines with references to previous projects with a Telco and a shipping Company. Data Models are seen by many as an unnecessary legacy but those based on Conceptual and Logical models offer flexibility and future proofing” – they are key to integrating and scaling data and queries in both Relational and NoSQL Schemas and future proofing. For the Agile organisation the models provide a framework that reduces technical debt and the concerns over duplication and lack of consistent progress in developing enterprise capabilities.  Generic Industry Data Models can be valuable but there is much more to modelling than producing a diagram of entities and relationships. Implementation requires generic structures are validated and customised and involves alignment of all stakeholders and team members – a process that is fraught with pitfalls. The speaker will share experiences and recommend key best practices in the conference session including:

  • Successful Conceptual Models can and should be memorised as drivers for detailed models – larger models should be abstracted.
  • Conceptual Models must be agreed at enterprise level by the business leaders and enterprise architects and then serve as the cornerstones of Logical Models.
  • Successful Conceptual and Logical Data Models establish a common language for the Enterprise, Data Governance and all Analytics

In the 3 hour workshop Kenneth will detail the process recommended for validation, customisation and implementation of a generic industry LDM .

Kenneth Hansen, Managing Consultant, Analyticdomain

Kenneth Hansen

Managing Consultant, Analyticdomain

Upon reaching a Group Management position on the business side of Barclays Bank, Kenneth Hansen established a data analytics team, including predictive analytics, to assist with decisioning some 20 years ago. That was the start of migration to technical disciplines. After achieving certification as a Master with Teradata he specialised in modelling for the enablement of Analytics for 10 years in a dozen countries across EMEA, covering Banking, Insurance, Retail, Shipping, Telco and, most recently, a major Airline. He has supported both Teradata and IBM and directly with end customers in the architecting and implementing enterprise data structures for the enablement of digital processes. Follow Kenneth on Twitter: @Ken_Hansen.

Getting Ready for the General Data Protection Regulation – An EIM2.0™ Approach
Getting Ready for the General Data Protection Regulation – An EIM2.0™ Approach

The General Data Protection Regulation comes into effect on the 25th May 2018. While at the core, the fundamental objectives remain the same, the GDPR introduces a number of significant changes for organisation with regards to how they need to plan for, manage, govern, and apply data that relates to identified or identifiable individuals.  The implications of GDPR are far reaching as it has set a new benchmark for global data privacy standards, not least with its extra territorial effect that means non EU-based organisations processing data of EU residents will have to comply.  In this intensive session, internationally recognised expert Daragh O Brien will:

  • Provide delegates with a pragmatic overview of the key provisions of the GDPR
  • Explain the penalties and enforcement frameworks that will exist under GDPR (and which are being introduced already in certain EU Member States)
  • Outline the impact of recent EU Court of Justice rulings on Data Privacy laws in the EU and elsewhere, and how that impacts the governance, planning, and management of information in your organisations
  • Discuss the other changes in EU Data Privacy laws that are emerging, and the impact on international data privacy regulation
  • Explore how agile principles and methods can be applied to getting ready for GDPR
  • Map out how frameworks such as the DMBOK Wheel and the Zachman Framework, and methodologies and principles from Data Governance and Data Protection can help with implementation of GDPR, including the execution of Privacy Impact Assessments
  • Introduce the concept of EIM2.0™ – Ethical Information Management and how this can support and enable GDPR governance, particularly in the context of the Risk based model for Regulation

Daragh O Brien, Castlebridge  

Daragh O Brien

Castlebridge  

Recently rated the 24th most influential person in Information Security worldwide on Twitter (http://www.onalytica.com/blog/posts/data-security-top-100-influencers-and-brands/ ), Daragh O Brien, FICS, is a leading consultant, educator, and author in the fields of Information Privacy, Governance, Ethics, and Quality. After over a decade in a leading telco, Daragh now works with clients in a range of sectors on a range of Information Management challenges.  Daragh is a Fellow of the Irish Computer Society and a Privacy Officer for DAMA-l. He teaches Data Privacy Law and Practice at the Law Society of Ireland. Castlebridge is a commercial partner of the Adapt Centre in Trinity College Dublin and collaborates with the Insight Centre for Digital Analytics, Europe’s largest Analytics research group. Follow Daragh on Twitter @cbridgeinfo.

From Vision to Capabilities: Effective Building Blocks for Information Centric Organisations
From Vision to Capabilities: Effective Building Blocks for Information Centric Organisations

Keeping control over the design- and architecture activities that are performed within your organisation without creating a top heavy governance organisation is quite a challenge. Justifying why certain choices have been made and come up with the proper reasons to invest in a particular solution can be quite tricky. The latter is often performed in an ad-hoc and unstructured way.

One of the current hypes is the concept of capabilities which can be loosely defined as “things that you should be able to do”. The idea behind the capability approach makes sense as it starts from the assumption that when you describe what and why you want to perform you end-up with the right mix to reach your objectives. This brings us to the more fundamental question of the underlying drivers.  The workshop will take you through the different steps of translating a vision into policies, principles and standards that guide your reference architectures and blueprints. We will cover the framework that connects all the different elements into a practical toolkit that underpins you reference architecture and transforms standalone statements such as “information is an asset” into useable steering of your designs and architecture.

  • Understanding the policy framework concept
  • Linking vision to actions
  • Outcome driven capabilities
  • Reference architectures for common use cases, MDM, Analytics, BI, Data Quality

Jan Henderyckx, Managing Partner, Inpuls     

Jan Henderyckx

Managing Partner, Inpuls     

Jan Henderyckx is a highly rated consultant, speaker and author who has been active in the field of Information Management and Relational Database Management since 1986. He has presented, moderated and taught workshops at many international conferences and User Group meetings worldwide. Jan’s experiences, combined with information architecture and management expertise, have enabled him to help many organisations to optimise the business value of their information assets. He has a vision for innovation and ability to translate visions into strategy. A verifiable track record in diverse industries including non-profit, retail, financial, sales, energy, public entities. Contributed to better stream lined and higher yielding operations for some of the leading businesses through a combination of creativity, technical skills, initiative and strong leaderships. He is a Director of the Belgium and Luxembourg chapter of DAMA (Data Management Association) and runs the Belgian Information Governance Council. He has published articles in many leading industry journals, and has been elected to the IDUG Speakers Hall of Fame, based upon numerous Best Speaker awards. Jan is Chair of the Presidents Council DAMA International.

Creating Better Business Diagrams
Creating Better Business Diagrams

Data management professionals need to create diagrams to explain concepts. Unlike written communication, where everyone is taught grammar and how to write clearly, there is almost no training in creating effective diagrams for business communication.

This workshop explains a set of diagramming techniques based on Gestalt principles. The focus is to create diagrams that are easily understood. By the end of the workshop you will be able to formally assess a diagram and understand how to improve it to make its message clear and compelling. The same techniques will also be applied to tabular information to improve its clarity. The workshop has a strong emphasis on doing exercises to re-inforce the concepts.  No drawing skills are required.

Key learnings:

  • Techniques for assessing and improving business diagrams
  • Understand how to apply these techniques to other communication areas such as tables of data
  • Ensuring the communications resonates with the target audience

Glen Bell, Director, Visual Explanations

Glen Bell

Director, Visual Explanations

Glen Bell has worked in data management for over 30 years. He is an independent consultant and has worked for a variety of organisations in Canada, Europe, Malaysia, Singapore, New Zealand and throughout Australia. He holds a Master of Business, Information Technology Management from the University of Technology Sydney and a Bachelor of Science, Computing and Mathematics from the University of Queensland. He also has CDMP and CBIP certifications, both at mastery level, as well as being a Certified Data Vault Data Modeller. Glen is president of DAMA Sydney.

All Roads Lead to Data Funding
All Roads Lead to Data Funding

Many Enterprise Data Management programs struggle for funding, especially for work that doesn’t fit neatly into IT Projects.  And yet, in just 2 ½ years, IFC (the private sector arm of the World Bank Group), evolved from “invisible” data needs to a well-funded, interdependent set of projects and work plans. In this highly-interactive workshop, the team behind the funding strategy coaches you through the process of finding your own path to increased funding.  The workshop explores:

  • 5 funding models, and when each might be deployed
  • Stakeholder analysis, and getting “a seat at the table” for the right people in the right venues
  • A Three Pillar Model that brings visibility to work that business and IT stakeholders may have been blind to
  • How and when to involve Enterprise Architecture, Information Security, and BI
  • The value of inserting data reviewers into IT Governance, Lifecycle Management, and IT Portfolio Management

Gwen Thomas, Corporate Data Advocate, IFC/World Bank Group

Gwen Thomas

Corporate Data Advocate, IFC/World Bank Group

Gwen Thomas is a Data Governance pioneer, helping to define and evangelize the field.  Founder of the Data Governance Institute (DGI) and primary author of the DGI Data Governance Framework and guidance materials found at www.datagovernance.com, she has influenced hundreds of programs around the globe. In the spring of 2013, Gwen joined the International Finance Corporation, part of the World Bank Group, as their Corporate Data Advocate. A founding member of the International Society of Chief Data Officers, Gwen was named by Information-Management.com as one of “17 Women in Technology You Should be Following on Twitter.” Follow Gwen on Twitter: @gwenthomasdgi.

Jennifer Trotsko, Head, Data Governance Office, IFC/World Bank Group

Jennifer Trotsko

Head, Data Governance Office, IFC/World Bank Group

Jennifer Trotsko is Head of the Data Governance Office for the International Finance Corporation (IFC, the private sector arm of the World Bank Group). Ms. Trotsko has been a data governance professional for over 12 years and was responsible for the establishment of the Information Quality function at IFC in 2001. Currently she is leading IFC on an enterprise-wide master data management implementation, customer quality improvement program, and data access governance initiative, all of which sit at the very center of the World Bank Group’s strategic priorities.  Ms. Trotsko started her career with IFC in Moscow in 1997 as a Project Manager for IFC’s Russian Electricity Sector Reform Project, where she advised clients on strategies for de-monopolizing the electricity sector.

Sana Al-Hajj, Manager, IFC/World Bank Group

Sana Al-Hajj

Manager, IFC/World Bank Group

Sana Al-Hajj has over 20 years of experience in the information management and technology field. Currently she manages the Service Quality team at the International Finance Corporation (IFC – the private sector arm of The World Bank Group), overseeing the quality of the IT program, the Program Management Office, IT Portfolio Management and Service Level Agreements, Quality Assurance, Data Management, and coordination with other IT units across the World Bank Group. Prior to this role, she managed the delivery of programs for HR and Controllers and served as the Chief Information Security Officer at IFC. Sana is the co-chair of the Engineer Alumni Association and the President of the International College Alumni Association DC Chapter.  She has a D.Sc.& M.Sc in Information Management from George Washington University, and a BA in Business Administration from the American University of Beirut, Lebanon.

Building Professional Competencies for Information Management Practitioners
Building Professional Competencies for Information Management Practitioners

Considering a career in Information Management?
Already well established in the field?
Want to build an information management practice in your organisation?

It’s not only the “Information Management” skills that are essential.  This workshop will address the key issues of:

  • What key capabilities are necessary (and desirable) for IM professionals
  • What behaviours and attitudes should be exhibited
  • What are the roles necessary in a successful Information Management practice
  • What are the skills and skill levels required to fulfil those roles,
  • What are the core services necessary to permeate an Information Management practice & how should these mature, and
  • Does certification help?

Taught by DAMA Award winner, DAMA Fellow, & President of DAMA UK this workshop is based upon real practical experience gained over 35 years in assisting Global organisations big & small & will help individuals & organisations plan their Information Management development.

Chris Bradley, Information Strategist, Data Management Advisors Ltd

Chris Bradley

Information Strategist, Data Management Advisors Ltd

Christopher Bradley has spent 35 years in the forefront of the Information Management field, working for leading organisations in Information Management Strategy, Data Governance, Data Quality, Information Assurance, Master Data Management, Metadata Management, Data Warehouse and Business Intelligence.   Chris is an independent Information Strategist & recognised thought leader.  Recently he has delivered a comprehensive appraisal of Information Management practices at an Oil & Gas super major, Data Governance strategy for a Global Pharma, and Information Management training for Finance & Utilities companies.  Chris guides Global organizations on Information Strategy, Data Governance, Information Management best practice and how organisations can genuinely manage Information as a critical corporate asset.  Frequently he is engaged to evangelise the Information Management and Data Governance message to Executive management, introduce data governance and new business processes for Information Management and to deliver training and mentoring.  Chris is Director of the E&P standards committee “DMBoard”, an officer of DAMA International, an author of the DMBoK 2.0, a member of the Meta Data Professionals Organisation (MPO) and a holder at “master” level and examiner for the DAMA CDMP professional certification. Chris is an acknowledged thought leader in Data Governance, author of several papers and books, and an expert judge on the annual Data Governance best practice awards. Follow Christopher on Twitter @inforacer.

Big Data Technology and Use Cases
Big Data Technology and Use Cases

This workshop provides an approach for storing, managing and accessing data of very high volumes, variety or complexity. Storing large volumes of data from a large variety of data sources in traditional relational data stores is cost-prohibitive. And regular data modelling approaches and statistical tools cannot handle data structures with such high complexity. This seminar discusses use cases and new types of data management systems based on Hadoop and NoSQL database management systems and alternative programming models and access methods.

  • Big Data Overview and Common Themes
  • The main characteristics of Hadoop and NoSQL databases
  • Differences between a distributed database and relational databases
  • NoSQL data models: key-value, columnar, document
  • Use cases for big data, with real-world examples from organizations in production today
  • The placement of big data in information architecture
  • Scale up versus scale out
  • MapReduce and Spark
  • Graph Stores: All about Relationships
  • Enablers for big data in the enterprise

Tuesday day 2 - November 8
08:00-09:00 Registration
09:00-09:10 Conference Opening
KEYNOTE: Leading Digital Strategy – Enterprise Data as an Asset
KEYNOTE: Leading Digital Strategy – Enterprise Data as an Asset

Digital commerce has shown tremendous growth figures in recent years, while global penetration is still below 5% of total sales. Mobile commerce, business intelligence and smart infrastructure are key in driving companies and consumers and ways of doing business.  However, simply implementing or using digital technologies is not enough. In general, effective digital strategies are less about acquiring and implementing the right technology than about reconfiguring your business to take advantage of the information these technologies enable.  Companies must bring together varieties of digital technologies integrated across people, process and functions in order to achieve competitive advantage.

How do you harness data as an asset to prove that Digital strategy works? What are the elements that get the attention of senior leadership?  What are the enablers?:

  • Culture – Understanding Digital strategy & the importance of data and the freedom to experiment. “Failure is always an option” attitude
  • People – Knowledge, expertise, willingness to risk
  • Leadership – Empowerment, understanding, coaching, mentoring and risk taking
  • Purpose – Why do we do this?
  • Vision & strategy

Sakari will explain how EA drives results at Talenom, one of the largest authorized accounting firms in Finland, using the CDO MIS.  The Talenom  CDO Management Information Systems (MIS) focuses on the management of information systems to provide efficiency and effectiveness of strategic decision making including

  • What decisions C-level leaders need to make?
  • What different data elements they need to see?
  • How the operation is steered with the data?

The Talenom Digital Services business unit, headed by Sakari Jorma, is operated via direct connections between MIS capabilities & Business operations. They operate in a conventional – one could argue in an “old fashioned business area”, of accounting. An average Accounting Company in Finland has around 5-7% level of digitalization when it comes to the handling of receipts, taxation, payroll data etc. So far one of Sakari’s main focuses has been automating and digitalizing Talenom’s main processes leading to the digitalization of materials as well. Today they are one of the leading digital accounting companies.  You will hear:

  • How EA data helps them to do this?
  • What standardizations & governance elements are needed?
  • What is the role of Data warehouse, BI and MDM in this?

10:00-10:25 Break and Exhibits
10:25-11:15
Strategies for Consolidating Enterprise Data Warehouses and Data Marts into a Single Platform
Strategies for Consolidating Enterprise Data Warehouses and Data Marts into a Single Platform

As companies grow in their realization of the value of information as a strategic asset, a re-evaluation of information architecture maturity and capabilities becomes essential. This has led many to corral their unwieldy, expensive environment into a more manageable and cost-effective infrastructure that produces the elusive bankable company numbers and metrics. During this session, industry expert William McKnight, will highlight key strategies leading edge companies have adopted to reduce the complexity of their data warehouse environment for maximum efficiency.

  • Inefficient Information Architecture
  • Methods of Data Mart Consolidation
  • Many data marts, 1 data warehouse
  • Many data warehouses, 1 data warehouse
  • Choosing survival systems
  • Keys to Data Mart Consolidation Success

Data Management, Analytics and People – An Eternal Golden Braid
Data Management, Analytics and People – An Eternal Golden Braid

The role of Data Management is often shown within a pyramid with analytical techniques, be these Statistical Modelling, Data Visualisation, Big Data or Business Intelligence at the top, supported by a much larger foundational areas of Data Strategy, Data Processes and Information Architecture. This view has many merits, but also two flaws. First it omits an important factor in adding value: people and cultural change. Second these three areas can be thought of as mutually reinforcing rather than as foundational and apex activities. I believe that that the leverage of data to yield information, provide insights and drive action is best achieved via interplay between these three equally important areas. Attendees will learn:

  • The value of making Data Management more people-centric
  • Three mutually reinforcing areas are stronger than a pyramid structure
  • What bringing these three areas together means in practice

Peter Thomas, Head Of Data Management Insurance, peterjamesthomas.com Ltd

Peter Thomas

Head Of Data Management Insurance, peterjamesthomas.com Ltd

Peter James Thomas has held senior international roles at Chubb Insurance, De Beers, Greene King, Validus Holdings, XL Catlin, Bupa and Lloyds Banking Group. While he has an IT background, having spent the first 8 years of his career in a start-up software house, Peter now most frequently operates in the nexus between business, technology and change. He has filled the “Top Data Job” in a number of organisations from 2000 onwards. Peter has a strong background in strategy development, data management, analytics, business intelligence and data warehousing. He is generally focussed on helping companies to successfully navigate the data to information to insight to action journey. Peter is a regular speaker on these topics and blogs at www.peterjamesthomas.com.

Progressing an Enterprise Wide Data Accountability Structure From the Design Stage to Operational Effectiveness – Case Study from Yorkshire Building Society Group
Progressing an Enterprise Wide Data Accountability Structure From the Design Stage to Operational Effectiveness – Case Study from Yorkshire Building Society Group

Typically there are significant challenges identifying and assigning accountability for data. These include a lack of industry standards and challenges getting buy-in for the concepts and the more complex the organisation, the bigger the challenges are.  However an accountability structure must be defined for data governance to become embedded and operationally successful.  During this case study session the authors will provide insight into the journey at the Yorkshire Building Society Group, sharing lessons learned and practical advice. The session will cover:

  • How to approach the design of an accountability structure for data
  • Leveraging the development of a Data Dictionary to embed the accountability structure at YBS
  • Navigating out of the inevitable troughs that occur during the implementation journey

Ellie Fitzpatrick, Data Governance Manager, Yorkshire Building Society Group

Ellie Fitzpatrick

Data Governance Manager, Yorkshire Building Society Group

Ellie Fitzpatrick is the Data Governance Manager for the Yorkshire Building Society Group where she is responsible for leading the development and implementation of a Data Governance Framework. With over ten years’ experience in related roles within financial services and the public sector, Ellie focuses on the delivery of a practical, risk and value-driven approach to data governance. Follow Ellie on Twitter: @elliefitz.

Bethany Lancaster, Senior Data Governance Analyst, Yorkshire Building Society Group

Bethany Lancaster

Senior Data Governance Analyst, Yorkshire Building Society Group

Bethany Lancaster is the Data Quality and Information Lifecycle Management Lead within the Data Governance Team at Yorkshire Building Society Group. She is responsible for the design and implementation of a Data Quality Management Framework, working with Data Owners and Data Stewards to promote and embed proactive data quality management. Her responsibilities also include implementing an improved approach to data retention.

Turning the Telescope - Humans as Data Systems
Turning the Telescope - Humans as Data Systems

This presentation gives a new insight into how we can gain a different and deeper understanding of the dynamics of many human groups – such as teams, organizations and even whole societies – by changing our perspective and viewing them as systems both containing and using information, and affected by the availability, flow, and quality of data within the system.

How is data used as a resource in the real world of human interaction? How do data silos, flows, errors and redundancies cause changes in behaviour and interactions? What could psychology learn from data management, and vice versa? We’ll be looking at these aspects, and more.

The presentation is designed to be engaging, and enjoyable – mixing a little light humour with real thought provoking ideas about both humans and data management in a unique context.

How is Big Data Changing the Paradigm of Sports
How is Big Data Changing the Paradigm of Sports

During the last World Cup, three companies made predictions on the results of the final phase of fifteen matches. They were demonstrating the ability of their advanced technology to predict the outcome of football matches. Microsoft and Baidu correctly predicted all fifteen results while Google made only one mistake…  How were they able to make such accurate predictions? They crunched and analysed large numbers of historic results – what we call “big data” – and used that analysis to make their successful predictions.  It seems reasonable, therefore, to ask ourselves whether big data is changing the paradigm of the sports industry?

European football teams have started to use statistics much more over the last decade and the methods are becoming more and more sophisticated. The first club in the Premier League to do so were Bolton Wanderers and just a few years later almost every club in the top flight are using statistics to monitor player performance.  It therefore seems that big data is genuinely changing sport, but it is also changing the games that are based on sports, for instance fantasy sports.  Key learning points:

  • How Big Data is changing the way we watch sports and interact with sports clubs
  • The importance of Big Data for skill games
  • The role of Big Data in traditional sports betting vs. fantasy sports betting

Valéry Bollier, CEO, OulalaGames Ltd

Valéry Bollier

CEO, OulalaGames Ltd

Valéry Bollier has over eleven years of experience in the iGaming industry. He is a regular speaker at industry conferences and seminars, as well as a contributor to various BtoB publications. Equipped with a passion for Daily Fantasy Sports (DFS), Bollier is the co-founder and CEO of Oulala.com, a revolutionary fantasy football game which was launched three years ago. Follow Valery on Twitter: @OulalaGames.

11:20-12:10
Gaining Business Value from The Internet of Things (IoT) and Critical Human Factors
Gaining Business Value from The Internet of Things (IoT) and Critical Human Factors

Realizing business value from the coming wave of Internet of Things (IoT) capabilities can be confusing when it affects your core business, adds new opportunities for expansion, and taxes your abilities to manage complex architectures. Understand how to formulate a winning approach that covers the bases, giving you the ability to execute and generate value and competitiveness in this new arena.  This session focuses on the most important factors for success in IoT, namely: how to link IoT to meaningful business objectives to create business value; how to maneuver through the politics; how to develop a strategy that makes sense; and how to handle the inevitable scenarios, challenges, and conflicts that occur in this environment.

Come to this session and understand the critical factors, take back tools and frameworks regarding organizational management in IoT, and learn how to succeed in a field with the potential to disrupt all the current business models as we know them.  You Will Learn:

  • The five key areas to focus on immediately in preparing for IoT
  • A framework to understand business needs, along with how it can help prioritize your needs
  • How to start formulating a strategy for IoT
  • How to address key issues in IoT such as privacy, security, and control using a trust framework
  • What inevitably happens in these types of efforts and how to address situations using proven and effective conflict management models

The Benefits of a Data Virtualisation Solution: from Data Vault to SuperNova and Beyond
The Benefits of a Data Virtualisation Solution: from Data Vault to SuperNova and Beyond

How can we improve agility in preparing data for end-users and for information products, like reports, dashboards etc.?  For this purpose a proof of concept with a data virtualization solution was performed. Jos investigated the benefits of a data virtualisation solution on top of a Data Vault Data Warehouse. In this proof of concept also the capability of the data virtualization solution to combine historic data, stored  in the Data Vault, with live data stored in  back-office systems, was subject of investigation.  In this presentation there will also be a brief introduction to the data modelling methods Data Vault and SuperNova. Key take aways:

  • The benefits of a data virtualisation solution.
  • The fit of data virtualization in a BI/Dwh architecture.
  • Unexpected benefits…..

Jos Kuiper, IT Enterprise Architect, Volkswagen Pon Financial Services

Jos Kuiper

IT Enterprise Architect, Volkswagen Pon Financial Services

Jos Kuiper is an experienced Enterprise Architect. He has been working in the financial industry for more than 30 years. Earlier this year he joined Volkswagen Pon Financial Services in Amersfoort, the Netherlands. He has vast experience in setting up agile Business Intelligence and data warehouse architecture. Jos’ drive is to innovate in order to improve agility. He is a certified Data Vault Data Modeller.

It's all Just Data Governance Isn't It?
It's all Just Data Governance Isn't It?

We think of Data Governance as being formulaic, linked to static frameworks and based on well thought out theory. But does that work in the real world. Having worked in a number of organisations Garry has found that whilst the theory works, in practical terms you have to be more fluid. This presentation will look at the differing views of Data Governance he has encountered and the ways he has looked to align with them and still deliver a successful program, it will also look at some of the mistakes that he and others have made and the impact they have had on the projects. In this session Garry will look to show that:

  • Data Governance is not a “one size fits all solution”, it has to meet the aims of the organisation
  • The ways Data Governance can be delivered are adaptable and fluid
  • Real world examples of both good and bad implementations

Garry Manser, Head of Data Governance, Visa

Garry Manser

Head of Data Governance, Visa

Garry Manser has worked in financial services for too many years to mention in a number of different roles. He has been involved with data since 1998 and was introduced to the worlds of governance and quality in 2005. During his career he has worked both in industry and consultancy, across both banking and insurance, with a brief spell in a mine in Seville! Achievements include introducing supporting a number of successful governance frameworks across organisations both at a local level and on a global scale as well as a number of quality initiatives, from the initial monitoring, through root cause analysis and onto remediation, with one role involving introducing circa 200 front end controls to improve data quality at capture. He is currently leading a data governance strategy across a major financial organisation, in support of various regulatory requirements and a driven business wide desire to succeed. Follow Garry on Twitter @garry2406.

Information Architecture as a Driver for Enterprise Data Integration at Boehringer Ingelheim Pharma
Information Architecture as a Driver for Enterprise Data Integration at Boehringer Ingelheim Pharma

Boehringer Ingelheim is a Research driven pharmaceutical company that has a complex value chain reaching from discovery research to market supply. The different parts of this value chain have been supported by business units acting quite independently from each other including the way they were handling their data. The company underwent a transformation towards a more integrated approach, e.g. Research and Development units were combined in an Innovation Unit. This session describes how Information Architecture creates a foundation to succeed on an Enterprise Data level.  This session will outline the strategy; lessons learned how these activities are put in place.

  • How does Information Architecture support and drive the transformation of the company on an Enterprise Data level?
  • What are the achievements?
  • Where are the pitfalls?

Martin Fleming, Enterprise Architect, Boehringer Ingelheim Pharma

Martin Fleming

Enterprise Architect, Boehringer Ingelheim Pharma

Martin Fleming is currently an Enterprise Architect for Medicine, at Boehringer Ingelheim. He Focuses on Business and Information Architecture and is a Member of the Boehringer Architecture Board.
He has previously worked for SCHUFA and IBM Global Services.

Rainer Remenyi, Enterprise Architect, Boehringer Ingelheim Pharma

Rainer Remenyi

Enterprise Architect, Boehringer Ingelheim Pharma

Rainer Remenyi is currently an Enterprise Architect for Research and Development at Boehringer Ingelheim. He Focuses on Business and Information Architecture and is a member of the Boehringer Architecture Board. He has also worked as a Lead Business Consultant in IT and a System Analyst at Boehringer Ingelheim.  PhD Thesis in Chemistry, bioinorganic chemistry, molecular biology, molecular modeling.

Right Sizing Big Data
Right Sizing Big Data

Data of all volumes, BIG and SMALL, and, the information it represents is the very life blood of an organisation in today’s digital world. In addition to being generated as a result of our operational activities, our increasing use, restructuring, transformation and analysis of that data is creating yet more data. This session will discuss

  • Implications of this reality in helping business come to terms with what actions to take and in what order the create a data ecosystem that is sustainable and valuable.
  • Practicalities to be considered and thought through
  • Opportunities for quick wins to help gain support for investment where needed

Michelle Teufel, Strategic Change Leader - Business and Technology Transformation for the Digital Age 

Michelle Teufel

Strategic Change Leader - Business and Technology Transformation for the Digital Age 

Michelle Teufel is a Global Executive based in London with extensive experience in IT, Information Management and Business Intelligence.   For the past year Michelle was interim CIO for Premier Farnell, Plc and has spent her career delivering multi-year transformation programme in both business and IT including establishing and leading the Global Information Management function for Premier Farnell covering Data Operations, Data Governance, Information Security and BI & Reporting Services.

12:10-13:40 Lunch and Exhibits
12:40 - 13:05 Perspective Sessions
Taming the Data Lake
Taming the Data Lake

Big Data technologies offer a way of consolidating the continual flow of raw data generated from interactions between the enterprise and the external world, and internally within the enterprise. While consolidation of data into a Data Lake seems a good way of harnessing this stream of data, it poses several challenges that can reduce the effectiveness of turning this raw data into an enterprise usable asset. This session highlights how Data Virtualization is being used to tame the turbulent nature of data lakes, improving enterprise readiness, providing access to enterprise relevant data quickly and cost effectively. This will be illustrated with two case studies of data virtualization deployed in this context.

Mark Pritchard, Sales Engineering, Denodo

Mark Pritchard

Sales Engineering, Denodo

Mark Pritchard, is a data virtualization specialist. He is a leading member of the sales engineering team at Denodo, with over 20 years of experience in the IT industry. He comes from a consulting background, having worked for major clients in the Financial Services, Energy & Utilities, Media and Government sectors. He has extensive expertise in data warehousing and data integration technologies across multiple market sectors.

Data Lakes and Master Data Management
Data Lakes and Master Data Management

Ever expanding data volumes and a growing number of data formats and input streams are just a way of life for most midsize and large organizations. Data Lakes are one of the new concepts that seem to be gaining traction, providing benefits from pure processing power and scalability, excellent price to performance ratio, and unprecedented capacity. This translates to business benefits such as dramatically shortened go-to-market times for new products, an individual approach to customers or micro segments, ability to build customer and risk profiles from relevant data sources, etc. This session will discuss new challenges brought by the Data Lakes and how MDM can help.

Michal Klaus, CEO, Ataccama

Michal Klaus

CEO, Ataccama

Michal Klaus is the Chief Executive Officer of Ataccama Corp., an established Data Quality, MDM, and Big Data Processing technology vendor. The company headquarters in Toronto, Canada, and has customers in Europe, Canada and the U.S. Michal oversees all of Ataccama’s operations and is responsible for global sales and marketing.

13:10 - 13:35 Perspective Sessions
Data Governance - slim and simple
Data Governance - slim and simple

I’ve come to the conclusion that the biggest trap facing DG initiatives is how complex they quickly become. Did your DG initiative get stuck because there were too many processes, policies, procedures or committees and too little progress? Why not take a different direction and start from solid ground with solid results?

I am going to present you with our “think less, do more” concept that can make your Data Governance actionable and work across your whole company or business.

Tomas Barta, Co-founder, SEMANTA

Tomas Barta

Co-founder, SEMANTA

Tomas Barta is a co-founder of SEMANTA – the software platform that makes Data Governance simple. He has spent 15 years in the BI field. And for the last 6 years he and his team have been working hard to make their Data Governance vision real. He thinks that Data Governance can successfully grow only when started as a simple, small, frank and humble initiative. He believes that LESS IS MORE!

To Gain Maximum Business Value Out of Your (BIG) Data
To Gain Maximum Business Value Out of Your (BIG) Data

our business depends on business analysts with smart analytical skills to make accurate decisions and gain new insights from enterprise and external data. However too much of their valuable time is lost struggling to access, merge or fix complex (big) data before it can be combined with the existing data environment of the organisation. According to Forrester Research, 38% of business analysts spend more than 30% of their time validating and fixing data. Learn how to eliminate the time-consuming burdens of data preparation and data validation by automatically refining business information for your specific needs. Thus ensuring consistent, complete and integrated quality data.

Ed Wrazen, VP Product Management EMEA, Trillium Software

Ed Wrazen

VP Product Management EMEA, Trillium Software

Ed Wrazen is VP Product Management, Big Data with Trillium Software, a leading provider of enterprise data governance and data quality solutions. Ed is responsible for the product strategy, roadmap and launch for Trillium’s Big Data products and solutions. With 30 years’ experience in database and data management technologies, Ed has held roles in software development, consulting and marketing in global businesses and technology companies. He has specialized in data architecture, performance design, data integration and data quality. He is a regular speaker at industry events worldwide and author on data management, data governance and data quality topics.

Dominic Tomey, Commercial Director, Trillium Software

Dominic Tomey

Commercial Director, Trillium Software

Dominic Tomey is a Commercial Director with Trillium Software, a leading provider of enterprise data governance and data quality solutions. Dominic is primarily responsible for positioning and aligning Trillium’s core propositions into the commercial sector. Whilst delivering effective frameworks that provide answers to data lead problems, Dominic has worked with some of the UK’s leading household names creating value based discussions that evolve into strategic relationships. With over 15years experience in working with Enterprise Software Vendors, Dominic has a deep understanding of the challenges data within organisations can bring.

13:40-14:30
KEYNOTE: Fast Data: The Next Frontier of Big Data
KEYNOTE: Fast Data: The Next Frontier of Big Data

In the first stage of big data adoption, the focus was primarily on storing and analyzing massive amounts data. The focus was completely on volume. Currently, organizations have started to enter the second stage of big data: fast data. Fast data is about streaming massive amounts of data and analyzing that same data instantaneously. It’s the next frontier of big data. It’s especially the Internet of Things (IoT) that’s pushing fast data. The IoT is about connecting devices to devices across the Internet. The stream of data these intercommunicating devices can generate is massive. In these massive data streams valuable business insights can be hidden, deeply hidden. The business value of the IoT is in analyzing this data. Unfortunately, analyzing IoT data is not similar to analyzing enterprise data for which data warehouses can be developed and easy-to-use data visualization tools can be deployed. For example, IoT data is very cryptic data and to make sense of it, it has to be integrated with enterprise data residing in the enterprise data warehouse. Also, data has to be analyzed real-time, and sometimes even before it’s stored, and a reaction may be required instantly. It’s a new world. This keynote discusses the architectural aspects of the IoT, guidelines on how to adopt IoT, and how to integrate IoT with an existing business intelligence environment.

  • How does fast data relate to the classic world of business intelligence and data warehousing?
  • A new architecture is required for the IoT
  • Technologies involved in analyzing the IoT data stream
  • How to integrate IoT data with data from the enterprise data warehouse
  • The challenge of reacting real-time on incoming IoT data
  • What is the relationship between Big Data and the IoT?

14:35-15:25
“Little Data" - Gaining Sustainable Insight Through Self-Service BI
“Little Data" - Gaining Sustainable Insight Through Self-Service BI

Having a data lab up and running does not equal a sustainable integration into the operational and strategic fabric an organisation. Big Data and Analytics are omni-present in many organisations and the mantra of statistical relevance is often used as an excuse for neglecting “little” data. Even if all your big data is accessible in your data lake you still need to understand what the data relates to. Without context you just have a huge pile of data points.

The little data, data that has a life cycle and that describes the related business concepts, will provide the context that allows you to obtain actionable insights. Proper management of the life-cycle of the business entities and their relationships will therefore give a significant boost to your business outcomes. Unfortunately, many master – and reference data management (MDM/RDM) projects are not providing the benefits that are anticipated.

This session will give you practical advice to setup your data management and information governance in such a way that you can get maximum value out of your data without increasing your liabilities.

  • Setting the scene: Defining a sustainable Information centric organisation
  • Drowning in the data lake or having a breach? Information and Data Governance as a safeguard
  • MDM and RDM design principles
  • Bringing the proper life-cycle management into your organisation?
  • Architecting the conform dimensions

The Sexiest Job of the 21st Century: How to Become a Data Scientist
The Sexiest Job of the 21st Century: How to Become a Data Scientist

Most organizations are at least getting their feet wet when it comes to ‘Big Data’ and ‘Predictive Analytics’, hoping to reap all the benefits the trade press is full of. But many still struggle to understand what data science is all about, how and when it makes sense to hire data scientists, how to embrace and incorporate analytical thinking and how to translate that into useable information products and insights. This session will explain what data science is and how organizations like e-tailers, railroads, banks, insurance companies and governments use data science to improve their services, increase their revenue or engage customers.

How to Survive the CEO’s Trough of Disillusionment with Data
How to Survive the CEO’s Trough of Disillusionment with Data

Roberto will discuss how Data Executives can construct a compelling story and influence their peers to advance new innovative business strategies. He will make you understand what the role requires and the critical skills needed to bridge the gap between data, technology and strategy to execute Change. He will cover how you should use communicative skills internally in order to market Data Management to all the firms levels and grab any opportunity to foster adoption. You need to build an enterprise wide vision to establish common goals and cohesion for sustainable and beneficial business transformation.

Data Quality – A Different Perspective
Data Quality – A Different Perspective

You want to have Data Quality reporting available across the company but cannot justify the upfront cost and ongoing outlay to your management.  Neil and Palash will present the journey from conception to delivery of a ‘pay as you go’ Data Quality Service.  A pioneering approach to the delivery of data quality reporting based on Cloud infrastructure outsourced resourcing through a workflow enabled governance process.  This session will focus on:

  • The initial concept
  • Design
  • The final product
  • Challenges faced

Neil Storkey, Director, InfoMana Ltd

Neil Storkey

Director, InfoMana Ltd

Neil Storkey is an independent consultant specialising in enterprise data and information management strategies with a focus on trust, integrity and sustainability of information assets. Neil started his data journey as an accountant back in 1991, where consolidation and reporting of financial management information depended on consistent quality data.  25 years on, he has lead and delivered change management programs in large global enterprises shifting the onus of accountability of data and information away from IT. These enterprise programs included BW / BI, MDM strategies, business change management, Data and Information strategies, all based on establishing business lead working practices around standards, stewardship, governance, organisation and quality.  His strategies have been recognised as industry leading and innovative which challenge established data management practices.  Neil’s career following the data has allowed him to experience industries in motor manufacturing, financial services, recruitment, telecommunications, mobile telephony, tobacco and Hydrocarbons.  His challenge to everyone is to lose the tag of ‘ownership’; we are but custodians of company data.

Palash Banerjee, Associate Partner, IBM

Palash Banerjee

Associate Partner, IBM

Palash Banerjee is an Associate Partner in IBM‘s Cognitive Solutions Team. With over 16 years of experience, he has significant knowledge in data/information management, analytics, business intelligence, program management, operational management and global sourcing.
He designed IBM’s first of a kind “Data Quality/Governance as a Service”, a combination of software, hardware and technical services delivered via an industrialized process. This enables ability to monitor, validate and remediate data through a structured Data Management process.
He has worked in UK, Germany, France, India and USA and has a track record of successful practice management; program and project delivery brining measurable benefits to clients. He has successfully delivered Information Management strategies, designs and working solutions across a broad range of market sectors.

Architectures for Big Data Analytics
Architectures for Big Data Analytics

Data is the raw ore whose transformation yields the currency of this capricious information economy. Unlike metals hewn from the earth, new data rushes into the modern world from a swiftly increasing array of sources in a welter of forms from personal devices, cars, buildings, cities and an exponentially increasing array of sources that are becoming smarter with each passing iteration. As raw data resources proliferate around us, information architectures to accommodate its transportation, aggregation and transformation into information products have become acutely important to organizations of all stripes around the world.  In this presentation, we will explore the essential characteristics of information architectures to generate decision options from complex, heterogeneous and unstructured data sources, and discuss some examples of high-performance analytic systems.  What attendees will learn:

  • Challenges inherent in big data volume, velocity and variety
  • Architectural approaches to decision support for big data
  • Information Quality Challenges related to Big Data
  • Examples of big data analytics system implementations

Markus Helfert, Senior Lecturer, Dublin City University

Markus Helfert

Senior Lecturer, Dublin City University

Dr. Markus Helfert is the Director of the Business Informatics Research Group and a Senior Lecturer in Information Systems at Dublin City University. He is funded Investigator at Lero – The Irish Software Research Centre (www.lero.ie) and Insight- Centre for Data Analytics (https://www.insight-centre.org/).  Dr. Helfert is research affiliate at the The Open Government Institute (TOGI), Zeppelin University, Germany. His interests relate to Data Analytics, Enterprise Architecture, Process Mining, Open Data, Innovation Management and Smart Cities. His work has been published in international journals, books and conferences. Markus is currently involved in national and international projects related to these areas, and has worked with and advised many companies on Information Management related challenges. Markus Helfert holds a PhD in business administration from the University of St. Gallen, a Master-Diploma in business informatics from the University Mannheim and a Bachelor of Science from Napier University, Edinburgh.

15:25-15:50 Break & Exhibits
15:50-16:40
Migrating our Enterprise DW from "Traditional" to Data Vault Based
Migrating our Enterprise DW from "Traditional" to Data Vault Based

Generali Hungary built its first Data Warehouse more than 15 years ago. Since then they have changed/reorganized its major parts three times. Because of changes in the internal & external environment they had to change their ETL tool (again). Last year they decided to do a complete reorganisation including their ETL & modelling tools, development processes and to move to Data Vault based modelling. Of course, they had to do this in a short time period with limited internal & external staff. The old and new DW had to work together, so that the major processes could be migrated one-by-one.  In the session we will share our experience on EDW migration, specifically:

  • Why we had to move from our existing architecture & why we’ve chosen to use Data Vault
  • What were the pain points, pitfalls and solutions
  • What are the results & what we’ve learnt during this project

Zoltan Csonka, Head of DW, Generali

Zoltan Csonka

Head of DW, Generali

Zoltan Csonka is Head of Data Warehouse at Generali Zrt. Zoltan has a solid experience as data warehouse architect and business intelligence solution expert. Currently responsible for the DW processes, architecture and strategy.

Gabor Gollnhofer, Principal Consultant, DMS Consulting

Gabor Gollnhofer

Principal Consultant, DMS Consulting

Gabor Gollnhofer is a principal consultant of DMS Consulting. Gabor has 20 years’ experience in system architecture & design, data modeling and meta data management. He was involved in projects in finance, insurance, telecommunication, higher education, and retail industries. He’s a member of ACM & TDWI and a Certified Data Vault Data Modeler.
Gabor is a frequent presenter in DW/BI related conferences and also used to give lectures on related topics. Follow Gabor on Twitter: @GGollnhofer.

Modernising Data Architectures Using Data Virtualization for Agile Data Delivery
Modernising Data Architectures Using Data Virtualization for Agile Data Delivery

In this case study Dave will provide a profile of the global insurer Zurich and it’s status in the market. He will then describe the many challenges and threats it is facing in the Insurance market which means that it must respond at pace, which is a challenge for any large traditional insurance provider. Dave will explain how Zurich needs to rapidly deploy existing data assets and leverage new sources. He will show how we have achieved this with a modern data architecture which strongly leverages data virtualization.  Key takeaways are:

  • Explaining how the critical business challenges being faced by a large insurance company are mostly all about data
  • Providing a detailed explanation of at least one case study on how we are solving that
  • Explaining how this initially tactical use case(s) can provide a strategic way forward to modernising a data architecture

Dave Kay, Senior Data Consultant, Zurich UK General Insurance

Dave Kay

Senior Data Consultant, Zurich UK General Insurance

Dave Kay is a self-confessed SQL geek with almost a decade of experience in data and analytics. Today he’s a Senior Data Consultant at the Zurich Insurance group – developing and maintaining a number of medium and large scale BI/analytical data solutions to a diverse stakeholder audience. Previously he has worked for Jagex in the online gaming space, leveraging data mining and analytics to generate new insight on games player behaviour.

Data Governance as a Force Multiplier
Data Governance as a Force Multiplier

On a personal basis Mike has travelled the circle from Data Governance is a good thing……because it is, through it is not really required and is just an additional cost and delay in the process of using data to a position where Data Governance is an enabler in gaining value from the data an organisation collects etc. This case study will look at the effect of governance for its own sake and then look at how organisations and the British Army in particular are using Data Governance to derive value. In the case of the Army the presentation will look at the differences in approach between and organisation that is seeking to monetise or gain additional profit from its data to an organisation that is seeking to gain addition effect. The Army will seek to gain additional effect from its tax pounds and when operating in an operational context to use the information it gains to multiply its effect on the ground. There will also be a nod to the requirements for all Armed Forces operating in a Joint or multinational environment to combine their data to increase their effectiveness and the Data Governance required for that.

  • Look at Data Governance Journey
  • Understand differing approach between monetisation and Force Multiplication
  • The Army experience

Lt Col Mike Servaes, Data Strategy, British Army

Lt Col Mike Servaes

Data Strategy, British Army

Michael Servaes is a career Royal Artillery officer who has commanded at all levels up to Regimental Command in barracks and deployed on operations. He has served in Germany and the U.K on operations in the Balkans, Iraq and with the UN peacekeeping in Cyprus. He has also visited a number of other countries as a part of his service. Originally a field Artillery officer he has also served with Air Defence units and on the staff has been an Arms Controller, led the redesign of the Artillery soldier career structure and led the study into the future of UK Armoured training a project worth £10 Bn. His back ground in information comes from leading the design and creation of Business Intelligence Competency Centre for the Army. He has been the CDO for the Army working to establish a data governance structure that supports the Army and enables the Army to begin to fully exploit its information. He was a founder member of the MIT CDO Forum and is a founder member of the International Society of CDOs. He is a frequent speaker on data associated matters and the cultural change that such initiatives must bring, especially to organisations which have a strong history behind their culture and behaviours.

From Data Blind to Data Funded: The IFC/World Bank Group Story
From Data Blind to Data Funded: The IFC/World Bank Group Story

Every organization has a process for funding technology projects. But what about data work? What about foundational efforts, governance, and tasks that project managers may be blind to? What about data maintenance and improvement activities that don’t map easily to projects? This is the story of IFC (the private sector arm of the World Bank Group), who in 2 ½ years went from “invisible” data needs to a well-funded, interdependent set of projects and work plans. Attendees will learn:

  • How a small team of individuals were able to influence the organization’s approach to executing and funding data work
  • The carefully aligned strategy they employed, and how it involved Data Management, Data Governance, Business Intelligence, Security, Auditing, IT Governance, and IT Portfolio Management
  • 5 funding models, and when each was deployed

Panel: Riding the Elephant: the Governance of Hadoop Environments
Panel: Riding the Elephant: the Governance of Hadoop Environments

The Hadoop family of open source technologies has been maturing in recent years. It has spawned many and varied instances in companies and organisations, mostly departmental, even siloed in nature, sometimes more central and strategic. Security has been something of a non-issue, and so, too, governance.

  • How can Hadoop data pools or lakes be incorporated into well-governed enterprise data systems disciplined by master data management programmes? Should they be?
  • Do big data technologies pose problems of a qualitatively different kind to those of more traditional data warehouses, data marts, and relational databases more generally?
  • Does the governance of unstructured data repositories, and of the analytics performed using them need, perforce, to be lighter in touch?
  • And what of the organisational design problems posed by Hadoop? Does it call into being a chief data officer? Does it entail building a data science team, and, if so, of what kind so as to deliver business value?

This session will address these and other questions, from technical, legal, commercial and organisational politics points of view.

Brian McKenna, Business Applications Editor, Computer Weekly

Brian McKenna

Business Applications Editor, Computer Weekly

Brian McKenna is the Business Applications Editor at Computer Weekly, covering information management and enterprise software from a corporate user perspective. He has 18 years’ experience in information and technology business publishing. He is a former editor of Computer Weekly, Infosecurity, and Knowledge Management magazines, among others. He has a degree in History and English from the University of Glasgow, a doctorate from the University of Oxford, and was a British Academy post-doctoral fellow before his publishing and communications career.

Mike Ferguson, Managing Partner, Intelligent Business Strategies

William McKnight, Consultant, McKnight Consulting Group

Luca Olivari, Chief Data Officer, Contactlab

Luca Olivari

Chief Data Officer, Contactlab

Luca Olivari is the Chief Data Officer at Contactlab. Headquartered in Italy but with operations globally, Contactlab has been collecting digital marketing data over the past 15 years. With changes in the Big Data landscape, the company re-engineered its business models to focus on new data-driven products and revenue streams. As part of the initiative, Contactlab has built a cloud-based marketing data science service to help its clients grow revenue by identifying actionable insights and improving. Before joining Contactlab, Luca held leadership roles at some of the world’s leading data platform companies including MySQL, Oracle and MongoDB. He also serves as an advisor to tech startups.

16:45-17:05 Lightning sessions
17:05-17:35
KEYNOTE: Data Science and the Panama Papers
KEYNOTE: Data Science and the Panama Papers

The trove of files that make up the Panama Papers is likely the largest dataset of leaked insider information in the history of journalism.  Mar will discuss the unique challenges that ICIJ’s Data and Research Unit encountered in analyzing this data. The overall size of the data (2.6 terabytes, 11.5 million files), the variety of file types (from spreadsheets, emails and PDFs to obscure and old formats no longer in use), and the logistics of making it all securely searchable for more than 370 journalists around the world are just a few of the hurdles they faced over the course of the 12 month investigation.

17:35-18:30 Drinks reception and Exhibits
Wednesday day 3 - November 9
09:00-10:00
KEYNOTE: Scaling out Data Operations in a Global Bank
KEYNOTE: Scaling out Data Operations in a Global Bank

The demand for data has never been higher and the supply provided by traditional EDW platforms is as constrained as it ever was. Many enterprises have realised that without leveraging the power of Big Data technology enterprises will fail to meet the demands of customers, executive and external regulators.

In this presentation Alasdair Anderson, EVP Data Engineering Nordea Bank will discuss the evolution Data Architecture from EDW through Hadoop 1.0 to the Enterprise Data Hub. The Enterprise Data Hub now forms the data backbone of the bank providing data that support core regulatory, compliance and finance reporting.

  • What is an Enterprise Data Hub?
  • How is Business Data Operations supported on the Enterprise Data Hub?
  • How can change be executed on a scale out platform to deliver improved data quality

Alasdair Anderson, EVP Data Engineering, Nordea Bank

Bert Oosterhof, EMEA Field CTO, Trifacta

Bert Oosterhof

EMEA Field CTO, Trifacta

Bert Oosterhof is a visionary data architect, technologist and entrepreneur. He has spent all his professional career around data-related challenges and innovative solutions, from databases (DBMS’s), Data Integration tools, Data Warehousing, Master Data Management, Big Data to Data Governance.  He introduced “Object Relationals Databases” (1995), “ETL” (1999) and now “Data Wrangling” in the Benelux markets. Bert is now EMEA Field CTO at Trifacta, a San Francisco based software company. Before this he was member of the EMEA CTO Office at Informatica and (a.o.) member of Advisory Board of Parstream, an IoT Edge Analytics database platform, recently acquired by Cisco.

Dr. Richard Harmon, Director, EMEA Financial Services, Cloudera  

Dr. Richard Harmon

Director, EMEA Financial Services, Cloudera  

Dr. Richard Harmon support’s Cloudera’s Financial Services business across the EMEA region.   He joined Cloudera in May, 2016 and has over 25 years of experience in Capital Markets with specializations in Risk Management, Advance Analytics and Fixed Income Research.  His business focus has included helping customers leverage unique data sources and advanced analytics to develop innovation solutions.  Richard started his post-academic career at the US-Fed followed by leading fixed income research teams at Citibank, Bankers Trust, JP Morgan and Bank of America.   He is the co-founder of a GMAC funded start-up called Risk Monitors which was acquired by BlackRock where he was an MD & Partner in the Risk Management Group.  Richard left Blackrock to start and manage the North American business for Norkom Technologies which was later sold to BAE systems. For the past six years Richard has been the Director, EMEA Capital Markets at SAP, where he helped grow the business across the EMEA region.  Dr. Harmon holds a PhD in Economics with specialization in Econometrics from Georgetown University.

10:00-10:30 Break and Exhibits
10:30-11:20
How to Accelerate Towards a Data-driven Business
How to Accelerate Towards a Data-driven Business

At Reed Exhibitions, world’s largest event organizer, establishing a data-driven agenda and serving customers across 5 continents and 44 industries is no easy task. In this talk, I present what being data-driven means to us and how we design frameworks to test and experiment hypothesis to spot opportunities to leverage data. The talk highlights the importance of understanding the existing business processes and practices and how to avoid sophistication and confusion when employing advanced analytics.

Key takeaways from this presentation:

  • Understanding the different paths through which data delivers value.
  • The role of leaders and senior stakeholders in a data-driven business.
  • How to maximise chances of success and the data-driven adoption in business.

Dr. Salman Taherian, Global Head of Data Innovation, Reed Exhibitions Ltd.   

Dr. Salman Taherian

Global Head of Data Innovation, Reed Exhibitions Ltd.   

Salman Taherian heads the data-driven innovation function of Reed Exhibitions. Leading teams of data scientists and data practitioners, he identifies and manages initiatives that drive customer and business value. Previously, he headed a tech start-up company, Kasra, founded in Cambridge (UK) which delivered data-driven solutions for diverse industries. He has a strong technical background in systems design, and pioneered an event-driven middleware solution for 21st century applications as part of his PhD from Cambridge University.

Is Data Warehouse Automation a Necessity?
Is Data Warehouse Automation a Necessity?

It’s 2016 and still ETL programs are being developed manually, data models are still being created by hand, and star schema data marts are still derived from normalized data warehouses manually. Designing and developing BI environments is still a laborious and error prone type of work, and very often, the wheel is invented over and over again. But it shouldn’t be. Manual development is slow, we all know that. Now, the business requirements have changed drastically over the last ten years. One of the key changes is time-to-market; new reports have to be developed more quickly and maintaining existing ones should be simpler. Too often, as a reaction, analysts working at business departments develop their own reports and their own ETL processes, they don’t even call the IT department anymore. Sometimes that’s a good idea, but not always. Organizations have to find ways to improve productivity. One of these alternatives is data warehouse automation. This allows developers and designers to focus more on the business aspects of a BI environment and have to spend less time on repetitive and time-consuming tasks. This session explains how data warehouse automation can help speed up development.

  • Business requirements have changed over time.
  • Generating data models for data warehouses and data marts.
  • What will data warehouse automation mean for your current BI system?
  • How efficient is generated ETL code?
  • Pros and cons of data warehouse automation.

Fostering Data Babies - Engaging a Younger Generation of Information Professionals
Fostering Data Babies - Engaging a Younger Generation of Information Professionals

Growing up in the digital age may lend to an increased aptitude for understanding data and information management but without the principles, history and guidance from those with decades of experience things can go awry. Due to increased awareness, demand and a global change in climate towards “geeks” (we’re cool now!) Data Babies are coming into the profession through various non-traditional paths with new ideas, misguided ideas and bad habits in tow. In this session, you’ll hear from a know it all Gen Y’er, who realised quickly that he hadn’t even scratched the surface of the history, complexity and savvy that required to succeed in enterprise data. Attendees will come away with ideas in engaging snot-nosed data geeks and gain insight as to where they came from, how their paths are different and how to speak their language.

  • Understand new paths to the data profession
  • Explore generational knowledge gaps
  • Engaging and fostering young data talent

Tony Mazzarella, Analytics Solutions Architect & Data Governance Lead, Financial Services Industry

Tony Mazzarella

Analytics Solutions Architect & Data Governance Lead, Financial Services Industry

Tony Mazzarella is a Digital Analytics Solutions Architect and Data Governance lead for a top US financial company and currently serves as VP of Online Services for DAMA International and the DAMA New England chapter. After nearly 15 years as a developer and executive in the agency world, Tony is a “Data Baby” in his first formal enterprise data role. Now responsible for architecture, integration, optimization and data governance in a digital analytics team that supports an enterprise, he is often challenged and frustrated, yet at the same time, extremely happy. Tony relies on the guidance and experience of his friends and good beer to meet the demands of his career. Follow Tony on Twitter: @tonymazz.

Data Quality Distilled: An Essential Guide for Data Management Professionals
Data Quality Distilled: An Essential Guide for Data Management Professionals

Today, more than ever, the quality of data underpinned by a robust approach to Data Quality Management, is critical to the success of every organisation. Unfortunately, it is a topic that is still impenetrable to many through the use of unfamiliar jargon and too much emphasis on technology.

In this session based on Equillian’s established data quality primer, Jon Evans seeks to redress the balance, by taking the audience on a whirlwind journey from first principles right through to advice on establishing a Data Quality Programme. Along the way, both beginners and those already familiar with the topic will benefit from a business-focused approach, based on industry best practice coupled with many years of experience helping organisations tackle their Data Quality challenges.  The session will be structured around 4 key topics:

  • Why should I care about data quality?
  • Monitoring data quality
  • Improving data quality
  • Developing a DQ Programme

Jon Evans, Founder, Equillian

Jon Evans

Founder, Equillian

Jon Evans is an Information Strategist, self-confessed data quality geek and the founder of Equillian, an independent UK consultancy practice specialising in Enterprise Information Management. For the past two decades, he has been helping organisations harness their information and transform it into a strategic business asset. His wealth of experience covers all the key disciplines that help define, manage and exploit enterprise information, from putting in place effective Data Governance to delivering insight through Business Intelligence. In the field of Data Quality, he contributes expert knowledge and thought-leadership, drawing upon a track record of successfully delivering DQ initiatives to a wide range of organisations, including a key role in advancing the statistical analysis of health data. As a regular speaker and panellist at industry events, Jon enjoys bridging the gap between the business and IT domains, bringing fresh understanding and clarity – the same approach he adopts as a respected Information Management coach and mentor. Follow Jon on Twitter: @MadAboutData.

Building a Data Infrastructure for a Smart City
Building a Data Infrastructure for a Smart City

This presentation will be about the struggles they face within the city of Amsterdam in transforming the local government into a data-driven organization in a smart city context.  Can they apply change management strategies and what can they learn from epidemiology to upscale abilities? The creation of the data infrastructure as well as the influencing involved to kick-start this development will also be discussed. Following a number of examples that increased citizens’ wellbeing as well as the local governments’ effectiveness, expectations are high.  But how do you keep up with these and how to mobilize the workers in the city to abide advice presented by computational models?

Rutger Rienks, Program Manager, City of Amsterdam

Rutger Rienks

Program Manager, City of Amsterdam

Rutger Rienks is the author of “Predictive Policing: Taking a Chance for a Safer Future”.  He holds a PhD in computer science from the University of Twente in The Netherlands and is a well-known enthusiastic speaker. He has  broad experience in Business Intelligence and Predictive Analytics. To broaden his view he exchanged the Dutch National Police after eight years for the City of Amsterdam in order to contribute in the transformation of Amsterdam becoming a smart city.

11:25-12:15
The Trials, Tribulations and Successes of Building an Enterprise Customer Data Warehouse
The Trials, Tribulations and Successes of Building an Enterprise Customer Data Warehouse

This session focuses on all the aspects to consider when building an Enterprise Data Warehouse. Sophie Holland will provide guidance on writing the business case, choosing the right infrastructure, and bringing the data warehouse to life after it has gone live to get the best use out of it in your company. Sophie will also talk about her experience of doing this at News UK and the challenges faced whilst getting a new CEO and a change in company strategy, along with what benefits were realised. News UK has made a significant commitment to put data and customer information at the heart of its business. In 2015 News UK implemented cloud technology tools for the Enterprise Data Warehouse and Campaign Management capability in house for the first time, removing its reliance on a third party, and owning IP.

  • Getting approval for a data warehouse project in your organisation
  • Business benefits of a data warehouse
  • Lessons learned

Sophie Holland, Royal Mail

Sophie Holland

Royal Mail

Sophie Holland is strong at being the glue between technology / data and the business, and bringing order to chaos. Sophie is currently working at Royal Mail to centralise their Group BI function. Sophie has worked on a range of technical delivery projects at News UK including CRM, publishing workflows, advertising and data. Prior to this Sophie managed a BI strategy team in the music industry, to influence company forecasting, budgeting and new investment deals using data and insights. Sophie has a Maths & Stats degree from Imperial College London.

Integrating Big Data Analytics into a Self-Service BI Environment
Integrating Big Data Analytics into a Self-Service BI Environment

This session looks at how advanced analytics such as machine learning, graph analysis and text analysis can be integrated into a self-service BI so that business analysts can exploit the power of Big Data analytics platforms such as Hadoop, Spark and Graph databases to add new insights to reports and dashboards.  It looks at how self-service BI tools can connect to a logical data warehouse consisting of traditional data warehouses, Big Data and streaming data and how they can invoke advanced analytical models to provide deeper contextual insights.

  • The logical data warehouse – new platforms and rich pickings for self-service BI
  • Data Science meets self-service BI – new types of analytics available to business analysts
    • Machine-learning, text analysis and graph analysis
    • Streaming analytics
    • What are these kinds of analytics and what can they do?
  • Approaches to integrating advanced analytics into self-service BI
  • Integrating predictive models with Self-service BI
  • Integrating text analytics with self-service BI
  • Integrating graph analytics and self-service BI

Become a Data Superhero
Become a Data Superhero

DATA IS KICK ASS and we all need to become the superhero because it’s changing the rules of work. Companies that are rich in data will be the outperformers. But for companies to be rich in data, employees have to grasp its importance. This BECOME A DATA SUPERHERO session will empower you and your organisation to leverage the power of data into your work and workplace.  You will learn:

  • A very brief history of data
  • Why is data so important?
  • Can data change the world?
  • Working with data – the fundamentals everyone should know

This talk is for you if:

You’re technical – If this is you then you’ll come out of the talk and workshop knowing how to market what you do to the business so they’ll jump up like screaming lunatics saying, “yes man, give me all of that and more, and I want it yesterday”

You’re in the business – If this is you then you’ll come out the talk and workshop knowing that data driven businesses are the best businesses in the world and you’ll want to put data driven at the core of your business strategy.

For more information visit www.datasuperhero.co.za.

Rob Zagey, Senior BI Analyst, STANLIB

Rob Zagey

Senior BI Analyst, STANLIB

Rob Zagey has over 10 years’ experience in business, from management consulting, NGOs and financial services. Rob is currently a senior business intelligence analyst at a financial organisation and private tech investor. He loves data, but not more than his wife. Rob lives in Johannesburg with his wife and 4 boys.

Expert Panel: Building your Career in Data
Expert Panel: Building your Career in Data

Data specialists have never been in more demand, but are those with real data skills adequately rewarded and are those with real skills under threat from those jumping on the Big Data bandwagon?

This session will look at general trends in business technology, Big Data and regulation and their impact on your career and look at strategies to build your career in data. Panellists include business technology, pay and HR experts and a leading headhunter.

This session will highlight:

  • Pay and HR trends for business technology and HR experts
  • Opportunities and threats to those building a career in data
  • Strategies for building your career in data and business technology

Mike Simons, Associate Editorfor, CIO.co.uk, ComputerworldUK and Techworld, IDG

Mike Simons

Associate Editorfor, CIO.co.uk, ComputerworldUK and Techworld, IDG

Mike Simons is Associate Editor for CIO.co.uk , ComputerworldUK.com and Techworld, joining IDG from Reed in 2006, where he worked on Computer Weekly and ComputerWeekly.com.. He was News Editor at the launch of ComputerWeekly.com in 2001 and News Editor of a combined Computer Weekly and ComputerWeekly.com operation from 2003. Mike helped Computer Weekly secure the Periodical Publishers Association awards as either “magazine of the year” or “campaigning magazine of the year” for four years out of five. Mike joined IDG as Launch Editor of ComputerworldUK and took over responsibility of Techworld in 2011.

Ken Mulkearn, Principal, Incomes Data Research

Ken Mulkearn

Principal, Incomes Data Research

Ken Mulkearn is one of the principals of Incomes Data Research, established in 2015. Prior to this he was Head of Pay and Research at IDS, where he led the Pay & Reward, Executive Compensation, and Research Services teams. He was Editor of the ‘Pay & Reward’ component of the ids.thomsonreuters.com online service, the monthly IDS Pay Report and a range of specialist sector reports, including ‘Pay and benefits in the public services’, ‘Pay and conditions in call centres’, ‘Pay and conditions in engineering’ and ‘Pay in retail’. As well as reporting on reward developments across the economy, his teams were responsible for compiling the data that appears in IDSPay.co.uk, the online pay benchmarking tool from IDS. During his time at IDS Ken covered pay developments across the private and public sectors and he was closely involved in a large number of research projects for a variety of external clients. He speaks to a wide range of audiences on pay issues, and regularly broadcasts on radio and TV. He holds an MSc in social research methods from the London School of Economics, where he also took modules in industrial relations. His primary degree is from Trinity College, Dublin.

Peter Segal, Managing Partner, Ogilvie & Associates

Peter Segal

Managing Partner, Ogilvie & Associates

Peter Segal is a Managing Partner with Ogilvie & Associates, a boutique Executive Search firm partnering with technology product and Services firms. Peter has more than twenty five years’ experience of partnering with Technology clients, building high impact executive teams in eighteen countries across Europe, North America and Asia.  His expertise covers both venture backed companies looking to grow to the next stage of development and publicly quoted companies; building leadership teams for both European owned companies and US owned companies looking to expand their International operations.  His experience includes partnering with software companies across a wide range of enterprise class applications and vertical markets.  Peter has acted in an advisory capacity to firms on a wide range of issues such as organisational design, remuneration, personal development and succession planning.  He has also led management assessment initiatives benchmarking client management teams against others in their sector as well as outsourcing services.

Alasdair Anderson, EVP Data Engineering, Nordea Bank

MDM in Financial Institutions - and What's Next?
MDM in Financial Institutions - and What's Next?

With data being the key asset for financial institutions, its quality is of utmost importance. An agile, two-speed (bi-modal) IT approach is becoming quintessential to keep up with an ever-increasing velocity of business, challenging requirements for compliance (e.g. GDPR), fast time to market, and operational excellence. Join this session to learn how a major European bank tackles data quality and MDM, and sees its further operational and analytical use.

Lukas Mazanek, Head of Data & Information Competence Center, Raiffeisenbank

Lukas Mazanek

Head of Data & Information Competence Center, Raiffeisenbank

Lukas Mazanek is the Head of Data & Information Competence Center at Raiffeisenbank. He is responsible for Datawarehouse, Reporting and Data Quality Management. His team has 60+ people. Prior joining Raiffeisenbank, Lukas was responsible for CDI hub delivery as an IT Project Manager and Solution Architect at KB, a member of the Societe Generale Group.

12:15-13:45 Lunch and Exhibit
12:45 - 13:10 Perspective Session
Create BI Success with Effective Dashboards
Create BI Success with Effective Dashboards

Dashboards are the No.1 technology for implementing business-driven BI (TDWI).  But, up to 80% of BI projects fail (Gartner).  The reason?  Most dashboards don’t communicate information effectively.  Attend this session to discover the top 10 best practices for producing effective dashboards and create BI success.

Carl Edwards, Senior BI Consultant, Yellowfin

Carl Edwards

Senior BI Consultant, Yellowfin

Carl Edwards is a Senior Business Intelligence Consultant at Yellowfin.  He works closely with clients to help them deliver valuable insights through their data and help make better business decisions.  With over 25 years’ experience working in the technology industry, Carl is well placed to advise customers on best practices for using data to get rapid answers and monitor their business.

13:15 - 13:40 Perspective Session
The 5 Must Do's for Guaranteed Data Governance Success
The 5 Must Do's for Guaranteed Data Governance Success

What’s stopping your data initiatives from succeeding? Stuck in endless data committee meetings or just paralysed by fear of the colossal task ahead? In this session Diaku gives you 5 data initiative hacks to start a collaborative data revolution towards a more data driven organisation today.

Patrick Dewald, Director, Diaku Limited

Patrick Dewald

Director, Diaku Limited

Patrick Dewald is a Data Governance architect and founding partner in Diaku.  He has a wealth of experience designing Master Data Management and Data Governance solutions for financial institutions. Patrick has been heading up Data Governance initiatives, designing and implementing group-wide data services from the ground up for the best part of 15 years. Patrick is recognised by its peers as a thought leader in the field of Data Governance.

Darius Clayton, Director, Diaku Limited

Darius Clayton

Director, Diaku Limited

Darius Clayton is an experienced change specialist and founding partner in Diaku. With a management consultancy background in business transformation and outsourcing he brings a practical, value-driven approach to data disciplines. Since 2007 his focus has been on data governance, collaboration, and the business view of the data asset. Darius has spent over 16 years working with institutions to control and improve their data while delivering tangible business benefits.

13:45-14:35
How to Make Business Tools People Want to Use and Actually Use Well: a Case Study in BI Tool Design from Zipcar
How to Make Business Tools People Want to Use and Actually Use Well: a Case Study in BI Tool Design from Zipcar

Zipcar has a lot of data and people want to use it – what you wouldn’t expect is that this can be a problem.

When everyone individually asks their questions and expects a report or tool to be developed from it you don’t always end up with more understanding or the real world, more agreement on what is really happening or more engagement with drawing insight from the data. If you aren’t careful you can end up with a proliferation of data without clear structure and confused end users wondering how it all fits together.

This session will demonstrate a methodology we use in Zipcar based on questions and flow to determine how to build tools for understanding the business through data and will invite discussion on other methods that can be used to manage a growing BI and insight tool set.

Will Sprunt, Head of Analytics, Zipcar International

Will Sprunt

Head of Analytics, Zipcar International

At Zipcar Will Sprunt makes sure that the decisions they make as a company both large and small are grounded in data whether that be shaping their long term plan and market position or looking at the results of a membership test in a scientific way. This has meant building up and leading our internal community of analysts and data scientists whilst using the insights created to drive thinking on our planning and strategy as a company.

Knowing What to do by Using Data from the Future
Knowing What to do by Using Data from the Future

Tomorrow’s outcome is a result from today’s decisions. So why make decisions based on data, which usually reflects the past? Statistical business management has been standard for over fifty years, where top managers rely their decisions on numbers, focusing on past outcomes, performance figures and forecasts. Of, course, we need to learn from history, but we also need to understand what may come. We need data from the future.
Now, in the zettabyte era, we have all the statistics we could ever wish for. But it does not help us understand what decisions we need to make. Furthermore, once we understand what decisions to make, the data won’t tell us what will be the best thing to do. This speech is about how the decision processes need to be modernized and how we can create “data from the future” that helps us to understand what will be the impacts of today’s decisions.

  • Get the most from the data analytics but realize the shortcomings in the analytics and the fantastic presentation layers. Wherever you are missing data, which you always will, you need to combine qualified assumptions with the data analytics in a controlled way.
  • Start with the outcomes, and not the data. Clean data, intelligent data analysis tools, and simulation capabilities combined with human expertise, can actually bring data from the future. Trying out various scenarios before trying something in real life saves time and money.  Do this by building a decision model based on the outcomes you strive for. It will guide you through possible “what if’s” and will use the most from your data assets. We have been visualizing and modeling processes, data and software for decades. The turn has now come to model decisions.
  • Recognize decision making as a team activity. Generalists usually make better decisions than specialists and teams are more powerful that lone wolfs, even though we tend to admire individuals that are capable of making complex decision by themselves. But, the combination of data analysis tools with the collective intelligence of specialists is even more powerful.

This decision making process has been tested in several Scandinavian organizations and decision model examples from the mobile technology industry, services sector and municipal administration will be shown.

Håkan Edvinsson, CTO, Informed Decisions

Håkan Edvinsson

CTO, Informed Decisions

Håkan Edvinsson is an author, trainer, speaker and consultant within data-driven Enterprise Architecture and Data Governance. Today, he is senior partner and the CTO of Informed Decisions.  He has practiced his information centric EA in large global industries, in medium-sized enterprises and in the public sector since the mid 1990ies. Håkan is the author of the best-selling “Enterprise Architecture Made Simple” book (Technics Publications 2013) and contributing author to the second edition of the DAMA Data Management Body of Knowledge.

The Journey to a Data-Driven Culture at Defence Infrastructure Organisation
The Journey to a Data-Driven Culture at Defence Infrastructure Organisation

In September 2014 the Defence Infrastructure Organisation (DIO) embarked on a new strategy, introducing an insourced management team via a strategic business partner consisting of a consortium of three private sector companies. This also heralded the introduction of a new directorate within DIO – Data, Analytics & Insight – to bring data to the forefront of the organisation and change the culture to be data-driven and evidence-based.

Data, Analytics & Insight has driven a significant change in less than two years, delivering a data management programme that encompasses data governance (15 Data Owners across DIO), data architecture and modelling, and most recently data quality metrics to drive accountability. MI has been transformed to be a dashboard-based series of key indicators driving decision-making at senior levels of the organisation, and a BI programme has been initiated bringing data together from across the breadth of the DIO for the first time whilst rolling out BI tools and a data warehouse to several hundred staff.

Building on these foundations, analytics and insight are exploiting the data, developing models and geospatial analytics to add value whilst also extending knowledge through research and benchmarking.

Data, Analytics & Insight has reached a maturity at which the combination of establishing the strategic pillars and delivering quick wins has begun to reap benefits. Clear changes in behaviour can be seen in DIO as a result and the future influence of the directorate is likely to grow as the outputs add more value.

Ian Wallis, Head of Data, Analytics & Insight (DA&I), Defence Infrastructure Organisation (DIO)

Ian Wallis

Head of Data, Analytics & Insight (DA&I), Defence Infrastructure Organisation (DIO)

Ian Wallis is Head of Data, Analytics & Insight (DA&I) at the Defence Infrastructure Organisation (DIO). DA&I’s role is to transform DIO into a data-driven, evidence-based organisation to support efficient use of £3.5bn of annual spending to enable the armed forces to live, work and train. Ian’s remit encompasses data management, business intelligence, MI and reporting, analytics and insight.  Ian has nearly thirty years’ experience in this field and has managed programmes in blue chip organisations, including data warehouse projects at organisations including Centrica and BBC. He has led analytics teams at HSBC, Royal Mail and The Pensions Regulator and delivered data management programmes at Thomson Reuters, Barclays and Arqiva.  Ian works for Aecom and is deployed through the Strategic Business Partner arrangement which has insourced private sector expertise to transform the DIO. Prior to this, Ian worked as an interim manager through his own company, Data Strategists.

From Small Experiments to Viable Data Governance
From Small Experiments to Viable Data Governance

Interesting things often start small and out-of-sight. This story started out as a small metadata management exercise but it has now evolved into a whole company data governance and knowledge management initiative. Petr will share their story of real people implementing practical, enterprise data governance in a small European bank, where every (FTE and) penny counts. Petr will share all the important lessons learned and also show you what the result looks like.

  • How did we make DG visible to everyone?
  • How did we get people on board?
  • Data Governance shouldn`t be a big deal

Petr Podany, Head of Data & Analytics, Sberbank CZ

Petr Podany

Head of Data & Analytics, Sberbank CZ

Petr Podany is head of the Data & Analytics department at Sberbank CZ. He is currently leading an initiative to redesign the whole department. This includes the building of a new Data Warehouse and the introduction of in-situ Data Governance. His prior experience includes being the BICC lead at Trask (the major Czech system integrator) and head of BI infrastructure at Komerční Banka (one of the leading Czech banks).

The Data Renaissance: Leading Your Business to the Modern Age
The Data Renaissance: Leading Your Business to the Modern Age

The Renaissance was a time during the 14th-17th centuries that is considered the link between the Middle Ages and modern history. It was a time of incredible artistic progress, a time when a fundamentally new kind of thinking took hold.  We are living in a similar time with respect to data. We have seen tremendous progress in the last decade: with the advent and ubiquitous adoption of handheld computing; a complete upheaval of transportation with car and ride-sharing services and soon, autonomous vehicles; and even a return to space exploration funded by private enterprise.

Though nobody can predict what the future holds, we do know that our companies must do something different to remain relevant. This session will focus less on the specific technologies in focus today, and more about how to create a culture that embraces, and ideally drives, the constant disruptive change of today’s world.

  • Why adapting to what is new today is a fool’s errand
  • How to take an approach that works far better
  • Why although everything is different, nothing has changed — and although history repeats itself, the cycle times are always getting shorter

14:40-15:30
Analyze Yourself!
Analyze Yourself!

Most people think that applying (advanced) analytics is the exclusive domain of large corporations or governments, but more and more individuals and (sport-) teams analyze their own or each other’s performance. Usually this boils down to what is now called ‘the quantified self’ but applications in healthcare, HR and community support stretch beyond the individual domain. This session will show what’s already available to the general public, healthcare professionals and researchers to help them analyze personal metrics, but will also show a glimpse of a future where wearables, injectables and implantables will be used to improve our lives. Jos will also share his own experiences in building a personal analytics solution where he integrates and analyzes data from Runkeeper, Sleepcycle, Weather.com, and devices like a blood pressure monitor and weighing scale.

Taking the Internet of Things Beyond Data to Actionable Insights
Taking the Internet of Things Beyond Data to Actionable Insights

Enterprises across the globe are looking to IoT-solutions to gain competitive advantage for their products and services. Watching an end-to-end demo is exciting but doesn’t tell you the whole story. More interesting is the journey to implement a real-world smart IoT-solution from concept to product.   Within an IoT data flow you can have multiple places where analytics can play an important role to create business value to your solution. You can also have different kinds of analytics in your architecture.

In this session we look at the implementation of a smart presenter where a presenter gets live feedback from the presenter, the audience and other ‘devices’. This – full cloud – prescriptive solution tells you what to do to improve your presentation. We’ll discuss the landscape of IoT & Analytics and our personal lessons learned during this implementation.

  • Understand the potential of data and datafication
  • Where/How to position Analytics within your (smart) IoT solution
  • What are the main challenges of incorporating Analytics into IoT

Hylke Peek, Consultant Performance Monitoring, The Backbone

Hylke Peek

Consultant Performance Monitoring, The Backbone

Hylke Peek is a consultant in Business and IT Performance and is a specialist in Data Warehousing and Analytics.
He has consulted for a wide variety of companies on enterprise business intelligence, Big Data processing and Analytics, Machine Learning, both on-premise and cloud. Hylke has spoken on multiple events and IT-channels to share his experience in the field. He loves working with data and sharing his experience with other data-minded individuals and organisations. Follow Hylke on Twitter: @hylkepeek.

Niels Naglé, Data Solution Consultant/Trainer, Info Support

Niels Naglé

Data Solution Consultant/Trainer, Info Support

Niels Naglé is a leading Data Solution consultant and Trainer at Info Support. As a consultant he specialises in business intelligence, data management and Analytics. Niels has consulted for a wide variety of companies in divers sectors on enterprise business intelligence, Analytics, self-service BI. He is passionate about data and loves to share his experiences and expertise with like-minded individuals and organisations unlocking their data dividend. Niels tweets, blogs, writes articles and presents Follow Niels on Twitter: @nielsnagle..

The Quality in Master Data Quality
The Quality in Master Data Quality

The end goal of our MDM endeavours is to ensure the quality of our master data is suitable to support the business processes and reporting needs, at the lowest possible cost. But how can we turn data quality into a way of living for the organisation, without making it just another formality? Within Friesland Campina this topic is part of their journey, with its challenges, successes and failures. They have gone a long way with their governance organisation and they intend to go even further. Dana will share with you some of the things they have learned so far, by answering the following questions:

  • Who needs data quality and why?
  • To which extent do we measure master data KPIs and to what purpose?
  • How do data quality and master data governance intertwine and support one another?

Dana Julinschi, Master Data Governance & Projects Manager, FrieslandCampina

Dana Julinschi

Master Data Governance & Projects Manager, FrieslandCampina

Dana Julinschi currently leads an ambitious programme to implement company-wide master data governance, together with data quality processes and KPIs.  Throughout the past eight years she has been part of several MDM implementation projects in different industries with a strong focus on change management within governance organisations, MDM maturity assessments, data quality KPIs and data maintenance processes.  The overall goal of each of her projects is to turn master data management into a way of living for the organisations.

Ethics in Information Management Practices
Ethics in Information Management Practices

Ethics is the new black in Information Management these days, or so it seems. From the European Data Protection Supervisor publishing an Opinion on Big Data Ethics, to countless articles about the call from data scientists for clarity on ethics, there is a growing consensus that “something must be done”. The challenge arises in moving from abstract discussion to practical application.

We need to move beyond talking about Ethics to implementing ethical frameworks in our information management and information governance initiatives. This session describes a soup-to-nuts approach for defining ethical frameworks for information management, setting a model for understanding how ethics affects and is affected by the Business, Information, and Technology management structures we put in place to meet customer expectations of Information and Process outcomes.   A case-based practical focus will ensure knowledge is practical and applicable, not just theoretical.  Key takeaways for this session include:

  • An overview Ethics and their relevance to Information Management practices
  • Ethics of Privacy and Human Rights
  • An overview of practical methods to align ethics with Information Governance
  • Risk management, Information management practices

Katherine O’Keefe, Data Governance & Privacy Consultant, Castlebridge

Katherine O’Keefe

Data Governance & Privacy Consultant, Castlebridge

Dr Katherine O’Keefe is a Data Governance and Data Privacy consultant with Castlebridge. Katherine has worked with clients in a variety of sectors on consulting and training engagements since starting with Castlebridge. In addition to her professional experience in Data Governance and Privacy, Katherine holds a Doctorate in Anglo-Irish Literature from University College Dublin and, as well as being a Data Governance and Privacy consultant, is a world leading experts on the Fairy Tales of Oscar Wilde. Follow Katherine @okeefekat.

Open Data I Love It! But Not My Data, It’s Special
Open Data I Love It! But Not My Data, It’s Special

The Environment Agency is a public sector organisation. They are part of the Department of the Environment whose approach is to open up all their data for use without charge or restriction. Their data is automatically open, unless there’s a good reason not to share it. The session will take you on our journey from charging for our data to being an open data organisation.  In this ‎session Lisa will

  • Explain what being an open data organisation means for us
  • The cultural shift needed to be open by default
  • The amazing uses of our open data

Lisa Allen, National Data Integrity Manager, Environment Agency

Lisa Allen

National Data Integrity Manager, Environment Agency

Lisa Allen leads a team in implementing data governance and data quality, with responsibility for developing, implementing and measuring the EA’s data maturity model. She is responsible for delivering ODX- the Environment Agency’s project to release vast amounts of data as open.

15:50-16:50
KEYNOTE: You Can Take It With You
KEYNOTE: You Can Take It With You

Graeme Simsion’s career has taken him from computer operator, to programmer, DBA, data modeller, data manager, IT strategist, consultancy manager, business facilitator and, more recently, Hollywood screenwriter and New York Times best-selling novelist. Graeme will talk about the generic skills in design, management and consulting that have helped him make the transitions and continue to be at the core of his work.   You will learn:

  • The practices most critical to implementing BI / ED initiatives – and writing a bestselling novel
  • The key strategies for winning support
  • The most valuable and portable skills to invest in

16:50-17:00 Conference Close

Sorry, your screen is too small to view this agenda.

Please switch to a larger device to see the full details.

Fees

  • 1 Day
  • £795
  • £795 + VAT (£159) = £954
  • 2 Days
  • £1245
  • £1,245 + VAT (£249) = £1,494
  • 3 Days
  • £1595
  • £1,595 + VAT (£319) = £1,914
  • 4 Days
  • £1945
  • £1,945 + VAT (£389) = £2,334
Group Booking Discounts
Delegates
2-3 10% discount
4-5 20% discount
6+ 25% discount

Venue

  • 22 Portman Square
  • London W1H 7BG
  • UK

Platinum Sponsors

Silver Sponsor

Standard Sponsors