Speakers and Abstracts

Speakers and Abstracts

Keynote talks

KEYNOTE: Towards a Smarter Data-Driven Future

Speaker

Piyush Malik
IBM

Piyush Malik
See my profile on LinkedIn

Abstract

The journey from Big Data to Smart Data

We live in a world where data has assumed a position of immense value in the economy. Cloud, Mobility, Cognitive computing, Artificial Intelligence, Robotics, Deep Learning & Internet of Things (IoT) may be “hot” topics in the industry lately but actually it is data that fuels the innovation that these “hot & emerging technologies” unleash.

From being a by-product of IT or ERP systems in the past to being termed as the “new natural resource”, data has come a long way. However, moving beyond the hype of Big Data and its virtues, can all this data in myriad of forms make humans more powerful and smarter? In this not-to be missed thought provoking keynote address, Piyush Malik will touch upon a spectrum of issues to set the tone of the IDQSummit 2015.

KEYNOTE: Trends in Data Analytics and Implications for Trust in Data

Speaker

Jeff Butler
IRS

Jeff Butler
See Jeff Butler's profile

Abstract

Changes in technology continue to bring new opportunities to improve data analytics and decision making. But are these changes also bringing new challenges to traditional data governance and data quality strategies? For example, how are data quality programs adapting to the growth in distributed/grid computing, NoSQL databases, and unstructured data? Is technology moving faster than data governance can keep up? This session explores these issues and the impact of increasingly non-traditional analytic data architectures on trust and confidence in data.

Jeff Butler is Associate Director of Data Management in the IRS Research, Analysis, and Statistics organization and manages the largest data warehouse in the IRS that he created in 1995. Prior to returning to the IRS in 2004, he was Associate Director, Office of Statistical Computing in the Bureau of Transportation Statistics, where he designed the largest data warehouse in the Department of Transportation. He has awards from TDWI, ACT-IAC, GCN, and Computerworld and over 25 years of experience in data mining, statistical computing, and data warehousing. Jeff holds a B.A and M.A. in Economics and has completed work toward a M.S. in Statistics.

Tutorials

Developing a Dashboard for Governance

Speaker

Kelle O’Neal
First San Francisco Partners

Kelle O'Neal
See my profile on LinkedIn

Abstract

One of the biggest challenges in implementing and sustaining a data governance program is determining the real impact the program has made to the organization – the ROI of the investment in governing data. It is relatively straightforward how to measure the progress of a data governance program in terms of identification of data accountability, creation of standards and policies, improvement in data quality. But how do we determine how all of this progress has improved the bottom line?

This tutorial will review how to measure the impact of a data governance program to risk management, operational efficiency and cost reduction. Using Case Studies from a variety of industries, we will identify metrics that measure the impact a governance program provides to business processes within the organization. We will review:

  • The difference between a metric and a KPI (Key Performance Indicator); a progress metric and an impact metric
  • How to create impact metrics
  • How to link progress metrics and impact metrics to provide a complete dashboard Sample metrics to consider for your organization
  • Tips and tricks on developing a dashboard and communicating metrics to increase value to stakeholders

IQScoring: A Practical, Cost-effective Method to Assess and Advance IQ in Your Organization

Speaker

Rodney Schackmann
Intel

Rodney Schackmann
See my profile on LinkedIn

Abstract

IQScoring is the technique we’re using at Intel to do just that. It’s a concise, cost-effective Information Quality (IQ) assessment methodology that establishes baselines, effective for understanding and communicating the current state of your full enterprise IQ landscape. That insight can help you determine where IQ is impacting your business and where IQ investments need to be made. We’ll describe the IQScoring Method, how it can be associated to your business processes, how it can work in conjunction with other industry recognized DQ and process optimization techniques, and how it can be used to initiate deeper and more effective IQ pursuits within your company.

We’ll explore the technique, walk you through how to perform the assessments, and discuss how to optimize it for your company.

ISO 8000 Data Quality Certification Workshop

Speaker

Gerardo Leal
ECCMA

Gerardo Leal
See Gerardo's profile

Abstract

An information-packed tutorial on the fundamental principles of information and data quality for preparation for ISO 8000 MDQM Certification and/or to gain a better understanding of emerging global corporate trends in cataloging, and master data quality and validation.

ISO 8000 is the recognized international standard for measuring data quality. It is responsible for defining quality data as “portable data that meets requirements”. What exactly does this mean and how can you use ISO 8000 to measure the quality of your data? Better still how can ISO 8000 help you improve the quality of your data and how can you use ISO 8000 to get better quality data from your trading partners?

This is a tutorial on the fundamental principles of information and data quality in general and specifically as they apply to master data. This tutorial covers the curriculum required for certification by ECCMA as an ISO 8000 Master Data Quality Manager (MDQM). It will also teach you how to build a corporate dictionary and how to create the all important data requirements that you can use to measure and improve the quality of your master data.

 

Next Generation Data Strategy: Aligning data strategy with corporate strategy and ensuring higher levels of data quality

Speaker

Rajesh Jugulum
Cigna

Rajesh Jugulum
See my profile on LinkedIn

Abstract

After realizing the importance of data, organizations have begun to view data as a critical asset, being equally important as other assets such as people, capital, raw materials and facilities. The concept of data as an asset has driven the need for dedicated data management programs. Beyond ensuring data is fit for its intended business purposes, the organizations should focus on the creation of shareholder value through data activities. To achieve this, organizations should focus on developing a data strategy to include components such as data monetization, data innovation, data risk management, etc. Key characteristics of a data strategy should include speed, accuracy, and precision of data management processes to help differentiate the organization from competitors. Such a data strategy should also be aligned to corporate strategy so that data requirements can be prioritized.

This session aims at designing and developing a data strategy that is aimed at creating value to shareholders while maintaining alignment to corporate objectives and describes how we can decompose the strategy into lower level components with suitable design parameters to address complexity and sequence of execution for resource planning. Further we will also discuss an important data strategy requirement “Build-In Data Quality” in detail with structured methodology to achieve higher levels of data quality with the help of case studies from different industries.

How to Avoid the most Common Mistakes Implementing Data Governance

Speaker

John Ladley
IMCUE Solutions

John Ladley
See my profile on LinkedIn

Abstract

Too many organizations miss the mark with getting data governance sold and started. Often the potential sponsors and stakeholders leave the presentation confused and de-motivated. John will explore several scenarios in this tutorial and will address the common errors and how to remedy the results, or avoid the issues altogether. He will address:

  • Strategies to regain managements attention, or never lose it in the first place
  • The huge challenge of old-style, embedded data management practices
  • The difference between typical and successful business cases
  • Common statements about data governance that are misunderstood or presented wrong
  • Dumb things that are still being done over and over

 

Protecting Data: Understanding and Building Resilient Security Programs

Speaker

Michael Montecillo
IBM

Michael Montecillo
See Mike's profile

Abstract

We have all read about them, data breaches, in fact, statistically it is likely we have all been affected by them, either professionally or in our personal lives. Increasingly it is becoming evident that businesses that lose data are not losing data as a result of the careless disregard for security but rather as a result of an increasingly sophisticated threat. This environment represents a distinct challenges to business that recognize the importance of the confidentiality, integrity, and availability of data.

IBM leverages more than 6,000 security professionals to deliver security to more than 3,000 clients in 133 different countries. As a result IBM has built a highly regarded knowledge base on the threats we face and how to build strategies on addressing them.

This session will utilize that expertise to answer the question of, “How can we protect our data when adherence to standards is not enough.” Additionally the session will provide attendees the building blocks to ensure data security. The session will be broken down into three major sections:

  • The threat that businesses face and how businesses can monitor their evolution.
  • The foundation for security that business must have.
  • The way forward, how to deal with advanced threats and build security into continually changing business environments.

Conference Sessions

Change Your Organisation’s Culture to Make Data and Information Quality a Part of Its DNA

Speaker

Jay Zaidi
AlyData

Jay Zaidi
See my profile on LinkedIn

Abstract

Welcome to the “Dawn of Data”. Volume of data is projected to grow 50 folds between now and 2020. Mobile, Social, and Cloud applications are creating a highly decentralised data eco-system, with complex data sets becoming the norm. We are in the midst of a historical event – organisations are slowly transitioning from an IT-centric environment to a Data-centric one. .

Sensational stories of Cybercrime and hackers make it easy for executives to make a strong case for investing in Cybersecurity. The cost and impact of data quality on an organisation’s bottom line, is equivalent or could even be higher than that of cyber crimes. However, it seldom gains much publicity and to my knowledge no C-level executives have been fired as a result of producing or consuming bad data. This is changing, as the organisation’s mature their data management capabilities. Volume, velocity and complexity aren’t what define data. Its quality that matters – since data is used to derive Insights, manage risks, innovate and drive decisions. In my advisory role, I’m seeing examples of the impact of bad quality data on organisations. Bad data is having a very tangible impact across business verticals and the government. In some instances, organisations are being shut down, due to decisions that were based on poor quality data.

What is needed is a major cultural transformation within organisations – to become data-driven and data and information quality focused. This will require support from the highest levels, long term investment and a change management strategy . Changing culture of an organisation is challenging, but it has been done by before. In this TED-style talk, I will make a strong case for why the slogan “Data and Information Quality is Job 1.” must be adopted by all organisations. I’ve successfully implemented an Enterprise Data Quality program at a Fortune 10 firm and advise many other organisations, on how to successfully do this. This experience has given me unique insights and I have learnt many lessons along the way. I will share these with you, discuss best practices, implementation challenges, core capabilities required in an Enterprise Data Quality Program and critical success factors, so that you can use this knowledge to influence change within your organisation. After all, quality isn’t just about slogans, but about transforming organisations and their staff, so that they have a “quality mindset” and weave quality into every thing they do!

Organizing for Data Quality: The Center of Excellence Approach

Speaker

Laura Sebastian-Coleman IQCP
Cigna

Laura Sebastian-Coleman
See my profile on LinkedIn

Abstract

Improving the quality of data within an organization requires a wide range of skills, from the ability to conduct detailed data analysis to the ability to change organizational behavior. Identifying individuals with this range of skills is nearly impossible. It takes a team to improve data quality.

This session will explore the advantages of establishing a Center of Excellence as an approach to implementing a data quality program, using the DQ CoE at Cigna as a case study.

Attendees will learn how to adapt the Center of Excellence approach by:

  • Aligning with organizational strategy
  • Defining key functions and goals for the CoE
  • Assembling a team to implement those goals
  • Long-term planning for CoE activities

Establishing a Data Quality Foundation for Master Data Management

Speakers

Deepa Krishnan
World Bank

Deepa Krishnan, See my profile on LinkedIn

Jennifer Trotsko
IFC

 Jennifer Trotsko
See my profile on LinkedIn

Abstract

World Bank Group (WBG) is comprised of a number of public and private sector entities including IBRD, IFC and MIGA, that share critical business and financial information. The Enterprise Master Data Management implementation at the WBG is seen as a cornerstone of our Data Management strategy and seeks to provide a consistent view of key business entities across the Bank Group.

In these two case studies, we will explore the steps in establishing the business case for MDM, creating an enterprise MDM strategy and roadmap and laying an enterprise MDM foundation. We will provide a high level overview of our requirements analysis process (including a special Data Specification Sheet, developed for the purpose of data requirements for MDM), solution design artifacts and agile implementation methodology. We will also outline the people, process and technology components that are critical to a successful MDM implementation. Finally, we will address the challenges and best practices for an initial, enterprise MDM implementation.

These sessions are expected to be useful to anyone who is just embarking on an enterprise MDM project with a focus on data quality or is looking to take their implementation to the next level by leveraging industry best practices.

The barriers to and the benefits of managing data, information and knowledge as a business asset

Speaker

James Price
Experience Matters

James Price
See my profile on LinkedIn

Abstract

Information Assets comprise the arterial system for organisations, supplying the nutrients for all business activities, process and decisions; without information no organisation can function. With the advent of mobility and the Cloud, IT is becoming a utility like electricity or water and we are witnessing the death of the traditional IT shop. Yet if information is so critical to an organisation’s survival and prosperity, and if IT is being marginalised, who will be accountable for managing that vital business asset? Whilst data, information and knowledge are managed by every person in the organisation, in most organisations it is done badly and at significant cost. Global research explains why.

This presentation will help explain:

  • What Information Assets are and why they are so important
  • How Information Assets are managed
  • Why it matters to organisations and their operation and therefore why the senior leadership team should pay attention
  • Current trends and the implications for organisations and the management of their information
  • The reasons why information is managed so badly
  • What to do about it

James will draw on the findings of a research project that Experience Matters, in conjunction with the University of South Australia, is conducting into “The Barriers to and the Benefits of managing Information as a Business Asset”. The project is being run in Australia, South Africa and the United States. Gartner has described Experience Matters’ work as “tremendous” and the project and its findings as “ground-breaking”. Mike Orzen, winner of the Shingo Prize, which is described by Business Week as the “Nobel Prize for Operational Excellence,” has declared that it is “truly great work.” James will present the findings which have been identified by C-Level Executives from organisations that include amongst many others National Australia Bank, Sanlam and the City of Cape Town and Bell Helicopter, Boeing, Multnomah County Health and Wells Fargo.

The Role of Crowd Power in Defining Information Value Across Business & IT Functions

Speaker

Will Crump
Datum LLC

Will Crump
See my profile on LinkedIn

Abstract

Metadata management is a valuable IT function, but the business has different needs and expectations for defining information relevance. The growing trend in data governance is solving operational business challenges in parallel with analytical integrity. This makes for a business-centric, lean approach that metadata management has a hard time keeping pace with. In this session you’ll learn:

  • How to crowd-power the definition of data quality rules and standards from the business source of the knowledge
  • How today’s most progressive companies are using crowd-power to focus efforts in process improvement and metrics that matter
  • How this approach helps create sustainable success by more broadly and deeply engaging the data dialogue

What has data quality to do with the Internet of Things and Smart Manufacturing?

Speaker

Dan Carnahan
Rockwell Automation

Dan Carnahan
See my profile on LinkedIn

Abstract

With all the hype about the Internet of Things, who is minding the store –so to speak about data quality? We hear about autonomous vehicles, smart cities, smart factories, and numerous interconnected devices. Security, of course is an important concern — what about the context, content, and conveyance of the data in the many different facets of an enterprise? In this paper, we will describe the different facets of an enterprise related to Smart Manufacturing and the Internet of Things and how standards are addressing the need for good quality data for Smart Manufacturing.

Quality Identifiers for Real Property

Speaker

Liz Green
Rel-e-vant Solutions

Liz Green
See my profile on LinkedIn

Abstract

Quality identifiers are central to the rapid advancement of quality data in today’s environment of “big data” and rapid information exchange. In the business of information about real property, the concepts of better identification, authority and ownership of land is increasingly poised to apply modern technologies surrounding geospatial science more so than ever before.

This session will discuss the progress of the ECCMA ePROP workgroup marshalling he ECCMA Standard 1 to illustrate immediate, real world uses that can benefit governmental authorities, property owners and business engaged in real estate services.

From Root Cause Analysis to Resolution

Speaker

Danette McGilvray
Granite Falls

Danette McGilvray
See my profile on LinkedIn

Abstract

Have you ever felt caught in a loop of assessing data quality, seeing the problems, but not quite getting to resolution the way you would like? You see the problems but haven’t gotten to root causes and implementing real improvements. If that’s the case, this presentation is for you. Join us to learn:

  • Techniques for identifying root causes
  • Moving to real improvements – both preventive and corrective
  • How data governance supports this phase of your data quality work
  • Typical barriers that arise and how to overcome them

Whether you are dealing with big data, little data, data in the cloud or data right here on earth, this tutorial will help. Come with your issues and be prepared to participate and share.

Achieve Data Accuracy with Master Data Governance

Speaker

Tom Taylor
Utopia Global

Tom Taylor
See my profile on LinkedIn

Abstract

Learn how to support your organization’s data governance initiatives with a proven packaged solution, find out how to get started and discover best practices utilizing SAP.

Learn how to:

  • Improve effectiveness and efficiency across the enterprise
  • Increase IT efficiency and reduce maintenance costs
  • Make strategic, accurate and timely decisions with reliable information
  • Publish and distribute data to partners, subsidiaries or divisions

With SAP Master Data Governance, you can deliver trusted information to maximize business efficiency, make smarter decisions and reduce risk to meet compliance requirements.

Enterprise Approach to data

Speaker

Nonna Milmeister IQCP
Telstra

Nonna Milmeister
See my profile on LinkedIn

Abstract

Telstra is Australia’s largest telecommunications / information services company with 16.7 million domestic retail mobile customers. Telstra continually develops new, innovative products and technologies into market across its entire portfolio. Naturally, this also brings complexity into the systems and processes that support these products – and in turn the Data that drives them. The importance of Data quality is amplified with emerging technologies and Industry changes such as National Broadband Network, Cloud and Big Data – Telstra is responding to meet this challenge. To achieve improved quality in this environment requires an Enterprise approach to Data. This presentation will share/exchange ideas on best industry practice and describe the key components that allowed Telstra to win 2014 Data Quality Asia Pacific Award

Cognitive Computing from Concepts of Entities

Speaker

Dwayne Collins
Acxiom

Dwayne Collins
See Dwayne Collins profile

Abstract

Characterizations of entities and their relationships important for marketing purposes are vastly different from characterizing classical physical science entities. Whereas legacy statistical and confirmatory methods have proven to be effective in modeling and interpreting instances of the latter case, the inherent ambiguity and inconsistencies of the former case often makes such modeling and computing environments much less effective and overburdened with highly granular static business rules. This talk explores an alternative approach to both the data and the computational perspectives with marketing/social entities and relationships. This approach focuses on the perspective of interpreting and processing these entities and their associated data from a conceptual context.

The talk will identify several alternative concept-based models, techniques for interpreting and computing with the data that is sensitive to the context of these models, and identify the different perspectives of data quality in these contexts, such as validity, consistency, accuracy, and integrity, as well as the reevaluation of the expectations and purpose of the data.

Defining and Establishing DataOps

Speaker

Jim Barker
Winshuttle

Jim Barker
See my profile on LinkedIn

Abstract

In an environment where individuals use applications to drive activities from what book to purchase, what film to view, to what temperature to heat a home, data is the critical element. To make things work data must be correct, complete, and accurate. Many firms view data governance as a panacea to the ills of systems and organizational challenge while other firms struggle to generate the value of these programs. This session outlines a conceptual framework that could be described as a “House of Data Governance.” This framework could be used to organized and execute a data governance program that is either just starting or needing to be reset. The goal for this conceptual framework is to establish a way to achieve DataOps.

Complying with the DATA Act of 2014

Speaker

Herschel Chandler
Information Unlimited Inc.

Herscel Chandler
See my profile on LinkedIn

Abstract

The Digital Accountability and Transparency Act of 2014 (DATA Act) promises to transform the U.S. government’s finances by replacing disconnected documents with standardized and searchable data. The DATA Act requires the U.S. Treasury Department and the White House to work together to adopt government-wide data standards for the government’s spending information — everything from agencies’ financial statements to budget reports to grantee accountability to contracting databases.

Common data standards, including a government-wide recipient identifier and consistent data formats for account balances, will allow cross-agency searches and government-wide analytics for the first time.

The impact of the DATA Act, if fully implemented, will go beyond democratic accountability. The law promises to create new management tools for data-driven decision-making as well. Learn how the DATA Act is creating a new era in spending transparency for the United States — if some significant challenges are overcome.

Leveraging Real-Time Operational Metadata: Data Management’s Great Pivot

Speakers

Darrin Cunningham
Peacock

Darrin Cunningham, Matt Slatner

Matt Slatner
Peacock

Matt Slatner
See my profile on LinkedIn

Abstract

The Background, What we know:
Little has changed in data management over the past 20 years. Traditional solutions require multiple tools that do not integrate well, industry specific knowledge from human resources and a significant budget for the initial intensive custom development. Compound this with the ongoing cost to maintain the enterprise data management framework and we quickly realize why the data management space has been largely stagnant.

The grand thought, the big what if:
What if a data management solution can be a single framework architected to be self-learning and self-healing requiring very little human interaction to build and maintain? How would an enterprise go about creating a single data management framework that holistically combines a large array of data management functionality?

The Solution:
Rethinking siloed traditional data management concepts and solutions by leveraging standards like ISO 8000 to create a base framework which combines a series of data management principles and functionality. Adding in-data-stream monitoring capabilities to evaluate inbound and outbound data feeds then creates invaluable real-time metadata and tracking information like the most advance super store would diligently track inventory.

Conclusion: What does it mean?
Data management as we know it can change for the better. The data management space doesn’t pause to give us a chance to catch up to the problems and challenges that it presents. This fact is one of the drivers of an evolution in data quality. Embracing standards such as ISO 8000 significantly helps reduces the data management noise by bringing incredible data standardization across industries.

An ISO framework for Data Governance

Speaker

Timothy King
Babcock International

Timothy King
See my profile on LinkedIn

Abstract

  • Introduction to a proposed framework to sit at the heart of ISO 8000, which is the international standard for data quality and has to date been a diverse collection of different parts
  • How the framework puts all the parts of ISO 8000 into a coherent overall approach and enables users of the standard to select which parts are relevant to a particular issue
  • How the framework builds on the core capabilities of ISO 9000 to underpin a holistic, systematic approach to data quality, appropriate for all types of data and all types of information and communications technology
  • How the framework focusses on computer-processable data specifications as an essential step in ensuring data meet requirements
  • How any organization can exploit the framework in creating solutions to ensure the right individuals across the organization are able to make the right decisions at the right time in all processes because data meet requirements

Building User-Focused Data Models That Work

Speaker

Seth Maislin
Earley Information Science

Seth Maislin
See my profile on LinkedIn

Abstract

Foundational to any information process, and necessary before personalization or automation can ever happen, content needs to be modeled with the right balance of precision and practicality. In this session, Seth Maislin will demonstrate:

  • deciding when a model is needed, and how many to create
  • proven techniques for developing thorough, user-focused models through use cases
  • how subjectivity in a model can be helpful
  • using a model to develop taxonomy, IA, and search

Standards and behaviours – some key factors for managing data effectively

Speaker

Julian Schwarzenbach
Data and Process Advantage

Julian Schwarzenbach
See my profile on LinkedIn

Abstract

Managing data effectively requires the establishment and running of a comprehensive and well-structured approach coupled with ongoing monitoring and refining. There are a number of standards that provide key components in an effective overall approach to data management:

  • ISO 8000:150 provides an effective framework for data quality management that can be expanded to encompass data governance (expands on main presentation)
  • Security – PAS 1192-5 is a recently released specification that defines an overall approach to the assessment and management of information security, particularly when information has to be shared across organisations
  • Data transfer/interoperability – ISO 15926 and COBie are two standard approaches to share data between organisations. These will be explained in more detail and will include some learning points from a data interoperability Proof of Concept

This workshop will additionally consider a number of the behavioural/human factors that impact data quality:

  • Comparison of attitudes to data with attitudes to health and safety
  • Use of the unique Data Zoo concept to explore typical behaviours towards data, the factors that drive these behaviours and how to change behaviours

Data Validation in the IRS Compliance Data Warehouse (CDW): Ideal vs. Actual

Speaker

Robin Rappaport
IRS

Robin Rappaport
See my profile on LinkedIn

Abstract

The Data Quality journey at the U.S. Internal Revenue Service RAS (IRS) – Research, Analysis, and Statistics (RAS) began in 2005 for the Compliance Data Warehouse (CDW) (started in 1997). As the largest IRS database; CDW provides data, metadata, tools, training and computing services to hundreds of research analysts working to improve tax administration. Data Validation is a necessary first step in any data analysis or analytic project. Presently evaluating existing data validation processes to identify and resolve data processing errors introduced during data load (missing, misaligned, and duplicate). This presentation will share findings and discuss the ideal vs. the actual and what this means for researchers using CDW data.

  • What it means to be a data repository
  • Why it is important to understand the data
  • What would ideal data validation look like

The Case for the CDO

Speaker

Peter Aiken,
Data Blueprint

Peter AikenSee my profile on LinkedIn

Abstract

Reflections on Chief Information Officers (CIOs), combined with decisive performance measurements, indicate that IT management has been asked to do a job that it cannot do well. Data are assets that deserve to be managed as professionally and aggressively as comparable organizational assets. Studies show that approximately 10% of organizations achieve a positive return on their investments in data. In the face of the accelerating “data explosion,” this leaves most organizations unprepared to leverage a non-degrading, strategic asset. The redress assigns this vital, lacking function to its rightful owner and driver, the business. Transformation may require some organizational discomfort. We are confident organizations that successfully create CDOs will achieve improved organizational performance results. And further that these will directly and obviously come from better organizational data management stemming from the CDO leadership.

Fundamentals of Entity Resolution and Master Data Life Cycle Management

Speaker

John Talburt IQCP
Black Oak Analytics

John Talburt
See my profile on LinkedIn

Abstract

Proper management of master data is a critical component of any enterprise information system. However, effective master data management (MDM) requires that both IT and Business understand the life cycle of master data and the fundamental principles of entity resolution (ER).

This presentation provides a high-level overview of current practices in data matching, record linking, and entity information life cycle management that are foundational to building an effective strategy to improve data integration and MDM. Particular topics of coverage are

  1. The CSRUD Life Cycle of master data
  2. The need for ongoing ER analytics
  3. How to systematically measure of ER performance
  4. The importance of investing in clerical review and asserted resolution for continuous improvement
  5. Addressing the challenges of large-scale ER and MDM through distributed processing

An Objective Approach to Prioritize Data Migration Activities

Speaker

Christopher Heien
Cigna

Chris Heien
See my profile on LinkedIn

Abstract

A functional need often arising from the creation of an Enterprise Data Management team, or Chief Data Office, is the consolidation of data into a centralized, enterprise warehouse of certified data. This presentation demonstrates the applied use of key metadata components (e.g. data inventory, lineage) along with documented business needs to objectively quantify and prioritize the order of migration of data from existing systems to a centralized, enterprise warehouse.

The Noetic/ECCMA Financial Dictionary

Speakers

Justin Magruder
Noetic Partners

Justin Magruder
See my profile on LinkedIn

Diane Schmidt
Noetic Partners

Diane Schmidt
See my profile on LinkedIn

Justin Magruder is President at Noetic, Chairman of Financial Content Standardization Council. His specialties are customer insights and predictive analytics, product design and analytics, market data, low latency data, making big data seem small, data warehouse design, reference & master data management & architecture for collateral management, client & counterparty, product, instrument, pricing, data quality management, flow, profiling & processing, market data analysis, selection and sourcing, meta data management, ETL architecture, transaction data management, financial market data, financial transaction process management.

 

Diane Schmidt, Managing Director at Noetic Partners, is an accomplished information management and financial services executive with extensive experience in operations, technology, analytics, and strategic information management roles. She has broad knowledge of financial data products, services and information management techniques, and has been managing “big data” for the last 15 years.

As a Managing Director with Noetic Partners, Diane provides executive level guidance to form and deliver data management strategies, solution management and execution plans for global financial institutions with a focus on financial, operations, control and information technology processes. She is one of the co-creators of the Noetic Active Data Governance ™ Program and the Noetic Master Model (NMM).

The Role and Benefits of Data Validation in Supply Chain Optimization

Speaker

Daryl Crockett
ECCMA and ValidDatum

Daryl Crockett
See my profile on LinkedIn

Abstract

While there is much focus these days on data quality rules and technological solutions for master data governance, we will explore a rarely discussed (but highly beneficial) practice of external validation of buy-side supply chain master data.

In this session, we will do a deep dive into the specifics of:

  • Distinguishing between internal and external validation and the best indicated use of each
  • Understanding the different methods, costs and benefits of external and third-party supply chain master data validation
  • Review of a phased data governance implementation plan approach which includes data validation

Industrial Quality Data is no longer optional, it is a mandatory requirement

Speaker

Peter Benson
ECCMA

Moderator: Peter Benson
See my profile on LinkedIn

Abstract

In 2013 ECCMA conducted a pilot project for the Government of the Kingdom of Saudi Arabia to see if applying data quality principles to industrial data could improve the Government’s ability to influence and monitor the progress of localization. What they learned in 2013 and what they are planning now, will have a profound effect not only on all industry in the Middle East but also on manufacturers worldwide.

This is a case study that will review the findings of the original 2013 pilot project as well as the scope of the application of ISO 8000 to the creation and maintenance of the four major databases in the national industrial data program.

Expert panels

Tuesday Expert Panel: Data Quality — Are we there yet?

Moderator

Peter Benson
ECCMA

Moderator: Peter Benson
See my profile on LinkedIn

Abstract

With over 225 million hits for “data quality” on Google, it clearly appears to be a subject on everyone’s mind. We have an international standard that tell us what data quality means and how we measure it but the question remains – are we there yet? if not how much farther do we have to go and does anyone have a map? This panel will look at how far we have come, what the destination could actually look like when we get there and of course are there any clear signs of the path ahead.

 

Panel Members

Dan Carnahan

Dan Carnahan
See my profile on LinkedIn
Rockwell Automation

Jay Zaidi

Jay Zaidi
See my profile on LinkedIn
AlyData

Justin Magruder

Justin Magruder
See my profile on LinkedIn
Noetic Partners

Rodney Schackmann

Rodney Schackmann
See my profile on LinkedIn
Intel

Liz Green

Liz Green
See my profile on LinkedIn
Rel-e-vant Solutions

Wednesday Expert Panel: The Future of Information Quality

Moderator

Lwanga Yonke
IAIDQ

Moderator: Lwanga Yonke
See my profile on LinkedIn

Abstract

This year marks the 20th anniversary of the academic International Conference on Information Quality (ICIQ), founded at MIT. IAIDQ celebrated its 10th anniversary in 2014. The UALR Information Quality Graduate Program is nine years old. The ISO 8000 data quality standard is now six years old. These and other milestones give us an opportunity to pause and reflect on the current state and future of Information Quality (IQ).

The distinguished panel we have assembled will tackle the following questions and more:

  • Data science and advanced analytics are re-energizing the focus on the strategic value of data, what opportunities does that trend provide to information quality practitioners?
  • What new information quality and data governance challenges are brought about by Big Data? What new questions arise that must be answered?
  • Are Chief Data Officers (CDOs) meant to also serve as Chief Information Quality Officers (CIQOs)?
  • What is the current status and future of information quality education at the university level?
  • How do we attract and retain the next generation of IQ practitioners? What will draw Millennials?
  • What structures need to be put in place so IQ becomes an even stronger force that shapes Business in the future?
  • What goals do we set for the next 5 years? The next 10 years?
  • What can we individually commit to do?

This will be a fast-paced, thought-provoking and insightful conversation, with lots of audience participation. Do not miss it!

Panel Members

Danette McGilvray

Danette McGilvray
See my profile on LinkedIn
Granite Falls

Dinah Mande

Dinah Mande
See my profile on LinkedIn
Aera Energy

Nonna Milmeister IQCP

Nonna Milmeister
See my profile on LinkedIn
Telstra

Julian Schwarzenbach

Julian Schwarzenbach
See my profile on LinkedIn
Data & Process Advantage

John Talburt IQCP

John Talburt
See my profile on LinkedIn
Black Oak Analytics

Information and Data Quality Summit © 2015 Frontier Theme