Get Instant Access
to This Blueprint

Data Business Intelligence icon

Build a Data Integration Strategy

Integrate your data or disintegrate your business.

  • As organizations process more information at faster rates, there is increased pressure for faster and more efficient data integration.
  • Data integration is becoming more and more critical for downstream functions of data management and for business operations to be successful. Poor integration holds back these critical functions.

Our Advice

Critical Insight

  • Every IT project requires data integration. Regardless of the current problem and the solution being implemented, any change in the application and database ecosystem requires you to solve a data integration problem.
  • Data integration problem solving needs to start with business activity. After understanding the business activity, move to application and system integration to drive the optimal data integration activities.
  • Data integration improvement needs to be backed by solid requirements that depend on the use case. Info-Tech’s use cases will help you identify your organization’s requirements and integration architecture for its ideal data integration solution.

Impact and Result

  • Create a data integration solution that supports the flow of data through the organization and meets the organization’s requirements for data latency, availability, and relevancy.
  • Build your data integration practice with a firm foundation in governance and reference architecture; use best-fit reference architecture patterns and the related technology and resources to ensure that your process is scalable and sustainable.
  • The business’ uses of data are constantly changing and evolving, and as a result, the integration processes that ensure data availability must be frequently reviewed and repositioned in order to continue to grow with the business.

Build a Data Integration Strategy Research & Tools

Start here – read the Executive Brief

Read our concise Executive Brief to find out why your organization should improve its data integration, review Info-Tech’s methodology, and understand how we can help you create a loosely coupled integration architecture.

1. Collect integration requirements

Identify data integration pains and needs and use them to collect effective business requirements for the integration solution.

2. Analyze integration requirements

Determine technical requirements for the integration solution based on the business requirement inputs.

3. Design the data-centric integration solution

Determine your need for a data integration proof of concept, and then design the data model for your integration solution.


Member Testimonials

After each Info-Tech experience, we ask our members to quantify the real-time savings, monetary impact, and project improvements our research helped them achieve. See our top member experiences for this blueprint and what our clients have to say.

9.0/10


Overall Impact

$22,159


Average $ Saved

10


Average Days Saved

Client

Experience

Impact

$ Saved

Days Saved

Clyde & Co LLP

Guided Implementation

7/10

N/A

5

Recorded Books

Guided Implementation

10/10

N/A

N/A

Barnardos Australia

Guided Implementation

10/10

$43,999

18

Academic Partnerships

Guided Implementation

9/10

$10,079

9

CAF - Corporacion Andina de Fomento

Guided Implementation

9/10

$12,399

9

SThree Management Services Ltd.

Guided Implementation

8/10

N/A

1

Construction Resources Management

Guided Implementation

8/10

$12,599

5

Remedi SeniorCare

Guided Implementation

8/10

$1,115

2

NASA

Guided Implementation

10/10

N/A

20

ChoiceTel

Guided Implementation

10/10

$14,259

23

Bush Brothers & Company

Guided Implementation

8/10

N/A

N/A

Helmerich & Payne, Inc.

Workshop

8/10

N/A

N/A

Mott MacDonald LLC

Guided Implementation

10/10

N/A

N/A

Broome-Tioga Boces

Guided Implementation

9/10

N/A

N/A

Kamehameha Schools

Guided Implementation

7/10

N/A

N/A


Workshop: Build a Data Integration Strategy

Workshops offer an easy way to accelerate your project. If you are unable to do the project yourself, and a Guided Implementation isn't enough, we offer low-cost delivery of our project workshops. We take you through every phase of your project and ensure that you have a roadmap in place to complete your project successfully.

Module 1: Collect Integration Requirements

The Purpose

  • Explain approach and value proposition.
  • Review the common business drivers and how the organization is driving a need to optimize data integration.
  • Understand Info-Tech’s approach to data integration.

Key Benefits Achieved

  • Current integration architecture is understood.
  • Priorities for tactical initiatives in the data architecture practice related to integration are identified.
  • Target state for data integration is defined.

Activities

Outputs

1.1

Discuss the current data integration environment and the pains that are felt by the business and IT.

1.2

Determine what the problem statement and business case look like to kick-start a data integration improvement initiative.

1.3

Understand data integration requirements from the business.

  • Data Integration Requirements Gathering Tool

Module 2: Analyze Integration Requirements

The Purpose

  • Understand what the business requires from the integration solution.
  • Identify the common technical requirements and how they relate to business requirements.
  • Review the trends in data integration to take advantage of new technologies.
  • Brainstorm how the data integration trends can fit within your environment.

Key Benefits Achieved

  • Business-aligned requirements gathered for the integration solution.

Activities

Outputs

2.1

Understand what the business requires from the integration solution.

  • Data Integration Requirements Gathering Tool
2.2

Identify the common technical requirements and how they relate to business requirements.

  • Data Integration Trends Presentation

Module 3: Design the Data-Centric Integration Solution

The Purpose

  • Learn about the various integration patterns that support organizations’ data integration architecture.
  • Determine the pattern that best fits within your environment.

Key Benefits Achieved

  • Improvement initiatives are defined.
  • Improvement initiatives are evaluated and prioritized to develop an improvement strategy.
  • A roadmap is defined to depict when and how to tackle the improvement initiatives.

Activities

Outputs

3.1

Learn about the various integration patterns that support organizations’ data integration architecture.

  • Integration Reference Architecture Patterns
  • Data Integration POC Template
3.2

Determine the pattern that best fits within your environment.

  • Data Integration Mapping Tool

Build a Data Integration Strategy

Integrate your data or disintegrate your business.

ANALYST PERSPECTIVE

Integrate your data or disintegrate your business.

"Point-to-point integration is an evil that builds up overtime due to ongoing business changes and a lack of integration strategy. At the same time most businesses are demanding consistent, timely, and high-quality data to fuel business processes and decision making.

A good recipe for successful data integration is to discover the common data elements to share across the business by establishing an integration platform and a canonical data model.

Place yourself in one of our use cases and see how you fit into a common framework to simplify your problem and build a data-centric integration environment to eliminate your data silos."

Rajesh Parab, Director, Research & Advisory Services

Info-Tech Research Group

Our understanding of the problem

This Research Is Designed For:

  • Data engineers feeling the pains of poor integration from inaccuracies and inefficiencies during the data integration lifecycle.
  • Business analysts communicating the need for improved integration of data.
  • Data architects looking to design and facilitate improvements in the holistic data environment.
  • Data architects putting high-level architectural design changes into action.

This Research Will Also Assist:

  • CIOs concerned with the costs, benefits, and the overall structure of their organization’s data flow.
  • Enterprise architects trying to understand how improved integration will affect overall organizational architecture.

This Research Will Help You:

  • Understand what integration is, and how it fits into your organization.
  • Identify opportunities for leveraging improved integration for data-driven insights.
  • Design a loosely coupled integration architecture that is flexible to changing needs.
  • Determine the needs of the business for integration and design solutions for the gaps that fit the requirements.

This Research Will Help Them:

  • Get a handle on the current data situation and how data interacts within the organization.
  • Understand how data architecture affects operations within the enterprise.

Executive summary

Situation

  • As organizations process more information at faster rates, there is increased pressure for faster and more efficient data integration.
  • Data integration is becoming more and more critical for downstream functions of data management and for business operations to be successful. Poor integration holds back these critical functions.

Complication

  • Investments in integration can be a tough sell for the business, and it is difficult to get support for integration as a standalone project.
  • Evolving business models and uses of data are growing rapidly at rates that often exceed the investment in data management and integration tools. As a result, there is often a gap between data availability and the business’ latency demands.

Resolution

  • Create a data-centric integration solution that supports the flow of data through the organization and meets the organization’s requirements for data accuracy, relevance, availability, and timeliness.
  • Build your data-centric integration practice with a firm foundation in governance and reference architecture; use best-fit reference architecture patterns and the related technology and resources to ensure that your process is scalable and sustainable.
  • The business’ uses of data are constantly changing and evolving, and as a result the integration processes that ensure data availability must be frequently reviewed and repositioned to continue to grow with the business.

Info-Tech Insight

  1. Every IT project requires data integration.Any change in the application and database ecosystem requires you to solve a data integration problem.
  2. Integration problem solving needs to start with business activity. After understanding the business activity, move to application and system integration to drive optimal data integration activities.
  3. Integration initiatives need to be backed by requirements that depend on use cases. Info-Tech’s use cases will help identify organizational requirements and the ideal data-centric integration solution.

Your data is the foundation of your organization’s knowledge and ability to make decisions

Integrate the Data, Not the Applications

Data is one of the most important assets in a modern organization. Contained within an organization’s data are the customers, the products, and the operational details that make an organization function. Every organization has data, and this data might serve the needs of the business today.

However, the only constant in the world is change. Changes in addresses, amounts, product details, partners, and more occur at a rapid rate. If your data is isolated, it will quickly become stale. Getting up-to-date data to the right place at the right time is where data-centric integration comes in.

"Data is the new oil." – Clive Humby, Chief Data Scientist Source: Medium, 2016

The image shows two graphics. The top shows two sets of circles with an arrow pointing to the right between them: on the left, there is a large centre circle with the word APP in it, and smaller circles surrounding it that read DATA. On the right, the large circle reads DATA, and the smaller circles, APP. On the lower graphic, there are also two sets of circles, with an arrow pointing to the right between them. This time, the largest circle envelopes the smaller circles. The circle on the right has a larger circle in the centre that reads Apple Watch Heart Monitoring App, and smaller circles around it labelled with types of data. The circle on the right contains a larger circle in the centre that reads Heart Data, and the smaller circles are labelled with types of apps.

Organizations are having trouble keeping up with the rapid increases in data growth and complexity

To keep up with increasing business demands and profitability targets and decreasing cost targets, organizations are processing and exchanging more data than ever before.

To get more value from their information, organizations are relying on more and more complex data sources. These diverse data sources have to be properly integrated to unlock the full potential of your data:

The most difficult integration problems are caused by semantic heterogeneity (Database Research Technology Group, n.d.).

80% of business decisions are made using unstructured data (Concept Searching, 2015).

85% of businesses are struggling to implement the correct integration solution to accurately interpret their data (KPMG, 2014).

Break Down Your Silos

Integrating large volumes of data from the many varied sources in an organization has incredible potential to yield insights, but many organizations struggle with creating the right structure for that blending to take place, and data silos form.

Data-centric integration capabilities can break down organizational silos. Once data silos are removed and all the information that is relevant to a given problem is available, problems with operational and transactional efficiencies can be solved, and value from business intelligence (BI) and analytics can be fully realized.

Data-centric integration is the solution you need to bring data together to break down data silos

On one hand…

Data has massive potential to bring insight to an organization when combined and analyzed in creative ways.

On the other hand…

It is difficult to bring data together from different sources to generate insights and prevent stale data.

How can these two ideas be reconciled?

Answer: Info-Tech’s Data Integration Onion Framework summarizes an organization’s data environment at a conceptual level, and is used to design a common data-centric integration environment.

Info-Tech’s Data Integration Onion Framework

The image shows Info Tech's Data Integration Onion Framework. It is a circular graphic, with a series on concentric rings, each representing a category and containing specific examples of items within those categories.

Poor integration will lead to problems felt by the business and IT

The following are pains reported by the business due to poor integration:

59% Of managers said they experience missing data every day due to poor distribution results in data sets that are valuable to their central work functions. (Experian, 2016)

42% Reported accidentally using the wrong information, at least once a week. (Computerworld, 2017)

37% Of the 85% of companies trying to be more data driven, only 37% achieved their goal. (Information Age, 2019)

"I never guess. It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts." – Sir Arthur Conan Doyle, Sherlock Holmes

Poor integration can make IT less efficient as well:

90% Of all company generated data is “dark.” Getting value out of dark data is not difficult or costly. (Deloitte Insights, 2017)

5% As data sits in a database, up to 5% of customer data changes per month. (Data.com, 2016)

"Most traditional machine learning techniques are not inherently efficient or scalable enough to handle the data. Machine learning needs to reinvent itself for big data processing primarily in pre-processing of data." – J. Qiu et al., ‎2016

Understand the common challenges of integration to avoid the pains

There are three types of challenges that organizations face when integrating data:

1. Disconnect from the business

Poor understanding of the integration problem and requirements lead to integrations being built that are not effective for quality data.

50% of project rework is attributable to problems with requirements. (Info-Tech Research Group)

45% of IT professionals admit to being “fuzzy” about the details of a project’s business objectives. (Blueprint Software Systems Inc., 2012)

2. Lack of strategy

90% Of organizations will lack an integration strategy through to 2018. (Virtual Logistics, 2017)

Integrating data without a long-term plan is a recipe for point-to-point integration spaghettification:

The image shows two columns of rectangles, each with the word Application Services. Between them are arrows, matching boxes in one column to the other. The lines of the arrows are curvy.

3. Data complexity

Data architects and other data professionals are increasingly expected to be able to connect data using whatever interface is provided, at any volume, and in any format – all without affecting the quality of the data.

36% Of developers report problems integrating data due to different standards interpretations. (DZone, 2015)

These challenges lead to organizations building a data architecture and integration environment that is tightly coupled.

A loose coupling integration strategy helps mitigate the challenges and realize the benefits of well-connected data

Loose Coupling

Most organizations don’t have the foresight to design their architecture correctly the first time. In a perfect world, organizations would design their application and data architecture to be scalable, modular, and format-neutral – like building blocks.

Benefits of a loosely coupled architecture:

  • Increased ability to support business needs by adapting easily to changes.
  • Added ability to incorporate new vendors and new technology due to increased flexibility.
  • Potential for automated, real-time integration.
  • Elimination of re-keying/manual entry of data.
  • Federation of data.

Vs. Tight Coupling

However, this is rarely the case. Most architectures are more like a brick wall – permanent, hard to add to and subtract from, and susceptible to weathering.

Problems with a tightly coupled architecture:

  • Delays in combining data for analysis.
  • Manual/Suboptimal DI in the face of changing business needs.
  • Lack of federation.
  • Lack of flexibility.
  • Fragility of integrated platforms.
  • Limited ability to explore new functionalities.

To make the most of your data, the four Vs of data must be enabled by technology

Big, fast data. The four Vs of data are typically used to describe big data. Big data architecture may require you to evaluate your existing technologies and capabilities.

1/3 Less than one-third of companies are able to deliver real-time data, despite over 50% needing it. (IOUG, 2016)

45% of companies say data quality is an inhibitor of data delivery. Source: IOUG, 2016

80% of business decisions are made using unstructured data. (Concept Searching, 2015)

1TB 55% of companies move up to 1 TB of data to analytical systems per day. (IOUG, 2016)

Velocity - Speed of data.

Veracity - Trustworthiness of data.

Variety - Diversity of data.

Volume - Scale of data.

"Architecture coupling is now a mainstream term but data coupling is rarely used. As data is the new gold, we should ensure it does not get polluted by contaminants on the way in or way out of storage." – Andy Neill, Sr. Director of Research, Info-Tech

Velocity of accurate data transmission is critical to making sure the right data gets to the right people at the right time

Velocity...is a challenge

Data velocity is:

  1. Speed at which it is generated.
  2. Frequency at which it is updated.
  3. Rate at which it is delivered.

In a business world that increasingly demands real-time insights, decision making continues to be inhibited by incomplete and slow-moving information. Enterprises are weighed down by inadequate performance, siloed data, and slow response times. A new data architecture and new approaches to integration are needed. (IOUG, 2016)

1/3 One-third of data managers say that it takes more than an hour to run reports. (IOUG, 2016)

10% 1 in 10 data managers have to wait more than one business day to receive their reports. (IOUG, 2016)

Data integration is one of the components of the DAMA DMBOK2 Framework

This research is created with reference to the Data Asset Management Association’s Book of Knowledge, Version 2 (DAMA DMBOK2).

The DAMA DMBOK2 Data Management Framework

Data Governance

  • Data Architecture
  • Data Modeling & Design
  • Data Storage & Operations
  • Data Security
  • Data Integration & Interoperability
  • Documents & Content
  • Reference & Master Data
  • Data Warehousing & Business Intelligence
  • Meta-data
  • Data Quality

Data management is the planning, execution, and oversight of policies, practices, and projects that acquire, control, protect, deliver, and enhance the value of data and information assets (DAMA, 2014).

In other words, getting the right information to the right people at the right time.

The research in this blueprint focuses on data-centric integration and interoperability, one of the essential components of a comprehensive data management practice.

Info-Tech’s Data Integration Onion Framework helps create a loosely coupled environment

Avoid pitfalls of integration.

This methodology will provide you with the strategy needed to address the common challenges of integration and create a loosely coupled integration architecture. By engaging the crucial roles and understanding what is needed from the business, and then from a technical perspective, you will generate integration solutions that serve the needs of the business in the present and that are scalable, agile, and responsive for changes in the future.

"A data-centric architecture, shortened from database centric, is an approach to software design that considers the data within a system as the most important component." – S. Ratnasamy et al.

The image shows the same Info-Tech Data Integration Onion Framework graphic.

Use the Data Integration Reference Architecture to establish your customized data integration architecture

Every organization’s data integration requires a unique design and an integration approach that fits its business and technology environment. Therefore, it is difficult to paint a picture of a universally ideal model. However, when data integration is broken down in terms of layers, there exists a general structure that is applicable to all data organizations.

The image shows the Data Integration Reference Architecture graphic, which shows the layers of data integration.

Identify where you fit into the data-centric integration personas and their associated roles

Do any of these roles sound like you?

Info-Tech’s methodology for data-centric integration projects outlines key processes and steps designed to be relevant for the people of modern organizations. Look for the following identifiers as you work through this blueprint to identify where we recommend the associated personas be involved in the integration development process.

Data Engineer - Works with and analyzes data to generate reports to support business decisions.

Business Analyst - Communicates with business to identify requirements and satisfaction. Demonstrates competencies for stakeholder management, analytical techniques, and the ability to “speak the language” of both the business and IT.

Data Architect - Understands the data environment in a holistic manner and designs solutions. Has a greater knowledge of operational and analytical data use cases. Is a hands-on expert in data management and warehousing technologies.

Info-Tech Insight

The above titles are roles. Therefore, if the exact titles do not exist in your organization, or if one person performs multiple activities across the descriptions, fit the people in your organization with the skills that best align with the project’s objectives.

Info-Tech will help you identify which of the four common integration use cases fit your project

Every IT project lives or dies by integration.

Whether data is being used to support business processes or to make strategic decisions, effective integration is built on the same process. However, there are unique considerations for each of the four common scenarios or use cases that drive organizations to improve their data-centric integration.

"Regardless of the nature of the integration project, whether it is for BI or for ERP, the same methodology applies." – Wayne Regier, Director of Data Management, Husky Injection Molding Systems

The image shows a graphic that has a downward pointing arrow on the right, labelled Use Case. To the right of the arrow are 3 columns, labelled Phase 1, Phase 2, and Phase 3. Under the Use Case text is another label: All Use Cases. To the right of that text is a green line that runs horizontally under the columns, split into sections.

Orchestration

Support business activities, processes, or workflows.

Analytics

Pulling data together into a data warehousing environment for analysis.

Conglomerate

Merger and acquisition activities.

Legacy

Application rationalization or migration.

Info-Tech offers various levels of support to best suit your needs

DIY Toolkit

“Our team has already made this critical project a priority, and we have the time and capability, but some guidance along the way would be helpful.”

Guided Implementation

“Our team knows that we need to fix a process, but we need assistance to determine where to focus. Some check-ins along the way would help keep us on track.”

Workshop

“We need to hit the ground running and get this project kicked off immediately. Our team has the ability to take this over once we get a framework and strategy in place.”

Consulting

“Our team does not have the time or the knowledge to take this project on. We need assistance through the entirety of this project.”

Diagnostics and consistent frameworks used throughout all four options

Build a Data Integration Strategy – project overview

1. Collect Integration Requirements 2. Analyze Integration Requirements 3. Design the Data-Centric Integration Solution
Best-Practice Toolkit

1.1 Identify Integration Pains and Needs

1.2 Collect Business Requirements for the Integration Solution

2.1 Determine Technical Requirements for the Integration Solution

2.2 Leverage Integration Trends to Address Requirements

2.3 Architect the Data-Centric Integration Strategy

2.4 Calculate ROI to Attach Dollar Value

3.1 Validate Your Data-Centric Integration Pattern

3.2 Design the Consolidated Data Model

3.3 Map Source to Target Model

3.4 Capture Integration Metadata

Guided Implementations
  • Learn about the concepts of data integration and the common integration use cases.
  • Understand what drives the business to need improved data integration, and how to collect integration requirements.
  • Determine the technical requirements for the integration solution.
  • Learn about and understand the differences between trends in data integration, as well as how they can benefit your organization.
  • Determine your ideal integration pattern.
  • Start with a proof of concept (PoC) to validate your integration design.
  • Learn about the source to target mapping tool, and how to create your own.
  • Learn about integration metadata and what metadata to capture.
Onsite Workshop Module 1: Collect Integration Requirements Module 2: Analyze Integration Requirements Module 3: Design the Data-Centric Integration Solution

Workshop overview

Contact your account representative or email Workshops@InfoTech.com for more information.

Planning and Preparation Workshop Day 1 Workshop Day 2 Workshop Day 3 Workshop Day 4
Activities

Pre-Work

1.1 Discuss the current data integration environment and the pains that are felt by both the business and IT in relation to data integration.

1.2 Determine what the problem statement and business case look like to kick-start a data integration improvement initiative.

1.3 Understand data integration requirements from the business.

Collect Integration Requirements

2.1 Understand what the business requires from the integration solution.

2.2 Identify the common technical requirements and how they relate to business requirements.

2.3 Review the trends in data integration to take advantage of new technologies.

2.4 Brainstorm how the data integration trends can fit with your environment.

Analyze Integration Requirements

3.1 Learn about the various integration patterns that support the organization’s data integration architecture.

3.2 Determine the pattern that best fits with your environment.

Create Your Solution’s Source to Target Data Model

4.1 Determine if a PoC approach is required for the environment.

4.2 Document the business activities, sources and targets of your new integration solution.

4.3 Identify the types of transformations required.

4.4 Identify the metadata that should be captured.

Map Source to Target for the Integration Solution

5.1 Obtain data from your sources.

5.2 Create target structures.

5.3 Apply transformation rules.

5.4 Create an orderly execution of data integration workflows.

Deliverables
  1. Data Integration Requirements Gathering Tool
  1. Data Integration Requirements Gathering Tool
  2. Data Integration Trends Presentation
  1. Data Integration Requirements Gathering Tool
  2. Data Integration Trends Presentation
  3. Integration reference architecture patterns
  1. Data Integration PoC Template
  2. Data Integration Mapping Tool
  1. Data Integration Mapping Tool

Phase 1

Collect Integration Requirements

Build a Data Integration Strategy

Phase 1 will help you to determine when you are experiencing integration pains and what the business needs

Tired of indeterminate pains stopping your data initiatives in their tracks?

Data integration is not often identified as being the root cause of the problem, but often it is the true bottleneck. You may have a data integration problem if you are experiencing one or more of the following:

  • Expensive ongoing manual maintenance of applications.
  • An inability to deliver data in a timely and accurate manner.
  • Delays in analysis of data.
  • Lack of flexibility and fragility of integrated platforms.

Goals for Phase 1:

  • Understand business context. Determine how data integration is used to improve business through the four uses of data: operations, transaction, reporting, and analysis. This will help you to identify pains and opportunities related to data integration.
  • Gather high-quality requirements. Approach your business stakeholders and gather high-quality requirements to drive your planning using our four column model for understanding data requirements.

This phase will help you to achieve the following:

  1. Identify the common problems that are caused by the root cause of poor integration.
  2. Understand what the business needs to get from the integrated data solution.

Follow success criteria for each step of the project to reduce rework

This blueprint highlights key areas in each phase of a data integration initiative where you should check to see if what you have done is successful before moving on.

If you are having trouble with a certain area, we recommend that you address the issues before moving on to the next step in the project to prevent failure and time-consuming rework.

Phase Success Criteria:

  • Phase 1
    • Collect Integration Requirements
  • Phase 2
    • Analyze Integration Requirements
  • Phase 3
    • Design the Data-Centric Integration Solution

For phase 1, the following list of success criteria should be achieved before moving on to phase 2:

  • Identification of the data integration problem, along with a clear and concise problem statement.
  • Business case developed for the data-centric integration solution, including how the pains will be relieved and how the improved solution will bring improved opportunities for the business to realize its vision.
  • Sign off from the business for the business requirements document (BRD).

Phase 1 outline

Call 1-888-670-8889 or email GuidedImplementations@InfoTech.com for more information.

Complete these steps on your own, or call us to complete a guided implementation. A guided implementation is a series of 2-3 advisory calls that help you execute each phase of a project. They are included in most advisory memberships.

Guided Implementation 1: Collect Integration Requirements

Proposed Time to Completion: 2 weeks

Step 1.1: Identify Integration Pains and Needs

Start with an analyst kick-off call:

  • Discuss the current data integration environment, and the pains that are felt by both business and IT around data integration.

Then complete these activities…

  • Determine what the problem statement and business case look like to kick-start a data integration improvement initiative.

With these tools & templates:

  • Data Integration Requirements Gathering Tool

Step 1.2: Collect Business Requirements for the Integration Solution

Review findings with analyst:

  • Understand what data integration requirements look like and how to collect effective requirements.

Then complete these activities…

  • Walk through the four column model of data to understand who to interview for requirements gathering, as well as what questions to ask.
  • Review the requirements gathered from the business.

With these tools & templates:

  • Data Integration Requirements Gathering Tool

Phase 1 Results & Insights:

Data integration improvements must start with business activity. Realizing that you have a data integration problem and understanding what that problem looks like can be difficult given the complexity of the organization’s data environment. Understanding this and the priorities of the solution for the business is a crucial first step in any data integration activity.

Step 1: Identify Integration Pains and Needs

PHASE 1

1.1 Identify Integration Pains and Needs

1.2 Collect Business Requirements for the Integration Solution

This step will walk you through the following activities:

  • Understand why the organization is integrating its data and how to pinpoint integration problems.
  • Learn how to determine if you are facing data integration pains, and identify what is needed to solve those pains.

This step involves the following participants:

  • Data Engineer
  • Business Analyst

Outcomes of this step

  • Improved data integration knowledge and a new way of thinking about data integration.
  • Clarity on current data integration-related pains.
  • Enhanced ability to recognize often misdiagnosed data integration pains.

Roles that are involved in data integration activities

Look for the following identifiers as you work through this blueprint to identify where we recommend the associated personas be involved in the integration development process.

Data Engineer

  • An individual or group of individuals who work with data to integrate, optimize, and support data with deep technical knowledge of underlying technologies.
  • Could also be called: database administrator (DBA), big data engineer, big data architect, etc. – depending on the technology and platforms.
  • Typically the first to experience data integration pains through slow or inaccurate data.

Data Architect

  • Reviews project solution architectures and identifies cross impacts across the data lifecycle.
  • Is a hands-on expert in data management and warehousing technologies.
  • Facilitates the creation of the data strategy.
  • Manages the enterprise data model.
  • Has a greater knowledge of operational and analytical data use cases.

Business Analyst

  • Role that functions as the crucial link between the business and the IT roles responsible for designing, developing, and implementing data changes.
  • The designated business analyst(s) for the project have responsibility for end-to-end requirements management.
  • Work collaboratively with their counterparts in the business and IT (e.g. developer teams or procurement professionals) to ensure that the approved requirements are met in a timely and cost-effective manner.

Organizations integrate data to support business processes and/or for insight generation

Business Analyst

The data engineer role often experiences data integration pains first.

Before you start to identify the pains and opportunities of integration, you must first understand why the organization is integrating data.

Organizations integrate data for one or more of the four uses of data, as illustrated below:

Support business processes

  1. Operations
  2. Transactions

Example of data integration supporting business processes:

The image shows two dark blue cylinders on the left, labelled Data Source 1 and 2. Next to them are two light blue rectangles, labelled Data Model 1 and 2. This section is labelled Sources. To the right is another section labelled Integration. There are two arrows pointing from the Sources section to the Integration section. In the Integration section are a dark blue cylinder labelled ERP and a light blue rectangle labelled ERP.

Data-driven insight generation

3. Reporting

4. Analytics

Example of data integration supporting insight generation:

The image shows a graphic with three dark blue cylinders on the left, labelled Sources, with arrows pointing to the right to a light blue rectangle with the text Consolidated Data Model 1 in it. Above that rectangle is the title Integration. There is another arrow pointing right to another dark blue cylinder, labelled Warehousing, and then another arrow pointing to the right to dark blue rectangles with icons of paper, labelled Reporting.

A note on data model notation:

Every application has an input and an output, as defined by the data model design created according to the use of the application. In other words, apps take data in, possibly perform activities on the data, and then send data out to other apps.

In this blueprint, data inputs will be represented by lines on the left of a data model and the outputs will be represented by lines on the right.

Info-Tech Insight

Most organizations don’t have the foresight to design their data architecture and data models right the first time. In a perfect world, the design that supports an organization’s needs would continue to satisfy those needs forever. However, in the modern world, needs change as businesses try to be more agile than ever before. Therefore, the data models supporting business needs have to be agile as well.

Understand the purposes of integration by knowing how it differs among available technologies

Business Analyst

Know Your Technology

When it comes to implementing your data integration and getting the most out of your solution, it is important to know the differences and similarities between commonly heard concepts and technologies.

Use Case ESB ETL MDM Data Hub ODS Micro Services Streaming Data EDI API
1. Co-ordinate processes, optimize workflows, and enable rules engine Partially enables the use case Directly enables the use case Is not the ideal solution for the use case Directly enables the use case Is not the ideal solution for the use case Directly enables the use case Partially enables the use case Is not the ideal solution for the use case Partially enables the use case
2. Streamline data flows at the transactional or operational levels Directly enables the use case Partially enables the use case Is not the ideal solution for the use case Partially enables the use case Directly enables the use case Partially enables the use case Partially enables the use case Is not the ideal solution for the use case Partially enables the use case
3. Enable near real-time access to data from multiple systems Directly enables the use case Partially enables the use case Is not the ideal solution for the use case Directly enables the use case Directly enables the use case Is not the ideal solution for the use case Partially enables the use case Is not the ideal solution for the use case Partially enables the use case
4. Create a 360-degree view of enterprise data Is not the ideal solution for the use case Directly enables the use case Directly enables the use case Directly enables the use case Partially enables the use case Is not the ideal solution for the use case Is not the ideal solution for the use case Is not the ideal solution for the use case Is not the ideal solution for the use case
5. Establish open data platform Partially enables the use case Directly enables the use case Partially enables the use case Directly enables the use case Is not the ideal solution for the use case Is not the ideal solution for the use case Partially enables the use case Is not the ideal solution for the use case Is not the ideal solution for the use case
6. Migrate data from legacy system Is not the ideal solution for the use case Directly enables the use case Partially enables the use case Partially enables the use case Is not the ideal solution for the use case Is not the ideal solution for the use case Is not the ideal solution for the use case Is not the ideal solution for the use case Is not the ideal solution for the use case
7. Integrate with supplier and partner systems Partially enables the use case Directly enables the use case Is not the ideal solution for the use case Partially enables the use case Partially enables the use case Is not the ideal solution for the use case Is not the ideal solution for the use case Partially enables the use case Partially enables the use case
8. Real-time analytics Partially enables the use case Partially enables the use case Is not the ideal solution for the use case Partially enables the use case Is not the ideal solution for the use case Partially enables the use case Directly enables the use case Is not the ideal solution for the use case Partially enables the use case

Integration problems are felt in multiple ways during data use

Business Analyst

Info-Tech Insight

Although the organization may be feeling pains associated with data integration inadequacies, data integration is not often seen as the root cause of the issues. Data integration is a backend process that does not have a lot of visibility, yet it touches every IT project. Read through these pains and identify how improved data integration will address these pains for your organization.

Pain Points Related to Poor Data Integration

Business

Limited Data Availability

  • Data is siloed, with different sources of data unable to be migrated and integrated together.
  • Large number of disparate data sources.
  • Fewer sources of data and less information for data analytics and decisions.

Weak Practices Create Data Trust Issues

  • Data is not synchronized across applications, creating data conflicts between systems.
  • The quality of data is inconsistent across systems; the organization experiences data quality issues related to duplicate, stale, and inaccurate data.
  • Improper data normalization and de-normalization causes incorrect report generation and poor business decisions.

IT

Inefficient Integration Processes

  • Data is often manually assembled and reconciled from various sources for enterprise reporting.
  • Integrations are performed ad hoc through hand coding.
  • There is a large amount of manual coding and re-keying of data.
  • Point-to-point integration is a common practice and performed without consideration of scalability and longevity.
  • IT application projects overlook data integration while conducting planning.
  • Poor data architecture documentation increasing integration complexity and project timelines.

Poor Standards

  • Data integration between different data sources has become difficult to perform due to inconsistent standards.
  • Poor match rules and data profiling practices prolong integration projects and undermine data quality. Inconsistent data definitions are used.

Large utilities providers need their platforms to integrate with new systems

Business Analyst

CASE STUDY

Industry Utility

Source ESB, Info-Tech Interview

Businesses with millions of interconnected devices need a strong integration platform to improve operational efficiency and customer service.

Context:

Rapid technology innovation in the Internet of Things (IoT) and expectations from customers to understand usage data have forced utilities to provide a more holistic and transparent view of their operations. This is not an easy task in an ever-growing environment of interconnected IoT devices. The business must have a 360-degree view of its operations. It should be able to connect with its disparate systems seamlessly while providing complete visibility to its customer in terms of their usage. The Enterprise Service Bus (ESB) software architecture model can help to implement communication between software applications.

Role of Data Integration:

  • Data integration real-time data hub pattern is able to establish middleware to communicate with geographic information systems, the IoT, and enterprise resource planning (ERP) systems using predefined formats established by large software and utility vendors.
  • A long-term, enterprise-level data integration strategy can eliminate point-to-point integration and support loosely coupled architecture to incrementally modernize legacy systems.

Business Value:

Speed to market and reduce overall deployment cost over the long run.

Greater performance management and oversight of operations across all facets of the organization.

Improved customer service: Improved analytics Quick turnaround for emergencies

More dynamic resource distribution based on availability, demand, and compliance requirements.

When supporting the uses of data, integration pains surface during four common use cases

Business Analyst

Data is utilized to support business processes through operations and transactions or to generate insights and make decisions through reporting and analytics. There are four common use cases that these utilizations of data fit into. An organization may currently fit into one or more of these use cases, but choosing one of them will help you better understand the pains, needs, and requirements from the business for improving data integration.

"Every analytical project involves a data integration component." – Hamdan Ahmad, Consultant, Slalom Consulting

  1. Orchestration
    • This use case encompasses integration activities that support business activities, processes, rules engines, or workflows.
    • This use case is performed during a one-time activity or multiple times through a data hub/orchestration layer.
    • Depending on the need for speed of operational data, there can be a real-time aspect to this situation.
  2. Analytics
    • The primary integration activity for this use case is pulling data together into a data warehousing environment for analysis.
    • This use case can include “simple” integration, or involve more complex data quality and reduction activities.
    • This use case is performed during a one-time activity or multiple times.
  3. Legacy
    • This use case involves application rationalization or migration (whether on-premises or cloud).
    • In this use case, it is important to first determine how far back you want to go for historical data.
  4. Conglomerate
    • This use case is experienced when an organization goes through any merger and acquisition activities.
    • Can be thought of as the legacy use case performed multiple times with an optimization strategy for the process.

Use the Data Integration Requirements Gathering Tool to document integration pains

Business Analyst

1.1.1 Data Integration Requirements Gathering Tool

Prepare for Requirements Gathering

Before starting to elicit requirements from the business, prepare for the requirements gathering by documenting the pains that motivate this project. This will help provide context for why the requirements are being gathered in the first place.

Data Integration Requirements Gathering Tool

Info-Tech’s methodology for gathering integration project requirements

To make your data integration initiative business-driven, document good requirements. This will help you to:

  • Complete the project effectively, with little rework.
  • Deliver business-approved results.
  • Enable greater confidence in data, enhance the relevance and usability of data, and improve access to data that is needed by users.

To produce good requirements, this data integration blueprint walks you through a data-integration-specific process in collaboration with Info-Tech’s framework from the blueprint Build a Strong Approach to Business Requirements Gathering.

Info-Tech’s Requirements Gathering Framework

The image shows a graphic with four arrows encircling it, labelled: Monitor; Communicate; Manage; Plan. Inside the circle of arrows is the title Deliver Business Value. Below that are three arrows pointing right, labelled: Elicit; Analyze; Validate. Key words are listed under each.

Info-Tech’s Requirements Gathering Framework gives a holistic, end-to-end approach for requirements gathering. It covers foundational issues (elicitation, analysis, and validation) as well as managing the requirements gathering process.

Identify your organization’s integration use case based on your pains

Business Analyst

1.1.1 30 minutes

Input

  • Use case prompts

Output

  • The use case that is driving the need for improved data integration

Participants

  • Data Engineer
  • Business Analyst

Instructions:

To place yourself in one of the four use cases outlined previously, ask the following questions and determine which one is the best fit for you:

Orchestration

  • Are operational activities ever slowed down by inaccurate data?
  • Is inaccurate data from multiple applications or systems causing problems with essential business activities?

Bottom line: This use case is the most urgent from a business perspective and will be the easiest to persuade the business to undertake.

Analytics

  • Does the business have a question or set of questions that they need to answer?
  • Is there enough data for analytics and decisions?
  • Is the organization lacking a federated approach and looking to bring multiple sets of data together for analytics and reporting?

Bottom line: The analytics use case has the most hope attached to it. Creative analytics to tackle new markets, engage new partners, or understand your customers better represents the “art of the possible.”

Legacy

  • Does the organization have to replace a legacy system and move the respective data?

Bottom line: This use case comes up when an organization is experiencing pain from an application that needs to be changed as it’s no longer fulfilling current business needs. This use case also occurs when support is no longer offered for a legacy system.

Conglomerate

  • Is the organization undergoing a merger or acquisition recently?
  • Does the organization have to replace multiple legacy systems and move the respective data?

Bottom line: This use case occurs less frequently for most organizations, although when it occurs it often requires integrating a large amount of data.

Use tab 2 of the Data Integration Requirements Gathering Tool to identify your use case.

Clarify the integration problem

Business Analyst

Step one of Info-Tech’s Requirements Gathering Framework helps you prepare to gather effective business requirements.

Before you determine what is needed to solve the problems experienced, it is important to articulate the pains and consequences of poor data integration. In this step, you should identify that you are experiencing data integration problems and communicate how these problems affect the organization.

Info-Tech’s Requirements Gathering Framework

ELICIT

  • Prepare
  • Conduct
  • Confirm

Phase 1

  • Step 1 = Prepare
    • Pains
    • Use Case
  • Step 2 - Conduct, confirm
    • Requirements

ANALYZE

  • Organize
  • Prioritize
  • Verify

VALIDATE

  • Translate
  • Allocate
  • Approve

Success Criteria

  • Identification of the data integration problem, along with a clear and concise problem statement.
  • Business case developed for the data-centric integration solution, including how the pains will be relieved and how the solution will bring improved opportunities for the business to realize its vision.

Step 2: Collect Business Requirements for the Integration Solution

Phase 1

1.1 Identify Integration Pains and Needs

1.2 Collect Business Requirements for the Integration Solution

This step will walk you through the following activities:

  • Use the four column model to identify business requirements.
  • Organize and consolidate the outputs of requirements gathering activities.

This step involves the following participants:

  • Data Engineer
  • Business Analyst

Outcomes of this step

  • Clear, concise, and actionable requirements that will inform the design and implementation of a data-centric integration solution.

"Things get done only if the data we gather can inform and inspire those in position to make difference." – M. Schmoker

Join forces with the business analyst role to translate pains into business requirements

Business Analyst

On the front lines working with the organization’s data, the data engineer role often experiences the impact of poor data integration first. Once the pains are described and a business case is developed, focus on turning those pains into opportunities for improvement, driven by business needs.

Data Engineer ↔ Business Analyst

The business analyst role should engage with the business to communicate the pains associated with poor data integration, understand what the needs of the business are, and assist the data engineer in translating those needs into requirements.

Don’t let your data integration initiative become a victim of poor requirements gathering

Business Analyst

The challenges in requirements management often have underlying causes; find and eliminate the root causes rather than focusing on the symptoms.

Root Causes of Poor Requirements Gathering:

  • There are requirements gathering procedures in place, but they aren’t followed.
  • There isn’t enough time allocated to the requirements gathering phase.
  • There isn’t enough involvement or investment secured from business partners.
  • There is no senior leadership involvement or mandate to fix requirements gathering.
  • There are inadequate efforts put toward obtaining and enforcing sign-off.

Outcomes of Poor Requirements Gathering:

  • Rework due to poor requirements leads to costly overruns.
  • Final deliverables are of poor quality and are implemented late.
  • Predicted gains from deployed applications are not realized.
  • Low feature utilization rates by end users.
  • Teams are frustrated within IT and the business.

Info-Tech Insight

Requirements gathering is the number one failure point for most development or procurement projects that don’t deliver value. This has been, and continues to be, the case as most organizations still don’t get requirements gathering right. Overcoming organizational cynicism can be a major obstacle to overcome when it is time to optimize the requirements gathering process.

Use the Data Integration Requirements Gathering Tool to document business requirements

Business Analyst

1.2.1 Data Integration Requirements Gathering Tool

Preserve Context

After the data engineer has determined and documented the pains felt by the business and other stakeholders, the Data Integration Requirements Gathering Tool can be used as a central document for collecting and analyzing the business requirements alongside the documented pains. This will help to preserve the context of why this project is being done as the requirements are collected for solving the pains.

"When gathering requirements, follow the 80-20 rule. 80% of the time organizations have the same requirements. The other 20% of the requirements are variable and dependent on the organization." – Gopi Bheemavarapu, Director of Research, Info-Tech

Data Integration Requirements Gathering Tool

Requirements for the solution for data integration problems change based on your use case.

We recommend that you fit your organization into one of the four use cases because your integration use case will help provide some of the most common and essential requirements of most organizations for integration solutions. Below are highlights of common requirements based on the use case.

Orchestration

  • When your data is used for operational purposes, speed of integration is of the essence.

Analytics

  • Most analytics can be performed with data that is updated in batches. However, if the organization is looking to get into real-time analytics or if that is already a priority, a real-time architecture is needed.

Legacy

  • This use case requires batch data integrations.
  • High quality of data is needed.

Conglomerate

  • This use case requires batch data integrations.

Determine your approach to eliciting business requirements

Business Analyst

Info-Tech’s Requirements Gathering Framework

ELICIT → ANALYZE → VALIDATE

  • Prepare
  • Conduct
  • Confirm

There are two approaches to requirements gathering, and which one you choose depends on existing requirements.

Data integration requirements are different than other requirements because data integration projects are typically part of a larger initiative. For example, data integration could be performed to support an analytics or BI project, or to support a new application being onboarded. Regardless of the larger initiative, the business requirements relevant to data integration activities would likely be gathered during that initiative’s project timeline. Therefore, the most effective way to gather data integration requirements would be to identify the BRD or its equivalent for the greater initiative.

  1. Existing Initiative Format
    • Use option one if you already have existing requirements from a larger initiative.
    • Requirements can be found using the project’s BRD.
    • Examples include:
      • Using data integration to support BI.
      • Integrating data for a new application.
  2. Interview Format
    • If there are no documented requirements for the greater data initiative, you will have to use option two: gather requirements from stakeholders using interviews. Decide on the following criteria:
      • Who will be interviewed?
      • What questions should be asked, and using what style?

Option 1: Gather requirements based on your use case

Business Analyst

Leverage existing business requirements, if possible.

If the data integration improvement initiative is based on a larger initiative, use the requirements gathered for that project to inform the business requirements of the data-centric integration solution. Data integration projects are typically done as part of a larger project with its own requirements spelled out for the data. Based on your use case, identify the type of requirements that you can reuse to inform the business requirements for the data-centric integration solution.

Use Case

Analytics

Orchestration

Legacy

Conglomerate

Business Initiative Requirements BI and Analytics Projects Business Process (application level) Business Process (application level) Business Process (application level)

Informs

E.g. Business looking to report on sales more comprehensively.

E.g. Business wants to include shopping cart functionality in e-commerce solution.

E.g. Legacy ERP being replaced by new system.

E.g. Business acquired data from a related firm.

Data Integration Requirements
Data Integration

Info-Tech Insight

Data integration projects are usually the result of problems experienced after a larger business initiative is completed. While it would be ideal for data integration requirements to be considered during the initiative planning, this is rarely the case. However, the larger initiative’s requirements can be used to inform the data integration requirements later on.

Gather and document integration requirements

Business Analyst

1.2.1 2 hours

Input

  • Use case and associated business initiative

Output

  • Documented data integration requirements based on business requirements

Participants

  • Data Engineer
  • Business Analyst

Instructions

  1. Based on your use case, identify the major initiative that has data requirements attached.
  2. Consult the BRD (or equivalent) for data requirements.
  3. Document the business requirements in tab 4 of the Data Integration Requirements Gathering Tool.

Business Requirements Document Template

If no BRD is present, use Info-Tech’s resources to put one in place.

Based on the type of business initiative, data requirements can be gathered with help from the following resources:

BI and Analytics Projects

Step 1.3 of Build a Next Generation BI With a Game-Changing BI Strategy provides guidance for gathering BI requirements.

Business Process (application level)

Step 1.1 of Enhance Your Application Architecture identifies requirements for updating app architecture.

The blueprint Build a Strong Approach to Business Requirements Gathering guides you through the process of identifying requirements for apps.

Option 2: Conduct interviews with key stakeholders if unable to gather existing requirements

Business Analyst

Requirements Gathering Modes

Info-Tech has identified four effective requirements gathering modes. During the requirements gathering interviews, you may need to switch between the four gathering modes to establish a thorough understanding of the information needs.

Dream Mode

  • Mentality: Let users’ imaginations go wild. The sky’s the limit.
  • How it works: Ask users to dream the ideal future state and how analytics can support their dreams.
  • Limitations: Not all dreams can be fulfilled. A variety of constraints (budget, personnel, technical skills) may prevent the dreams from becoming reality.

Pain Mode

  • Mentality: Users are currently experiencing pains related to information needs.
  • How it works: Vent the pains. Allow end users to share their information pains; ask them how their pains can be relieved. It is possible to convert their pains into requirements.
  • Limitations: Users are limited by the current situation and aren’t looking to innovate.

Decode Mode

  • Mentality: Read the hidden messages from users. Speculate as to what the users really want.
  • How it works: Decode the underlying messages. Be innovative to develop hypotheses and then validate with the users.
  • Limitations: Speculations and hypothesis could be invalid. They may direct the users into some pre-determined directions.

Profile Mode

  • Mentality: “I think you may want XYZ because you fall into that profile.”
  • How it works: The user may fall into an existing user group profile or their information needs may be similar to some existing users.
  • Limitations: This mode doesn’t address very specific needs.

Identify your data integration requirements by engaging both the business and IT

Business Analyst

The project team must engage both the business and IT to understand the current state of data integration, identify alignment and issues relative to the business’ data availability needs, and identify barriers and considerations regarding feasibility.

Gaining the perspectives of both sides will provide the most accurate assessment of current data integration practices and performance, highlight gaps, and uncover action items.

Who to Engage in This Step

Business

  • Data Stewards
  • Business Analysts
  • Power Users
  • Business Process Owners
  • BI/Analytics Analyst
  • Data Scientists
  • Data Engineers
  • Data Architects

IT Department

  • Project Manager
  • Application Manager/Developer
  • Database Administrators
  • Solutions Architects
  • Systems Analysts
  • Data Warehouse Developers

"Regardless of the nature of the data integration project, whether it is for BI or for ERP, the same methodology applies. First, determine the business context. Then figure out what data you need and how it is captured and modified. Next, consider the architecture of the data and the design of the new data model – does it comprehensively support the associated business processes?" – Wayne Regier, Director of Data Management, Husky Injection Molding Systems

Determine who you will interview to gather requirements

Business Analyst

1.2.2 1 hour

Input

  • Existing requirements and new requirements from exercise 1.2.1

Output

  • Prioritized and categorized requirements

Materials

  • Requirements Insights section of the BI Strategy and Roadmap Template

Participants

  • BA
  • Business Stakeholders
  • PMO

Before you can dive into most elicitation techniques, you need to know who you’re going to speak with – not all stakeholders hold the same value.

There are two broad categories of stakeholders:

  • Customers: Those who ask for a system/project/change, but do not necessarily use it. These are typically executive sponsors, project managers, or interested stakeholders. They are customers in the sense that they may provide the funding or budget for a project, and may have requests for features and functionality, but they won’t have to use it in their own workflows.
  • Users: Those who may not ask for a system, but must use it in their routine workflows. These are your end users, those who will actually interact with the system. Users don’t necessarily have to be people – they can also be other systems that will require inputs or outputs from the proposed solution. Understand their needs to best drive more granular functional requirements.

Instructions:

As a group, generate a complete list of the project stakeholders. Consider who is involved in the problem and who will be impacted by the solution, and record the names of these stakeholders/stakeholder groups on a sticky note. Categories include project sponsor, user groups, architects, and project team.

Now that you’ve compiled a complete list, review each user group and indicate their level of influence and the level of impact the project will have on them by placing their sticky note on a 2X2 grid.

Record this list in tab 3. Identify Stakeholders in the Data Integration Requirements Gathering Tool.

Ask the right questions during your business interviews

Business Analyst

You don’t have to ask a lot of questions during the interviews, just the ones that will promote discussion and focus the scope around the business’ specific data needs as it pertains to data integration.

Interview Questions

  1. What types of data do you use?
  2. Where and how do you access this data?
  3. What general data issues do you have with this data?
  4. Do you have any specific availability or latency issues with this data?
  5. Do you perform any of your own data integrations?
  6. Do you enter any data? If yes, where does this data come from; is it supplied by an external source or from an entity of the business?

Use these questions as a starting point; feel free to adapt them and add your own.

Stay on Point

In these interviews, the focus should be placed on data availability and latency needs, but it will likely also diverge into data quality and trust concerns as those are often the top priority and issues for users.

Control these interviews to ensure they stay on point, but also consider the implications of potential data quality issues and how quality issues may be impeding data integration operations.

Determine the methods for evaluating the performance of your improved data integration practice

Business Analyst

1.2.3 30 minutes

The value of data integration is not derived from its presence, but rather from its ability to support data being available at the business level.

Consider the following practice level metrics as options for evaluating the performance and value of data integration.

Practice Performance

  • Number of projects citing issues with fulfilling data requirements or data integration acting as a barrier to completing their project (change over time).
  • Number of point-to-point integrations (reduction of this over time).

Message Model

  • Number of strategic business areas supported by the Common Message Model.
  • Percentage of data domains coverage by the Message Model.
  • Number of redundant data instances in the data interfaces.
  • Time wasted rekeying data.

User Satisfaction

  • Satisfaction of key business units with the data available to their department and related systems.

Total Cost of Ownership

  • Costs associated with applications, databases, and data maintenance.
  • Should decrease with better data integration (reuse interface, rationalize tools).

Identify the success metrics for your integration project

Business Analyst

1.2.4 2 hours

Data integration projects are a lot of work, and not all of that work is visible to the end users.

This means that to secure backing for the data integration project, including buy-in for what may seem like a large cost, there must be compelling reasons for engaging in the integration improvement project.

Instructions:

Identify the metrics that will be used to benchmark the performance of the organization’s current data integration and to evaluate the impact of the project.

Benefit Metric
Enhanced Productivity Person-hours and dollars saved from reducing manual re-keying of data.
Risk Avoidance Dollars saved avoiding potential fines due to non-compliance.
Cost Reduction Dollars saved due to reduced customization or redundancy of integration solutions.
Improved Time to Insight Reduction in time for data transfer between Tier 1 systems.

Use the Data Integration Requirements Gathering Tool to determine the benefits of improved integration for your use case.

Your use case is: Analytics

The biggest benefit of improved integration is: Improved Time to Insight

"Customers often don’t realize that during the implementation, 80% of the work is backend work. However, the 20% that represents the visible reports is what gives them value. A lot of business users don’t understand this." – Hamdan Ahmad, Consultant, Slalom Consulting

Improved Data Integration Processes

  • Time spent fixing issues related to hand-coded data integrations.
  • Reduction in duplicate data entries.
  • Reduction in conflicting data between different integrated systems.

Receive sign-off from the business

Business Analyst

After gathering requirements from the business, the business analyst should circle back to the interviewees to validate the requirements.

Ensure that the requirements are comprehensive, specific, and feasible before moving on to the analysis stage. Bring the BRD (see template below) to key interviewees to receive sign-off as proof that due diligence was performed.

Gathering requirements is meant to be an iterative process – if the requirements collected from the business are incomplete, go back and refine them to more accurately reflect the needs of the business.

Info-Tech’s Business Requirements Document Template

An aviation organization required improved data integration when implementing a new ERP system

Business Analyst

CASE STUDY

Industry Manufacturing and Retail

Source ETL, Info-Tech Interview

Businesses competing on margins must be able to extract all possible savings from their business operations.

Context:

Rising costs and increased competition have forced retailers to be leaner and more efficient in their manufacturing operations to remain profitable. To achieve leaner and optimized operations, the business must be able to fully view its manufacturing operations and align its production cycles and inventory with market demand.

  • The performance of the business’ warehouse and manufacturing operations may not be readily available to the business.
  • Latency within inventory management can distort analytics and create shortages of products during critical buying times or overstocking during off-seasons.

Role of Data Integration:

Data integration programs using extract, transform, load (ETL) patterns are able to load batches of data during scheduled updates to related operational systems.

Business Value:

Improved supply chain management.

Greater performance management and oversight of operations across all facets of the organization.

Improved inventory management:

  • Improved analytics.
  • Inventory allocation tracking.

More dynamic production:

  • Improved capacity to reduce the margin between supply (production) and demand (sales).

If you want additional support, have our analysts guide you through this phase as part of an Info-Tech workshop

Book a workshop with our Info-Tech analysts:

  • To accelerate this project, engage your IT team in an Info-Tech workshop with an Info-Tech analyst team.
  • Info-Tech analysts will join you and your team onsite at your location or welcome you to Info-Tech’s historic Toronto office to participate in an innovative onsite workshop.
  • Contact your account manager (www.infotech.com/account), or email Workshops@InfoTech.com for more information.

The following are sample activities that will be conducted by Info-Tech analysts with your team:

1.1.1 Lay the foundation for the integration project by identifying integration drivers

An Info-Tech facilitator will help your organization identify the objectives and purpose of a data integration project. Considering the constraints often around a data integration project, the facilitator will help to scope the project, determining the expected outcomes and engagement methods that can help ensure project success.

1.2.1 Gather and document data integration requirements from the business

Understanding how data integration can be a tough sell on its own, an Info-Tech facilitator will help your organization identify the current impact of data integration on business performance. They will also help project leaders create a narrative that successfully articulates the value of an investment and focus on data integration.

Phase 2

Analyze Integration Requirements

Build a Data Integration Strategy

Phase 2 outline

Call 1-888-670-8889 or email GuidedImplementations@InfoTech.com for more information.

Complete these steps on your own, or call us to complete a guided implementation. A guided implementation is a series of 2-3 advisory calls that help you execute each phase of a project. They are included in most advisory memberships.

Guided Implementation 2: Analyze Integration Requirements

Proposed Time to Completion: 2 weeks

Step 2.1: Determine Technical Requirements for the Integration Solution

Start with an analyst kick-off call:

  • Understand what the business requires from the integration solution.

Then complete these activities…

  • Identify the common technical requirements and how they relate to business requirements.

With these tools & templates:

Data Integration Requirements Gathering Tool

Step 2.2: Leverage Integration Trends to Address Requirements

Review findings with analyst:

  • Review the trends in data integration to leverage new technologies.

Then complete these activities…

  • Brainstorm how the data integration trends can fit within your environment.

With these tools & templates:

Data Integration Trends Presentation

Step 2.3: Architect the Data-Centric Integration Strategy

Architect Solution:

  • Learn about the various integration patterns that support a data integration architecture.

Then complete these activities…

  • Determine the pattern that best fits within your environment.

With these tools & templates:

Data Integration Pattern Selection Tool

Step 2.4: Calculate ROI to Attach Dollar Value

Finalize ROI and phase deliverable:

  • List benefits and costs of your integration project.

Then complete these activities…

  • Calculate ROI to justify project investment.

With these tools & templates:

Data Integration ROI Calculator

Phase 2 Results & Insights:

  • Target a loosely coupled integration architecture so that you can more easily support downstream systems, federate data for enterprise views, and build trust that the organization’s data is accurate.

Phase 2 will help you translate the business requirements into technical requirements

Now that you know the requirements for the data-centric integration solution, proceed to phase 2 to understand the technical meaning of those requirements.

Remember: This blueprint highlights key areas in each phase of a data integration initiative where you should check to see if what you are doing is successful before moving on.

Phase Success Criteria:

  • Phase 1
    • Collect Integration Requirements
  • Phase 2
    • Analyze Integration Requirements
  • Phase 3
    • Design the Data-Centric Integration Solution

For phase 2, the following list of success criteria should be achieved before moving to phase 3:

  • Identification of the technical requirements for the data-centric integration solution based on the fulfillment of the business requirements.
  • Consideration of the trends in the data integration field, and how they may benefit your organization.
  • A data integration architecture strategy based on data integration patterns and communicated to the business.

Step 1: Determine Technical Requirements for the Integration Solution

Phase 2

2.1 Determine Technical Requirements for the Integration Solution

2.2 Leverage Integration Trends to Address Requirements

2.3 Architect the Data-Centric Integration Strategy

2.4 Calculate ROI to Attach Dollar Value

This step will walk you through the following activities:

  • Engage with the data architect or equivalent to translate the business requirements into technical requirements for designing the integration solution.
  • Interview DBAs and other data workers to understand technical requirements.
  • Prioritize your technical requirements.

This step involves the following participants:

  • Business Analyst
  • Data Architect

Outcomes of this step

  • A list of technical requirements that inform the type of architecture that will be implemented to solve your data integration pains.

" Data is a precious thing and will last longer than the systems themselves." – T. Berners-Lee

Take the business requirements and translate them into technical requirements

Data Architect

Involve the data architect role during translation of requirements.

To determine the technical requirements for the data-centric integration solution, the business analyst role should engage with the data architect. Data architects represent the bridge between strategic and technical requirements, and therefore are an essential resource for determining what is realistic for the integration solution and what would be the optimal solution for filling the needs of the business.

Business Analyst ↔ Data Architect

Some of the data architect’s primary duties and responsibilities include:

  1. Perform data modeling.
  2. Review existing data architecture.
  3. Benchmark and improve data initiatives performance.
  4. Review and recommend optimum data processing technologies.
  5. Lead on data integration activities.
  6. Validate data integrity across all platforms.
  7. Manage underlying framework for data presentation layer.
  8. Advise management on data solutions.

See the blueprint, Build a Business-Aligned Data Architecture Optimization Strategy, for a more detailed description of the data architect role and resources to help fulfill that role.

Data integration technical requirements fall into common categories

Data Architect

When working on the technical requirements for the data integration project, you must understand the needs of the business in the following categories of common data integration requirements:

Technical Requirements Description Example
Data Sources This represents the applications or systems that the data is being pulled from. ERP
Data Targets This represents the applications or systems that the data is moved to. Data Warehouse

Type of Data

Type of data that needs to be integrated. Structured CSV
Location of Data Location of the data (for example, cloud vs. on-premises). On-premises Oracle database
Performance Amount of downtime of data acceptable. 5s delay
Speed of Delivery Batch vs. real time. Real time
Quality Specify specific quality requirements. No empty fields in address
Legal Requirements Compliance rules. PCI compliance
Data Security Actions that must be taken to ensure security. Mask credit card data
Data Transformation Transformations of the data. Mask credit card data
Access to Data Type of access to the data. Indirect access via Oracle DB
Approval for Access Approval needed for access to data. DBA approval required
Operational Requirements Operational considerations. Cannot access ERP during daytime
Volume Amount of data being integrated. 100 GB per day
Variety Measurement of how many different sources of data there are. All in CSV
Technology Technologies the integrations have to be compatible with.

Use the Data Integration Requirements Gathering Tool to categorize the business requirements gathered in phase 1 according to these categories.

Review the business’ desired data availability and capabilities results with the IT staff

Data Architect

Next Steps

Share the high-level findings of your business assessment with the critical IT staff responsible for performing data integration with the business.

Note

Strong negative feedback from the business could be a sensitive topic for your staff, as it may be perceived as a challenge to their work. Generally this is not the case, but more reflective of the ad hoc system in which they operate.

CONSIDERATIONS

Latency of Data

Time between data creation and delivery.

Increasing Volumes of Data

How much data do you expect in the future?

New Data Sets

Additional data from unknown sources.

New Data Sources

Internal and external additional data.

Flow of Data

Replicating and updating information across multiple systems.

Security and Compliance

Is your data secure and does it comply with laws?

Interview data experts and database administrators

Data Architecture

2.1.1 60 minutes

Input

  • Integration requirement categories

Output

  • Prioritized and categorized requirements

Materials

  • Data Integration Requirements Gathering Tool

Participants

  • Data Architect
  • Business Analyst
  • DBAs

Use this opportunity to gain an accurate understanding of the true state of operations around data integration.

Focus the discussion on the following:

  • Daily issues being experienced.
  • What constraints in the current environment are preventing the application of long-term fixes to recurring integration issues?
  • Policies related to integration.
  • Degree of governance around data integration.
  • How changes are deployed for integrations.
  • Data models and documentation.

Ask IT if they know of any future requirements.

Example: Is IT receiving pressure for more big data analytics from BI operations?

Prioritize requirements to assist with solution modeling

Data Architect

Prioritization ensures that the development team focuses on the right requirements.

The MoSCoW Model of Prioritization

Must Have - Requirements that must be implemented for the solution to be considered successful.

Should Have - Requirements that are high priority and should be included in the solution if possible.

Could Have - Requirements that are desirable but not necessary and could be included if resources are available.

Won't Have - Requirements that won’t be in the next release, but will be considered for the future releases.

(Agile Business Consortium, 2014)

Prioritization is the process of ranking each requirement based on its importance to project success. Hold a separate meeting for the domain SMEs, implementation SMEs, project managers, and project sponsors to prioritize the requirements list. At the conclusion of the meeting, each requirement should be assigned a priority level. The implementation SMEs will use these priority levels to ensure that efforts are targeted toward the proper requirements and the plan features available on each release. Use the MoSCoW Model of Prioritization to effectively order requirements.

Brainstorm unique technical requirements depending on your use case

Data Architect

2.1.2 1 hour

Review the findings of your business interviews and identify the implications related to data integration.

Input

  • Integration requirement categories

Output

  • Prioritized and categorized requirements

Materials

  • Data Integration Requirements Gathering Tool

Participants

  • Data Architect
  • Business Analyst

Review the findings of your business interviews and identify the implications related to data integration.

To uncover the true latency needs of the business, go straight to the data owner. Based on their responsibilities and accountability for the data, they should be able to identify how current latency aligns with the business’ availability and timelines. Review the findings from the interviews with critical project members (recommend project manager and business analyst or sponsor). Along with the requirements that are common amongst the use cases, there are also unique requirements specific to each use case. When assessing the technical requirements, consider the unique requirements for each use case.

As a secondary resource, reach out to data stewards and power users.

Remember that not all data availability issues are data integration issues. There are two common buckets related to data availability:

  • Permission and access issues.
  • Data integration gaps/latency issues.

Permission and access issues, although often acute and critical issues for the user, fall under the purview of data security, not data integration.

Step 2: Leverage Integration Trends to Address Requirements

2.1 Determine Technical Requirements for the Integration Solution

2.2 Leverage Integration Trends to Address Requirements

2.3 Architect the Data-Centric Integration Strategy

2.4 Calculate ROI to Attach Dollar Value

This step will walk you through the following activities:

  • Understand the various types of technologies and trends in data integration to determine the right solution for your environment.

This step involves the following participants:

  • Data Architect
  • Business Analyst

Outcomes of this step

  • An understanding of how the various trends in data integration can apply to your organization to create a loosely coupled environment that is flexible, scalable, and agile.

Plan to make your data integration architecture as loosely coupled as possible

Data Architect

As you go through the process of solving data integration problems, make sure you plan for the future.

How organizations integrate their data depends on the use of the data and how the integrations support the needs of the business. Current integrations may be causing pains as they have not scaled properly or adapted to the business needs of today. When updating your integration architecture, you must consider the future. Scalability, agility, and adaptability are important considerations for your future architecture.

Info-Tech Best Practice

Tight coupling and loose coupling are relative terms meant to represent the state of an application’s interactions with the data of other internal or external applications. The concepts fall on a spectrum, and the correct balance of coupling needs to be determined according to business need.

"The question for loose coupling is ‘where do we need it?’ It is not just black and white. Loose coupling is less important for internal coupling. You can rely on the systems having a fixed schema, and if I know what the endpoint will give me, I don’t need loose coupling." – Jason Bloomberg, President, Intellyx

Tightly Coupled Architecture

Increasing:

  • Flexibility
  • Scalability
  • Agility

A tightly coupled architecture has the following attributes:

  • Lack of flexibility.
  • Fragile architecture. Breaks easily due to impactful changes – non-impactful change is one that doesn’t break the integrations.
  • Leads to decreased ability to explore new functionalities or take advantage of external sources of data.
  • Fixed schema.

Leads to data integration pains

But also increasing:

  • Cost
  • Difficulty

Loosely Coupled Architecture

A loosely coupled architecture has the following attributes:

  • Flexible
  • Scalable
  • Schemaless
  • Allows the organization to take advantage of new functionalities and external integrations

Complete decoupling is incredibly difficult and expensive, if not impossible. So, what architects should be aiming for is achieving the right level of loose coupling to facilitate business agility without imposing huge costs.

Create a loosely coupled integration environment by leveraging trends in the space

Data Architect

In the field of data integration technology, there are multiple trends that are shaping how organizations can better integrate their data today and in the future.

We have categorized these trends into the use case categories identified in Phase 1:

Orchestration trends:

  1. Microservices: the trend toward smaller, more focused, and loosely coupled applications to support agility and flexibility, as opposed to monolithic, tightly coupled applications.

Analytics trends

  1. Real-time integration: a greater need for speed of analytics has pushed real-time integration forward.
  2. NoETL: as organizations strive to leverage unstructured data, NoETL will grow in popularity.
  3. Data hub: data hub is growing in popularity and making data available to internal and external systems in real time.

Legacy trends

  1. Cloud

Conglomerate trends

  1. MDM: the trend that can quite significantly benefit organizations in constant mergers and acquisitions. Single consolidated view of master data is required for ongoing business operations and analytics.

"The end of ‘Fashion-IT’ – customers will only pay for value and not technology."– S. Ghosh

Read the Data Integration Trends Presentation to understand which trends are right for you

Data Architect

2.2.1 Data Integration Trends Presentation

The speed at which new technology changes makes it difficult for IT professionals to keep pace with best practices, let alone cutting-edge technologies.

Info-Tech’s Data Integration Trends Presentation provides a glance at some of the more significant innovations in technology that are driving today’s advanced data architectures.

This presentation also explains how these trends relate to either the data challenges you may be facing or the specific business drivers you are hoping to bring to your organization.

Data Integration Trends Presentation

Step 3: Architect the Data-Centric Integration Strategy

Phase 2

2.1 Determine Technical Requirements for the Integration Solution

2.2 Leverage Integration Trends to Address Requirements

2.3 Architect the Data-Centric Integration Strategy

2.4 Calculate ROI to Attach Dollar Value

This step will walk you through the following activities:

  • Use the technical requirements of the data-centric integration solution in conjunction with the data integration trends to create the high-level integration strategy.
  • Identify the pattern that best addresses the business and technical requirements.

This step involves the following participants:

  • Data Architect

Outcomes of this step

  • Knowledge of the various integration patterns that are used in today’s modern data landscape.
  • A pattern-based approach to architecting the data integration environment.

Adopt a patterns-based approach to your data integrations

Data Architect

Patterns are observed, not invented, and can be used in strategy through design and deployment to improve quality and reduce cycle times.

Benefits of Patterning

  • Developer Effectiveness
  • Development Estimates
  • Documentation Quality
  • Report Accuracy
  • Data Quality
  • DI Errors
  • Hand Coding

Process for Adopting a Patterns-Based Approach

At Design Time

  • Identify the components required for integration scenarios.
  • Document component interactions and data flow.
  • Document non-functional attributes of the interactions such as security and availability.
  • Influence architecture decisions such as where to place the functionality: on-premises or in the cloud.

At Deployment

  • Test patterns prior to implementation.

2 Average number of integration patterns in use at any one organization, depending on needs. (Dzone, 2015)

Determine your data integration architecture with the support of Info-Tech’s research, tools, and patterns

Data Architect

2.3.1 2 hours

Data integration architecture is fundamental to effective data integration.

Determine your organization’s data integration architecture patterns by completing the two steps of this activity:

Step Resource
Part A Build your understanding of the different reference patterns. 2.3 Pattern Slides
Part B Select your reference patterns for the different integration scenarios of your business. Data Integration Pattern Selection Tool

During your evaluation

Review the high-level integration reference architecture and identify how different patterns utilize different components.

The image shows the Data Integration Reference Architecture graphic described earlier.

Consider the factors that will drive your pattern selections

Data Architect

Business Latency Needs

Assess the timelines between when the organization captures the data and when it needs to be accessible and consumable for the business.

This business need should have been uncovered during the business assessment in your requirements gathering phase.

Volume of Data

  • Amount of data being imported into the business from external sources.
  • Amount of data being generated from the organization’s internal processes (big data).
  • Amount of master data present within the organization and dispersed across the business.

Access Requirements for Historical Data

Determine the amount of data history that must be available to be reviewed by the business and incorporated into its reporting practices.

Consider:

Compliance requirements

Use of historical data for BI

"Data can be manipulated by any tool nowadays. The key is to choose the correct technology for the correct job." – Andy Neill, Sr. Director of Research, Info-Tech

The image shows a graph, titled Pattern Selection Driver. On the X-axis is Volume (of data), and on the Y-axis is Latency needs of the business. Within the graph, there are 5 circles, labelled: Pub/Sub; Micro Services; Cloud; ETL; Streaming Data.

Analyze integration reference patterns

Data Architect

2.3.2 1 hour

Use the following slides to support your organization’s understanding of the different reference patterns for data integration, and select a pattern that aligns to your own business needs and capabilities.

In-Scope Patterns

  • ETL
  • Streaming Data
  • Publication/Subscription
  • Microservices
  • Cloud

Out-of-Scope Patterns

  • Data Federation
  • Data Virtualization

Data federation and virtualization are not considered in the scope of this data integration blueprint due to the general lack of adoption and feasibility of these integration ideas.

Evaluations of each pattern include the following:

  • Reference architecture
  • Integration scenarios
  • Associated processes and policies
  • Staff requirements
  • Technology implication

The following chart represents the percent of organizations that use the following three integration patterns:

Point-to-point: 69%

Message buses: 52%

Hub-and-spoke: 24%

What is ETL and how does it integrate an organization’s data?

Data Architect

Extract – Transform – Load

This can be done in real time or batch

Defined

Process of integrating data from a source to its target through the following three steps:

Extract

Pulls a desired subset of data from a source(s).

Transform

Converts data to fit target destination’s format using rules and lookup tables.

Load

Imports the transformed data into the target database.

A derivative of this pattern is known as Extract, Load, Transform. It uses the processing power of the target database to perform the transformation.

Flow of data in ETL

Source, Source → Access → ETL Services → Transformed Data → Data

Current Use

  • Often used for populating data warehouses, data marts, and operational data stores (ODS) on a periodic basis with data changes since the last update.
  • Can also be used for migration of data from one system to another.

Strengths and Capabilities

  • Efficient Processing of Large Data Sets: ETL solutions are built for high performance data processing.
  • Data Transformation: Some data transformation requirements can only be met by ETL, e.g. complex de-normalization, consolidation, and joins.

Limitations and Implications

  • Latency: Depending on the frequency of batches, latency is introduced between source and target.
  • Agility: Because ETL processes are bound to physical data stores, they are less agile.

ETL is the most commonly performed integration pattern in today’s business

ETL is typically seen in the following use cases: analytics, legacy, and conglomerate. Not seen in orchestration use cases as it is too slow for the rapid velocity of data required.

Data Requirements

  • Source data schemas need to be understood.
  • Foreign keys need to be understood across the data being integrated.
  • Match and merge rules must be defined.
  • Staff must understand the processing and query implications on a source database.

People

  • Data architects
  • Data and business analysts
  • Database administrators
  • System administrators

Process

  • Data latency/timeliness guidelines
  • Data protection policies
  • Data access and security policies
  • Change management

Technology

  • ETL technology
  • Network bandwidth
  • File system storage for staging tables
  • Memory allocation in ETL server(s)

Assess the characteristics and value of streaming data

Data Architect

How It Works

Streaming data technology helps to provide near real-time data analytics. This technology pushes data out of source systems in real time and supports subsequent transformation/algorithms to an event. Streaming data offers continuous data collection, transformation using a variety of methods, and distribution as necessary to target systems that need to know about the final action.

Source → Scheduled (high freq.) Streaming Data → Integration Services → Event Output →Target

Current Use

  • Streaming data is typically used in continuous integration scenarios where often complex events processing is required in near real time.
  • Used to perform advance analytics in data lakes.

Strengths and Capabilities

  • Real-Time: Streaming data technology is high performance and real time and supports many advanced analytics use cases.
  • Reduced latency: This pattern can reduce data latency and support continuous transformation.

Limitations and Implications

  • Invasive: Streaming data processing is disruptive and invasive to traditional data-centric integration designs.
  • Technology: Need to invest in new technology such as a big data platform to support streaming data processing.

Info-Tech Insight

Streaming data processing is a fundamental component of any modern data processing platform. Most organizations will move towards streaming data processing to enable decision making where it matters most and to embed intelligence in day-to-day business processes.

Review the architecture and requirements of streaming data integrations

Data Architect

Streaming data is typically seen in the following use cases: advanced analytics, analytics, orchestration.

Not seen in legacy or conglomerate use cases because the focus is on real-time event processing and related transformation.

Data Requirements

  • Message model needs to be clearly defined and understood.
  • Target event processing and integration must be identified and maintained.
  • Advanced analytics use cases must be understood across data being integrated.

People

  • Data architects
  • Data and business analysts
  • Data engineer

Process

  • Data latency guidelines
  • Data protection policies
  • Data access and security policies
  • Change management

Technology

  • Data Streaming technologies such as Kafka, Storm, Spark, and Flink.
  • Network bandwidth

Info-Tech Insight

Steaming data may require message-oriented middleware to transport the data from the source for transformation and/or distribution. Be sure to understand the full breadth of the technical components required for this type of pattern.

Assess the characteristics and value of publication/ subscription

Data Architect

How it Works

Publication/subscription is middleware technology that acts as an integration point between the enterprise’s systems, eliminating the complexity and costs associated with point-to-point integration and providing data veracity and velocity.

An example of pub/sub technology: the ESB.

The image shows a graphic with a large rectangle in the centre that reads ESB. Connected to that rectangle are smaller rectangles labelled Application Services. The left side is labelled Service Requester, and the right side is labelled Service Providers.

  • An ESB is a middleware solution that acts as an intermediary between service requesters and service providers, facilitating multiple integration methods across heterogeneous IT environments.
  • At its core, an ESB facilitates system integration and reduces the number and complexity of application interfaces.
  • With an ESB, there is a single interface across communication systems. The ESB can transform and augment messages between service requesters/providers.
  • An ESB enables loose coupling and breaks up integration logic into manageable pieces.

This technology provides the organization with the following benefits:

  • Move data between applications with velocity, without compromising veracity.
  • Improve centralized integration management.
  • Improve productivity and agility when it comes to building integration.
  • Better logging and analytics of integration activities.
  • Greater abstraction of data.

The image shows a sample Logical Architecture Diagram.

Review the architecture and requirements of publication/subscription integrations

Data Architect

Pub/sub is typically seen in the following use cases: analytics and orchestration.

Not seen in legacy or conglomerate use cases because the focus is on speed and veracity of message transmission.

Data Requirements

For your data to be ready for pub/sub technology, the people who are implementing the data hub or ESB must fully understand the data formats that are currently being used in the organization, and a data model should be in place. If data models within the organization do not already exist, having industry standards will help you be ready.

People

  • Data architects
  • Data and business analysts
  • Data engineer
  • System administrators

Process

  • Data latency guidelines
  • Data protection policies
  • Data access and security policies

Technology

  • ESB, data hub, or other message-oriented middleware technology

Info-Tech Insight

There is no standard universal definition or usage of middleware technology since every integration environment is unique. This means that knowing your own situation and using that rather than external findings to drive your implementation and configuration is of the utmost importance.

Assess the characteristics and value of microservices

Data Architect

How It Works

Traditional data analysis methods are difficult, expensive, and time intensive. In addition to better performance and lower operating costs, node-based architecture provides information redundancy, protecting it against the threat of any one central network going down.

Current Use

  • While node-based architectures are used today, no common platform exists for microservices architectures.
  • Expect the ongoing convergence of hardware and software, a phenomenon that could eclipse mobile. Combining microservices architecture and the IoT, application software developers will need to understand the code in hardware.
  • As architectures evolve around computer-to-computer interactions, there will be more emphasis on simpler and specific APIs. In contrast to conventional APIs that deliver a general purpose request, they will soon be able to home in on a specific microservice or requirement.

Strengths and Capabilities

  • Microservices and node-based architecture can offer performance advantages that come primarily from outsourcing memory and processing power to independent nodes.
  • A pure distributed system can rapidly scale horizontally by adding nodes with memory and processing power.
  • Compared to a controller-based or central architecture, node-based architectures eliminate single points of failure, simple automated maintenance, and non-disruptive upgrades.
  • Focusing on a microservices architecture enables rapid delivery. Whereas traditional development models have one large team producing one major application, microservices entail many small teams responsible for the streamlined end-to-end development of individual services that work together toward the whole.

Limitations and Implications

  • The microservices architectural concept is relatively new in mid-market organizations, meaning businesses are likely to face certain challenges associated with building up these capabilities. Vendor technologies for the mid-market, however, are rapidly maturing.
  • Node-based architecture can take longer to respond to queries that involve transactions across multiple nodes and large data sets. Several hops between nodes may be required to retrieve one piece of information and so there is a reliance on the speed of the network.

Info-Tech Insight

Application architectures of the past are not flexible enough to meet the demands of today’s mobile users and their multitude of applications. Node-based microservices offer greater speed and control in delivering innovation, real-time analytics, and maintenance processes.

Review the architecture and requirements of microservice integrations

Data Architect

The microservices pattern is typically seen in the following use case: orchestration.

Not seen in legacy or conglomerate use cases because the focus is on speed of data update at small volumes.

Data Requirements

  • Source API services need to be understood.
  • Business requirements and workflows must be identified and organized.
  • Independent services component and use of functionality.

People

  • Data architects
  • Data and business analysts
  • Programmer
  • Data engineer

Process

  • Data latency guidelines
  • Data protection policies
  • Data access and security policies

Technology

  • SOA, domain model oriented services
  • Microservices patterns and framework

Info-Tech Insight

Microservices is all about breaking down your monolithic application. The key here is to define independent reusable components that can be orchestrated to provide complete business functionality. This will in turn eliminate integration pains across systems.

Assess the characteristics and value of cloud-driven integration

Data Architect

Cloud application connectivity is becoming increasingly important. DI vendors are offering more SaaS connectivity and building their own data integration services (IaaS) in the cloud, making them available via a subscription model.

Drivers

Cloud applications (Salesforce, GoodData)

  • Data is hosted on the cloud

Private cloud

  • Transfer data to and from the cloud

Common Cloud Integration Scenarios and Architecture

1. Integration of Cloud Applications On-Premises

Layer 1 Applications →Layer 2 ETL - ETL; ESB → Layer 3 and 4 Warehousing - Data Hub; EDW; Data Marts → Layer 5 Analytics - BI; Predictive

Ideal for organizations with:

  • A limited number of cloud applications.
  • Primarily in-house data warehousing and analytics environment.
  • A lot of sensitive data that needs to be kept on-premises.

2. Integration, Warehousing, and Analytics in the Cloud

Layer 1 Applications →Layer 2 ETL - ETL; ESB → Layer 3 and 4 Warehousing - Amazon Redshift Warehouse and Data Marts → Layer 5 Analytics - Hosted BI; Predictive on the Cloud

Ideal for organizations with:

  • Many cloud applications, including applications in layers 3, 4, and 5.
  • Cloud-first strategy.
  • Small data management and IT groups.

Fit for purpose. Cloud data sets can be accessed or written using different options – API, proprietary connector, ODBC connection, data dump, etc. Evaluate each option and match the most suitable option to your data integration tool.

Assess the characteristics and value of cloud-driven integration

Data Architect

Integration with SaaS or PaaS cloud offerings combine application and data integration as clients can only operate at one level.

Vendors manage the databases associated with an application – the client’s contract enables them to use and pull from the application.

  • Clients do not have access to the database level.

Organizations are placing more and more of their data in the cloud

24% Of surveyed clients have migrated their CRM to the cloud

At the moment CRM is the most popular business app to outsource to the cloud. (Info-Tech Research Group; N=144)

For most organizations, CRM systems store large amounts of master data and serve as a critical system of record that feeds a number of additional systems. As a result, successful integration of a cloud-based CRM solution is critical for it to be successfully utilized by an organization.

Customer records and sales transaction information are two of the most common data types located in the cloud.

Consider the implications of adopting cloud solutions

Performance Capabilities

The throughput for cloud solutions is order of magnitudes lower than traditional on-premises solutions (ten versus hundreds and thousands a second).

Security

Constraints over what types of data can be stored in the cloud.

Info-Tech Insight

Integrating with the cloud presents challenges such as security, performance, and bandwidth requirements. Fortunately, many DI tools have been introduced in the market to help organizations address and manage these issues and these tools exist both as on-premises and as-a-service solutions.

Review the architecture and requirements of cloud integrations

Data Requirements

  • Source data schemas need to be understood.
  • Change log locations must be identified and maintained.
  • Foreign keys must be understood across data being integrated.

People

  • Data architects
  • Data and business analysts
  • Database administrators
  • System administrators

Process

  • Data latency guidelines
  • Data protection policies
  • Data access and security policies
  • Change management

Technology

  • Cloud platform and connectors
  • Network bandwidth

Info-Tech Insight

Cloud integration adds an extra layer of complexity if data is distributed between on-premises and cloud platforms. Cloud connectors must be evaluated carefully to integrate data securely.

An organization found that strong vendor management was critical to protecting sensitive data in the cloud

Data Architect

CASE STUDY

Industry Healthcare

Source Client Interview

Challenge

  • The organization faced a large volume of dental claims scanned and keyed in locally – preventing optimization of time and resources.
  • Security and PHI are huge issues for those in the healthcare industry.
  • Security services surrounding processes and other functions that could not be supported locally (e.g. email archiving) were necessary.

Solution

Decision

  • The business made the decision to outsource keying of claims to cloud services.

Contract Negotiation

  • During negotiation, the organization explicitly stated its need for a vendor policy to the potential service providers.
  • The organization negotiated and managed the SLAs with service providers to ensure standards of data quality and data security were being met.

Results

  • Outsourced keying of claims. Redundancies were eliminated, improving overall data quality.
  • Risk management became an integral part of the IT department.
  • Service providers implemented appropriate security technologies (PGP encryptions, secure FTP, VPN, etc.) and processes (SAS 70) to meet their customer and regulatory requirements.

Key Finding

Negotiations for contractual terms are crucial to the security of DI in a cloud environment.

Select your integration patterns

Data Architect

2.3.3 2 hours

Input

  • Existing organizational architecture

Output

  • A selection of the various integration patterns based on feasibility

Materials

  • Integration patterns and their reference architectures
  • Existing architecture

Participants

  • Data Architect
  • Data Workers

Instructions

  1. Review the reference pattern recommendations for each evaluated scenario.
  2. Discuss the feasibility of implementing the recommended pattern. Note: Each pattern is given a best-fit pattern recommendation, as well as a secondary pattern. If constraints limit your feasibility in implementing the primary pattern, consider the opportunities associated with the secondary pattern.
  3. Document your selected pattern outcomes.

Identify limitations for the chosen pattern

  1. Review the integration patterns and deployment methods selected during this step. Identify the limitations regarding people, processes, and technology that exist based on the current resources and environment present at your organization.
  2. Document the barriers currently present and their solutions. Identify business implications and concerns that may arise through the use of cloud integration and data storage.
  • Are current staff levels able to support new integration patterns?
  • Does current staff have the knowledge and expertise to perform these patterns?
  • Is management or business approval or sign-off required to proceed?
  • How will integration practices change?
  • What governance changes must be considered?
  • Are additional technology investments required to support these patterns?

Step 4: Calculate ROI to Attach Dollar Value

Phase 2

2.1 Determine Technical Requirements for the Integration Solution

2.2 Leverage Integration Trends to Address Requirements

2.3 Architect the Data-Centric Integration Strategy

2.4 Calculate ROI to Attach Dollar Value

This step will walk you through the following activities:

  • Use the technical requirements of the data-centric integration solution in conjunction with data integration trends to create a high-level integration strategy.
  • Identify the pattern that best addresses the business needs and technical requirements.

This step involves the following participants:

  • Data Architect

Outcomes of this step

  • Clear understanding of benefits and costs associated with the data integration project.

"When you say ROI, do you mean return on investment or risk of inaction?" – P. Gillin

To maintain momentum for the integration project, attach a dollar value

Data Architect

It is essential to identify the benefit and cost components before quantifying and analyzing the ROI.

What? How?
Benefits Reduce time to integration Use a pilot to compare point-to-point and data-driven integration to find out the time savings. You should have calculations for integration with low, medium, and high complexity.
Increase in integration quality Track, on average, how many defects are found in point-to-point integration and data-driven integration. Calculate how much is saved due to fewer defects.
Become more loosely coupled You can estimate how much can be saved by being more loosely coupled and modular. Also factor in becoming more vendor agnostic.
Costs Initial technology costs This is the initial technology cost when you install and implement an integration platform. It includes hardware, software, and other related costs.
Licensing costs This is the licensing cost of your integration platform solution of choice. It may be incurred on an annual basis, term basis, or even subscription basis.
Supporting costs This is the support cost. Support cost can be the cost of subscribing to the vendor support or hardware/software upgrade costs.

Calculate the data-centric integration ROI and use it to demonstrate the long-term benefits

Data Architect

Info-Tech’s

Enterprise Service Bus ROI Calculator

This was built based on several case studies. The cost savings and the benefits are verified and they apply to most medium and large enterprises. If you have specific assumptions applicable to your organization, please document them in tab 2, Assumptions.

The Very Nature of ROI

  • Data-centric integration will create more value if the integration complexity is very high.
  • ROI increases significantly after time passes.

Complete the Enterprise Service Bus ROI Calculator to calculate your ROI.

The image shows a graph with Integration Complexity on the x-axis, and Cost on the y-axis. There are two lines in the graph: one labelled Point-to-Point, which rises from left to right; the other is labelled Data Centric Integration, and falls from left to right.

Calculate the long-term benefits of the data-centric integration

Data Architect

2.4.1 Enterprise Service Bus ROI Calculator

Purpose

Determine the ROI specifically for your environment and the data integration solution you have selected.

Instructions

  • Review the benefit and cost components. Brainstorm on the assumptions used to determine these components. Document the assumptions on tab 2, Assumptions.
  • Imagine performing application integration in five different scenarios:
    1. Simple integration – one to one application.
    2. Complex integration – one to many, or many to many applications.
    3. Integration with cloud provider services – SaaS/hosted applications.
    4. Partner integration – conglomerate, supplier, or B2B.
    5. A change request to an existing integration.
  • Try to define the resources required and how long the resources will take to complete the scenarios using point-to-point integration or ESB on tab 3, Integration Savings per Year. At this point an estimation is good enough. Once you have completed the pilot project, you can come back and validate the estimates.
  • Document your costs on tab 4, ROI Calculation. The costs should be based primarily on the ESB vendor agreement.

Communicate the benefits of the solution to the business

Data Architect

2.4.1 1 hour

Input

  • Identified integration architecture and its benefits

Output

  • Requests for changes to the suggested approach or approval

Participants

  • Business Analyst
  • Data Architect
  • Business Stakeholders

Instructions:

Begin by presenting your plan and roadmap to the business units who participated in business interviews in activity 1.2.1 of phase 1.

If you receive feedback that suggests you should make revisions to the plan, consult Info-Tech Research Group for suggestions on how to improve the plan.

If you gain approval for the plan, communicate it to the core team and extended team members.

Iterative Optimization and Communication Plan:

Start here:

Communicate Plan and Roadmap to the Business

Revise Plan

Receive Approval or Suggested Modifications

Get Advice for Improvements to the Plan

Approval

Present to Data Workers

Phase 2 helped you identify an integration solution architecture

Data Architect

For phase 2, the following list of success criteria should be achieved before moving to phase 3:

  • Identification of the technical requirements for the data-centric integration solution based on the fulfillment of the business requirements.
  • Consideration of the trends in the data integration field, and how they may benefit your organization.
  • A data integration architecture strategy created based on data integration patterns, and communicated it to the business.

ESB integrations enabled Coca-Cola to set up a new business entity in a tight time window

Data Architect

CASE STUDY

Industry Retail

Source Software AG, 2017

As an organization, Coca-Cola is well known for its agile and innovative management.

To maintain its strong market positioning, Coca-Cola’s supply chain management and business operations must be able to operate at a high level of efficiency, with streamlined communications and interactions between different business units and the systems that they use.

As a result of the organization’s size and use of technology, it required an EAI environment that was able to manage a high volume of messages and interactions across a wide network of systems and applications. A change in the business landscape led to Coca-Cola needing to migrate and integrate 1,100 applications over a six-month project period to have operations running for two newly created companies.

"Enterprise integration is the enabler to give you agility in your company."– Kevin Flowers, Chief Technology Officer, Coca-Cola Enterprises

Role of an ESB

An integration platform that functioned as an ESB was used to create streamlined interfaces and create a high-performing messaging environment that was able to quickly deliver messages.

Results

Value $100,000,000 in cost savings due to the strategic integration project performed during the creation of the new business entities.

Commerce was able to flow from minute one.

If you want additional support, have our analysts guide you through this phase as part of an Info-Tech workshop

Book a workshop with our Info-Tech analysts:

  • To accelerate this project, engage your IT team in an Info-Tech workshop with an Info-Tech analyst team.
  • Info-Tech analysts will join you and your team onsite at your location or welcome you to Info-Tech’s historic Toronto office to participate in an innovative onsite workshop.
  • Contact your account manager (www.infotech.com/account), or email Workshops@InfoTech.com for more information.

The following are sample activities that will be conducted by Info-Tech analysts with your team:

2.1.1 Identify the technical requirements needed for the integration solution

An Info-Tech facilitator will help your organization identify the technical requirements that are typically associated with integration solutions for your use case. These requirements will be created in synchronization with the business requirements collected in phase 1.

2.2.1 Identify the latest trends in integration and determine your high-level solution

Understanding how data integration can be a tough sell on its own, an Info-Tech facilitator will help your organization identify the latest trends in data integration technology and how those trends can be adapted into your organization’s architecture.

Phase 3

Design the Data-Centric Integration Solution

Build a Data Integration Strategy

Phase 3 will help you to take your refined technical requirements and design the solution

Now that you know the requirements for the data-centric integration solution, engage in phase 3 to design the solution.

Phase Success Criteria:

  • Phase 1
    • Collect Integration Requirements
  • Phase 2
    • Analyze Integration Requirements
  • Phase 3
    • Design the Data-Centric Integration Solution

For phase 3, the following list of success criteria should be achieved.

  • Determined if a PoC approach is needed. If not, ensure that the know-how is in place to implement the solution.
  • Confirmed that the integration solution design is valid and feasible for the organization’s data environment.
  • Documented source-to-target mapping for the new integrations.
  • Captured metadata for the data being integrated.

Phase 3 outline

Call 1-888-670-8889 or email GuidedImplementations@InfoTech.com for more information.

Complete these steps on your own, or call us to complete a guided implementation. A guided implementation is a series of 2-3 advisory calls that help you execute each phase of a project. They are included in most advisory memberships.

Guided Implementation 3: Design the Data-Centric Integration Solution

Proposed Time to Completion: 2 weeks

Step 3.1: Validate Your Data-Centric Integration Pattern

Start with an analyst kick-off call:

  • Determine if a PoC approach is required for the environment.

Then complete these activities…

  • Design the PoC.

With these tools & templates:

Data Integration PoC Template

Step 3.2: Design the Consolidated Data Model

Review findings with analyst:

  • Understand what a consolidated data model is.

Then complete these activities…

  • Document the business activities, sources, and targets of your new integration solution.

With these tools & templates:

  • Data Integration Mapping Tool

Step 3.3: Map Source to Target Model

Finalize phase deliverable:

  • Identify the types of transformations.

Then complete these activities…

  • Use the Data Integration Mapping Tool to map data.

With these tools & templates:

  • Data Integration Mapping Tool

Step 3.4: Capture Integration Metadata

Review findings with analyst:

  • Use existing tools to capture metadata.

Then complete these activities…

  • Identify the metadata that should be captured

With these tools & templates:

  • Data Integration Mapping Tool

Phase 3 Results & Insights:

Validation of your data-centric integration solution, and the data model design for the solution.

Step 1: Validate Your Data-Centric Integration Pattern

Phase 3

3.1 Validate Your Data-Centric Integration Pattern

3.2 Design the Consolidated Data Model

3.3 Map Source to Target Model

3.4 Capture Integration Metadata

This step will walk you through the following activities:

  • Engage the DBA team to determine feasibility and complexity of the proposed integration architecture.
  • Determine if a PoC is needed for your solution.
  • If not, validate your chosen integration pattern by ensuring that the skills and resources are in place to implement it.

This step involves the following participants:

  • Data Architect
  • Solution Designer

Outcomes of this step

  • Validation that the correct data integration pattern has been chosen based on previous integration experience or a PoC demonstration.

Communicate the data integration architecture strategy to the data engineer to design integration workflows

Data Engineer

After you’ve decided on a high-level data integration strategy in phase 2, the data engineer is the role that takes the strategy and designs the solution model.

Data Architect ↔ Data Engineer

"When designing the solution, you must consistently be asking: does the data need to be updated in another system? You may forgo the data integration, but it would require the business process to update the data in multiple systems. It’s a cost risk – if the data is out of sync, what are the implications to the company?" – Wayne Regier, Director of Data Management, Husky Injection Molding Systems

What is a data engineer?

  • This role could also be known as a big data expert or technical architect.
  • Responsible for understanding the high-level architectural design and putting the changes into action.
  • Develop, test, and implement technology solutions and report on delivery commitments to ensure solutions are implemented as expected and in agreed timeframes.
  • Within the agreed enterprise architecture, define and design technology solutions to assist the business in meeting its business objectives.

Determine if a PoC makes sense for your integration project

Data Engineer

Before you begin designing the data model for the full solution, we recommend that you conduct a PoC if there is no easy or established way to validate your solution.

Building confidence is key to any new engagement.

Building a data integration PoC can help to prove feasibility and foster trust from the business in people building the new integrations.

PoC vs. Pilot
  • A demonstration of a method’s feasibility.
  • Smaller in scale.
  • Purpose is to prove that a concept or theory has potential for use.
  • Limited scope of the intended final solution.
  • Larger in scale than a PoC.
  • Typically involves a subset of users of the solution, or involves a limited number of business processes.

"Building a PoC depends on the solution and how comfortable the customer is with the solution. It is more to do with the complexity of the solution – if unsure of a certain solution, or if they are working with a new tool, they would be more comfortable with doing an exploratory project to make decisions." – Hamdan Ahmad, Consultant, Slalom Consulting

Info-Tech Insight

Cloud, big data, and external sources have added extra complexity in modern data integration initiatives. Streaming data and secured connectivity are key aspects that must be considered for data integration. A PoC is highly recommended.

Evaluate if you need a PoC

Data Engineer

3.1.1 1 hour

Input

  • Chosen data integration architecture pattern

Output

  • Decision on PoC

Participants

  • Data Architect
  • Data Engineer

Not every integration solution needs a PoC.

Use the following guidelines to decide if you need to conduct a PoC.

When do you need a PoC?

  • If the integration solution is a complex implementation without a precedent in the organization.
  • The organization typically supports the timeline of a PoC. In some circumstances, deadlines need to be met and don’t allow time for a PoC. Although this may reduce the chance of project success, reality can make a PoC unrealistic.

When do you not need a PoC?

  • If the proposed integration solution fits into pre-existing data architecture and integration patterns, the concept has already been proven.
  • There is no need for a PoC, and the data integration pattern is inherently validated.

Validation of the data integration pattern.

If it is decided that no PoC is needed, go to phase 3, step 2.

Use Info-Tech’s Data Integration PoC Template to outline your PoC

Data Engineer

3.1.1 Data Integration PoC Template

A PoC is built by ensuring that all of the pieces for a functional, demonstrable product are present and able to demonstrate due diligence. Some key pieces of the PoC include:

  1. The purpose of the PoC and business context.
  2. How it was designed based on business requirements (rationale).
  3. High-level technical details.
  4. Who is doing what for the project.
  5. How to know if the PoC is successful for go-ahead.
  6. Assumptions about the limited size of the PoC.
  7. An assessment of the PoC results.
  8. Demonstration of the PoC. Sign-off.

Info-Tech’s Data Integration PoC Template

Table of Contents

  1. Executive Summary
  2. Business Requirements
  3. Business Objective
  4. Technical Requirements
  5. Environment
  6. PoC Location
  7. Cloud/On-Premises Environment
  8. Roles and Responsibilities
  9. Proposed Tests and Success Criteria
  10. Assumption
  11. Human Resource Assumptions
  12. Technical and Facilities Assumptions
  13. Results
  14. Approval

Demonstrate the value of your PoC

Data Engineer

Input

  • Data Integration PoC Template
  • Data integration sample

Output

  • Approval for moving past the PoC stage

Materials

  • Data Integration PoC Template

Participants

  • Data Architect
  • Solution Designer

Advertise the data integration group’s successes during the PoC.

  • Share your group’s results internally:
    • Run your analysis by senior management and then share it across the organization.
    • Maintain a list of technologies that the working group has analyzed and solicit feedback from the wider organization.
    • Post summaries of the technologies in a publicly-available repository. The C-suite may not read it right away, but it will be easy to provide when they ask.
    • If senior management has declined to proceed with a certain technology, avoid wasting time and resources on it. However, include notes about why the technology was rejected.

Validation of the data integration pattern.

Info-Tech Insight

Cast a wide net. By sharing your results with as many people as possible within your organization, you’ll not only attract more attention to your working group, but you will also get more feedback and ideas.

Step 2: Design the Consolidated Data Model

Phase 3

3.1 Validate Your Data-Centric Integration Pattern

3.2 Design the Consolidated Data Model

3.3 Map Source to Target Model

3.4 Capture Integration Metadata

This step will walk you through the following activities:

  • Use the Data Integration Mapping Tool to document your integration sources and targets.
  • Begin creating your integrated data model.

This step involves the following participants:

  • Data Architect

Outcomes of this step

  • Clarified source to target mapping.
  • The beginning of a comprehensive source to target mapping document for tracking and communicating the integration logic and solution.

Now is the time to scale up and define your data model

Data Engineer

Create your new data model.

Now that the feasibility of your planned data integration architecture is validated, it is time to get into the weeds of modeling your new integration environment. To do this, you need the following pieces of information:

  1. Sources of data.
  2. Targets of the data.
  3. Mappings between source and target. This includes any transformations occurring on the data.

Source →Transformations→Target

There are multiple types of data models that are useful for different reasons.

Feature Conceptual Logical Physical
Entity Names Yes Yes No
Entity Relationships Yes Yes No
Attributes No Yes No
Primary Keys No Yes Yes
Foreign Keys No Yes Yes
Table Names No No Yes
Column Names No No Yes
Column Data Types No No Yes
Purpose Describes the data flows at a high level, the sources, the targets, and the required mappings. Normalization occurs at the logical level. Represents how the model will be built in the database.

Use Info-Tech’s Data Integration Mapping Tool for creating the new integration source to target mapping

Data Engineer

3.2.1 Data Integration Mapping Tool

What is a source to target (S2T) mapping document?

An S2T document contains the mapping of source system fields to the fields of a target system.

In addition to containing the mapping of fields from source to target, the document also captures the following important information:

  • Loading frequency for the mapping described by the document.
  • How source tables/files should be joined together to get the desired source data set.
  • Data types of both the source as well as the target fields.
  • Any conversion logic that is applied to convert between data types.
  • Any business rules that need to be applied.
  • Any slowly changing dimension attributes and logic.

Use the Data Integration Mapping Tool as a documentation resource to support the mapping and governance of the integrations.

This document will be your guide in creating the new integration solution. Follow the next set of slides to complete this document.

Use Info-Tech’s Data Lineage Diagram Template to document data sources and apps used by the business

Data Engineer

3.2.1 2 hours

Input

  • Data sources and applications used by the business unit

Output

  • Data lineage diagram

Materials

  • Data Lineage Diagram Template

Participants

  • Business Unit Head/ Data Owner
  • Business Unit SMEs
  • Data Engineers/ Architects

Map the flow and location of data within a business unit by creating a system context diagram.

Gain an accurate view of data locations and uses: Engage business users and representatives with a wide breadth of knowledge-related business processes and the use of data by related business operations.

  1. Sit down with key business representatives of the business unit.
  2. Document the sources of data and processes in which they’re involved, and get IT confirmation that the sources of the data are correct.
  3. Map out the sources and processes in a system context diagram.

Use Info-Tech’s Data Lineage Diagram Template as a resource to complete the business unit’s data lineage diagram.

Sample Data Lineage Diagram

The image shows a sample data lineage diagram.

Build your business-driven consolidated data model for the new integration environment

Data Engineer

3.2.2 2 hours

Input

  • Business activities requiring data

Output

  • Consolidated data model

Participants

  • Data Architect
  • Solution Designer

Instructions:

  1. Identify the common business activities that the integration solution will support.
  2. Identify the data domains and subdomains that hold the required data elements.
  3. Choose the appropriate data elements as the source. These will be the master records from the system of record.
  4. Build the technical data dictionary and business data glossary for each element.
  5. Determine integration requirements for all input fields.
  6. Determine integration requirements for all output fields to support target activities.

The image shows a graphic with a dark blue rectangle in the centre with the text Consolidated Data Model, and above it the text Integration Activities. On either side are sections labelled Business Activities. At the bottom is the text Business Data Glossary, and Data Dictionary.

The image is screen capture of the Business Activity Mapping Tab, which contains a blank chart.

Document the sources and targets of the data integration

Data Engineer

3.2.3 1 hour

Input

  • Sources and targets of data

Output

  • Consolidated data model

Participants

  • Data Architect
  • Solution Designer

Instructions:

  1. Document the source fields that need to be mapped in the left-most column of tab 3. S2T Mapping.
  2. For each source field, input the:
    1. Field information
    2. Mapping rules
    3. Data formats
    4. Notes
  3. Document the target fields that the source data will be moved to and transformed.

The image shows two charts: the one on the left is labelled Sources and the one on the right is labelled Targets. They are filled in with sample information. At the bottom, there is a label that reads: Data Integration Mapping Tool, tab 3.

Step 3: Map Source to Target Model

Phase 3

3.1 Validate Your Data-Centric Integration Pattern

3.2 Design the Consolidated Data Model

3.3 Map Source to Target Model

3.4 Capture Integration Metadata

This step will walk you through the following activities:

  • Identify the mappings that will correctly bring the data from source to target.
  • Understand transformations that occur to the data to ensure the correct data format is maintained.

This step involves the following participants:

  • Data Engineer

Outcomes of this step

  • Source to target mappings for the data model identified in step 3.2.

Document the transformations occurring

Data Engineer

When moving data from source to target, the following represent common transformations that occur to ensure that the data ends up in the correct format.

Copy - Copies data in source field to the target field.

Format - Formats the source data when writing to the target.

Transform - Changes the representation of the source data in the target.

Augment - Adds data to the source data before writing to the target.

Constant - Writes a constant value to the target.

Calculate - Calculates a value to write to the target.

Filter - Removes the input field and does not write it to the target.

Document transformations that occur in your new integration environment

Data Engineer

3.3.1 2 hours

Input

  • Source to target mapping

Output

  • Data transformations

Participants

  • Data Architect
  • Solution Designer

Instructions:

  1. For each of the source-to-target mappings, document the mapping rule associated with the integration.
  2. Provide sufficient details for the mapping rule being applied to the data.

The image shows a chart titled Mapping Rules, with sample data inputted.

Update tab 3. S2T Mapping

Info-Tech Insight

You need to know all field types that are available so that the right field and attributes can be chosen for each purpose to maximize data quality at point of data creation.

Step 4: Capture Integration Metadata

Phase 3

3.1 Validate Your Data-Centric Integration Pattern

3.2 Design the Consolidated Data Model

3.3 Map Source to Target Model

3.4 Capture Integration Metadata

This step will walk you through the following activities:

  • Understand the importance of metadata in data integration.
  • Determine the types of metadata that should be captured for the integration solution.

This step involves the following participants:

  • Data Architect
  • Data Engineer

Outcomes of this step

  • Metadata captured for the integration solution.

Analyze your metadata for effective data integration

Data Engineer

To ensure that the data being integrated is captured with the highest source quality, capture the metadata associated with the integrated data.

Info-Tech Insight

Metadata management is often a missed opportunity in a data integration project, and without proper metadata management, data integration can become unmanageable.

Legend Samples Attributes
Field Types Table Field Name User Field Name Content Type(s) Field Size(s) Size Limit(s) Sample Use(s)
Numeric Social Security Social Security # Numeric Only 9 char 400 123987456
Alpha First Name First Name Alpha Only 20 char 400 Broderick
VarChar Postal_Code Postal Code Alpha & Numeric 11 char 100 L7M3X8 or 90210 or HM32
Special Password Password All Characters 20 char 100 $%fFg87-KLhfrM543
Drop Down State State/Province List Select 3 char 100 Choice is made from structured list
Pick One Gender Gender List Select 1 char 100 Radio button single choice (Mandatory?)
Pick Many Preference Preference List Select 1 char 100 Selection boxes, many choices (Mandatory?)
Pick One or Text Industry Industry List Select or Other 40 char 100 Radio button single choice or enter new
Etc.

Discover metadata from existing data repositories and tools

Data Engineer

3.4.1 1 hour

Input

  • Metadata from data sources

Output

  • Metadata from data sources

Participants

  • Data Architect
  • Solution Designer

Collect metadata from the databases being integrated, including the types of data contained within them.

Types of Data

Number

  • Min value
  • Max value
  • Digits before decimal
  • Digits after decimal
  • Thousand separator
  • Displays a comma every three digits, does not alter stored value

String

  • Min length
  • Max length
    • Option to use database max
  • Allowed characters
  • Regular expression
    • Can be used to validate input, e.g. check for “@”
  • Failure message

Boolean

  • Initial value
    • True
    • False
    • Blank

Date

  • Mode
    • dd/mm/yyyy
    • mm/yyyy
    • dd/mm
  • Must be in future
  • Must be in past
  • No restriction on past/future

Enumeration

  • Enumeration options
    • Key
    • Label
  • Parent key

File

  • Max file size
    • Kb or Mb
  • Allowed file extensions
    • All types = blank
  • Failure message

Phase 3 helped you identify an integration solution architecture

Data Engineer

For phase 3, the following list of success criteria should be achieved

  • Determined if a PoC approach is needed. If not, ensured that the know-how is in place to implement the solution.
  • Confirmed that the integration solution design is valid and feasible for the organization’s data environment.
  • Documented source-to-target mapping for the new integrations.
  • Captured metadata.

If you want additional support, have our analysts guide you through this phase as part of an Info-Tech workshop

Book a workshop with our Info-Tech analysts:

  • To accelerate this project, engage your IT team in an Info-Tech workshop with an Info-Tech analyst team.
  • Info-Tech analysts will join you and your team onsite at your location or welcome you to Info-Tech’s historic Toronto office to participate in an innovative onsite workshop.
  • Contact your account manager (www.infotech.com/account), or email Workshops@InfoTech.com for more information.

The following are sample activities that will be conducted by Info-Tech analysts with your team:

3.1.1 Determine if a PoC is needed for the data-centric integration solution

An Info-Tech facilitator will help your organization identify the need for an integration PoC based on their experience with implementing integration projects in the past, as well as the existence of clear requirements from larger projects.

3.2.1 Create your consolidated data model

An Info-Tech facilitator will help your organization determine the sources and targets of the integration, as well as map the transformations occurring in the data. Based on these inputs, you will end up with a consolidated data model for the integration project at hand.

Insight breakdown

Every IT project requires data integration.

  • Regardless of the problem at hand and the solution being implemented, any change in the application and database ecosystem requires the solving of a data integration problem.

Data integration problem solving needs to start with business activity.

  • After understanding the business activity, move to application and system integration to drive optimal data integration activities.

Data integration improvement must be backed by solid requirements that depend on the use case.

  • Info-Tech’s use cases will help to identify your organization’s requirements and the integration architecture ideal for your data-centric integration solution.

Summary of accomplishment

Knowledge Gained

  • Understanding of the value of data integration practices.
  • Insight into how data integration impacts and is impacted by other functions of data management.
  • Understanding of the different reference patterns for data integration.

Processes Optimized

  • The business’ varying data flows and integration points between databases and applications.
  • Implementation and maintenance of the organization’s different data integration patterns.

Deliverables Completed

  • Project Charter
  • Reference Architecture Patterns
  • Data Governance Planning
  • Implementation Plan

Research contributors

Internal Contributors

Info-Tech Research Group

  • Gopi Bheemavarapu, Director, Research & Advisory Services
  • Ben Dickie, Director, Research & Advisory Services
  • Valence Howden, Director, Research & Advisory Services
  • Ilia Maor, Director, Research & Advisory Services
  • Keith Tudor, Director, App Development
  • Andy Neill, Sr. Direction, Research & Advisory Services

External Contributors

  • Wayne Regier, Director of Data Management, Husky Injection Molding Systems
  • Jason Bloomberg, President, Intellyx
  • Hamdan Ahmad, Principal Consultant, Slalom Consulting
  • Sanjay Pande, Co-Founder and Instructor, LearnDataVault.com
  • David Ormsby, Senior Manager, Liberty Utilities Anonymous Contributors

Bibliography

Accenture. “Accenture Information Management Services Survey.” Accenture. 2007. Web. Accessed 1 Sept. 2017.

Agile Business Consortium. “MoSCoW Prioritisation.” Agile Business Consortium. 2014. Web.

Berners-Lee, T. “A Framework for Web Science (Foundations and Trends(R) in Web Science).” ACM Digital Library. 2006. Web. Accessed 1 Feb. 2019.

Blueprint Software Systems Inc. “10 Ways Requirements Can Sabotage Your Projects Right From the Start.” Blueprint Software Systems Inc. 2015. Web. Accessed 1 Sept. 2017.

Computerworld. “12 Things you Know About Projects But Choose to Ignore.” Computerworld. 12 Mar. 2017. Web.

Concept Searching. “The Business Value of Compound Term Processing and Automatically Generated Intelligent Metadata White Paper.” Concept Searching. Mar. 2015. Web. Accessed 1 Sept. 2017.

Data.com. “Data.com Clean.” Salesforce. 2016. Web. Accessed 1 Sept. 2017.

Dayal, U., et al. “Data Integration Flows for Business Intelligence.” HP Labs. Mar. 2009. Web. Accessed 1 Sept. 2017.

Deloitte Insights. “Dark Analytics: Illuminating Opportunities Hidden Within Unstructured Data.” Deloitte Insights. 7 Feb. 2017. Web.

DZone. “The DZone Guide to Enterprise Integration.” DZone. 2015. Web. Accessed July 2017.

“Experian Data Quality. “The 2016 Global Data Management Benchmark Report.” Experian Data Quality. 2016. Web. Accessed 1 Sept. 2017.

Ghosh, S. “Taking the Mystery Out of Cloud-Based GIS.” CadCorp. 2017. Web. Accessed 1 Feb. 2019.

Gillin, P. “Another Way to Understand ROI.” iJetColor. 2018. Web. Accessed 1 Feb. 2019.

Information Age. “Harnessing the Power of Community in Analytics.” Information Age. 1 Mar. 2018. Web.

KPMG Capital. “Going Beyond the Data: Achieving Actionable Insights With Data and Analytics.” KPMG. 2014. Web. Accessed 1 Sept. 2017.

Bibliography (cont’d)

Matos, Vanessa. “Seven Data Integration Trends & Best Practices for 2017.” Virtual Logistics. Jan. 2017. Web. Accessed 1 Sept. 2017.

McKendrick, Joseph. “Moving Data at the Speed of Business.” IOUG. Feb. 2016. Web. Accessed 1 Sept. 2017.

Qiu, J., et al. “A Survey of Machine Learning for Big Data Processing.” EURASIP Journal on Advances in Signal Processing. 2016. Web. Accessed 1 Jan. 2019.

Rahm, E. “The Case for Holistic Data Integration.” Springer International Publishing Switzerland. 2016. Web. Accessed 1 Jan. 2019.

Ratnasamy, S., et al. “GHT: A Geographic Hash Table for Data-Centric Storage.” In Proceedings of the 1st ACM International Workshop on Wireless Sensor Networks and Applications. 2002. pp. 78-87. Web.

Schmoker, M. “Use of Data.” The University of Texas at Austin. Web. Accessed 1 Feb. 2019.

Semeniuk, Joel. “Data Is the New Oil.” Medium.15 Dec. 2016. Web.

Software AG. “Digital Enterprise Success Stories.” Software AG. 2017. Web.

The Data Asset Management Association (DAMA). “DAMA-DMBOK2 Framework.” The Data Asset Management Association. 6 Mar. 2014. Web.

ViON. “Why Data Management Matters Right Now.” ViON. 2015. Web. Accessed 1 Sept. 2017.

Ziegler, Patrick, and Klaus R. Dittrich. Data Integration — Problems, Approaches.” Database Research Technology Group. n.d. Web

Integration architecture pattern – overall

The image shows the Integration Architecture Pattern graphic which has appeared earlier.

A business unit of an aviation organization implemented a new ERP system into its operations

CASE STUDY

Industry Aerospace

Source Info-Tech Research Group

Situation

A business unit of a global aerospace organization implemented a new ERP system into its work environment.

Role of Data Integration

The organization had to load its master files before going live, manage transaction information, and eventually migrate all necessary data from the legacy system.

Data Integration Steps

The ERP project team dedicated a full-time resource to monitor and ensure proper data integration. The team split the data integration steps of the project into three different projects within the larger project.

  1. Migrated its master files into the procured system so they were fully loaded prior to the go-live date.
  2. Migrated critical financial data and set up open balances for the system’s go-live date.
  3. Over a period of time, identified and migrated the types and amount of historical data required to be integrated into the new system.

Approach

Due to the responsive nature of data integration to elements of the ERP implementation, the project team employed an Agile approach to their data integration.

To support this process and make sure data integration requirements were properly considered, the team ensured the necessary priority and resourcing occurred around integration.

About Info-Tech

Info-Tech Research Group is the world’s fastest-growing information technology research and advisory company, proudly serving over 30,000 IT professionals.

We produce unbiased and highly relevant research to help CIOs and IT leaders make strategic, timely, and well-informed decisions. We partner closely with IT teams to provide everything they need, from actionable tools to analyst guidance, ensuring they deliver measurable results for their organizations.

Member Rating

9.0/10
Overall Impact

$22,159
Average $ Saved

10
Average Days Saved

After each Info-Tech experience, we ask our members to quantify the real-time savings, monetary impact, and project improvements our research helped them achieve.

Read what our members are saying

What Is a Blueprint?

A blueprint is designed to be a roadmap, containing a methodology and the tools and templates you need to solve your IT problems.

Each blueprint can be accompanied by a Guided Implementation that provides you access to our world-class analysts to help you get through the project.

Need Extra Help?
Speak With An Analyst

Get the help you need in this 3-phase advisory process. You'll receive 8 touchpoints with our researchers, all included in your membership.

Guided Implementation #1 - Collect integration requirements
  • Call #1 - Learn about the concepts of data integration and the common integration use cases.
  • Call #2 - Understand what drives the business to need improved data integration, and how to collect integration requirements.

Guided Implementation #2 - Analyze integration requirements
  • Call #1 - Determine the technical requirements for the integration solution.
  • Call #2 - Learn about and understand the differences between trends in data integration, as well as how they can benefit your organization.
  • Call #3 - Determine your ideal integration pattern.

Guided Implementation #3 - Design the data-centric integration solution
  • Call #1 - Start with a PoC to validate your integration design.
  • Call #2 - Learn about the source to target mapping tool, and how to create your own.
  • Call #3 - Learn about integration metadata and what metadata to capture.

Authors

Rajesh Parab

Steven Wilson

Contributors

  • Wayne Regier, Director of Data Management, Husky Injection Molding
  • Jason Bloomberg, President, Intellyx
  • Hamdan Ahmad, Principal Consultant, Slalom Consulting
  • Sanjay Pande, Co-Founder and Instructor, Learn Data Vault
  • Anonymous Contributors

Search Code: 75051
Last Revised: March 22, 2019

Visit our COVID-19 Resource Center and our Cost Management Center
Over 100 analysts waiting to take your call right now: 1-519-432-3550 x2019