Insights

Modernizing Information Technology: A Customer-Centric Delivery Model

August 5, 2025

Change of any kind can often be tough, but making the customer an integral part of the process and decisions and finding some early adopters to be a change agent can help ease the transition, particularly when individuals feel like their world is being turned upside down.1 In fact, change can create opportunity and have long-standing impact not initially imagined as I will illustrate in this paper. This paper also includes insights from Kevin Ebberts, Allocore’s Executive Vice President of Product Innovation on the benefits of a customer-centric approach.

Among other principles, a customer-centric model keeps the customer at the core of business decisions throughout the modernization effort to create the best experience for the customer. The overall delivery strategy starts with (1) understanding customer needs, followed by (2) articulating business requirements and functionality, (3) determining the right technology to solve the problem, (4) evaluating and monitoring, and (5) scaling to production.

Rather than focusing on replacing old technology and then campaigning to get customers to adapt, a customer-centric approach flips that traditional strategy on its head, placing the customer at the center of the solution so that they own the solution from the beginning, rather than possess a tool forced upon them.

To illustrate the benefits of a customer-centric delivery model, I had the unique opportunity to be at the forefront of a major modernization initiative that involved piloting a functional and technical design to solve an urgent problem that intelligence analysts faced.2 Under a tight time frame, the initiative brought together a project team across multiple disciplines including experts in the fields of technology and design, intelligence analysis, and policy officials across multiple intelligence agencies to demonstrate an innovative, customer-centric approach for improving intelligence production and dissemination. The end goal was to improve decision making for the warfighter through the collection of multi-intelligence information and enhanced analytics to yield greater outcomes.

Understanding Customer Needs

Through interviews with intelligence analysts and regular collaboration with the others in the project team, I gained an understanding and appreciation of the current functionality, operating environment, workflows, and data. I also learned about the barriers that hinder information sharing across intelligence agencies, the roles and responsibilities of those involved in the intelligence life cycle, and the capability gaps that we would need to overcome.3 Equally important, I got clarity on the impact and risk of not achieving the end goal which helped the project team define success for and get stakeholder buy-in on the demonstration objectives. One way to achieve this rapid understanding of customer needs is through deploying Tiger Teams. Tiger Teams meet directly with customers and stakeholders involved in the current process to understand the data and how it is used. The teams include technologists and customer engagement specialists who truly want to understand how to make customers’ lives better through technology and business process improvement. What types of problems are they solving? What is the risk to the customer, project leads, and citizens if the business challenge is not solved? How can we develop a solution that increases productivity without creating fear of job replacement? The customer is involved from Day 1.

The most effective way to understand customer needs is to build relationships with them and meet regularly to identify their frustrations, pain points, and capability gaps.

Articulating Business Requirements and Functionality

Using inputs from the interviews with analysts, we began documenting the functional design and workflows centered around the who, what, where, when, how, and why and conceptualizing the operational concept of what we could achieve in terms of outcomes and capabilities for the demonstration. By framing the interviews around these questions, this helped us:

  • Identify the people or groups involved.
  • Clarify the data sources and the technology and tools needed to drive the new business outcomes and capabilities.
  • Specify the location or setting.
  • Explain the method, process, and way the activities would occur.
  • Explore the reasons, motivations, or purpose.

Too often, teams focus exclusively on the requirements for how the current process works, rather than the business outcomes and capabilities needed to achieve a goal. If we were designing a magical “black box,” what kinds of things would it need to be able to do?

We documented the challenges and constraints that could hamper success not only for the demonstration but for full scale production and worked with policy experts to assist us. We came to realize that while many of the challenges and constraints could be addressed in the short-term others would have to be addressed over time.

The project team reimagined the intelligence analysts working in a collaborative environment that facilitated gathering, analyzing, visualizing, and sharing multi-intelligence information. Analysts brainstormed on hypotheses and underlying storylines without constraints from the current environment. This allowed the project team to develop a demonstration evaluation plan that included specific use cases, a software analytical suite, and a refined functional design to achieve the desired outcomes and capabilities.

Determining the Right Technology to Solve the Problem

The overall objective of the demonstration was to build a technical design architecture for the desired functional design and workflows and refine the requirements specifications for testing. We applied an agile approach that was both iterative and incremental to refine work elements for the demonstration environment. The overarching expected outcome of the demonstration was to prove, in an operational environment, the concept that working collaboratively with multi-intelligence information and analytic capabilities could provide added value for decision makers. We refined the demonstration evaluation plan to include specific evaluation metrics and delivered a minimal viable product for intelligence analysts to evaluate and to obtain user feedback and observation.

Successful evaluation and monitoring of progress relies on a repeatable structure of measurement that is directly linked to business outcomes or value delivered such as improved experience for the public… as opposed to contextless metrics…

Evaluating and Monitoring the Demonstration

To emulate analysis production in the new environment, we evaluated the analysts during the demonstration as they performed multiple tests related to functionality, workflows, analytical capabilities, and production of intelligence. We documented our observations through interviews with analysts and other observations made by key stakeholders. We identified foreseen and unforeseen challenges and risks and quickly iterated by using innovative approaches, managed stakeholder expectations, and utilizing strategic partners to assist when we ran into snags. For example, when we learned of technology difficulties transmitting specific data, we partnered with another stakeholder who could provide an alternate means of connectivity in the short-term that stakeholders could later replicate in the production environment. We also opted to use temporary software licenses in lieu of procuring permanent license where it was feasible to do so. Evaluating the alternative in the demonstration and establishing a strategic partnership agreement for the unanticipated connectivity issue assured us that this would be able to provide a repeatable process for full scale production.

Full-Scale Production

In the end, analysts and key stakeholders were amazed with the results that were achieved with the minimum viable product. The product served as a catalyst for change in business and technology and set the stage for a scalable solution of a customer-centric, multi-intelligence environment. Following the demonstration, experts went on to develop a migration strategy to lead the delivery of the pilot environment to full scale production. The overall delivery model included the total cost of ownership to sustain a multi-intelligence environment in full scale production (i.e., operations and maintenance costs).

Footnotes

1A change agent is someone who can transform ideas into scalable, market-ready businesses. From the perspective of an organization, change agents are the bridge between a groundbreaking idea and its practical implementation. They are the ones who challenge the status quo, question long-held beliefs and practices, and push the boundaries of what is considered possible. In doing so, they must exhibit a blend of strategic foresight and resilience to overcome the inevitable obstacles that arise during the change process.

2A pilot serves several key change management purposes: engage use communities to increase adoption, address hesitancy–in this case to share intelligence information across agencies and prepare for a smooth transition for deployment.

3See https://www.fbi.gov/image-repository/intelligence-cycle-graphic.jpg/view for an illustration of the intelligence life cycle.