Skip to main content

Assessment

Phase 1 - Problem Assessment

The first phase is designed to help decision makers create the most value through their AI initiative. It includes tools to ensure that the initiative is designed to address a specific use case and advancing mission goals, even if that is not an AI solution. Inputs and outputs artifacts are organized in 3 categories - AI, Business Need and Governance Risk & Compliance (GRC).

Phase Inputs

Playbook Phases Figure 3: assessment phase (1) summary - expand

Key Goals

Key Participants

“Do I need a blockchain?”

The following are key assessment questions to consider as a preliminary guide for those considering an AI approach. Afterwards, determine your total score to gain insight into the possibility for a substantial Return on Operations (ROO-increased effectiveness) and Return on Investment (ROI-increased efficiency) from the development (integration) and application (implementation) of the proposed AI approach.

Note: This is a notional table and the level of importance associated with each question may be tied to the use case being assessed. Assign Points based on the Attribute Importance Rank (with suggested weighting). You may adjust the weight of questions as they apply to your use case. (5 – critical, 4 – very high, 3 – high, 2 – moderate, 1 – slightly, 0 – not at all)

  1. Does the use case clearly, and accurately describe the problem to be solved?
  2. Does the use case accurately outline current processes in place?
  3. Does the use case align the goals and objectives with desired outcomes?
  4. Does the use case need greater insight from the data?
  5. Has sufficient data been identified for the use case?
  6. Does the use case identify what data is required and available, accessible, and accurate?
  7. Is the data from the use case annotated and curated? (Does the data contain metainformation?)
  8. Does your use case largely need manual process automation? (That is to determine if only RPA is needed)
  9. Is there a predictive element to the use case? (Assumptions and testing made based on prior data)
  10. Have other technologies successfully been applied to address elements of the use case? (Could you somewhat solve your use case with an existing solution?)
  11. Does the data fit for purpose (descriptive modeling) and is it operationally relevant (predictive modeling)?
  12. Are the authoritative data sources of the use case, organized, structured, deconflicted, and matriculated?
  13. Could the result of the use case change how conformance requirements need to be applied? (e.g., personally identifiable information [PII], classified etc.)?
  14. Does the use case contain ethical considerations and is there a potential for bias? (In the data, algorithms, or aggregation process)

Questionnaire Results

The total score will serve as a preliminary assessment for those considering an AI approach. While useful, this is still only a guide for consideration and further investigation. Thorough engineering analysis and practices should still prevail. Assessing your Score: In order to assess the applicability of an AI approach, the total score will guide the reader whether an AI approach would be beneficial (high score) and where it is less likely (may still be applicable but needs additional scrutiny).

Total: xx points

The total score will serve as a preliminary assessment for those considering an AI approach. While useful, this is still only a guide for consideration and further investigation. Thorough engineering analysis and practices should still prevail.
Assessing your Score: In order to assess the applicability of an AI approach, the total score will guide the reader whether an AI approach would be beneficial (high score) and where it is less likely (may still be applicable but needs additional scrutiny).

Score (-20 - 18 total points)

A score of 18 or below typically represents a small ROO/ROI and limited applicability from an AI approach. Consider that while the score may be low, your situation may still warrant deeper analysis as there can be a compelling reason to continue with an AI approach that did not fall into the standard categorization.

Score (19 - 40 total points)

A score of between 19 and 40 could typically be supported with an AI approach but is not an overwhelming natural candidate. These situations can have powerful reasons that can still drive an AI approach, yet they might also have mitigating factors that make a traditional approach a better alternative. In these situations, a more thorough analysis is typically needed.

Score (41 - 50 total points)

A score above 41 typically represents a compelling ROO/ROI and strong applicability that would benefit significantly from an AI approach. It is strongly recommended to consider the costs and benefits of an AI approach in these instances while still considering other additive and mitigating factors in the organization, strategic direction, interdependencies, and related items.

Key Considerations

With the word AI being used everywhere, it is important to separate reality from hype when it comes to which uses cases can actually benefit from an AI solution. Consider the following advice and best practices when evaluating AI for any use case.

Demonstration of Capabilities/Minimal Viable Product (MVP):

Set goals and objectives for each AI use case by defining a schedule for the MVP/POC demo.

Set Your Foundation:

Business Capabilities and AI Capabilities:

Build AI Architectural Blueprint for Future Phases:

Develop a vision and a plan for the additional requirements and challenges that will need to be addressed if your solution moves into a prototype phase and subsequent operational pilot phases. This should encompass modernization and integration with legacy systems in consideration of infrastructure requirements to host AI applications. Currently FedRAMP Authority to Operate (ATO) accreditation of the AI application is a viable option . Additional options should include the necessary activities essential for major change management components. The viability of these opportunities should take into account policy, process, operational, and cultural requirements.

Building or Taking an Inventory:

Current algorithms, dashboards, questionnaire/checklist objectives statements, and computer macros across an organization can be used as the basis to understand how AI might map to business processes.

Build Once, Use Many:

Ultimately, the organization should examine the desired technologies and subsequent capabilities that can be enabled by the future state AI solution. Building a working blueprint of the technical architecture presents a powerful tool for defining the scope and phases of a comprehensive AI implementation. Strategic scaling will enable organizations to optimally address pain points and align stakeholders while tackling one priority area at a time. This will ultimately accomplish the transformational objectives that advance mission goals.

Emphasize ROI and Benefits:

Emphasize ROO/ROI while making an assessment. Examine the solution’s common costs/ benefits to provide increased effectiveness and deliver more efficiencies from their AI solution. Include design thinking based on personas and a prioritization matrix around value versus complexity. A MVP should prove viability of an AI solution with ROO/ROI measures to ascertain potential operational gains and resource savings in effectiveness and efficiencies respectively. Also important to consider is the reduction of risk in their ability to meet their mission goals. Ultimately, the ROO/ROI considerations should include:

Incorporate Regulations/Mandates:

AI has the potential to traverse large swaths of data and generate new forms of data aggregation requiring impact assessments against standards such as National Institute of Standards and Technology (NIST) along with other legal and regulatory considerations (GDPR , HIPAA , Personally Identifiable Information, Data Sensitivity). Organizations should review the use case to understand the application of standards around use of AI and develop risk management plans around underlying technologies that support the use case as it relates to the ethics, mission goals, and business objectives. As organizations seek to establish levels of governance and enforce assessment standards that drive new outcomes, the goals and objectives achieved by the AI use cases should be reviewed at each phase and iteration to assure existing regulatory, legislative, and policy guidance surrounding the use cases are being fulfilled.

Key Activities

Management

People

Process

Technology

Acquisition

Key Outcomes

Engaged At the executive level, integration of AI use case is endorsed to facilitate the overarching potential to fulfill the strategic intent.

At the program/mission office, confidence that AI integration is possible and that an AI solution will enhance the outcomes of business processes and meet all ethical requirements. In the general workforce, acceptance of AI capabilities, preparation for future skills, and competency development to achieve the desired outcomes.

Defined Careful consideration of the problem statement, in the context of the underlying business processes, will afford the means to appropriately apply existing technology within the environment that assures AI solution are appropriately applied.

Phase Outputs

The following artifacts generated during the Assessment Phase support the Organizational Readiness and the subsequent phases.

Business Need

AI

Business Need

Decision Gate

Consider the score on the “Do I need an AI” assessment questionnaire as a guide to proceeding with an AI approach.

Next