Implementation Guide

Consider these questions as discussion prompts or a guide for your individual or team efforts. Return to them as needed as you develop your solution until they can be confidently answered. You may use all of them or some of them, depending on your specific situation. These will aid in a variety of situations, not limited to analytics.

Domain I: Business Problem (Question) Framing

  • Are you seeking to identify or solve a question, opportunity, or problem?
  • What is your business problem, and what is unclear or undefined?
  • What are the challenges and opportunities related to your problem?
  • What is the scope of your problem, and what is within your ability to solve?
  • Who has a vested stake in your effort, how would you categorize them, and are their perspectives incorporated into the business problem?
  • Have you clearly identified the expectations of your stakeholders, sponsors, and team members?
  • What types of measurable benefits, costs, risks, and direct or indirect consequences are associated with a solution?
  • Is this problem amenable to analytics?
  • What contingency plans have you put in place if one or more of the resources or permissions required for the business problem are no longer available?

Domain II: Analytics Problem Framing

  • Does your business problem lend itself to an analytics methodology (e.g., descriptive, diagnostic, predictive, prescriptive)?
  • How would you categorize the analytics approach (e.g., classification, optimization, etc.)?
  • Do you have multiple solutions to propose to your stakeholders?
  • What are the inputs and outputs associated with the problem, and what do you propose their relationship to be?
  • What is the current performance or baseline state of the problem area? Estimate the order of magnitude of change desired.
  • Thinking about how you will measure success, what are the assumptions, constraints, and risk mitigation strategies you might need to consider?
  • What work or research has been done in the problem area?

Domain III: Data

  • What data do you need for your solution, and does it exist?
  • Are there privacy, security, or data usage issues associated with the data that raise concerns or responsibilities?
  • Are there any data portability or data retention constraints affecting your data sources?
  • What data do you own, what data do you use, and what data do you produce?
  • How do privacy, security, and regulatory requirements constrain how the data can be accessed, used, shared, retained, or disposed of?
  • What are the four “V”s of your data (e.g., volume, velocity, variety, veracity)?
  • Have you documented the limitations of your data and assessed whether it is of sufficient volume and quality to create your solution?
  • For what purposes were your data sources originally created?
  • What steps have you taken to clean, refine, or synthesize your data, and have you documented the process of doing so?
  • Are there any potential biases in your data?
  • What visualizations exist for characterizing your data?

Domain IV: Methodology (Approach) Framing

  • What are the various methods available to solve your problem, and what are their technical and practical strengths and weaknesses?
  • What types of methods or IT constraints may affect the viability of your effort?
  • Do your problem type, time frame, project constraints, and available data align with your chosen method(s)?
  • Have you identified what technologies you will need for your solution that are not currently available, and have you documented how you will go about acquiring them?
  • What technologies are available to implement your problem-solving method, and what are their technical and practical strengths and weaknesses?
  • Do the problem type, time frame, project constraints, available data, and chosen method lend themselves to your planned technological implementation?
  • Does your approach work with existing or forthcoming policy?
  • Have you considered whether the intended users will be receptive to your solution and trust the analytic outputs?

Domain V: Analytics / Model Development

  • Have you defined and documented a process for designing, building, testing, and refining your solution?
  • How should the output of your model be interpreted?
  • How will you evaluate the performance and reliability of your solution?
  • What assumptions, limitations, and biases are associated with your solution, and how will you address them?
  • How will you communicate results to a non-technical audience?
  • What iterations of the solution have you gone through, and can you revert to a previous model if needed?
  • Is your process documented, and can you justify the rationale of your approach?
  • Can your process be replicated to create the same solution?

Domain VI: Deployment

  • In which environments will your solution be developed, tested, and deployed?
  • What risks are associated with deployment?
  • Which parties need to be involved in bringing your solution into production, and what will be required of them?
  • What differences do you anticipate between deployment and testing, and what is needed for your solution to address these differences?
  • What tests will you design to validate the solution and its implementation in its new environment?
  • Who will be involved in user acceptance testing, and what criteria will be used for the solution to pass?
  • What training needs to be established to align with deployment?
  • Are you prepared to provide corrective action if deployment fails?
  • Has a failover and rollback plan been established prior to deployment?

Domain VII: Analytics Solution Lifecycle Management

  • Do you have a plan to train your model on new data after it is deployed?
  • How will you monitor and evaluate the ongoing effectiveness and business value of your solution?
  • Since your deployment environment may change over time, how might those changes alter the effectiveness of your solution?
  • What methods for anomaly or erroneous output detection are in place as part of solution performance management?
  • How might the original problem statement evolve over time, and how would those changes affect solution requirements?
  • What continuous monitoring is in place for privacy, access control, assurance, reliability, and security management?
  • How will you identify unforeseen consequences during ongoing deployment (e.g., security vulnerabilities, changes in user behavior)?
  • Has a plan been created to transition solution management and maintenance responsibilities to a new owner?
  • Can your solution be rereleased with new capabilities as required (e.g., build, release)?
  • What measures are in place to continuously assess the data received by the model?
  • How will your solution be archived, retired, or disposed of when it is no longer used, and how could it be reactivated if needed?