Certified Analytics Professional

INFORMS prepares to launch first-of-its-kind program.

Certified Analytics Professional

Analytics is the scientific process of transforming data into insight for making better decisions.
– INFORMS, 2012

By Scott Nestler, Jack Levis and Bill Klimack

Definitions are important; so is strategy.  As Andy Boyd (INFORMS vice president for Marketing, Communications and Outreach) pointed out recently, “INFORMS’ definition of analytics certainly doesn’t resolve the question, ‘What is analytics?’ It does, however, provide a thoughtful perspective to ponder. And it informs us of just where INFORMS stands” [1]. Just as INFORMS recently defined what the word analytics means to it as an organization, it continues the effort to define what it means to be a Certified Analytics Professional (CAP). Development of the CAP program, proceeding on schedule, represents one of INFORMS’ strategic efforts as first outlined in an interview in late 2011 with INFORMS President Terry Harrison [2] and in another OR/MS Today article earlier this year [3].

Demand for analytics certification – and interest in analytics in general – continues to grow throughout the business community. Since rebranding its “practice conference” the “Conference on Business Analytics and Operations Research” in 2011, INFORMS has seen attendance surge. The 2012 event, held in Huntington Beach, Calif., drew a record number of attendees and 52 percent more than the 2010 event.

Eligibility for Certification

INFORMS has now published the eligibility criteria for the CAP certification at INFORMS Online (www.informs.org/Build-Your-Career/Certification), which includes the following:

  • BA/BS (or higher) degree, and
  • at least five years of analytics work-related experience for BA/BS holder in related area, or
  • at least three years of analytics work-related experience for MA/MS (or higher) holder in related area, or
  • at least seven years of experience for those with BA/BS in unrelated area, and
  • verification of soft skills/provision of business value by employer.

Additionally, applicants must pass a certification exam (still in the development process). The plan is to make the first exam available in conjunction with the INFORMS Conference on Business Analytics and Operations Research in April 2013 in San Antonio, Texas, with an additional opportunity at the INFORMS Annual Meeting in October 2013 in Minneapolis.

Job Task Analysis Development

In general, a Job Task Analysis (JTA) is a comprehensive description of the duties and responsibilities of a profession, occupation or specialty area. The CAP approach consists of four elements: 1) domains of practice, 2) tasks performed, 3) knowledge required for effective performance on the job, and 4) domain weights that account for the importance of and frequency with which the tasks are performed. More specifically, the JTA for the CAP program can be viewed as an outline of a partial body of knowledge, as it represents a delineation of common or typical tasks performed and knowledge applied by analytics professionals, grouped together in a hierarchical domain structure. In the course of analytics work, these tasks may be performed multiple times with modifications based on data, findings and results, as part of ongoing feedback loops that are routinely a part of practice. The JTA serves as the test blue print for exam development and links what is done on the job with what is measured by the certification examination. This linkage is necessary to establish a valid, practice-related examination. It is important to realize that the JTA is a dynamic document that will change in the future to reflect best practices and changes in the analytics profession.

The JTA outlined in this article was developed by the INFORMS Analytics Credentialing Job Task Analysis Working Group, comprised of 12 subject matter experts (SMEs) (see box) who are: highly regarded in their field; diverse in geography, sector (public-private), organization type (e.g., large companies-smaller consulting firms, practice-academia, etc.) and application area (e.g., finance, logistics, software, consumer goods, etc.); and representative of the descriptive, predictive and prescriptive segments of analytics. Additionally, the Working Group contains four members in common with the Task Force and two INFORMS directors, helping to ensure continuity with existing governance structures. Since CAP is designed to attract analytics professionals who are not currently members of INFORMS, the Working Group also includes some non-members.

In developing the JTA, members of the Working Group relied upon their knowledge of practice gained from years of experience, academic program content, corporate job descriptions in analytics and articles from professional and scholarly publications. As outlined in the earlier update, the JTA Working Group proposed, and the Credentialing Task Force and Board of Directors approved, that the CAP assess to some level of depth across the breadth of knowledge needed in analytics [4]. The evaluation of more detailed knowledge in specific areas or applications will be done later in “add-on” certifications, pending the successful development and deployment of CAP.

Domains Provide Top-Level Structure

The 36 typical tasks and 16 knowledge statements (not provided here) in the analytics JTA are organized in seven domains, as listed in Figure 1. Tasks are specific goal-directed work activities or groups of closely related work activities that describe identifiable behaviors, while knowledge is an organized body of information that, when applied, makes possible the competent and effective performance of the work activities described by a task.

Domain, descriptions and weights in the JTA.

Figure 1 also shows domain weights, which are based on the SMEs assessments of the importance of tasks and the frequency of their performance. Mean weights of the Working Group members were used as starting point for discussion and debate that continued until consensus was reached. The weights will be used in the exam construction process to ensure content mixture validity.

Within each of these seven domains, a number of tasks must be performed by practitioners of analytics. The tasks identified by the JTA Working Group are listed in Figure 2 (see page 20).  

Figure 2: Task identified by the JTA Working Group.
(15%) Domain I Business Problem (Question) Framing
T-1 Obtain or receive problem statement and usability requirements
T-2 Identify stakeholders
T-3 Determine if the problem is amenable to an analytics solution
T-4 Refine the problem statement and delineate
T-5 Define an initial set of business benefits
T-6 Obtain stakeholder agreement on the problem
(17%) Domain II Analytics Problem Framing
T-1 Reformulate the problem statement as an analytics problem
T-2 Develop a proposed set of drivers and relationships to outputs
T-3 State the set of assumptions related to the problem
T-4 Define key metrics of success
T-5 Obtain stakeholder agreement
(22%) Domain III Data
T-1 Identify and prioritize data needs and sources
T-2 Acquire data
T-3 Harmonize, rescale, clean and share data
T-4 Identify relationships in the data
T-5 Document and report findings (e.g., insights, results, business performance)
T-6 Refine the business and analytics problem statements
(15%) Domain IV Methodology (Approach) Selection
T-1 Identify available problem solving approaches (methods)
T-2 Select software tools
T-3 Test approaches (methods)*
T-4 Select approaches (methods)*
(16%) Domain V Model Building
T-1 Identify model structures*
T-2 Run and evaluate the models
T-3 Calibrate models and data*?
T-4 Integrate the models*
T-5 Document and communicate findings (including assumptions, limitations and constraints)
(9%) Domain VI Deployment
T-1 Perform business validation of the model
T-2 Deliver report with findings; or
T-3 Create model, usability and system requirements for production
T-4 Deliver production model/system*
T-5 Support deployment
(6%) Domain VII Model Lifecycle Management
T-1 Document initial structure
T-2 Track model quality
T-3 Recalibrate and maintain the model*
T-4 Support training activities
T-5 Evaluate the business benefit of the model over time
* Note these tasks are beyond certification level and, therefore, not addressed by the CAP exam, which is a basic certification. Potential future certifications may cover more advanced areas.

Successful performance of the tasks listed in Figure 2 requires specific knowledge, which is what will be tested. At this time, the supporting knowledge statements are being used to develop items (questions) for the first exam and, as such, are not publicly available. We anticipate releasing them as part of the CAP Candidate Handbook and at INFORMS Online in October 2012.

JTA Validation

In order to ensure that the JTA Working Group had not missed anything important in the practice of analytics, the JTA and an associated questionnaire were sent out to a random sample of INFORMS members and non-members. More than 200 total analytics professionals from various regions of the United States, Europe and Asia/Pacific responded to the survey; approximately three-quarters of these were not INFORMS members. Non-member respondents included previous respondents of the 2011 certification feasibility study and registered subscribers of Analytics magazine. Survey participants were asked to perform three tasks in their review of the draft JTA document:

  1. Identify those domains, tasks or knowledge statements they would like to remove, reword or revise.
  2. Suggest new domains, tasks, knowledge or skill statements that they would like to add.
  3. Confirm or suggest changes to the weights based on their ranking of importance and frequency.

After reviewing the results of the survey, including a thorough report prepared by the certification consultant, the JTA Working Group met again by telephone in February 2012 to clarify and improve the JTA. The agreed upon changes primarily included the addition of examples of concepts and definitions to most of the knowledge statements in order to ensure understandability. INFORMS owes a big “thank you” to both the members of the JTA Working Group and those who responded to the survey.

Test Development

In May 2012, a call for question writers went out via the INFORMS subdivision (section and society) officers and to respondents to the feasibility study in 2011who wanted to help with development of an analytics certification program. Nearly 50 volunteers, a mix of academics and practitioners, initially responded; most, but not all, are INFORMS members. In early June 2012, a consulting psychometrician provided training to about 30 volunteers on how to write multiple-choice questions for the exam. The goal of the training was several-fold: to maximize the quality of the test; to ensure the accuracy, fairness and validity of the test; to minimize the measurement error of the test; and, most importantly, to minimize errors in the classification of candidates as certified or not. Item writers were asked to keep the following general guidelines in mind:

  1. Does the question test something relevant and non-trivial?
  2. Does the question reflect current best practice?
  3. Is the question stated clearly enough so that the knowledgeable candidate will be able to select the correct choice without undue hesitation?
  4. Is the context, setting and content of the question equally appropriate and familiar to all segments of the candidate population, including minority groups?
  5. Is the question free of language and/or descriptions that might be offensive to any segment of the candidate population?
  6. Is the content of the question free of language, descriptions or terminology that could reinforce common stereotypes concerning any segment of the candidate population?

Over a six-week period, the group worked individually to develop multiple-choice questions for the certification exam. In addition to the question, each writer provided a correct answer, three incorrect but plausible answers, a reference or citation, and a suggestion of which domain, task and knowledge statements the question was useful to assess. In the first round, the group provided more than 200 questions, which were reviewed by 12 volunteers at a two-day, in-person meeting in late June 2012. The question writer and reviewer group included four members of the JTA Working Group, who helped to guide the review effort. The following comments below from Carrie Beam of Carrie Beam Consulting, reflecting on her experience as a member of the question reviewing team, do a good job of summarizing the review process [5]:
“I was honored to be invited to Baltimore to help the exam writing committee. I expected to work hard … (and) to meet fellow INFORMS members. … What I didn’t expect was that it would be hard to actually write the exam! You see, over the years as a professor, I’ve written, given and graded ton of exams. As an analytics practitioner, I’ve run into all sorts of weird datasets, unusual requests, special software packages and even a few problems involving garden-variety predictive analytics. Writing an exam for analytics practitioners? It should have been a piece of cake! However, it was anything but cake-like.

I learned how to write a good multiple-choice question, with precisely one correct right answer and three very plausible “distracters.” I learned how to recognize a question that was out of scope, and how to identify a question in scope but way too detailed for a general practitioner exam. (When was the last time you needed to know the F-statistic threshold for something by heart?)

“We started with a rich flurry of submissions from the corners of the INFORMS community. We discussed, edited, wrote, re-wrote, graphed, spell-checked and double-checked the problems. We agreed on the right answer for each problem. When we were finished, we had crafted a well-rounded exam that will carefully and accurately test the boundaries of your analytics knowledge. It will test what you learn on the job as well as what you learn in school. And it will be able to certify to the business community that a person who passes this exam actually does know something about analytics.”

In conjunction with a certification consultant, the INFORMS Certification Task Force prepared a draft policies and procedures manual for the CAP program. Further work on an independent governance board for the program continues as well. Additionally, INFORMS staff is working on a branding and marketing plan for CAP. The marketing department developed a logo that was recently approved by the Certification Task Force and used in an advertisement for the CAP program in the June issue of OR/MS Today. Meanwhile, INFORMS is proceeding with trademark protection of both “Certified Analytics Professional” and “CAP.”

Future Steps

Test development efforts, including additional question writing, creation and review of two exam forms and a cut score study, are ongoing. Development of a detailed marketing and communications plan, a candidate handbook, marketing materials and a Website, as well as the back-end accounting and IT systems, are also underway. The INFORMS Certification Task Force recognizes the importance of making additional details available as soon as possible, since the first certification exam is scheduled for April 2013. For the latest information, check the CAP page at the INFORMS Website (www.informs.org/Build-Your-Career/Certification) from time to time, where updates will be posted as soon as they are available for public release. Similarly, keep reading OR/MS Today and Analytics magazine, as further updates will be shared there as well.

Scott Nestler is a colonel in the U.S. Army, currently attending the Army War College.

Jack Levis is the director of Process Management at UPS and the INFORMS VP for Practice Activities.

Bill Klimack is a decision analysis consultant at Chevron and the INFORMS VP for Meetings.

All three are members of both the INFORMS Certification Task Force and also the INFORMS Analytics Certification Job Task Analysis Working Group.

References

  1. Boyd, A. “Revisiting ‘what is analytics’,” Analytics Magazine, July/August 2012, p. 6.
  2. Horner, P. “The State of INFORMS,” an Interview with Terry Harrison, OR/MS Today, December 2011, p. 34-40.
  3. Nestler, S., Levis, J, Klimack, W, and Rappa, M., “The Shape of Analytics Certification,” OR/MS Today, February 2012, p. 34-36 (www.informs.org/ORMS-Today/Public-Articles/February-Volume-39-Number-1/The-shape-of-analytics-certification).
  4. Ibid.
  5. Beam, Carrie, e-mail message to first author, Aug. 15, 2012.