GRAYASSOCIATES
1
GRAYASSOCIATES
July 2015
Program Selection
GRAYASSOCIATES
2
1.  Set clear criteria for program evaluation
2.  Obtain data on student demand, job opportunities, and competitive intensity
3.  Use the data to profile existing programs and set targets for new programs,
identifying programs that have:
§  Strong student demand
§  Reasonable levels of competition
§  Sufficient job openings for graduates and long-term employment growth
4.  Develop a shared understanding among relevant managers of:
§  Program evaluation data and methods
§  Choices and options
5.  Draft a scoring system for institution-wide and campus-level program selection
6.  Identify priority programs
§  To create
§  To cross-fertilize to other campuses
7.  Profile the selected new programs
Objectives
Gray program selection work helps you achieve the following objectives:
GRAYASSOCIATES
3
Approach
To achieve these objectives, Gray combines leading-edge databases, advanced
analytics, and a collaborative process that ensures that program assessments reflect a
consensus of the relevant managers.
§  GrayData. As illustrated below, Gray has assembled a database on the market-drivers
of a successful campus or program, including inquiries, applications, demographics,
competition, job openings, job postings, and placement rates. We have mapped all the
data down to the census tract level. We have built crosswalks to link related variables,
such as job openings and completions. We have invested in a BI tool (QlikView) that
enables us to rigorously screen hundreds of cities, locations, or programs.
§  Comprehensive, systematic analysis. Often, schools lack the resources to analyze
more than a few programs, so the selected programs may be only “good” rather than
“best”. Similarly, schools often rely on manual processes to pull data from industry and
regulatory databases, an approach that is slow, error-prone, and difficult (or impossible)
to scale up. These approaches also lead to using unnecessarily simplistic data, such as
competitor counts or lists, ignoring IPEDS data on size. Also, it is common to use BLS
data for only the most common job type for a program, ignoring other fields that feed
that field, and other fields that may be appropriate for graduates. In contrast, Gray’s
investments in data, tools, and processes enables us to score all potential IPEDS
programs for each individual local market using the best available data on student
demand, competitive intensity, and job opportunities.
§  Collaborative process. One risk with program assessments is that the client team
does not fully understand the data, methods, and conclusions reached, undermining
support for the new program (and risking that the conclusions do not reflect the insights
of the management team). To avoid that risk, we facilitate a workshop that allows the
senior team to understand the data, design its own scoring process, and apply its
judgment to select the best programs. All participants leave understanding what
programs were picked and why.
GrayData &
Crosswalks
Demographics
U.S. Census
Job Postings
WANTED Analytics
Completions and
Enrollments
IPEDS
Placement Rates
Gray Research
Employment
BLS/O*NET
Industry Inquiries
GrayReports
GRAYASSOCIATES
4
Methodology
Gray’s methodology builds toward a decision-making workshop. Leading up to the
workshop, we assemble all the data, create an “illustrative” set of scoring rules to make
sure that we have a good starting point for the workshop, and prepare all needed
materials. At the workshop, we work with the internal team to finalize the scoring rules,
validate them based on existing programs, identify high-scoring programs for the
relevant overall market, and then adjust the scoring logic to enable scoring of each
program for each local market.
1.  Set Scope: We work with the client to clarify the geographic scope for program
analysis for the area surrounding its campuses (e.g., within 60 miles of a campus); in
most cases, the client provides a file of starts (or completions) that Gray analyzes to
determine the radius that covers substantially all of your students.
2.  Pull Data: For the selected geographic area, Gray creates a dataset for student
demand, competition, and employment for all programs in IPEDS, using GrayData
and other sources such as BLS and IPEDS.
3.  Establish Illustrative Scoring: Gray creates a draft scoring system designed to
illustrate the interaction of the scoring process and the issues to consider in selecting
criteria and weights. Please note: the illustrative scoring system is not intended to
identify the optimal programs before it is refined in the workshop.
4.  Plan Program Selection Workshop: We work with you to select participants, dates,
and an agenda for a program selection workshop. Typically, that workshop requires
one and a half or two days.
5.  Develop Working Papers: Gray develops the documents needed to run the program
workshop. This includes live versions of the data and scoring logic.
6.  Facilitate Workshop: Gray facilitates the sessions, as appropriate, in the Program
Selection Workshop. Together, we refine and finalize the scoring logic for the total
relevant market. Once that is agreed on, we then scale the scoring rules to
appropriately score programs at each individual campus. Options are refined and
high-potential programs identified for further evaluation.
7.  Refine and Summarize Workshop Recommendations: After the workshop, Gray
completes the program scan summary document that identifies programs selected
and next steps. This includes a cross-campus and program matrix to allow an “at a
glance” view of program potential.
8.  Profile selected programs: We sum the data for all of your campuses to create one
total-relevant-market profile for each program selected at the workshop. These
profiles include additional information, such as employment market assessments and
details from Wanted Analytics.
GRAYASSOCIATES
5
Tasks and Deliverables
1.  Set Scope: Typically, we facilitate a kick-off meeting (conference call) to ensure a
shared understanding of the scope, objectives, planned deliverables, and
responsibilities.
As part of the preparation for or follow-up to that meeting, we clarify the
geographic scope for program analysis. That may involve discussions about
whether to exclude any campuses or include any anticipated campus sites. It
may also include analysis of client data to determine an appropriate radius
around each campus, or some other “rules” for defining the relevant market
areas.
In the example below, 92% of students come from within 35 miles of a campus. In
this case, we would use a 35 mile radius to define markets, since it would include
the vast majority of students, without accidentally including potential students,
competitors, or jobs that may not really be part of the local market.
Deliverables at this step include an agreed-on statement of scope, market definitions,
and project schedule.
29%
57%
74%
83%
92%
95% 97% 98% 98% 99% 99% 99%
20%	
  
30%	
  
40%	
  
50%	
  
60%	
  
70%	
  
80%	
  
90%	
  
100%	
  
110%	
  
Miles	
  from	
  Student’s	
  Home	
  to	
  Campus	
  
Percentage	
  of	
  Students	
  by	
  Distance	
  to	
  Campus	
  
GRAYASSOCIATES
6
Tasks and Deliverables
2.  Pull Data: For the selected geographic area, Gray creates a dataset for student demand,
competition, and employment for all programs in IPEDS, using GrayData and other
sources such as BLS and IPEDS.
Typically, we create a dataset that includes the market surrounding each campus. This
enables us to look at all campuses combined as well as each campus individually. For
on-line programs, we can look at the national market or at an appropriate geographic
region, depending on where the institution expects to draw its online students from.
The metrics are typically aligned in four categories: student demand, employment
opportunities, strategic fit, and competition. Within each category, we use several
different metrics to cross-validate the information and provide a sufficiently broad view
of how each program rates. We can also construct additional or different metrics to
better align with a client’s view of their own market environment. For example,
different schools target very different mixes of award levels for their programs.
The table below shows metrics for student demand and employment opportunities.
Each column shows one metric. The columns are color-coded to group similar
categories of these metrics. Each row represents percentiles of the entire set of IPEDS
programs. For instance, in the top left corner, the program with the highest number of
inquiries for an on-campus program had 894,089 inquiries. The median program (#406
out of the 813 programs in GrayData) had 629 ground inquiries.
	
  	
   Student	
  Demand	
   Employment	
  Opportuni=es	
  
	
  	
   Inquiries	
   Comple=ons	
   Employment	
  Rate	
   Employment	
  
	
  	
   Ground	
   Online	
  
Year-­‐
over-­‐
Year	
  
Change	
  
Inquiry	
  
Growth	
  
PCT	
  
Assoc	
  
&	
  
Below	
  
PCT	
  
Bach	
  &	
  
Above	
  
2013	
  
Total	
  
Change	
  
from	
  
2012	
  
(000)	
   Cert.	
   Assoc	
  
Bach	
  
and	
  
Above	
  
Annual	
  
Job	
  
Openings	
  
Job	
  
Open	
  
per	
  
Grad	
   Growth	
  
100%	
   894,089	
   599,985	
   61,527	
   5270%	
   100%	
   100%	
   179,843	
   15,018	
   100%	
   100%	
   100%	
   192,635	
   19.2	
   7.7%	
  
95%	
   62,241	
   19,713	
   3,754	
   238%	
   100%	
   100%	
   5,113	
   245	
   100%	
   100%	
   100%	
   26,196	
   2.0	
   4.6%	
  
90%	
   20,788	
   7,297	
   1,637	
   117%	
   100%	
   100%	
   2,704	
   92	
   93%	
   93%	
   100%	
   10,457	
   1.5	
   3.8%	
  
80%	
   6,376	
   1,063	
   163	
   41%	
   100%	
   85%	
   1,109	
   17	
   86%	
   87%	
   92%	
   4,544	
   1.1	
   3.1%	
  
70%	
   2,863	
   247	
   12	
   18%	
   93%	
   63%	
   561	
   1	
   82%	
   83%	
   85%	
   2,042	
   0.8	
   2.6%	
  
60%	
   1,518	
   48	
   -­‐2	
   0%	
   74%	
   40%	
   329	
   0	
   78%	
   80%	
   83%	
   935	
   0.7	
   2.1%	
  
50%	
   629	
   8	
   -­‐14	
   -­‐11%	
   41%	
   24%	
   169	
   0	
   76%	
   77%	
   80%	
   552	
   0.6	
   1.9%	
  
40%	
   226	
   0	
   -­‐54	
   -­‐23%	
   19%	
   15%	
   100	
   0	
   74%	
   75%	
   75%	
   293	
   0.5	
   1.6%	
  
30%	
   84	
   0	
   -­‐199	
   -­‐35%	
   7%	
   9%	
   57	
   0	
   72%	
   69%	
   72%	
   143	
   0.4	
   1.3%	
  
20%	
   21	
   0	
   -­‐661	
   -­‐52%	
   2%	
   5%	
   27	
   -­‐2	
   68%	
   64%	
   67%	
   54	
   0.3	
   1.1%	
  
10%	
   6	
   0	
   -­‐2,529	
   -­‐76%	
   0%	
   1%	
   11	
   -­‐29	
   58%	
   53%	
   50%	
   20	
   0.2	
   0.7%	
  
0%	
   1	
   0	
   -­‐311,866	
   -­‐100%	
   0%	
   0%	
   1	
   -­‐8,682	
   0%	
   0%	
   0%	
   -­‐1,539	
   0.0	
   0.0%	
  
GRAYASSOCIATES
7
Tasks and Deliverables
This chart shows a set of metrics for Strategic Fit and Competitive Intensity.
	
  	
   Strategic	
  Fit	
   Compe==ve	
  Intensity	
  
	
  	
   Wages	
   Percent	
  of	
  Employed	
   %	
  of	
  Comple=ons	
  
Number	
  of	
  
CompeTtors	
  
Cost	
  Per	
  Inquiry	
  
Comple=ons	
  Per	
  
Capita	
  
	
  	
  
10th	
  PercenTle	
  
Wages	
  
No	
  College	
  
Cert	
  &	
  Assoc	
  
Bach	
  
Grad+	
  
Cert	
  &	
  Assoc	
  
Bach	
  
Grad+	
  
100%	
   $88.1	
   83%	
   83%	
   64%	
   95%	
   100%	
   100%	
   100%	
   1	
   $0	
   0.00	
  
95%	
   $56.1	
   63%	
   56%	
   52%	
   58%	
   100%	
   90%	
   100%	
   1	
   $17	
   0.66	
  
90%	
   $52.2	
   55%	
   51%	
   47%	
   51%	
   100%	
   77%	
   100%	
   1	
   $36	
   0.97	
  
80%	
   $45.5	
   38%	
   42%	
   44%	
   36%	
   85%	
   55%	
   98%	
   2	
   $40	
   1.25	
  
70%	
   $40.3	
   27%	
   35%	
   39%	
   24%	
   46%	
   40%	
   82%	
   2	
   $43	
   1.51	
  
60%	
   $36.6	
   18%	
   31%	
   33%	
   16%	
   20%	
   21%	
   57%	
   3	
   $45	
   1.84	
  
50%	
   $32.8	
   14%	
   28%	
   27%	
   10%	
   8%	
   10%	
   32%	
   4	
   $46	
   2.15	
  
40%	
   $29.4	
   9%	
   23%	
   20%	
   5%	
   2%	
   0%	
   13%	
   8	
   $49	
   2.51	
  
30%	
   $27.3	
   4%	
   15%	
   14%	
   3%	
   0%	
   0%	
   2%	
   12	
   $53	
   3.04	
  
20%	
   $24.6	
   1%	
   10%	
   9%	
   2%	
   0%	
   0%	
   0%	
   23	
   $60	
   4.35	
  
10%	
   $21.2	
   0%	
   4%	
   4%	
   1%	
   0%	
   0%	
   0%	
   48	
   $70	
   5.68	
  
0%	
   $6.2	
   0%	
   0%	
   0%	
   0%	
   0%	
   0%	
   0%	
   953	
   $204	
   21.02	
  
3.  Establish Illustrative Scoring: Gray creates a draft scoring system designed to
illustrate the interaction of the scoring process and the issues to consider in
selecting criteria and weights. This illustrative scoring system is not intended to
identify the optimal programs before it is refined in the workshop.
The scoring system assigns a value for every metric for each program. We sum
these values to calculate scores for each program for each of the four category
areas: student demand, employment opportunities, strategic fit, and competitive
intensity. We then sum those four values to get an overall score for each program.
The purpose of this illustrative scoring is to help the client understand how the
scoring works, and to provide a starting point for building the client-specific
scoring rules.
GRAYASSOCIATES
8
Tasks and Deliverables
This is a section of a sample scoring table. It shows some of the rules used to
create a score, in this case a sub-score for student demand. The actual criteria and
scoring rules are refined collaboratively with the client institution.
4.  Plan Program Selection Workshop: We work with you to select participants,
dates, and an agenda for a program selection workshop. Typically, that workshop
requires one and a half or two days.
This is a sample workshop agenda:
Day	
  1	
   Day	
  2	
  
•  Workshop objectives
•  Overview of approach to program
selection
•  Initial total region scoring outcomes
•  Initial local scoring outcomes
•  Refining scoring rules
-  Breakout
-  Full group
•  Refining total region and local
scoring criteria
•  Initial decisions on top programs
•  Day 1 wrap-up: homework tasks
and owners
•  Review of Day 1 outcomes
•  Decisions on top programs – national
•  Decisions on top programs – local
•  Selection of “top 10” programs to
create profiles for
•  Workshop wrap-up: next steps tasks
and owners
Criteria Sign
Min Possible
Score
Max Possible
ScoreValue Weight
StudentDemand
2013 Completions > 3,000 0
-7 17
2013 Completions > 5,000 1
2013 Completions > 20,000 2
2013 Completions = 0 -1
Change in Completions < -100 -1
Change in Completions > 200 1
Change in Completions > 400 2
Change in Completions > 2,000 4
Student Inquiry = 0 -1
Student Inquiry > 20,000 1
Student Inquiry > 40,000 2
Student Inquiry > 60,000 3
Student Inquiry Growth < -20% -2
Student Inquiry Growth < -10% -1
Student Inquiry Growth > 20% 1
Student Inquiry Growth > 40% 2
GRAYASSOCIATES
9
Tasks and Deliverables
5.  Develop Working Papers: Gray develops the documents needed to run the program
workshop. This includes live versions of the data and scoring logic. By applying the
scoring logic to each program’s data – by market, if appropriate – we can see how
each program scores in detail. Here is an example for Registered Nursing:
Overall	
  Score	
   24	
  
Program:	
  	
  51.3801	
  -­‐	
  Registered	
  Nursing/Registered	
  Nurse.	
  
Category	
   Criteria	
   Value	
  
Percen-­‐
=le	
   Score	
  
Total	
  
Score	
   	
  	
   Category	
   Criteria	
   Value	
  
Percen-­‐
=le	
   Score	
  
Total	
  
Score	
  
Inquiries	
  
Ground	
   248,290	
   95%	
   3	
  
14	
  
Employment	
  
Rate	
  
CerTficate	
   94.9%	
   90%	
   	
  	
  
13	
  
Online	
   105,610	
   95%	
   	
  	
   Associate's	
   81.4%	
   60%	
   6	
  
YoY	
  Absolute	
  
Change	
  
61,527	
   100%	
   4	
   Bachelor's+	
   88.8%	
   70%	
   	
  	
  
YoY	
  Growth	
   33%	
   70%	
   1	
  
Employment	
  
Annual	
  Job	
  
Openings	
  
150,278	
   95%	
   3	
  
%	
  Associate's	
  &	
  
Below	
  
5%	
   20%	
   	
  	
  
Jobs	
  Per	
  
Graduate	
  
0.7	
   60%	
   0	
  
%	
  Bachelor's	
  &	
  
Above	
  
53%	
   60%	
   	
  	
   Growth	
   3.3%	
   80%	
   4	
  
Comple-­‐
=ons	
  
2013	
  Total	
   68,976	
   95%	
   2	
   Wages	
   10th	
  PercenTle	
   $51	
  	
   80%	
   7	
  
-­‐3	
  
Change	
  From	
  
2012	
  
8,318	
   95%	
   4	
  
Percent	
  of	
  
Current	
  
Workers	
  
No	
  College	
   1%	
   20%	
   0	
  
Compe-­‐
==ve	
  
Intensity	
  
Number	
  of	
  
CompeTtors	
  
409	
   10%	
   0	
  
0	
  
CerTficate	
  &	
  
Associate's	
  
43%	
   80%	
   -­‐10	
  
Cost	
  Per	
  
Inquiry	
  
$41	
   80%	
   0	
   Bachelor's	
   43%	
   70%	
   0	
  
CompleTons	
  
Per	
  Capita	
  
2.89	
   40%	
   0	
   Master's+	
   13%	
   50%	
   0	
  
Percent	
  of	
  
Comple=ons	
  
CerTficate	
  &	
  
Associate's	
  
9%	
   50%	
   0	
  
Bachelor's	
   81%	
   90%	
   0	
  
Master's+	
   10%	
   30%	
   0	
  
Descrip-­‐
=on	
  
A	
  program	
  that	
  generally	
  prepares	
  individuals	
  in	
  the	
  knowledge,	
  techniques	
  and	
  procedures	
  for	
  promoTng	
  health,	
  providing	
  
care	
  for	
  sick,	
  disabled,	
  infirmed,	
  or	
  other	
  individuals	
  or	
  groups.	
  	
  Includes	
  instrucTon	
  in	
  the	
  administraTon	
  of	
  medicaTon	
  and	
  
treatments,	
  assisTng	
  a	
  physician	
  during	
  treatments	
  and	
  examinaTons,	
  Referring	
  paTents	
  to	
  physicians	
  and	
  other	
  health	
  care	
  
specialists,	
  and	
  planning	
  educaTon	
  for	
  health	
  maintenance.	
  
This shows the
Overall Score
this program
received.This column shows the
actual value for each of
the criteria
This column shows
the score that the
value received
This column shows what percentile the value is in compared
to every other program in Baton Rouge. For instance,
Ground Inquiries show 100% because this program has the
most inquiries in Baton Rouge compared to the other 800+
programs
This final column shows
the total score for each of
the 4 categories
This shows what percentile ranges the
colors represent.
PercenTles:	
   <	
  40%	
   40%-­‐60%	
   70%	
   80%	
   90%+	
  
GRAYASSOCIATES
10
Tasks and Deliverables
This chart compares programs within a single market. A chart like this can be used to
compare existing programs, look at top candidates, or analyze other sub-groups of
programs. A program scan includes the data to create charts like this that cover all
813 potential IPEDS programs in a market.
6.  Facilitate Workshop: Gray facilitates the sessions, as appropriate, in the Program
Selection Workshop. Together, we refine and finalize the scoring logic for the total
relevant market. Once that is agreed on, we then scale the scoring rules to
appropriately score programs at each individual campus. Options are refined and
high-potential programs identified for further evaluation.
30
30
28
27
26
26
26
24
24
23
42.0101 - Psychology, General
14.0903 - Computer Software Engineering
51.0912 - Physician Assistant
14.1001 - Electrical and Electronics Engineering
26.0102 - Biomedical Sciences, General
51.2205 - Health/Medical Physics
30.1101 - Gerontology
51.3801 - Registered Nursing/Registered Nurse
14.2701 - Systems Engineering
51.2010 - Pharmaceutical Sciences
Program Scores by Quadrant
Student Demand Employment Opportunities Strategic Fit Competitive Intensity Overall Score
GRAYASSOCIATES
11
Tasks and Deliverables
7.  Refine and Summarize Workshop Recommendations: After the workshop, Gray
completes the program scan summary document that identifies programs selected
and next steps. For institutions with more than one geographic market, this
includes a cross-campus and program matrix to allow an “at a glance” view of
program potential.
8.  Profile selected programs: We sum the data for all of your campuses to create one
total-relevant-market profile for each program selected at the workshop. These
profiles include additional information, such as employment market assessments
and details from WANTED Analytics like that below:

Gray Associates - Program Selection and Assessment

  • 1.
  • 2.
    GRAYASSOCIATES 2 1.  Set clearcriteria for program evaluation 2.  Obtain data on student demand, job opportunities, and competitive intensity 3.  Use the data to profile existing programs and set targets for new programs, identifying programs that have: §  Strong student demand §  Reasonable levels of competition §  Sufficient job openings for graduates and long-term employment growth 4.  Develop a shared understanding among relevant managers of: §  Program evaluation data and methods §  Choices and options 5.  Draft a scoring system for institution-wide and campus-level program selection 6.  Identify priority programs §  To create §  To cross-fertilize to other campuses 7.  Profile the selected new programs Objectives Gray program selection work helps you achieve the following objectives:
  • 3.
    GRAYASSOCIATES 3 Approach To achieve theseobjectives, Gray combines leading-edge databases, advanced analytics, and a collaborative process that ensures that program assessments reflect a consensus of the relevant managers. §  GrayData. As illustrated below, Gray has assembled a database on the market-drivers of a successful campus or program, including inquiries, applications, demographics, competition, job openings, job postings, and placement rates. We have mapped all the data down to the census tract level. We have built crosswalks to link related variables, such as job openings and completions. We have invested in a BI tool (QlikView) that enables us to rigorously screen hundreds of cities, locations, or programs. §  Comprehensive, systematic analysis. Often, schools lack the resources to analyze more than a few programs, so the selected programs may be only “good” rather than “best”. Similarly, schools often rely on manual processes to pull data from industry and regulatory databases, an approach that is slow, error-prone, and difficult (or impossible) to scale up. These approaches also lead to using unnecessarily simplistic data, such as competitor counts or lists, ignoring IPEDS data on size. Also, it is common to use BLS data for only the most common job type for a program, ignoring other fields that feed that field, and other fields that may be appropriate for graduates. In contrast, Gray’s investments in data, tools, and processes enables us to score all potential IPEDS programs for each individual local market using the best available data on student demand, competitive intensity, and job opportunities. §  Collaborative process. One risk with program assessments is that the client team does not fully understand the data, methods, and conclusions reached, undermining support for the new program (and risking that the conclusions do not reflect the insights of the management team). To avoid that risk, we facilitate a workshop that allows the senior team to understand the data, design its own scoring process, and apply its judgment to select the best programs. All participants leave understanding what programs were picked and why. GrayData & Crosswalks Demographics U.S. Census Job Postings WANTED Analytics Completions and Enrollments IPEDS Placement Rates Gray Research Employment BLS/O*NET Industry Inquiries GrayReports
  • 4.
    GRAYASSOCIATES 4 Methodology Gray’s methodology buildstoward a decision-making workshop. Leading up to the workshop, we assemble all the data, create an “illustrative” set of scoring rules to make sure that we have a good starting point for the workshop, and prepare all needed materials. At the workshop, we work with the internal team to finalize the scoring rules, validate them based on existing programs, identify high-scoring programs for the relevant overall market, and then adjust the scoring logic to enable scoring of each program for each local market. 1.  Set Scope: We work with the client to clarify the geographic scope for program analysis for the area surrounding its campuses (e.g., within 60 miles of a campus); in most cases, the client provides a file of starts (or completions) that Gray analyzes to determine the radius that covers substantially all of your students. 2.  Pull Data: For the selected geographic area, Gray creates a dataset for student demand, competition, and employment for all programs in IPEDS, using GrayData and other sources such as BLS and IPEDS. 3.  Establish Illustrative Scoring: Gray creates a draft scoring system designed to illustrate the interaction of the scoring process and the issues to consider in selecting criteria and weights. Please note: the illustrative scoring system is not intended to identify the optimal programs before it is refined in the workshop. 4.  Plan Program Selection Workshop: We work with you to select participants, dates, and an agenda for a program selection workshop. Typically, that workshop requires one and a half or two days. 5.  Develop Working Papers: Gray develops the documents needed to run the program workshop. This includes live versions of the data and scoring logic. 6.  Facilitate Workshop: Gray facilitates the sessions, as appropriate, in the Program Selection Workshop. Together, we refine and finalize the scoring logic for the total relevant market. Once that is agreed on, we then scale the scoring rules to appropriately score programs at each individual campus. Options are refined and high-potential programs identified for further evaluation. 7.  Refine and Summarize Workshop Recommendations: After the workshop, Gray completes the program scan summary document that identifies programs selected and next steps. This includes a cross-campus and program matrix to allow an “at a glance” view of program potential. 8.  Profile selected programs: We sum the data for all of your campuses to create one total-relevant-market profile for each program selected at the workshop. These profiles include additional information, such as employment market assessments and details from Wanted Analytics.
  • 5.
    GRAYASSOCIATES 5 Tasks and Deliverables 1. Set Scope: Typically, we facilitate a kick-off meeting (conference call) to ensure a shared understanding of the scope, objectives, planned deliverables, and responsibilities. As part of the preparation for or follow-up to that meeting, we clarify the geographic scope for program analysis. That may involve discussions about whether to exclude any campuses or include any anticipated campus sites. It may also include analysis of client data to determine an appropriate radius around each campus, or some other “rules” for defining the relevant market areas. In the example below, 92% of students come from within 35 miles of a campus. In this case, we would use a 35 mile radius to define markets, since it would include the vast majority of students, without accidentally including potential students, competitors, or jobs that may not really be part of the local market. Deliverables at this step include an agreed-on statement of scope, market definitions, and project schedule. 29% 57% 74% 83% 92% 95% 97% 98% 98% 99% 99% 99% 20%   30%   40%   50%   60%   70%   80%   90%   100%   110%   Miles  from  Student’s  Home  to  Campus   Percentage  of  Students  by  Distance  to  Campus  
  • 6.
    GRAYASSOCIATES 6 Tasks and Deliverables 2. Pull Data: For the selected geographic area, Gray creates a dataset for student demand, competition, and employment for all programs in IPEDS, using GrayData and other sources such as BLS and IPEDS. Typically, we create a dataset that includes the market surrounding each campus. This enables us to look at all campuses combined as well as each campus individually. For on-line programs, we can look at the national market or at an appropriate geographic region, depending on where the institution expects to draw its online students from. The metrics are typically aligned in four categories: student demand, employment opportunities, strategic fit, and competition. Within each category, we use several different metrics to cross-validate the information and provide a sufficiently broad view of how each program rates. We can also construct additional or different metrics to better align with a client’s view of their own market environment. For example, different schools target very different mixes of award levels for their programs. The table below shows metrics for student demand and employment opportunities. Each column shows one metric. The columns are color-coded to group similar categories of these metrics. Each row represents percentiles of the entire set of IPEDS programs. For instance, in the top left corner, the program with the highest number of inquiries for an on-campus program had 894,089 inquiries. The median program (#406 out of the 813 programs in GrayData) had 629 ground inquiries.     Student  Demand   Employment  Opportuni=es       Inquiries   Comple=ons   Employment  Rate   Employment       Ground   Online   Year-­‐ over-­‐ Year   Change   Inquiry   Growth   PCT   Assoc   &   Below   PCT   Bach  &   Above   2013   Total   Change   from   2012   (000)   Cert.   Assoc   Bach   and   Above   Annual   Job   Openings   Job   Open   per   Grad   Growth   100%   894,089   599,985   61,527   5270%   100%   100%   179,843   15,018   100%   100%   100%   192,635   19.2   7.7%   95%   62,241   19,713   3,754   238%   100%   100%   5,113   245   100%   100%   100%   26,196   2.0   4.6%   90%   20,788   7,297   1,637   117%   100%   100%   2,704   92   93%   93%   100%   10,457   1.5   3.8%   80%   6,376   1,063   163   41%   100%   85%   1,109   17   86%   87%   92%   4,544   1.1   3.1%   70%   2,863   247   12   18%   93%   63%   561   1   82%   83%   85%   2,042   0.8   2.6%   60%   1,518   48   -­‐2   0%   74%   40%   329   0   78%   80%   83%   935   0.7   2.1%   50%   629   8   -­‐14   -­‐11%   41%   24%   169   0   76%   77%   80%   552   0.6   1.9%   40%   226   0   -­‐54   -­‐23%   19%   15%   100   0   74%   75%   75%   293   0.5   1.6%   30%   84   0   -­‐199   -­‐35%   7%   9%   57   0   72%   69%   72%   143   0.4   1.3%   20%   21   0   -­‐661   -­‐52%   2%   5%   27   -­‐2   68%   64%   67%   54   0.3   1.1%   10%   6   0   -­‐2,529   -­‐76%   0%   1%   11   -­‐29   58%   53%   50%   20   0.2   0.7%   0%   1   0   -­‐311,866   -­‐100%   0%   0%   1   -­‐8,682   0%   0%   0%   -­‐1,539   0.0   0.0%  
  • 7.
    GRAYASSOCIATES 7 Tasks and Deliverables Thischart shows a set of metrics for Strategic Fit and Competitive Intensity.     Strategic  Fit   Compe==ve  Intensity       Wages   Percent  of  Employed   %  of  Comple=ons   Number  of   CompeTtors   Cost  Per  Inquiry   Comple=ons  Per   Capita       10th  PercenTle   Wages   No  College   Cert  &  Assoc   Bach   Grad+   Cert  &  Assoc   Bach   Grad+   100%   $88.1   83%   83%   64%   95%   100%   100%   100%   1   $0   0.00   95%   $56.1   63%   56%   52%   58%   100%   90%   100%   1   $17   0.66   90%   $52.2   55%   51%   47%   51%   100%   77%   100%   1   $36   0.97   80%   $45.5   38%   42%   44%   36%   85%   55%   98%   2   $40   1.25   70%   $40.3   27%   35%   39%   24%   46%   40%   82%   2   $43   1.51   60%   $36.6   18%   31%   33%   16%   20%   21%   57%   3   $45   1.84   50%   $32.8   14%   28%   27%   10%   8%   10%   32%   4   $46   2.15   40%   $29.4   9%   23%   20%   5%   2%   0%   13%   8   $49   2.51   30%   $27.3   4%   15%   14%   3%   0%   0%   2%   12   $53   3.04   20%   $24.6   1%   10%   9%   2%   0%   0%   0%   23   $60   4.35   10%   $21.2   0%   4%   4%   1%   0%   0%   0%   48   $70   5.68   0%   $6.2   0%   0%   0%   0%   0%   0%   0%   953   $204   21.02   3.  Establish Illustrative Scoring: Gray creates a draft scoring system designed to illustrate the interaction of the scoring process and the issues to consider in selecting criteria and weights. This illustrative scoring system is not intended to identify the optimal programs before it is refined in the workshop. The scoring system assigns a value for every metric for each program. We sum these values to calculate scores for each program for each of the four category areas: student demand, employment opportunities, strategic fit, and competitive intensity. We then sum those four values to get an overall score for each program. The purpose of this illustrative scoring is to help the client understand how the scoring works, and to provide a starting point for building the client-specific scoring rules.
  • 8.
    GRAYASSOCIATES 8 Tasks and Deliverables Thisis a section of a sample scoring table. It shows some of the rules used to create a score, in this case a sub-score for student demand. The actual criteria and scoring rules are refined collaboratively with the client institution. 4.  Plan Program Selection Workshop: We work with you to select participants, dates, and an agenda for a program selection workshop. Typically, that workshop requires one and a half or two days. This is a sample workshop agenda: Day  1   Day  2   •  Workshop objectives •  Overview of approach to program selection •  Initial total region scoring outcomes •  Initial local scoring outcomes •  Refining scoring rules -  Breakout -  Full group •  Refining total region and local scoring criteria •  Initial decisions on top programs •  Day 1 wrap-up: homework tasks and owners •  Review of Day 1 outcomes •  Decisions on top programs – national •  Decisions on top programs – local •  Selection of “top 10” programs to create profiles for •  Workshop wrap-up: next steps tasks and owners Criteria Sign Min Possible Score Max Possible ScoreValue Weight StudentDemand 2013 Completions > 3,000 0 -7 17 2013 Completions > 5,000 1 2013 Completions > 20,000 2 2013 Completions = 0 -1 Change in Completions < -100 -1 Change in Completions > 200 1 Change in Completions > 400 2 Change in Completions > 2,000 4 Student Inquiry = 0 -1 Student Inquiry > 20,000 1 Student Inquiry > 40,000 2 Student Inquiry > 60,000 3 Student Inquiry Growth < -20% -2 Student Inquiry Growth < -10% -1 Student Inquiry Growth > 20% 1 Student Inquiry Growth > 40% 2
  • 9.
    GRAYASSOCIATES 9 Tasks and Deliverables 5. Develop Working Papers: Gray develops the documents needed to run the program workshop. This includes live versions of the data and scoring logic. By applying the scoring logic to each program’s data – by market, if appropriate – we can see how each program scores in detail. Here is an example for Registered Nursing: Overall  Score   24   Program:    51.3801  -­‐  Registered  Nursing/Registered  Nurse.   Category   Criteria   Value   Percen-­‐ =le   Score   Total   Score       Category   Criteria   Value   Percen-­‐ =le   Score   Total   Score   Inquiries   Ground   248,290   95%   3   14   Employment   Rate   CerTficate   94.9%   90%       13   Online   105,610   95%       Associate's   81.4%   60%   6   YoY  Absolute   Change   61,527   100%   4   Bachelor's+   88.8%   70%       YoY  Growth   33%   70%   1   Employment   Annual  Job   Openings   150,278   95%   3   %  Associate's  &   Below   5%   20%       Jobs  Per   Graduate   0.7   60%   0   %  Bachelor's  &   Above   53%   60%       Growth   3.3%   80%   4   Comple-­‐ =ons   2013  Total   68,976   95%   2   Wages   10th  PercenTle   $51     80%   7   -­‐3   Change  From   2012   8,318   95%   4   Percent  of   Current   Workers   No  College   1%   20%   0   Compe-­‐ ==ve   Intensity   Number  of   CompeTtors   409   10%   0   0   CerTficate  &   Associate's   43%   80%   -­‐10   Cost  Per   Inquiry   $41   80%   0   Bachelor's   43%   70%   0   CompleTons   Per  Capita   2.89   40%   0   Master's+   13%   50%   0   Percent  of   Comple=ons   CerTficate  &   Associate's   9%   50%   0   Bachelor's   81%   90%   0   Master's+   10%   30%   0   Descrip-­‐ =on   A  program  that  generally  prepares  individuals  in  the  knowledge,  techniques  and  procedures  for  promoTng  health,  providing   care  for  sick,  disabled,  infirmed,  or  other  individuals  or  groups.    Includes  instrucTon  in  the  administraTon  of  medicaTon  and   treatments,  assisTng  a  physician  during  treatments  and  examinaTons,  Referring  paTents  to  physicians  and  other  health  care   specialists,  and  planning  educaTon  for  health  maintenance.   This shows the Overall Score this program received.This column shows the actual value for each of the criteria This column shows the score that the value received This column shows what percentile the value is in compared to every other program in Baton Rouge. For instance, Ground Inquiries show 100% because this program has the most inquiries in Baton Rouge compared to the other 800+ programs This final column shows the total score for each of the 4 categories This shows what percentile ranges the colors represent. PercenTles:   <  40%   40%-­‐60%   70%   80%   90%+  
  • 10.
    GRAYASSOCIATES 10 Tasks and Deliverables Thischart compares programs within a single market. A chart like this can be used to compare existing programs, look at top candidates, or analyze other sub-groups of programs. A program scan includes the data to create charts like this that cover all 813 potential IPEDS programs in a market. 6.  Facilitate Workshop: Gray facilitates the sessions, as appropriate, in the Program Selection Workshop. Together, we refine and finalize the scoring logic for the total relevant market. Once that is agreed on, we then scale the scoring rules to appropriately score programs at each individual campus. Options are refined and high-potential programs identified for further evaluation. 30 30 28 27 26 26 26 24 24 23 42.0101 - Psychology, General 14.0903 - Computer Software Engineering 51.0912 - Physician Assistant 14.1001 - Electrical and Electronics Engineering 26.0102 - Biomedical Sciences, General 51.2205 - Health/Medical Physics 30.1101 - Gerontology 51.3801 - Registered Nursing/Registered Nurse 14.2701 - Systems Engineering 51.2010 - Pharmaceutical Sciences Program Scores by Quadrant Student Demand Employment Opportunities Strategic Fit Competitive Intensity Overall Score
  • 11.
    GRAYASSOCIATES 11 Tasks and Deliverables 7. Refine and Summarize Workshop Recommendations: After the workshop, Gray completes the program scan summary document that identifies programs selected and next steps. For institutions with more than one geographic market, this includes a cross-campus and program matrix to allow an “at a glance” view of program potential. 8.  Profile selected programs: We sum the data for all of your campuses to create one total-relevant-market profile for each program selected at the workshop. These profiles include additional information, such as employment market assessments and details from WANTED Analytics like that below: