2. 44 Evaluation in Higher Education 5:1 (June 2011)
1. Introduction
In recent years, higher education policy in Europe has been characterised
by growing differentiation of the higher education system as a modernisation
factor, driven by catalytic forces of the Bologna Process toward shifts in thinking
and acting within universities. Meanwhile, universities are being granted more
autonomy and their focus in the resulting competitive situation (Hödl & Zegelin,
1999) is expected to become customer-oriented (Hansen, 1999; Pausits, 2006),
cost-aware, and sensitive towards the needs of society. The approach adopted
by public authorities with regard to HEIs has essentially transformed, and the
shift towards enlarged “missions” has been seriously influenced by ideas of
the “entrepreneurial university” (Clark, 1998). “Universities are seen by many
to be increasingly significant sources of knowledge and capabilities in the
knowledge economy. provides a framework for understanding what universities
are really doing and the variety of networks within which HEIs prosecute their
missions” (Arbo & Benneworth, 2007). Policy-makers and analysts alike have
begun to pay more attention to the ways in which university-based capabilities
and activities can contribute to social and economic development. There is a
common understanding for the two core missions of universities which are
teaching and research. These are at the heart of all activities and therefore the
engines of institutional development but also core elements of the university
outputs. However, in recent years, another mission is being considered in order
to reflect all contributions of universities to society, what is generally known as
‘Third Mission’ or Third Stream (Molas-Gallart, Salter, Patel, Scott, & Duran,
2002). While numerous ranking concepts are present for the first and second
missions, the Third Mission is not included as a core element into existing
ranking. The generally recognised ranking systems -- like Academic Ranking
of World Universities commonly know as the Shanghai Ranking or Times
Higher Education World University Ranking -- for the traditional missions of
the university present indicators to assess excellence at universities mainly by
research and teaching.
While rankings can improve quality assurance by allowing the institutions
to understand their own performance, develop best practices and provide
effective and efficient value to society, it is important that third mission activities
-- as components of the institutional performance -- are also part of such
03-Marhl.indd 44 2011/8/3 下午 04:02:50
3. Marhl, Pausits: Third Mission Indicators for New Ranking Methodologies 45
rankings. However, there are no commonly agreed indicators or methodologies
to assess quality in Third Mission activities.
Universities have contributed directly and indirectly to much decision-
making in wider society since their inception, such activities have not been the
essential to their role in the same way as the first two missions of university
activity -- research and teaching. Now, developments in this field have reached
the stage where university “Third Mission” contributions are seen important and
distinctive on their own, deserving specific policies and resources to ensure their
effective functioning.
We are at a key moment in the development of university Third Mission
activities. Various governmental initiatives are aimed at encouraging universities
to invest more in this area, bringing with it significant new funding opportunities
for universities. Many universities are seeking to gather information on their
Third Mission activities in order to be able to ensure their effective management,
and to underpin funding bids. The governments have shown signs of making
funding for Third Mission activities a permanent feature of the university
funding landscape. Such funding decisions need to be based also on information
about the performance in Third Mission activities (Molas-Gallart et al., 2002).
First, we discuss, based on the state of the art literature, Third Mission
activities as one of the major topics for higher education institutions. Then,
we describe a theoretical framework for third mission and related activities by
three dimensions as a classification model. These are “Continuing Education”,
“Technology Transfer & Innovation” as well as “Social Engagement”. From
an established conceptual framework, different processes associated to each
dimension are described. The identification and definition of these processes
allow us to design a set of indicators for each dimension. Finally, Delphi Method
is used to obtain a selected set of indicators (relevant and feasible) which
determine the basis of the ranking methodology criteria. This article is based
on an European Erasmus Lifelong Learning project called E3M.1
The project
objectives are to create European standard indicators to measure the effectiveness
of Third Mission provision as well as a ranking methodology to benchmark
European Third Mission Services of higher education institutions. The main
1
http://www.e3mproject.eu
03-Marhl.indd 45 2011/8/3 下午 04:02:50
4. 46 Evaluation in Higher Education 5:1 (June 2011)
purpose is to generate a comprehensive instrument to identify, measure, and
compare Third Mission activities from a wide perspective.
2. Understanding Third Mission
The modernisation agenda of higher education is reflected in many
new policies and higher education laws across Europe. Governments force
universities to become more accountable and responsible for the resources
they use and how they use them. At the same time the increasing numbers and
initiatives of rankings also in Europe underline the importance of competition,
value awareness and contribution. The value of universities and how to measure
it, is core to many policy and academic discussions.
Third mission as a new buzz word of higher education modernisation leads
to different understanding of the activities at the universities. These institutions
have been founded primarily on two sets of activities: teaching and research.
Already in the 70’s the German “Bildungsrat” (Advisory Board for Education)
defined the third pillar of the university -- besides teaching and research --
continuing education (Deutscher Bildungsrat, 1970).
This definition of third mission activities is very narrow and focus only
on the aspects of lifelong learning and continuing education as an additional
goal besides traditional degree oriented undergraduate and graduate education.
Lifelong learning can be seen as a commonly agreed important issue for the
development of the society and the individuals. Lifelong learning or continuing
education as a concept includes tertiary education and force universities
to enlarge their teaching portfolio for adult learners and offer postgraduate
and professional development programmes too. On the other hand, in
many European countries like Germany, Austria or Hungary as a source of
additional funding opportunities for universities, continuing education has
a strong commercialisation aspect of education. In fact both direction of the
argumentation for the development of LLL activities at universities show
the importance of education besides traditional undergraduate and graduate
education and is one part of universities’ third mission.
According to the Russell Report universities have always made
contributions to decision-making in wider society; this is their ‘third mission’.
Third mission activities are therefore concerned with the generation, use,
03-Marhl.indd 46 2011/8/3 下午 04:02:50
5. Marhl, Pausits: Third Mission Indicators for New Ranking Methodologies 47
application and exploitation of knowledge and other university capabilities
outside academic environments (Molas-Gallart et al., 2002). Finally the expert
group in the report agreed that the Third Mission is about the interactions
between universities and the rest of society. This general definition is too broad
and need further investigation to identify special tasks and activities, which can
be seen as third mission activities and offer a distinction to the other two missions.
The basic understanding and the developed framework in the Russell Report (see
Figure 1) see third mission based on the first two missions and understand as a
special services based on activities and capabilities for and to the society.
In general, third mission is the vehicle to let universities leave the ivory
tower and to increase the collaboration and exchange with the society. Etzkowitz
and Leydesdorff (1997) proposed the triple helix model of industry, government
and university collaboration (Etzkowitz, 2001; Etzkowitz & Leydesdorff, 1997).
In this concept universities are essential actors in new knowledge production
(a) Knowledge
capabilities
(b) Facilities
(c)Research
(d) Teaching
(e) Communication
CapabilitiesCapabilitiesActivitiesActivities
Associated Third Stream activitiesAssociated Third Stream activities
Advisory work and contracts
Technology commercialisation
Entrepreneurial activities
Student placements?
Learning activities
Curricula alignment
Social networking
Non-academic dissemination
Commercialisation of facilities
Contract research
Collaboration in academic
research
ExploitationanduseExploitationanduse
Staff flow
Figure 1. Conceptual Framework for Analysing Third Stream Activities.
Source: Molas-Gallart et al. (2002).
03-Marhl.indd 47 2011/8/3 下午 04:02:50
6. 48 Evaluation in Higher Education 5:1 (June 2011)
and third mission is the key to become entrepreneurial universities. Almost a
decade later a network of European expert developed a “radar” of third mission
elements made of 8 dimensions and proposing for each a core set of indicators
and/or descriptors (see Table 1). The goal of this initiative was to characterise
the contribution of universities in the third mission.2
Table 1. The “Radar” of Third Mission Elements Proposed
by the PRIME -- OEU Project3
Issues Focus, main indicators and descriptors
1. Human resources Focus: Transfer of embodied knowledge in PhD students and graduates.
Comment: This axis screens the transfer of “competences trained through
research” to industry and “mission oriented” public services.
Indicators: The number and share of PhD diploma going to industry and
public services (distinguishing between R&D and non R&D positions).
2. Intellectual property Focus: Codified knowledge produced by the university and its management
(patents, copyright). Indicators concern not only patents owned by the
university, but university “inventors” (whatever the grantee is). Patent
numbers should be complemented by licences granted and fees received.
3. Spin offs Focus: Knowledge transfer through entrepreneurship.
Indicators: Simple counts are not enough, a typology of relationship
between spin-off firms and labs has to be considered (staff that left, staff
still involved, research contracts, licences granted...).
Descriptors are needed to characterise university involvement and support:
dedicated teams, incubator, funds provided (in whatever form, including
shareholding).
4. Contracts with industry Focus: Knowledge co-production and circulation to industry. This is
taken as the main marker of the attractiveness of universities for existing
economic actors.
Indicators: Number of contracts, amount as a share of total resources,
type of partners (global, large firms, SME) are the key aspects.
Level of concentration (sectoral and/or on a few partners), types of
contract (research, consultancy, services) and duration are important
complementary aspects.
Delineating in large labs the degree of concentration (thematic or on given
teams) is also often of strategic interest.
Comment: This is often complemented by a “soft” dimension where
account is taken of membership to professional associations (and role
played in given professional networks), professional publications, activities
in continuous training, consultancy activities (often not paid to the lab) and
internships (master students accepted in “stages”).
2
See the final report of the OEU project go to http://www.enid-europe.org or www.prime-noe.org
3
http://www.prime-noe.org
03-Marhl.indd 48 2011/8/3 下午 04:02:50
7. Marhl, Pausits: Third Mission Indicators for New Ranking Methodologies 49
5. Contracts with public
bodies
Focus: The “public service” dimension of research activities.
Indicators: Similar aspects as for contract with industry apply, especially
differentiating between co-research and services.
Comment: It is important to complement contracts by non-market relations
which are often critical when labs focus on social and cultural dimensions
(this has often important implications for identity building but also for
economic activities such as tourism). This is also very present in health
research (with clinical trials for new therapeutic protocols...).
6. Participation into policy
making
Focus: Involvement in the shaping and/or implementation of policies (at
different levels). This is often captured under the wording of “expertise”,
including policy studies, participation in the formulation of long-term
programmes or to ‘formalised’ debates on S&T&I policy, involvement
into standard setting committees, into committees and work on safety
rules.
Descriptors: The usual mode is to consider a description in the annual
report in order to build an indicator of presence and ‘relative importance’
(number of different activities and entities, number of persons involved).
7. Involvement into social
and cultural life
Focus: Involvement of the university in “societal” (mostly “city”) life.
Comments:
A number of universities have lasting “facilities” that participate to the
social and cultural life of the city (museums, orchestra, sport facilities,
facilities like libraries open to schools or citizens...). Some involve
themselves opening “social services” (like law shops).
Besides these “structural” investments, a number of labs involve
themselves in given social and cultural events (expos, concerts, urban
development projects...).
Descriptors: There is little accumulated knowledge on how to account
for such activities. Two approaches are being experimented: accounting
for relative importance in all university investments and/or activities,
positioning these within their own environment (as can be done for
museums).
8. Public understanding of
science
Focus: Interaction with society.
Comment: The choice has been to focus here only on “dissemination”
and interaction with the “general public”. All growing aspects upon
involvement into public debates are considered to be part of dimension 6
(participation to policy making).
Descriptors: Follow sets of activities deployed (open days, involvement
in scientific fairs and the like, involvement into general press and science
journals for the public, involvement in the different media, construction
of “dissemination” and “interactive” websites, involvement into activities
directed towards children and secondary schools...). Differentiate between
individual initiatives and proactive policies of labs and of the university
(as a whole or through its departments).
The notion of third mission is quite ambiguous. How differently it
can be taken depends upon the formation of university activities and upon
its institutional understanding. The table above describe third mission as a
result of research activities (issues 1 to 5) but also deliver element which has
more aspects of community and social outreach (issues 6 to 8). Many official
03-Marhl.indd 49 2011/8/3 下午 04:02:50
8. 50 Evaluation in Higher Education 5:1 (June 2011)
documents and policy papers describe universities as the “dynamos of growth”
in the knowledge-driven economy. In particular, universities are seen to have a
role in economic development based on commercialisation of research. Yet there
is much more to the relationship between universities and the rest of society
than simply commercial activities. Universities contribute to government, civil
society and the private sector, supporting not simply with financial performance
but furthermore improve quality of life and the value of public services. Third
Mission activities that focuses purely on university commercial activities is
likely to miss large and important parts of the picture (Molas-Gallart et al.,
2002). The social dimension of third mission activities is as important as the
commercialisation of knowledge trough spin off activities or professional
development programmes.
Arbo and Benneworth (2007) see third mission as it was conceived of as
an addendum to their primary tasks of teaching and research, but nowadays
it is more and more expected to be an integrated part of their mission and
operation. On the other hand Montesinos, Carot, Martinez, and Mora (2008)
identified three different driving forces of university third mission: social,
enterprising and innovation. The social third mission is a set of activities not
based on economic revenues. This is a commitment to society services through
volunteer contribution, social networking or cultural activities open to the
public at the campus. As we already discussed third mission as commitment to
an entrepreneurial university, it implies commercial benefits for universities as
differentiated sources of funding. For example in the enterprising dimension of
the third mission activities like collaborative contract research with industry,
commercialisation of intellectual property but also professional development
programmes or to rent campus facilities for instance for fairs or conferences are
services with revenue creation. Finally, the last dimension in contradiction to
the second dimension has not only financial benefits as the main driving force.
Here the notion of improvement is the key element. The examples are regional
innovation and networking with entrepreneurs, patent exploitation, government
consulting. Innovation third mission is the use of research for change.
Based on the above mentioned scientific and professional approaches
the e3m network developed a specific set of dimensions of third mission.
All dimensions are identified by single activities. Some of the activities are
03-Marhl.indd 50 2011/8/3 下午 04:02:50
9. Marhl, Pausits: Third Mission Indicators for New Ranking Methodologies 51
crosscutting themes and therefore belong to more then one dimension. These
activities are described later in this paper. The first dimension is defined as
“continuing education” in the context of resumption through the university
organized and managed education as service in change between job, leisure time
and education. This includes degree and non-degree education/training, while
the target audience are adults. The second dimension is “technology transfer and
innovation” and cover knowledge exchange activities especially in the context of
use of research. The third dimension is “social engagement” as the collaboration
between universities and their larger communities (local, regional, national, and
global) for the mutually beneficial exchange of knowledge and resources in a
context of a non-for-profit relationship.
The first two dimensions are mainly driven by profit orientation and
the basic understanding of an entrepreneurial university, taking care of the
interaction with the society under economical perspective. The last dimension is
more related to the role of the university as a provider of social services for the
community. Especially state universities as non-for-profit entities have to have a
commitment for the social welfare and activities, which may not automatically
lead to a financial turnover or even cause costs for the university. But as a pillar
of the society development universities has also a social responsibility and has to
develop activities to fulfil this special role as well.
The origin of third mission activities can be seen in research as an engine
to create new and better economical solutions for societies as well as to create
additional third party funding for universities themselves. In the meantime
continuing and adult education receives a higher attention at universities to
become a player in lifelong learning framework, to attract adult learners and
to develop the “partner for life” university. In most of the European countries
financial aspects are crucial and therefore continuing education is partly a
business driven phenomenon within third mission activities. Only social
engagement as part of the third mission seems less influenced by financial
aspects and creates a social entrepreneur environment for third mission activities.
Finally, we can see that third mission as a service has strong link to the idea
of an entrepreneurial university. Business and social entrepreneurships can be
indentified in universities as part of the service oriented, modern and competitive
university and are significant elements of the third mission.
03-Marhl.indd 51 2011/8/3 下午 04:02:50
10. 52 Evaluation in Higher Education 5:1 (June 2011)
3. How to Measure Third Mission Activities
For measuring the third mission activities we need indicators. They form
the skeleton and the basic framework enabling us to assess to what extend the
universities fulfil their third mission. The question arises, however, how to define
feasible indicators. First, we have to have a clear picture of the main streams of
the third mission activities that the universities usually conduct; we call them
dimensions of the third mission activities. Second, we define processes, which
characterise particular dimensions, and third, according to the well-defined
processes we are looking for appropriate and feasible indicators.
3.1 Dimensions of the Third Mission
The dimensions of the third mission activities are the main features and the
most prominent entities of the university activities, which the universities carry
out in order to fulfil their third mission. We analyse usual activities for different
universities. Some universities are stronger and more active in one segment and
some of them in another segment. It is not easy to decide and it is not trivial
at all to have a unique set of dimensions. There is a problem to have complete
independent dimensions, since there are some activities, which are important
not for only one dimension but it is partially involved in more dimensions.
However, our analysis has shown that it is reasonable to introduce three basic
dimensions of the university third mission activities: Continuing Education (CE),
Technology Transfer & Innovation (TTI), and Social Engagement (SE).
3.2 Processes
When the main dimensions are established, it is obvious to look at the
processes being related to those dimensions. This enables us to recognize
possible indicators for measuring particular aspects in the given dimensions. It is
not a simple task to define, or even to recognise and to identify all the processes.
Let us present an example regarding to the dimension Technology Transfer &
Innovation. We at least can have a look at the
• Entrepreneurial Process
• Structural Cooperation Process
• Network Process
03-Marhl.indd 52 2011/8/3 下午 04:02:50
11. Marhl, Pausits: Third Mission Indicators for New Ranking Methodologies 53
Entrepreneurial Processes includes, for example, contract-based research
and consultancy, protection of intellectual property rights (IPR), licensing,
establishing of start-ups, spin-offs, and spin-out companies, university business
incubators, scientific, discovery, and technology parks.
Structural Cooperation Process includes, for example, extent of the
cooperation in the research and development, sharing of space, facilities and
equipment, cooperation in education, mobility of people, etc.
Network Process includes formal and informal networking, which sounds
rather trivial, but the networking in all forms is gaining its importance even
more, in particular in the recent trends of globalisation, new information
channels, in particular internet and all other possibilities of communication.
3.3 Indicators
The analysis of processes enabled us to produce a rather large set of
possible indicators. All these indicators need to be evaluated. We have to apply
new methods for selecting the most relevant indicators. For this purpose we
applied Delphi method. The Delphi method was originally developed in the 50s
by the RAND Corporation in Santa Monica, California (Cuhls, 2011). This is a
survey method based on experts’ opinions obtained in consecutive rounds. The
information obtained in the first round is used as a basis for the questionnaire
used in the second round, and this procedure is repeated two or three times. We
used three rounds. During the first and the second round we were looking for
common visions of the experts, and in the third round we were trying to obtain a
more global vision of the whole set of indicators.
In the following we present the outcomes of the Delphi process according
to the particular steps of our procedure. We start with the selection of experts for
the Delphi process, and then we separately present the basic frame of outcomes
for all three steps of the Delphi process. In particular, we present the selection
process and the way of reducing and extracting the most relevant indicators of all
three steps. In the end the final results wit the current final set of indicators are
presented separately.
3.3.1 Selection of Experts for the Delphi Process
First of all, in the Delphi process relevant experts have to be selected. We
collected proposals from all of the partners in the project. Our colleagues were
03-Marhl.indd 53 2011/8/3 下午 04:02:50
12. 54 Evaluation in Higher Education 5:1 (June 2011)
requested to make individual proposals for experts, for whom they believe to
be specialists in particular dimensions, either in the CE, TTI, or SE. Once we
received all the proposals, we made the selection of the experts and established
the final list of experts. In this process of selection, mainly three basic criteria
were considered: the expert’s profile, the Delphi needs, and the availability of
funds.
3.3.2 Role of Experts
The questionnaires were sent to the experts. They were requested to give
their opinion on how the indicators are described and to express their general
view on the whole set of indicators in order to achieve a consensus on the best
indicators which measure the third mission activities. Depending on their field
of expertise, the experts contributed to one, two or three dimensions. For each
dimension we developed separate questionnaires. In the first round, experts also
have the opportunity to suggest additional indicators if they felt a lack of them
in some perspectives of the original proposal. The main role of the experts was
to select the most relevant indicators from the initial set of indicators, which
contained a rather large number of indicators.
3.3.3 The 1st Delphi Round
In the first round three different questionnaires were elaborated according
to the three dimensions that we have identified in the frame of the third mission
activities. Experts were asked to evaluate the indicators and they also had the
opportunity to propose additional indicators where they consider they would
be needed. However, we already had more than one hundred indicators for the
first round, therefore the experts were asked to follow the main idea and try to
identify the most relevant indicators and herewith actually reduce the number of
the indicators being in the initial set.
The evaluations obtained from the experts were analysed. For the analysis
we decided to take into account the following considerations:
• Treatment of missing values
• Selection of indicators -- organizing them into five categories
• Reformulation of indicators
03-Marhl.indd 54 2011/8/3 下午 04:02:50
13. Marhl, Pausits: Third Mission Indicators for New Ranking Methodologies 55
(1) Treatment of Missing Values
Obtaining the reports of experts, we realized that some of the questions
were not answered. However, since the rate of non-answered questions was
minimal and rather distributed, i.e., not particularly focused on specific items,
we decided not to make any correction actions and then calculate the descriptive
statistic and the dispersion excluding the missing values.
(2) Selection of Indicators -- Organizing into Five Categories
The experts evaluate the indicators according to their: “Relevance”,
“Validity”, “Reliability”, “Feasibility”, and “Comparability”. For each of
this attribute the experts give an assessment according to the 4-step scale:
“Unimportant”, “Slightly important”, “Important”, and “Very important”.
On the basis of the experts’ assessments we calculated the percentage
of those indicators here the attribute was marked as “Important” and “Very
important”. With these values the following decision criteria was established:
1. If more than 66% of answers in the attribute of “Relevance” were assessed as
“Important” and “Very important”, the indicator is maintained without any
further consideration.
2. If less then 66% of answers in the attribute of “Relevance” were assessed
as “Important” and “Very important, the other attributes (“Validity”,
“Reliability”, “Feasibility”, and “Comparability”) and the experts’ comments
were consider in order to decide if the indicator should be maintained or not.
3. For all indicators, decided to be maintained, all its aspects (attributes and
comments) were revised once again in order to decide if an indicator could be
kept with or without modifications.
According to the procedure described above, the indicators were classified
into five categories:
• Category 1 -- “Unchanged”: The indicator is maintained without changes
• Category 2 -- “Modified”: The indicator is maintained with some changes
• Category 3 -- “Doubtful”: The indicator is still undecided
• Category 4 -- “Deleted”: The indicator has been removed
• Category 5 -- “Added”: A new indicator is proposed
Indicators classified into Category 1 and Category 2 were maintained
without changes or with some (by the experts proposes) changes, respectively.
03-Marhl.indd 55 2011/8/3 下午 04:02:50
14. 56 Evaluation in Higher Education 5:1 (June 2011)
The attributes of the Category 3 should be re-evaluated since the experts in
the first round had not yet reached a consensus. Indicators classified into the
Category 4 were deleted and not proposed for any further consideration in the
next round. In the Category 5 were new indicators proposed by the experts.
These indicators should be evaluated in the next round.
(3) Reformulation of Indicators
For indicators in the Categories 2 and 3 we consider all the suggestions and
comments given by the experts. In particular, the comments, criticism and the
suggestions related to the formulation and the terminology used for indicators’
description, we took into account very precisely. According to this analysis,
appropriate corrections were made.
(4) Results of the 1st Round
The quantitative results according to the number of remaining indicators
are shown in Table 2. The reduction of the initially proposed indicators is clearly
seen. The rate of reduction is around 25% for the first two dimensions (Continuing
Education and Social Engagement), whereas the drop in the dimension of
Technology Transfer and Innovation is more than 47%.
Table 2. Reduction of the Number of Indicators in the First Round of the Delphi
Process. The Results Are Shown Separately for All Three Dimensions
First round
Dimension Initial number of indicators Final number of indicators
Continuing Education 28 21
Social Engagement 31 23
Technology Transfer and Innovation 36 19
3.3.4 The 2nd Delphi Round
For the second round we also prepared three different questionnaires for
all three dimensions: for the CE, SE, and for the TT&I. The main goal of the
second round was to further assess the indicators being selected by the procedure
in the first round. The evaluation in the second round was more specified. Here a
decision must particularly be taken for those indicators where no consensus was
03-Marhl.indd 56 2011/8/3 下午 04:02:50
15. Marhl, Pausits: Third Mission Indicators for New Ranking Methodologies 57
reached during the first round. In other words, the indicators being classified in
Category 2 and Category 3 in the first round were re-evaluated. In addition, the
new indicators proposed by the experts in the first round were evaluated now in
the second round.
The responses from the experts in the second round were analysed and
summarized in the sense of a decision-making process in order to be know if a
given indicator should be remained or not.
The quantitative results according to the number of remaining indicators
are shown in Table 3. The reduction of the initially proposed indicators is
clearly seen; however, in the second round a very high degree of consensus
was achieved. The rate of reduction is only around 15% in all the dimensions
(Continuing Education, Social Engagement, and Technology Transfer and
Innovation). Moreover, the further positive output of the second round is that the
majority of the comments given by the experts were related to the terminology
and the interpretation of a rather small subset of the indicators.
Table 3. Reduction of the Number of Indicators in the Second Round of the Delphi
Process. The Results Are Shown Separately for All Three Dimensions
Second round
Dimensions Initial number of indicators Final number of indicators
Continuing Education 21 18
Social Engagement 23 20
Technology Transfer & Innovation 19 16
3.3.5 The 3rd Delphi Round
The third round of the Delphi process was structured differently in
comparison to the first and the second round. In the two previous rounds the
experts evaluated the indicators independently from each other. In contrast,
in this round they were requested to give us a global view and opinion about
the whole set of indicators considering all three dimensions: the Continuing
Education, Social Engagement, and the Technology Transfer & Innovation.
The experts were requested to assess the importance and the feasibility of each
indicator using a rating scale of 1 to 7, from the least to the most important and
03-Marhl.indd 57 2011/8/3 下午 04:02:50
16. 58 Evaluation in Higher Education 5:1 (June 2011)
feasible. The importance gives us the base to identify the relative significance of
each indicator, and the feasibility provides a contrast element for further phases
of the study.
In the third (in our case the final) round of the Delphi process, all the
experts were requested to rate the importance and the feasibility for the complete
set of 54 indicators, which corresponds to the output of the second round. Also
in this case the questionnaire was divided into three parts according to the three
existing dimensions.
We analysed the responses of the experts. For each indicator we calculated
the mean value of the particular values given by the experts. The mean
values were calculated for the importance and the feasibility of each indicator
separately. These mean values can be interpreted easily for every indicator,
considering the same scale of 1 to 7 of the questionnaire, from the least to the
most important and feasible. The results are interesting and show that all the
indicators considered in the third round are rated high. We realise that all mean
values exceed the value of 4, which is more than 57% of the assessment range
(1-7). Therefore, the number of the final (output) indicators is the same as the
number of the initial (input) indicators, as shown in Table 4.
Table 4. The Number of Indicators in the Third Round of the Delphi Process
Third round
Dimensions Initial number of indicators Final number of indicators
Continuing Education 18 18
Social Engagement 20 20
Technology Transfer & Innovation 16 16
3.4 Results -- The Current Set of Indicators
The final results of the third round of the Delphi process provided us with
54 indicators, 18 in the dimension Continuing Education, 20 in the dimension
Social Engagement, and 16 in the dimension Technology Transfer & Innovation.
In the following we give the list of all the indicators. There are three lists, one
for each dimension (Final Report of Delphi Study, 2011).
03-Marhl.indd 58 2011/8/3 下午 04:02:51
17. Marhl, Pausits: Third Mission Indicators for New Ranking Methodologies 59
The following list of indicators were selected for the CE dimension:
• CE is included in the mission of the HEI
• CE is included in the policy and/or the strategy of the HEI
• Existence of an institutional plan for CE in the HEI
• Existence of quality assurance procedure for CE activities
• Total number of CE programmes active in that year for implementation
• Number of CE programmes delivered which have a major award under higher
education system
• Number of partnership with public and private business CE programmes
delivered in that year
• Percentage of international CE programmes delivered in that year
• Percentage of funded CE training projects delivered in that year
• Total number of the ECTS credits of the delivered CE programmes
• Number of ECTS credits enrolled
• Number of registrations in CE programmes in that year
• Percentage of CE ECTS enrolled referred to the total ECTS enrolled
• Percentage of qualifications issued referred to total CE registrations
• Students satisfaction
• Key stakeholder satisfaction
• Completion rate for all programmes (in average)
• Percentage of CE programmes with external accreditations
The following list of indicators were selected for the TTI dimension:
• TTI is included in the mission of the HEI
• TTI is included in the policy and/or strategy of the HEI
• Existence of an institutional action plan for TTI in the HEI
• Number of licences, options and assignments (active and executed, exclusive
and non-exclusive) to start-ups or spin-offs and existing companies
• Total budget coming from revenues from commercialisation of HEI
knowledge
• Number of start-ups and spin-offs established
• Number of creative commons and social innovation projects that HEI
employees are involved
• Number of R&D sponsored agreements, contracts and collaborative projects
with non-academic partners
03-Marhl.indd 59 2011/8/3 下午 04:02:51
18. 60 Evaluation in Higher Education 5:1 (June 2011)
• Percentage of HEI budget coming from income of R&D sponsored contracts
and collaborative projects with non-academic partners
• Number of consultancy contracts
• Percentage of postgraduate students and postdoctoral researchers directly
funded or co-funded by public and private businesses
• Number of created (co-funded) or shared laboratories and buildings
• Number of companies participating in continuous professional development
courses (CPD)
• Number of HEI employees with temporary positions outside of academia
• Number of non-academic employees with temporary positions at HEIS
• Number of postgraduate thesis or projects with non-academic co-supervisors
• Number of joint publications with non-academic authors
• Number of academic staff participating in professional bodies, networks,
organizations and boards
• Number of external organizations or individuals participating at advisory,
steering, validation, review boards to HEIS, institutes, centres or taught
programmes
• Number of prestigious innovation prizes awarded by business and public
sector associations or funding agencies (national and international)
The following list of indicators were selected for the SE dimension:
• SE is included in the mission of the HEI
• SE is included in the policy and/or strategy of the HEI
• Existence of an institutional action plan for SE in the HEI
• Budgetary assignment to SE
• Percentage of academics involved in volunteering advisory activities
• Number of events open to community/public
• Number of research initiatives with direct impact on the community
• Number/cost of staff/student hours made available to deliver services and
facilities to community
• Number of people attending/using facilities
• Number of projects related to educational outreach
• Number of faculty staff and students involved in educational outreach activity
• Percentage of HEI budget used for educational outreach
• Number of community participants in educational outreach activity
03-Marhl.indd 60 2011/8/3 下午 04:02:51
19. Marhl, Pausits: Third Mission Indicators for New Ranking Methodologies 61
• Number of activities specifically targeting disadvantaged students /community
groups
• Number of community representative on he boards or committees
• Amount of grants/donations/contracts arising from engaged partnerships
4. Conclusions
In this paper the Third Mission of universities is described as a strategy
to develop entrepreneurial universities based on activities in three different
dimensions. In particular the concepts and current state of the European
project E3M are discussed in relation to third mission frameworks. The current
results of the E3M projects are the results of recently finalised Delphi process,
which has been used to formulate the final set of indicators for measuring the
third mission activities of universities. Moreover, the Delphi process has also
served to prove the usefulness of the method for the depuration of the initial
collection of indicators. It should be emphasised, however, that the purpose
of the Delphi process was not just to limit the number of indicators and try to
make a reasonable selection of the initial set of indicators. According to the
procedure, the final indicators were also rated above the media in relation to
their importance. This is actually the natural consequence of the three round
Delphi process we have used. In the same time it is clear that our approach is
a qualitative one and therefore need further field testing. Most of the indictors
are descriptive, which is somehow the result of an international discussion
about third mission and is influenced by project goals. Our work should be seen
more as an additional step to develop further understanding of third mission
and should not be seen as a final manifest. Our motivation is that the scientific
discussion but also the views on third mission in practice get more fundamentals
to improve the “puzzle of third mission”. Therefore the list of dimensions and
related indicators provide a first operative and process perspective, which is a
sign of a work on a common understanding. The readers may discuss the set
of indicators because the list miss some important elements. Elements like
contribution of popularizing sciences trough popular journals or the contribution
of arts and humanities would be also important to add. Further definitions like,
what is continuing education in different national systems, needs also further
03-Marhl.indd 61 2011/8/3 下午 04:02:51
20. 62 Evaluation in Higher Education 5:1 (June 2011)
clarification. Fur the future, collecting and providing more data and information
within HEIs as well from HEIs of third mission, will be an important task.
To summarise we would like to stress on very positive experiences and
results of our study. In particular different properties of indicators, like relevance,
feasibility and others have proved to be useful for rating different aspects of the
interest and importance of the information handled. However, on the other hand,
we still feel some lacks and weakness, which need to be carefully considered
in the frame of our project and in any other future studies. Looking at all three
dimensions: CE, SE, and TTI, there is a general agreement considering that
the CE indicators are the most feasible, and on the other hand, there are some
doubts about the feasibility of some SE indicators. Moreover, the CE indicators
are in most cases well defined and well accepted; the SE indicators, on the other
hand, are not always easy to define, not easy even to formulate some aspects
of activities in the frame of the SE. In addition to that, some aspects are also
difficult to measure and quantify. Therefore, we would like to emphasise, that the
quantification should not be put in the foreground; we should prevent that some
people were trying to avoid particular aspects, actually some very important
issues of the Third Mission of the Universities, only because they are not easy to
measure. Regarding to our results, the aim of our study was to establish a well-
defined set of indicators for the Third Mission of Universities. The relevance of
these indicators should not be judged on the bases of their quantification, i.e., if a
particular indicator can easily be measured and quantified or not. The indicators
should be evaluated and assessed on the basis of their importance and feasibility.
The aim of further activities in the field is to establish a comprehensive way
of assessing the Third Mission activities of universities. We cannot be satisfied
with a partial set of indicators that would provide us with partial and mainly
quantitative measures. The intention must be to evaluate the Third Mission
activities of universities in a broader sense and to include also qualitative
aspects, although they might request additional efforts in order to be assessed
in an appropriate way. In the future we also need to be aware that the aspects
being assessed, e.g., in a way of indicators and other measures, have to be
permanently improved. The most obvious issue is looking for missing items and
the completion of the current set of indicators. This should be considered as a
flexible way of improvement. Therefore, at the current stage of the E3M project,
03-Marhl.indd 62 2011/8/3 下午 04:02:51
21. Marhl, Pausits: Third Mission Indicators for New Ranking Methodologies 63
additional aspects are permanently discussed. Through institutional visits,
international conferences, publications and other forms of peer-oriented ways of
communication, we improve the current set of indicators, and look for new ideas.
As current ideas under considerations are, for example, additional dimensions
of social engagement contributing particularly to arts and humanities, and to
regional/national society or the contribution of popularizing sciences (e.g.,
through publications in popular journals/newspapers). Also in the dimension of
technology transfer, the cooperation between universities and industry in co-
patents, or the institution’s license income and similar aspects are important.
There are many other issues being discussed at the time and we need additional
analyses and further peer discussions in order to be able to create a professional
and comprehensive way of measuring of University Third Mission activities. But
there is no doubt that we need it.
References
Arbo, P., & Benneworth, P. (2007). Understanding the regional contribution of
higher education institutions: A literature review. [Online] Retrieved June,
2011, from http://www.oecd.org/dataoecd/55/7/37006775.pdf
Cuhls, K. (2011, April 17). Delphi method. [Online] Retrieved June, 2011, from
http://www.unido.org/fileadmin/import/16959_DelphiMethod.pdf
Clark, B. (1998). The entrepreneurial university: Demand and response. Tertiary
Education and Management, 4, 5-16.
Deutscher Bildungsrat. (1970). Empfehlungen der bildungskommission:
Strukturplan für das bildungswesen. Stuttgart: Dt. Bildungsrat.
Etzkowitz, H. (2001). Innovation in innovation: The triple helix of university-
industry-government relations. Social Science Information, 42, 293-337.
Etzkowitz, H., & Leydesdorff, L. (1997). Introduction: Universities in the global
knowledge economy. In L. Etzkowitz & L. Leydesdorff (Eds.), Universities
and the global knowledge economy: A triple helix of university-industry-
government relations (pp. 1-8). London: Pinter.
Final Report of Delphi Study. (2011). E3M project -- European indicators
and ranking methodology for university Third Mission (Project No.
143352-LLP-1-2008-1-ES-KA1-KA1SCR). Valencia, Spain: Center for
Quality and Change Management.
03-Marhl.indd 63 2011/8/3 下午 04:02:51
22. 64 Evaluation in Higher Education 5:1 (June 2011)
Hansen, U. (1999). Die universität als dienstleister: Thesen für ein
leistungsfähigeres management von hochschulen. In B. Stauss, I.
Balderjahn, & F. Wimmer (Eds.), Dienstleistungsorientierung in der
universitären ausbildung: Mehr qualität im betriebswirtschaftlichen studium
(pp. 369-385). Stuttgart: Schäffer-Poeschel Verlag.
Hödl, E., & Zegelin, W. (1999). Hochschulreform und hochschulmanagement.
Marburg: Metropolis Verlag.
Molas-Gallart, J., Salter, A., Patel, P., Scott, A., & Duran, X. (2002). Measuring
third stream activities -- Final report to the Russell Group of Universities.
Brighton, UK: University of Sussex, SPRU.
Montesinos, P., Carot, J. M., Martinez, J.-M., & Mora, F. (2008). Third Mission
ranking for world class universities: Beyond teaching and research. Higher
Education in Europe, 33, 259-271.
Pausits, A. (2006). Student relationship management. Flensburg: Universität
Flensburg.
03-Marhl.indd 64 2011/8/3 下午 04:02:51