The three PCAs were conducted on
Likert­type scales, ranging from 1 to 5.

Two types of scales were used:

1. Not important to very important (PCA
on functions)

2. Totally disagree to totally agree (PCA
on embeddedness and on project
performance)

The reliability of the factor data was
tested using Cronbach’s alpha; values
greater than 0.6 were considered reli­
able (Cronbach, 1951); factors loading
greater than 0.5 were considered sig­
nificant (Hair et al., 2010). Those factors
not meeting these two threshold levels
were considered orphans and listed in
the respective tables but excluded from
further analyses. The Kaiser, Meyer, and
Olkin method (KMO) was used to mea­
sure covariance. All KMO values are
above 0.7 (Hair et al., 2010).

The second type of multivariate
methods included two nonparametric
methods:

1. Nonparametric analysis of variance:One­way analysis of variance usingthe Kruskal and Wallis test (Kerlinger& Lee, 2000). This method was per­formed to see if differences betweenthe four types of projects are globallysignificant.

Results from Factor Analysis

Three different factor analyses were
performed for the purpose of summari­
zation and reduction: functions, embed­
dedness, and project performance.

Factor Functions

Functions constitute one component ofthe conceptual framework (See Figure 1)and it refers to what the PMO is per­forming. Two different sets of questionswere asked regarding the 27 functions:the importance and the degree of real­ization. Several factor analysis strate­gies were performed using these twosets of answers. In our judgment, theframework that better fits the context ofthis research is based on the questionof the degree of realization (rather thanimportance). However, five functionshad to be excluded from this analysis forthree reasons: ( 1) some functions didnot load into a factor; ( 2) some func­tions loaded into more than one factor;and ( 3) some functions loaded into afactor making interpretation much moredifficult. Ultimately, 22 functions loadedinto five factors, each comprising agroup of functions (as shown in Table 3).

The first group labeled ‘Project per­
formance and portfolio management’
brings together the monitoring and con­
trolling of a single project within other
functions associated with the manage­
ment of a portfolio of projects. Unger,
Gemünden, and Aubry (2012) have
related this function to a controller role
in the PMO. In the public context, it
seems that there is no locus for individual
project monitoring and control outside
portfolio management. This observation
is complemented by the exclusion of two
orphan functions: ‘Implement and oper­
ate a project information system’ and
‘Develop and maintain a project score­
card.’ Interpretation of this result puts
into question the quality of the monitor­
ing and control of individual projects in
public administration. Alternately, the
inclusion of benefits management aligns
with a view of portfolio management as
a means to delivering expected benefits
provided in a strategic process (Zwikael
& Smyrk, 2012).

The second group of functions,
‘Methodologies and competencies,’ is
quite coherent with previous models
(e.g., Hobbs & Aubry, 2007) where two
different types of activities are loaded
together: activities related to standard­
ization and activities related to the
development of project management
competencies. The combination of both
makes sense because training people
in methodologies and standards will
certainly lead to their better use. Note­
worthy is that networking and environ­
mental scanning were loaded in this
group; it is quite interesting that the
public administration includes open­
ness to external views that can incorpo­
rate innovation into more conventional
approaches to projects. In this particu­
lar public setting, there is a strong com­
munity of practice dedicated to project
management, where practices are con­
sistently discussed between practitio­
ners and where researchers are invited
to present their results. One function