• Each standard comprised of sub-components that are evaluated and
scored against expectations c Strong ( 7. 5-10)

■ Fair ( 5.0-7.4)

■ Weak ( 2. 5-4. 9)

■ Critical (0.0-2.4)

• Weighted sum of sub-components within each Standard equals up to

10 points• No penalty if unable to score a sub-component, or whole Standarddeemed not to apply to the JV

• Time dedication – Do members of the
asset team dedicate enough time to
do their work well, to understand the
bigger picture surrounding the asset,
and to feel real accountability as part
of a team – or is everyone involved
while nobody is accountable?

• Continuity – Do the members of the
asset team stay involved with the asset
long enough to nurture effective relationships with peers in the Operator,
and understand the deeper nuances
of the asset – or do they rotate so often
that the Operator feels whiplash, and
nobody has a holistic picture of the
asset?

• Location – Are team members located
close enough to collaborate consistently and effectively as a team? Are
they also located close enough to the
Operator to spend real time there, rather than trying to influence from another continent via scratchy video
conferencing?

• Orientation – Across the year, do key
asset team members (e.g., asset manager, technical leads) actually spend
real time engaging with the Operator,
and not purely focused on internal
meetings and information reporting?

• Reporting relationship – Does the asset
manager actually have a team reporting to them? Is the asset manager fully
accountable for setting goals and objectives for team members who spend
more than half their time on the asset,
even if they belong to another
function?

When applied against our Standards,
the answers to these questions can be
surprising. One mining major found it
difficult to exert any influence in Operating Committee Meetings (OCMs) as a
non-operator. The cause? It tended to
appoint OCM representatives from junior
commercial staff, while the Operator and
other non-operators sent senior experts
capable of complex technical discussions.
Or consider an IOC where teams generally
spent no more than 10% of their time
actually working with the Operator’s
team, which is below the industry benchmark and far below the 25% minimum
that we test for. This IOC had a voracious

ASSET TEAM PROFILE

Of the 15 Standards, one relates to the profile of the non-operated team – its size,
functional mix, seniority, time dedication, location, reporting relationships, etc. Team
profile can make all the difference in the effectiveness of a non-operator. A small,
technically-oriented team with relatively senior resources and strong communication
and influencing skills usually outperforms a team that is twice as large, but half as
capable and spread across locations.

When we test non-operated asset teams against this Standard, we look at eightdifferent components to reveal the full picture (Table 2). In the most general sense,these components each answer a different question:• Size – Do the number of FTEs (Full Time Equivalents) allocated to the asset reflectits current value and risk profile, and the company’s ability to influence – or are theya product of historical needs or other less relevant factors?

• Functional mix – Given the Operator’s relative capabilities, the profile of upcoming
projects in the asset, and the company’s own strengths, does the asset team have
the right mix of functional support to do things that would generate influence and
add value?

• Seniority – Does the team’s seniority profile match up with the Operator’s team
profile, our influencing objectives, and the technical complexity of the asset (e.g., if
the Operator is senior-heavy, are we sending junior people to try and influence
them)?