The primary aim of this study is to investigate the contributions of the CEFR linking process, as stipulated by the Manual for Relating Examinations to the Common European F ramework of Reference for Languages: learning, teaching, assessment (Council of Europe, 2003), to the validation argument of a university level English language proficiency examination. It also aims to explore the impact of the linking process on the pre-determined or desired level of the examination under study. This study uses both qualitative and quantitative methods to address the above areas and is comprised of three phases. Phase 1 explores every stage of the CEFR linking process as they are being carried out through field notes, interviews, questionnaires and statistics in order to investigate how well the Manual suggestions capture aspects of validity and guide users in this respect. In Phase 2, the study focuses on an overall investigation of the process through a questionnaire after all stages of linking, viz. familiarisation, specification, standardisation and empirical validation, have been conducted. Finally, Phase 3 examines the Manual itself and its suggestions with respect to validation through a critical analysis of the Manual, a questionnaire and interviews. The study showed that the CEFR linking process helps users focus on particularly the context, cognitive and scoring aspects of validity at all stages, but mostly at the standardisation stage of the process. Provided that data are accumulated systematically at different stages, at the end of the linking process, those undertaking a linking study can put forward a complete validation argument for the examination in question. However, the Manual fails to provide a model that guides users in this respect. The iii process also highlights areas to be considered, should the users set out to design or modify an existing examination to measure at the set desired standards.