BACKGROUND Arguably the most important opportunity to acquire the standards and norms of any discipline and develop researchers' judgement is the peer review process â and this is probably particularly true in an emerging discipline such as engineering education. Ironically, research in many disciplines has established that the review process is deeply flawed in conception as well as (often) in operation, with the American Medical Association asserting that if peer review were a drug it would never be allowed on to the market. And yet university ranking systems for published research, on which all of our careers depend, rely on this flawed instrument. With this in mind we have been examining how members of our community (AAEE) give and respond to reviews with a view to making the process more useful. PURPOSE Reviewing is an inexact and subjective process so it would be misguided to think that somehow inter-rater reliability or some notion of objective âtruthâ may be attained. Instead, we ask what reviewers need to do to provide helpful advice that can help shape norms and standards in the field. DESIGN/METHOD In previous work (Willey et. al. 2011; Jolly et.al. 2011) there appeared to be a need for well-expressed criteria that would guide authors on what a publication should contain and guide reviewers in how judgements should be made. With the help of a Delphi panel made up of 12 international researchers in the field a set of criteria were developed. Volunteers were then sought to apply the criteria to sample texts in an online tool (SPARKplus). Individual interviews with some respondents were then used to clarify participantâs understandings and goals. RESULTS The criteria developed by the Delphi panel are those being used for this conference. The members of the panel particularly approved the âcommentsâ accompanying the criteria per se which were intended primarily as guidance to authors about acceptable practice. Anecdotal evidence to date suggests that authors should find the criteria and comments clarify expectations but the matter of standards will remain. The use of the criteria in the second stage and analysis of the discussions in particular will produce information both about present expectations and practices and visions of future growth and improvement. CONCLUSIONS Our analysis of the stage 2 data will aim to describe consensus on research quality and how to use the peer review process to help attain it, in the form of recommendations for future application of the criteria, in journals as well as at conferences. Our international experts from the Delphi panel have expressed an interest in being involved in stage 2 and informed about the outcomes so the potential also exists for this community to develop best practice peer review in engineering education thorough the sharing of their expertise in this way.

en_US

dc.format

Shannon Brown

en_US

dc.publisher

Swinburne University of Technology

en_US

dc.relation.ispartof

Proceedings of the 23rd Annual Conference for the Australasian Association for Engineering Education - The Profession of Engineering Education: Advancing Teaching, Research and Careers

BACKGROUND Arguably the most important opportunity to acquire the standards and norms of any discipline and develop researchers' judgement is the peer review process â and this is probably particularly true in an emerging discipline such as engineering education. Ironically, research in many disciplines has established that the review process is deeply flawed in conception as well as (often) in operation, with the American Medical Association asserting that if peer review were a drug it would never be allowed on to the market. And yet university ranking systems for published research, on which all of our careers depend, rely on this flawed instrument. With this in mind we have been examining how members of our community (AAEE) give and respond to reviews with a view to making the process more useful. PURPOSE Reviewing is an inexact and subjective process so it would be misguided to think that somehow inter-rater reliability or some notion of objective âtruthâ may be attained. Instead, we ask what reviewers need to do to provide helpful advice that can help shape norms and standards in the field. DESIGN/METHOD In previous work (Willey et. al. 2011; Jolly et.al. 2011) there appeared to be a need for well-expressed criteria that would guide authors on what a publication should contain and guide reviewers in how judgements should be made. With the help of a Delphi panel made up of 12 international researchers in the field a set of criteria were developed. Volunteers were then sought to apply the criteria to sample texts in an online tool (SPARKplus). Individual interviews with some respondents were then used to clarify participantâs understandings and goals. RESULTS The criteria developed by the Delphi panel are those being used for this conference. The members of the panel particularly approved the âcommentsâ accompanying the criteria per se which were intended primarily as guidance to authors about acceptable practice. Anecdotal evidence to date suggests that authors should find the criteria and comments clarify expectations but the matter of standards will remain. The use of the criteria in the second stage and analysis of the discussions in particular will produce information both about present expectations and practices and visions of future growth and improvement. CONCLUSIONS Our analysis of the stage 2 data will aim to describe consensus on research quality and how to use the peer review process to help attain it, in the form of recommendations for future application of the criteria, in journals as well as at conferences. Our international experts from the Delphi panel have expressed an interest in being involved in stage 2 and informed about the outcomes so the potential also exists for this community to develop best practice peer review in engineering education thorough the sharing of their expertise in this way.

OPUS Help

OPUS

OPUS (Open Publications of UTS Scholars) is the UTS institutional repository. It showcases the research of UTS staff and postgraduate students to a global audience. For you, as a researcher, OPUS increases the visibility and accessibility of your research by making it openly available regardless of where you choose to publish.

Items in OPUS are enhanced with high quality metadata and seeded to search engines such as Google Scholar as well as being linked to your UTS research profile, increasing discoverability and opportunities for citation of your work and collaboration. In addition, works in OPUS are preserved for long-term access and discovery.

The UTS Open Access Policy requires UTS research outputs to be openly available via OPUS. Depositing your work in OPUS also assists you in complying with ARC, NHMRC and other funder Open Access policies. Providing Open Access to your research outputs through OPUS not only ensures you comply with these important policies, but increases opportunities for other researchers to cite and build upon your work.

OPUS archives UTS research submitted for the UTS Research Output Collection (UTS ROC) and Excellence in Research for Australia (ERA). It also stores digital theses and forms of scholarship that do not usually see formal publication.

How can you deposit works in OPUS?

When you claim (or enter) your research in Symplectic Elements, simply upload a copy of your work which can be made openly available. Symplectic provides information on which version of your work to upload. If you are unsure, please supply a copy of the Accepted Manuscript version. Ensure you check the box to "agree to the OPUS license terms".

Once uploaded, your works are automatically sent to OPUS and placed temporarily in Closed Access until reviewed by UTS Library staff.