The Research Excellence Framework (REF)

Similar documents
Units of assessment and recruitment of expert panels

Annex A Summary of additional information about outputs

University Grants Committee. Research Assessment Exercise Draft General Panel Guidelines

REF 100 words workshop

University of Dundee University Innovation Fund (UIF) AY

National Teaching Fellowship Scheme (NTFS) Awards guidelines

PROGRAMME SPECIFICATION KEY FACTS. Health Sciences. Part-time. Total UK credits 180 Total ECTS 90 PROGRAMME SUMMARY

EPSRC Monitoring and Evaluation Framework for the portfolio of Centres for Doctoral Training (CDT s) Updated January 2011

NERC Impact Awards 2018

Maximising the impact of nursing research. RCN research conference 5-7 April 2017, Oxford, UK

SHOULD I APPLY FOR AN ARC FUTURE FELLOWSHIP? GUIDELINES

CANCER COUNCIL NSW PROGRAM GRANTS PEER REVIEW GUIDELINES

Nominating Institution and Nominee Guidance

UKRI Future Leaders Fellowships Overview of the scheme

Centre for Cultural Value

Achievements and Outcomes

Wolfson Foundation. Strategy,

Collaborative Commissioning in NHS Tayside

Research Funding: Expanding Excellence in England (E3) Fund

Research Centres 2016 Call Webinar January Abstract Deadline: 04/03/16, 1pm Pre-Proposal Deadline: 28/04/16, 1pm

Independent Review of the Implementation of RCUK Policy on Open Access

FUNDING OF SCIENCE AND DISCOVERY CENTRES

NIHR Funding Opportunities

Third stream - England Experience OECD, Valencia November 08

Confirmation of Doctor of Philosophy (PhD) Candidature

Creative Industries Clusters Programme Creative Research & Development (R&D) Partnerships Call specification Stage 1

GLOBAL CHALLENGES RESEARCH FUND TRANSLATION AWARDS GUIDANCE NOTES Closing Date: 25th October 2017

UKRI Future Leaders Fellowships Frequently Asked Questions

International Doctorate Centre. High Value, Low Environmental Impact Manufacturing

Economic Impact of the University of Edinburgh s Commercialisation Activity

Quick Reference. Early Career Forum in Manufacturing Research

Supporting information for appraisal and revalidation: guidance for psychiatry

Investing in excellence, delivering impact for the UK

Creative Industries Clusters Programme Programme Scope

ESRC Centres for Doctoral Training Je-S guidance for applicants

Post-doctoral fellowships

Guidance notes: Research Chairs and Senior Research Fellowships

Office for Students Challenge Competition Industrial strategy and skills support for local students and graduates

MASONIC CHARITABLE FOUNDATION JOB DESCRIPTION

10. Secure, clean and efficient energy

Public Health Skills and Career Framework Multidisciplinary/multi-agency/multi-professional. April 2008 (updated March 2009)

Future Manufacturing Research Hubs

1.1 Introduction. 1.2 Strategic Context HES Corporate Plan

Teaching Excellence Framework Subject pilot

The hallmarks of the Global Community Engagement and Resilience Fund (GCERF) Core Funding Mechanism (CFM) are:

Quick Reference. Robotics and Artificial Intelligence Hubs in Extreme and Challenging (Hazardous) Environments

Post-doctoral fellowships

Diagnostic Waiting Times

Applicant Guidance Notes 2017 / 18 Engineering Leaders Scholarship

Quick Reference. Tackling global development challenges through engineering and digital technology research

REQUEST FOR QUOTATION. To complete an audit of the Traditional Music Sector in Northern Ireland

Quick Reference. EPSRC/Energy Systems Catapult Whole Energy Systems Scoping Studies

Irish Research Council Government of Ireland (GOI) Postgraduate Scholarships Shona Leith Research Development Office

Supporting information for appraisal and revalidation: guidance for Occupational Medicine, June 2014

Knowledge and innovation action plan for

CHASE Collaborative Doctoral Award competition Call for projects

Initial education and training of pharmacy technicians: draft evidence framework

Supporting information for appraisal and revalidation: guidance for Occupational Medicine, April 2013

Goldsmiths Open Access Statement:

Research Assessment Exercise Panel 11 Humanities Specific Criteria and Working Methods (August 2013)

Higher Education Innovation Fund

STATISTICAL PRESS NOTICE MONTHLY CRITICAL CARE BEDS AND CANCELLED URGENT OPERATIONS DATA, ENGLAND March 2018

APPLICATION FORM: International Conservation Grants Program

UK-Egypt Newton-Mosharafa Fund Call for Proposals: Preserving Egypt s Cultural Heritage: Mitigating Threats for a Sustainable Future

AHRC COLLABORATIVE DOCTORAL PARTNERSHIP SCHEME Applying for a CDP studentship from the British Museum

Alpbach Technology Forum, The Efficiency of RTI Investments, 26 August 2011 EU RESEARCH : VALUE FOR MONEY?

Job Related Information

High performing research environments in nursing, midwifery and allied health professions

SHOULD I APPLY FOR AN ARC DECRA? GUIDELINES

NHS Ayrshire & Arran Adverse Event Management: Review of Documentation Supplementary Information Requested by NHS Ayrshire & Arran

European Research Council. Alex Berry, European Advisor 15 December 2015, Royal Holloway

GUIDANCE ON SUPPORTING INFORMATION FOR REVALIDATION FOR SURGERY

Standards to support learning and assessment in practice

Circular letter Funding for

Call for proposals. Nordic Centres of Excellence within escience in Climate and Environmental research

Guidance of Applying for a COLLABORATIVE DOCTORAL PARTNERSHIP (CDP) Studentship from the British Museum

THE ROYAL COLLEGE OF SURGEONS OF ENGLAND Strategic priorities

Learning Through Research Seed Funding Guide for Applicants

Doctoral Training Partnerships

Collaborative Research Grants Impact Report

Quick Reference. Future Vaccine Manufacturing Research Hub

Guidelines for Peer Assessors

We invite leading data scientists from any country or discipline to become a Visiting Researcher at The Alan Turing Institute.

Social Enterprise Awards 2014

India-UK ANTIMICROBIAL RESISTANCE SANDPIT EVENT: Addressing the challenge of antimicrobial resistance in India CALL FOR EXPRESSIONS OF INTEREST

STRATEGIC PLANNING

Health Sector: Improving Health and Well-being (National 5)

Ensuring quality outcomes

UK GIVING 2012/13. an update. March Registered charity number

Summary of programmes

Oxfordshire Strategic Environmental Economic Investment Plan. Stakeholder Workshop, 20 th May 2015

INTRODUCTION TO THE UK PUBLIC HEALTH REGISTER ROUTE TO REGISTRATION FOR PUBLIC HEALTH PRACTITIONERS

BBI JU Introduction & link to EU policies. Dieter BRIGITTA Project Officer

Pharmacy Schools Council. Strategic Plan November PhSC. Pharmacy Schools Council

Booklet to support competence in the administration of Intranasal Flu Vaccine

Peraproposal for EWG Task

SFI Terms of Reference for the 2-year Progress Review of the 2013 Research Centres

Better Skills Better Jobs Better Health. National Professional Standards for Herbal Medicine Guide

RIM: Challenges for the UK

Topical Peer Review 2017 Ageing Management of Nuclear Power Plants

Transcription:

The Research Excellence Framework (REF)

Overview: Purpose of the REF The REF is a process of expert review It replaces the RAE as the UK-wide framework for assessing research in all disciplines Its purpose is: - To inform research funding allocations by the four UK HE funding bodies (approximately 2 billion per year) - Provide accountability for public funding of research and demonstrate its benefits - To provide benchmarks and reputational yardsticks

Overview: The assessment framework Overall quality Outputs Impact Environment Maximum of 4 outputs per researcher Impact template and case studies Environment data and template 65% 20% 15%

Overview: Guidance and criteria Comprehensive information and guidance is set out in: Assessment framework and guidance on submissions (July 2011): - Sets out the information required in submissions and the definitions used Panel criteria and working methods (Jan 2012): - Sets out how panels will assess submissions - Refined following consultation in 2011 The above documents set out the official guidelines for the REF. These slides provide a summary of key points but do not provide or replace the official guidelines.

Overview: Submissions Each HEI may submit in any or all of the 36 units of assessment (UOAs) Each submission in a UOA provides evidence about the activity and achievements of a submitted unit including: - Staff details (REF1a/b/c) - Research outputs (REF2) - Impact template and case studies (REF3a/b) - Environment data (REF4a/b/c) - Environment template (REF5) A submitted unit may, but need not, comprise staff who work within a single department or organisational unit

Overview: Publication of results The primary outcome of the REF is an overall quality profile to be awarded to each submission: - E.g. 23% 4*; 57% 3*; 20% 2* Further reports and feedback will be provided: - Overview reports by panels - Concise feedback on submissions, to the heads of HEIs - The output, impact and environment sub-profiles for each submission will be published - A report by the Equality and Diversity Advisory Panel Submissions will be published (except for confidential or sensitive information)

Overview: Example of a quality profile Quality Level Overall Quality Profile 4* 3* 2* 1* U The overall quality profile is comprised of the aggregate of the weighted sub-profiles produced for outputs, impact and environment. % of Research Activity 12 37 41 10 0 Outputs Impact Environment 4* 3* 2* 1* U 12.8 32.8 43 11.4 0 4* 3* 2* 1* U 20 45 35 0 0 4* 3* 2* 1* U 0 40 40 20 0 65% 20% 15%

Overview: Timetable 2011 2012 2013 2014 Panels appointed (Feb) Guidance on submissions published (Jul) Draft panel criteria for consultation (Jul) Close of consultation (5 Oct) Panel criteria published (Jan) HEIs submit codes of practice (by Jul) Pilot of submissions system (Sep) HEIs may request multiple submissions (by Dec) Survey of HEIs submission intentions (Dec) Launch REF submissions system (Jan) Additional assessors appointed to panels Staff census date (31 Oct) Submissions deadline (29 Nov) Panels assess submissions Publish outcomes (Dec)

REF panels: Main and sub-panel roles There are 36 sub-panels working under the guidance of 4 main panels. Membership is published at www.ref.ac.uk Sub-panel responsibilities Contributing to the main panel criteria and working methods Assessing submissions and recommending the outcomes Main panel responsibilities Developing the panel criteria and working methods Ensuring adherence to the criteria/procedures and consistent application of the overall assessment standards Signing off the outcomes

REF panels: Main Panel B 7 Earth Systems and Environmental Sciences 8 Chemistry 9 Physics 10 Mathematical Sciences 11 Computer Sciences and Informatics 12 13 Aeronautical, Mechanical, Chemical and Manufacturing Engineering Electrical and Electronic Engineering, Metallurgy and Materials 14 Civil and Construction Engineering 15 General Engineering

REF panels: Main panel working methods Each main panel has developed a consistent set of criteria for its group of sub-panels Each main panel will guide its sub-panels throughout the assessment phase, ensuring: - Adherence to the published criteria - Consistent application of the overall standards of assessment Main panels will undertake calibration exercises and keep the emerging outcomes under review Main panel international and user members will be engaged at key stages across the sub-panels

REF panels: Sub-panel working methods Sub-panels will review their expertise to ensure appropriate coverage Work will be allocated to members/assessors with appropriate expertise Each sub-panel will run calibration exercises for outputs and impacts, guided by the main panels All outputs will be examined in sufficient detail to contribute to the formation of the outputs sub-profiles Each case study will normally be assessed by at least one academic and one user Graduated sub-profiles will be formed for each aspect of submissions

REF panels: Additional assessors Additional assessors will be appointed to extend the breadth and depth of panels expertise: Both academic assessors (to assess outputs) and user assessors (to assess impacts) will be appointed Assessors will play a full and equal role to panel members, in developing either the outputs or impact sub-profiles. They will be fully briefed, take part in calibration exercises and attend the relevant meetings: - Some appointments will be made in 2012 where a clear gap has already been identified - Further appointments to be made in 2013, in the light of the survey of institutions submission intentions

Outputs: Citation data Main Panel B will make use of citation data to assist assessments Citation data will be used as a minor component to inform peer-review HEIs will be provided access to the Scopus data via the REF submission system The funding bodies do not sanction or recommend that HEIs rely on citation data to inform the selection of staff or outputs for their REF submissions Google Scholar data will NOT be used in the assessment and should not be included in additional information.

Outputs: Assessment criteria The criteria for assessing the quality of outputs are originality, significance and rigour Each panel provides further explanation of how they will interpret these criteria Panels will assess the quality of outputs, not the contribution of individual researchers to the submission They will examine all outputs in sufficient detail to contribute to the formation of a robust outputs sub-profile that represents all the outputs listed in a submission

Outputs: Assessment criteria The criteria for assessing the quality of outputs are originality, significance and rigour* Four star Three star Two star One star Unclassified Quality that is world-leading in terms of originality, significance and rigour Quality that is internationally excellent in terms of originality, significance and rigour but which falls short of the highest standards of excellence Quality that is recognised internationally in terms of originality, significance and rigour Quality that is recognised nationally in terms of originality, significance and rigour Quality that falls below the standard of nationally recognised work. Or work which does not meet the published definition of research for the purposes of this assessment * Each main panel provides descriptive account of the criteria

Impact: Definition of impact Impact is defined broadly for the REF: an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia Panels recognise that impacts can be manifest in a wide variety of ways, may take many forms and occur in a wide range of spheres, in any geographic location Panels provide examples of impact relevant to their disciplines, intended to stimulate ideas - not as exhaustive or prescriptive lists

Impact: Some examples of impact Improved health or welfare outcomes Improved quality, accessibility or efficiency of a public service Changes to the design or delivery of the school curriculum Policy debate or decisions have been influenced or shaped by research Organisations have adapted to changing cultural values Enhanced corporate social responsibility policies A new product has been commercialised Enhanced professional standards, ethics, guidelines or training Production costs have reduced Jobs have been created or protected Levels of waste have reduced More effective management or workplace practices Enhanced preservation, conservation or presentation of cultural heritage New forms of artistic expression or changes to creative practice Improved risk management Improved business performance Research has enabled stakeholders to challenge conventional wisdom Improved access to justice, employment or education Research has informed public understanding, values, attitudes or behaviours The policies or activities of NGOs or charities have been informed by research Changes in professional practice Public debate has been shaped or informed by research A social enterprise initiative has been created Improved forensic methods or expert systems Changes to legislation or regulations Improved management or conservation of natural resources Enhanced technical standards or protocols

Impact: Submission requirements Impact template (REF3a) 20% of the impact sub-profile Sets out the submitted unit s general approach to supporting impact from its research: Approach to supporting impact during the period 2008 to 2013 Forward strategy and plans Case studies (REF3b) 80% of the impact sub-profile Specific examples of impacts already achieved, that were underpinned by the submitted unit s research: 1 case study per 10 FTE staff submitted (plus 1 extra) Impacts during 2008 to 2013; underpinned by research since 1993

Impact: Case studies Each case study should: - Clearly describe the underpinning research, who undertook it and when - Provide references to the research and evidence of quality - Explain how the research led/contributed to the impact - Clearly identify the beneficiaries and define the impact - Provide evidence/indicators of the impact - Provide independent sources of corroboration All the material required to make a judgement should be included in the case study Submitted case studies need not be representative of activity across the unit: pick the strongest examples

Impact: Evidence of impact Case studies should provide a clear and coherent narrative linking the research to the impact Including evidence most appropriate to the case being made Evidence may take many different forms, including quantitative (where possible) and qualitative. Panels provide examples, which are not exhaustive or prescriptive Key claims should be capable of verification. Independent sources of corroboration should listed, to be used for audit purposes

Impact: Assessment criteria The criteria for assessing impact are reach and significance In assessing a case study, the panel will form an overall view about the impact s reach and significance taken as a whole, rather than assess each criterion separately Reach is not a geographic scale. Sub-panels will consider a number of dimensions to the reach as appropriate to the nature of the impact. In assessing the impact template, the panel will consider the extent to which the unit s approach is conducive to achieving impacts of reach and significance

Impact: Assessment criteria The criteria for assessing impacts are reach and significance* Four star Three star Two star One star Unclassified Outstanding impacts in terms of their reach and significance Very considerable impacts in terms of their reach and significance Considerable impacts in terms of their reach and significance Recognised but modest impacts in terms of their reach and significance The impact is of little or no reach and significance; or the impact was not eligible; or the impact was not underpinned by excellent research produced by the submitted unit * Each main panel provides descriptive account of the criteria

Environment: Assessment criteria The criteria for assessing the environment are vitality and sustainability* Four star Three star Two star One star Unclassified An environment that is conducive to producing research of world-leading quality, in terms of its vitality and sustainability An environment that is conducive to producing research of internationally excellent quality, in terms of its vitality and sustainability An environment that is conducive to producing research of internationally recognised quality, in terms of its vitality and sustainability An environment that is conducive to producing research of nationally recognised quality, in terms of its vitality and sustainability An environment that is not conducive to producing research of nationally recognised quality * Each main panel provides a descriptive account of the criteria

Further information www.ref.ac.uk (includes all relevant documents) Enquiries from staff at HEIs should be directed to their nominated institutional contact (see www.ref.ac.uk for a list) Other enquiries to info@ref.ac.uk

Outputs Submitted to Computer Science and Informatics UoA in RAE2008 Type Number % 4 3 2 1 Journal 4970 66.3% 22% 47% 27% 4% Conference 1990 26.5% 16% 40% 33% 11% Chapter 199 2.6% 5% 38% 37% 20% Internet Js 155 2.1% 8% 58% 28% 5% Book 75 1.0% 49% 35% 9% 7% Software 33 0.4% 39% 45% 12% 3% Exhibition 19 0.3% 11% 53% 26% 11% Patent 18 0.2% 22% 39% 17% 17% Ed Book 9 0.1% 11% 22% 22% 44% Overall Outputs 7492 20% 45% 28% 6% Number of Different Journals Submitted = 1247 Number of Journals with <5 outputs = 976

Recent Developments Intention to submit indicates 93 institutional submissions to UoA11 15% rise in category A staff over RAE2008 Hence we are expecting about 9200 outputs and 300 case studies Existing panel of 21 members to be increased by 3 + 9 additional impact assessors ( users )

Submissions Additional information for each output should include a number in angle brackets that indicates the main ACM classification of the output. <01> This paper.. <18>This paper.. List of topics is on the REF web site http://www.ref.ac.uk/subguide/submissionsy stemdatarequirements/

Panel Working Methods Early Jan 2014 calibration meeting based on real REF submission Six other formal meetings, some for multiple days, during 2014 Each output read by 3 people (expertise selected by ACM classification), automatically allocated Do use additional information section to point to originality, significance and rigour!

Conclusion REF is a quality assessment and we are used to undertaking quality assessments! Environment go through the list in panel criteria and working methods document; over 30 items to arrive at a quality profile Impact broad assessment of reach and significance Outputs make good use of the additional information 300 words and check criteria.

UKCRC Report UKCRC is an expert panel of the Institution of Engineering and Technology and the BCS for computing research in the UK. Its members are leading computing researchers from academia and industry

UKCRC Executive Committee J S Sventek (Chair) Professor of Communication Systems, University of Glasgow Anthony G Cohn Professor of Automated Reasoning, University of Leeds Chris Hankin Professor of Computing Science, Imperial College London Ursula Martin Professor of Computer Science, Queen Mary University London Ron Perrott Visiting Professor, Oxford e-research Centre, Dave Robertson Professor of Computing, University of Edinburgh Tom Rodden Professor of Computing, University of Nottingham Morris Sloman Professor of Distributed Systems Management, Imperial College London Martyn Thomas Independent Consultant Software Engineer Martin Loomes CPHC Representative Paul Davies IET Representative Bill Mitchell BCS Representative

Research Funding and Policy Elected members of the UKCRC Executive Committee met several times with the EPSRC ICT team to informally discuss the Shaping Capability activity. Elected members of the UKCRC Executive Committee also met with the EPSRC ICT team to informally discuss the Centres for Doctoral Training call. UKCRC continues to monitor the activities leading up to Horizon 2020.

Membership The membership of the UKCRC has grown slowly during the year, increasing by approximately 5 members. The Executive Committee and Membership Committee continue to actively recruit new members of UKCRC. Changes to the web site should increase UKCRC s attractiveness to industrial experts, with the hope that we can attract more of them into the Committee.

Consultations and Submissions RCUK Capital Investment Consultation (led by Dave Robertson) Scottish Government consultation on a Scotland-wide Data Linkage Framework for Statistics and Research (led by Michael Fourman) BIS Inquiry into Government s Open Access Policy (led by Dave Robertson) Cabinet Office Consultation on the Definition and Mandation of Open Standards for Software Interoperability, Data and Document Formats in Government IT (led by Dave Robertson) HEFCE Call for advice on open access (in process)