CAREER GUIDANCE - DEVELOPING THE EVIDENCE BASE FOR POLICY MAKING
REPORT OF A MEETING AT IAEVG ANNUAL CONFERENCE
24 AUGUST 2006
PARTICIPANTS: 24 practitioners, trainers, service managers, researchers, policy developers
COUNTRIES: Canada, Denmark, Finland, France, Germany, Italy, New Zealand, Sweden, UK
Chairperson: Dr John McCarthy, Director, ICCDPP
Presenters: Raimo Vuorinen (FI), Teja Felt (FI), Bryan Hiebert (CA), Lester Oakes
1. MOTIVATION OF PARTICIPANTS:
- Learn from other countries experiences (methods, ideas)
- International collaboration as a means to make national actions and outcomes stronger
- Develop strategies to articulate the value of guidance provision to policy makers
- Form a cooperation network
- Help improve research
- Linking data collection with ICT
2. INTRODUCTION TO THE SESSION (J. McCarthy)
The reviews of policies for career guidance undertaken by OECD, European Commission, and the World Bank had shown that the evidence base for policy making for guidance provision was weak. Data collection for policy making fell into three categories:
- Level 1: basic statistical data for administrative and accountability purposes e.g. usage of services;
- Level 2: data for improving the quality of the services provided;
- Level 3: data to show the efficacy of certain interventions and measures.
The issue of the evidence base for guidance was receiving increasing attention from researchers, managers and policy developers. At the International Symposium on Career Development and Public Policy in Sydney in April 2006, an Interest Group was formed around this topic. Participants had been contacted subsequently to identify what was happening in their countries with respect to this issue. The meeting today would hear some of the responses to that follow-up; this meeting could also be viewed as a follow-up to the meeting in Sydney.
3. PRESENTATIONS (Raimo Vuorinen, Teija Felt)
In Finland the evaluation of guidance provision at all levels of education was considered important but no sustainable mechanism for collecting data was yet in place. The provision of guidance was a responsibility of municipalities and higher education institutions; adult educational guidance provision was being planned. A pilot project existed in higher education to test e-portfolios and their usefulness for data collection. For adult educational guidance it was planned to have a sustainable data base for data collection. The development of sustainable data collection to support national policy making will be a key concern of the new Centre for Lifelong Guidance and Expertise in the University of Jyvaskala.
In the labour market sector, the Finnish Public Employment Service was good on Level 1 data; does some work on Level 2 data; and does very little on Level 3.
Issues raised in the discussion:
- The possibility of a cross discipline meta analysis re efficacy data being undertaken by the new Finnish Centre;
- Cross departmental support at the university for the work of the new Centre;
- The need to map the evidence base across countries and to share such information.
4. PRESENTATION (Bryan Hiebert)
The recent Canadian survey of evaluation practice showed that while most guidance service managers and practitioners agree the importance of evidence base collection for guidance, very little is being done. Practitioners were particularly concerned that behaviours such as client empowerment be captured in such collection.
An evaluation framework was currently being developed based on an input-process-output model. Indicators of client change were the starting point of this approach. Process is linked with outcomes and deliverables; distinctions have to be made on the outcomes of generic versus specific interventions; client satisfaction can be seen as a process variable. Inputs refer to resources used and its data is akin to Level 1 data.
Two trends can be discerned in the Canadian approach: evidence base (efficacy-what works) and outcome (how do I know/tell).
One of the conclusions of the researchers to date is that there is a need to develop a different mindset among practitioners (to think outcome as well as process) and to reconceptualise the professional identity of practitioners to a role of being applied scientists.
The next steps for the Canadian research group include:
- To validate the framework using existing and new agency data;
- To get support for research proposals in order to validate the model across countries;
- To build up an evaluation culture among agencies and practitioners.
Issues raised in the discussion:
- Action in Canada is being undertaken by a researchers working on a voluntary basis; they receive no funding from policy makers; they have the voluntary support of CCDF;
- Output and impact measures are not linked to any public policy goals; the research group made a deliberate choice to build the framework around client change;
- The importance of distinguishing between outputs and outcomes;
- The challenge of growing practitioners into the role of local scientist;
- Increasing numbers of practitioners are becoming interested in this issue as evidenced by participation in workshops at conferences;
- The need to develop simple tools for practitioners to collect data and field test them. A workbook of tools and resources was being built and is accessible on the CCDF web site.
PRESENTATION (Lester Oakes)
It is important to make connections between what practitioners do, what client changes result from this, and how such results link with public policy goals.
Level 1 Data (New Zealand): Important starting points are: why collect it? Who benefits? Such data is useful for overviews of usage at different times of the year, by different sectors, and by different target groups, and is helpful to plan services. A quarterly report, on time, well presented, in plain language, with observations, gives career guidance services credibility. The challenge is to get practitioners to collect it and making it difficult for them not to collect it by providing them with friendly software. Level 1 data shows that the work is done; it is the foundation for Levels 2 and 3.
Level 2 Data (New Zealand): is important for improving the quality of the services and is also linked
with Level 3. It is important that services are evaluated against wider government goals e.g. employment, integration of immigrants, successful transitions – these are where government interests lie. It is important to build into the service’s annual budget the cost of their evaluation. In New Zealand the evaluation task is undertaken by an external research agency.
It is important from an evaluation perspective to consider how to deliver guidance to those who do not access or use the services, to find out why, especially concerning the government’s primary target groups. Level 2 data should answer: Is what we are delivering the most efficient and most effective way of doing so? Are we the best delivery source? How should we build bridges to others who can deliver the service?
Level 3 Data (New Zealand): This is new ground for NZ Careers Services. Users will be followed up 12 to 18 months after using the service. The Careers Service will also try to piggy back on other research being undertaken.
Issues raised in discussion:
· Giving their changing nature, is it wise to link public policy goals to outputs and outcomes? Answer: it is important for services to influence policy development; such evaluation can help feedback to policy makers unmet needs and gaps in provision.
· The problems of measurement when several actors are involved: differential impact versus impact of working together in partnership;
· The power of stories: the importance of collecting practitioner stories of client improvement/changes.
OPEN DISCUSSION: POINTS RAISED
Level 1 Data:
· How to define basic data and definitions e.g. of drop-out, special needs;
· Mandatory versus non-mandatory data collection (EU Focus Group Report by CEDEFOP 2005);
· Importance of reporting statistics that are relevant to funding decisions;
· How to piggy back on other research data collected e.g. student drop-out;
· To collect learner journey data.
Level 2 Data:
· Quality assurance data for which purpose?
· Client expectations versus delivery – importance of follow-up;
· Basic questions: what do we ask that is of real value? Who asks (internal vs external)? When do we ask? How many are asked? How do I ask?
· Importance of market research;
· Relevant resources: CEGs publication on the evaluation of guidance services; Careers Wales research undertake by staff and published on its web site;
· Evaluating counsellor performance: inspection, supervision, CPD, organisational standards
Level 3 Data:
· Methodological problems: moving from standardised to narrative data.
Is there a standard way to collect narrative data? What is acceptable evidence?
· The importance of relationships between services and funders: the need for agreement on what is acceptable evidence of client change especially before a programme of intervention starts – the need for a list of acceptable outcomes e.g. employment, enrolment
· Client expectations versus outcomes;
· Soft outcome data e.g. adapting policy to include distance travelled (personal career journey) as an outcome;
· Ability of a client to self-evaluate as an outcome (also as a guidance intervention);
· Identifying data that is currently collected but is not analysed or used.
CONCLUSIONS (J. McCarthy)
Many good examples and different approaches have been highlighted in this discussion. To my mind there are three general needs arising from the discussion:
- To collect information on what is happening on this topic across countries and to share this information internationally;
- To share and exchange data collection instruments, to adapt questions, and to develop common questions;
- To share information on how to identify non-users of services and on effective marketing strategies to reach them.
ICCDPP can play a key information collection and dissemination role with regards to this important issue for the development of careers services worldwide.