Cover photo of Social Policy Journal

What Works for Children and What Works in Research Implementation? Experiences from a Research and Development Project in the United Kingdom

Kristin Liabo
Research Fellow
Child Health Research and Policy Unit
Institute of Health Sciences
City University, London


Abstract

Service planners increasingly recognise the need to develop more effective ways of implementing evidence-based practice and improving research utilisation. A key question is how we base services for children on the best available evidence in the context of competing and sometimes conflicting priorities and needs. For evidence-based research to make a difference to end-point users, those who plan and deliver services need to be in a position to apply research findings. While service planners are ever demanding evidence of need, less attention is paid to the evidence on what to do about the need once it has been identified. Influences such as practice experience, current priorities, pressures to spend funds in particular ways and common sense can be both more immediate and more easily available than research evidence on effectiveness. This paper draws on experiences from the What Works for Children? project, based in the United Kingdom, which seeks to influence policy and practice through (1) making relevant research evidence more accessible and usable to practitioners, and (2) exploring and identifying the research needs of service planners and practitioners. Some of the methods used by the What Works for Children? project to address potential barriers to implementation are discussed.


Introduction

Interventions in early childhood have been found to make a difference to important outcomes in later life (Hertzman and Wiens 1996, Roberts 1997). In the last few years, government programmes in the United Kingdom have increasingly paid attention to this research (Glass 1999) and have issued guidance that services be based on the best evidence of what works (Department of Health 1998, Nutley et al. 2003). However, it is one thing to advertise the “what works” message in publications, quite another matter to work out how to influence practice on the front line effectively. In an attempt to facilitate research use in practice and increase understanding of the processes of research implementation, various initiatives have been set up in the United Kingdom (CEBSS 2003, Making Research Count 2004, Research in Practice 2003).

The What Works for Children? project (WWfC)1 – a collaboration between City University in London, the United Kingdom children’s charity Barnardo’s and the University of York – was established in 2001 with funding until March 2005 from the United Kingdom’s Economic and Social Research Council (ESRC). The project was one of seven in the ESRC EvidenceNetwork, set up to facilitate developments in the evidence-based policy and practice field. This paper, by one of the research fellows working on the project, looks at some of the lessons learned and their relevance to the current social policy agenda. One finding is that to be successful, research implementation strategies must respond to the needs of service planners, who are caught between national priorities and local context issues. Also identified in this work is the need for national policy makers to base guidelines on research evidence and commission research studies that produce evidence relevant to practitioners on the ground.


What Works for Children?

The research and development behind WWfC was in part built on findings from work on research implementation. They showed that if research is to have an impact on practice, dissemination needs to be targeted to suit practitioners' needs (Barnardo's Research and Development Team 2000, Kitson et al. 1998). Simply disseminating the message of "what works" may not be useful to practitioners, who need to know what works for whom, where and at what cost. With an emphasis on implementation rather than the primary-research production end of the evidence-based spectrum, we employed an implementation officer to work directly with practitioners and service planners. Her remit was to work with the practitioners on issues related to the adaptation and replication of interventions recommended by research. At the same time, a research team was set up to facilitate access to and understanding of research through a range of paper, face-to-face and web-based materials, including evidence summaries, a guide to the evidence, and training days. The implementation officer and the researchers worked in partnership, as one team, but the implementation officer also worked directly with practitioners and service planners on a day-to-day basis, and shared offices with some of them. With a background in both education and research, she held a key position in terms of bridging the traditional gap between research and practice (Stevens et al. 2005).

To facilitate the direct implementation work, we knew that we were more likely to succeed if we were pushing on a door that was at least partly ajar. In this respect, we were fortunate that our project coincided with a new policy initiative – the Children’s Fund – which gave us an opportunity to work with multi-disciplinary teams charged with setting up new services. Links were established with six Children’s Fund programmes in the North of England.


Children’s Funds: Local Initiatives, National Priorities

Children’s Fund programmes were set up to manage earmarked government funding to develop local services for children aged 5–13. They aim to reduce poverty and increase opportunities for children and young people who live in deprived neighbourhoods. The Children’s Fund partnership boards are made up of local representatives from both voluntary and statutory sectors in health, education and social care. Potential Children’s Fund projects apply for funding to work directly with children and young people in a variety of ways, Children’s Fund programmes help projects prepare proposals and evaluation, and the Children’s Fund partnership boards make funding decisions (see Table 1).

Table 1 Organisation of Local Children’s Fund Initiatives

Partnerships boards Comprise representatives from voluntary and statutory agencies who:
  • Make funding decisions
Programmes Comprise paid staff who:
  • Develop service plans
  • Help projects prepare proposals
  • Support projects with evaluation and service delivery
Projects Comprise statutory and voluntary service organisations that:
  • Deliver services

The Children’s Fund programmes received guidance from central government on how to design proposals for services (Children and Young People’s Unit 2001). Key requirements were that projects would:

  • focus on those service users perceived to be at most risk of social exclusion
  • build on existing services and fill in service gaps
  • promote participation of service users in the design and working of programmes
  • build on existing partnerships
  • be sensitive to local needs.

The guidance stated:

Partnerships are not confined to evidence-based services, but if a service is included in your proposal that has not been evaluated we would like to know the basis on which you believe it will be successful. (Children and Young People’s Unit 2001)

Over time, it was hoped that services would focus both on what children and families say they need and have an evidence base developed by each service to show why their work makes a difference.

Like any initiative set up by central government and implemented locally, the Children’s Fund programmes were operating between two sets of priorities. Service plans had to incorporate key national priorities, for example 25% of the services had to be targeted at young people at risk of or involved in offending. At the same time, they had to build their service plans on findings from local needs assessments. Once projects were up and running, service evaluation was next on the agenda. Evidence-based practice, in the academic sense of choosing services on the basis of systematic reviews and randomised trials demonstrating an effect, was not high on the list of their priorities.

In this situation, the WWfC team was faced with two major challenges. First, how do we support the use of research evidence when the practice priorities have already been set? Second, what do we do when no research has been conducted on the types of services practitioners are about to deliver? We hypothesised that change would be more achievable in these newly established programmes than in traditional or more established welfare services, where ways of working were more deeply entrenched. But we were also aware that the Children’s Fund programmes, projects and partnerships had their own agendas, as well as the government’s priorities and local needs to respond to. It was evident from the start that the support we provided had to build on their needs as much as on our hopes to make research findings integral to their service planning and provision.


Implementing Research

It has been suggested that interventions to improve care need to operate at the individual (practitioner), group/team (Children’s Fund projects), organisational (Children’s Fund programmes) and larger system/environment levels (the national and local government agencies promoting the Children’s Fund) (Ferlie and Shortell 2001). WWfC’s work focused on both Children’s Fund programme staff and individual project workers. Through this approach, we aimed to create both a trickle-down and a bottom-up effect as programme staff worked at the interface between projects and partnership boards, preparing service plans and evaluations. Meanwhile, project workers were invited to training events and received research summaries.

Our aims were to:

  • raise research awareness
  • enhance service planners’ and practitioners’ understanding of what research might have to offer them
  • enable them to use research in practice.

Some service planners may have a deeply sceptical view of research and, when working with them, raising research awareness may be the most realistic aim. Implementation of research findings – building them into project planning – may be more achievable with those more open to research. It is one thing to acknowledge that research is of interest to a service, and quite another to see it as important information in decision making, including changing programmes.

We adapted a Canadian instrument that assigns four stages to the research implementation process (Canadian Health Services Research Foundation 2002):

  • acquiring research
  • assessing research
  • adapting research
  • applying research.

The project has aimed to address potential barriers, and identify and build on facilitators at each stage.

To support the implementation officer in her work, the WWfC team developed a range of tools for research implementation. Some of these tools were produced at the initiative of WWfC and others were developed in collaboration with the Children's Fund programmes. All of them were aimed at addressing one of the four stages of research implementation.

WWfC was designed as both a theory-generating and theory-based exercise, and a pilot project for research dissemination. Data were collected by the implementation officer through literature review, document analysis and reflective practice within the research team, and in discussions with Children’s Fund staff. In January 2004 the author of this paper carried out in-depth interviews with 10 staff from the six Children’s Fund programmes with whom we were working. The interviews explored their views on evidence-based practice in general, and more specifically how WWfC had worked, or not, for their programme.


Using Research in Practice

The Children’s Fund programmes varied in both size and staff skills. Some had a designated evaluation officer, while one programme only employed a part-time programme manager. Some were already up and running, and had commissioned a set of services by the time our implementation officer came into post. Others were only at the planning stage. These differences presented the implementation officer with a range of needs, and the opportunity to try out various approaches. None of the programmes used all of the WWfC resources, but every programme used at least one. In addition, the implementation officer was used at various stages as a sounding board for service development, and she was constantly balancing meeting the needs of the Children’s Fund staff with working from the evidence-based agenda set by the WWfC project at the start.

We adapted a push-and-pull strategy to influence developments in the Children’s Fund programmes with which we worked. The implementation officer was based in the open-plan office of one of the programmes, which enabled her to experience some of their day-to-day pressures and work out how we could help them while pushing the evidence-based agenda forward. Three particular collaborations with the Children’s Fund programmes illustrate this work, and may be relevant to policy makers and service planners elsewhere wishing to adopt an evidence-based approach.

Evidence Nuggets

The Evidence Nuggets are summaries of research evidence on a particular subject, based on systematic reviews where these are available.2 The nuggets are designed to meet service planners’ and practitioners’ needs for succinct information, avoiding specialised language and including available details on the components of effective interventions, such as training, costs and background theory. They also pay attention to the quality of the included research and what it can tell us about the likelihood of an intervention producing change (Brocklehurst and Liabo 2004). We produced six nuggets in all, and these are available from www.whatworksforchildren.org.uk

The subjects for the nuggets were chosen in a trawling exercise, where academic colleagues were asked for research recommendations3 and practitioners and policy makers were asked for their priority topics for research.4 Staff in the participating Children’s Fund programmes were also consulted. One of the most frequently mentioned questions from the service planners was about research on volunteer mentoring for disaffected youth. It was decided that one of the nuggets would focus on mentoring to reduce offending behaviour in young people.

A systematic and comprehensive search was carried out to identify relevant studies, but we found little evidence that volunteer-delivered mentoring can reduce offending behaviour, improve conduct or enhance academic achievement and school attendance (Brewer et al. 1995, DuBois et al. 2002). There were, in fact, findings that mentoring may in some circumstances produce harm, particularly within the client groups targeted by the Children’s Funds (Grossman and Rhodes 2002, O’Donnell et al. 1979). These findings did not prove that mentoring will never work, and for some young people it is a positive intervention. However, we were concerned that too much might be expected of existing mentoring models, and felt that the evidence of potential harm had to be taken seriously. It was clear that further development and research was needed before mentoring could be recommended as an evidence-based intervention (Roberts et al. 2004). Drafts of this and other nuggets were distributed to Children’s Fund staff for feedback.

Having read the mentoring nugget, one manager contacted the implementation officer to discuss with her how they might take findings from that forward, to inform developments in his area. This Children’s Fund programme had received earmarked funding to set up a mentoring scheme, and the manager was keen to implement findings from the nuggets at the same time that the programme fulfilled its commitments to the Children’s Fund. In collaboration with the implementation officer the programme drew up an alternative plan to reduce youth offending, encompassing a range of services shown to be effective elsewhere. Rather than scrapping mentoring altogether, the new scheme included a parenting support component and cognitive behavioural therapy training to key staff and mentors. This decision was based on research findings looking in more detail at the components of mentoring programmes that have produced promising effects (Davidson et al. 1987, DuBois et al. 2002).

The implementation officer also helped the Children’s Fund programme map local services to see how the new initiative would fit with these, and whether local experiences of mentoring mirrored those in the research literature. The decision on whether funding could be shifted to this approach was made by the local Children’s Fund partnership board, and the implementation officer helped the manager prepare for this meeting and came with him on the day.

Researcher: “Did any [nuggets] change original plans?”
Manager: “They definitely did. I mean, they shifted us from an initial idea about developing a crime-prevention mentoring programme… [to] a more holistic parenting school-based programme, with mentoring as part of that.”

Research implementation within the Children’s Fund needed to consider both the local context and national priorities. The implementation officer had the time and skills to help the programme manager develop a service that met local needs, responded to central policy strategies and was built on findings from research. We were fortunate that the mentoring nugget was disseminated in time to change service plans, and that the particular manager involved was committed to basing practice on research findings. The information supplied by the research team contributed to the use of research in the service-development stage. As acknowledged by the programme manager, finding time to read research and appraise it was difficult and the WWfC team helped overcome this barrier:

“You can use the summary or the nugget and present it but you can be confident that the time has been spent to look at some of the background to that, ‘cause you know we wouldn’t have the time to do that.” (Children’s Fund Programme Manager)

The Evidence Request Service

As well as consultation at an early stage, WWfC wanted to respond to the Children’s Fund programmes’ needs on an ongoing basis. Following an increasing number of requests to the implementation officer for research evidence, we set up an evidence request service where Children’s Fund staff could ask for research on specific topics or interventions. The researchers would carry out a search, appraise relevant articles and present a list of these in summarised form, pointing out strengths and weaknesses. Information on further resources and websites for special organisations was also provided (Stevens et al. 2005).

One important finding that emerged from this service was the gap between the questions raised by service planners and the research available to answer their questions. Furthermore, when research studies were found, they seldom addressed the question of effectiveness, let alone applied methods to reduce bias in reporting results. For example, one of the requests read: “Opportunities for learning for travellers’ children – not necessarily academic achievement – learning to allow them to have the lives they want – culturally specific.” A WWfC researcher contacted the programme manager and the request was narrowed down to: “Learning opportunities on traveller sites for traveller children (specifically mobile homework clubs or mobile computer schemes/youth clubs).” Hardly any relevant research publications were found, but the service planner did not necessarily see that as negative:

“And sometimes the fact that you can’t find anything for us doesn’t matter. I guess [that] as a practitioner you always think somebody else must have come up with a better answer somewhere ... and I think that sometimes, like with the travellers work, asking questions and nobody has published anything precisely relevant to what I was looking at you think, ‘oh well my answer is as good as anybody else’s then.’” (Children’s Fund Programme Manager)

The service also highlighted service planners’ and practitioners’ need for implementation research (“How do we put research into practice?”), as well as effectiveness studies (“What kinds of intervention make a difference to outcomes?”). Detailed information on costs and staffing, and practical tips on challenges and obstacles in setting up and running a programme were almost as important as information on the intervention’s ability to produce change. This meant that in addition to answering the question “Does x intervention work?” the researchers would look for details within the studies about the setting in which the intervention was delivered, characteristics of the service users for whom it was effective and whether certain components were crucial to success.

Another concern was relevance, and whether findings from research studies carried out elsewhere would “work” for children in other localities, sometimes with very different contexts from the research sites.

“I think that it has to be acknowledged that just ’cause it’s evidence that something works in one place, doesn’t mean it will work anywhere else. Like I’ve got a brilliant travellers project going now, and it works because of the key worker who’s running it. And you could try and replicate it somewhere else and it wouldn’t work unless you’ve got someone very like that doing it. So it’s not always just what it says on paper.” (Children’s Fund Programme Manager)

Some social interventions, such as parenting and home-visiting programmes, have shown consistently positive results across national borders (Barlow 1998, Bull et al. 2004, Liabo et al. 2004, Liabo and Lucas 2004). Systematic reviews can be a useful tool in demonstrating the relevance of an intervention, but at present relatively few are published in user-friendly format. The evidence request service indicated a lack of outcome research relevant to current practice. When relevant research was found, important information on implementation was frequently absent, a gap that hinders the adaptation of research into practice.

The Project Planning and Review Tool

From an early stage it became clear to the implementation officer that there was a large gap between researchers’ views of evidence-based practice and the way Children’s Fund services were funded, planned and run. Where service planners focused on needs and target groups, researchers focused on interventions, outcomes and effect size (Stevens et al. 2005). For example, implementation plans quoted evidence on local needs for a service rather than arguing for how and why the service would change outcomes for users. “Research” in these plans normally referred to local data on children and families eligible for Children’s Fund services, or in-house user surveys.

Local evaluation was a key principle in the establishment of the Children’s Fund, and services were required to monitor their own progress. Having considered the messages on evidence-based practice from the implementation officer, one programme came to her for advice on how they could focus on outcomes in their local service planning and evaluation. This seemed to be a way of starting to bridge the gap in the research base identified by the evidence request service. The Project Planning and Review Tool, devised by the implementation officer in collaboration with Children’s Fund staff, introduced them to the concept of outcomes-based service planning and how this could help improve the quality of evaluation, and ultimately their service. When completing the tool, projects were asked, “How do you know that the specific aims and objectives of the project have been achieved?” and “What information do you need?”

“[The Project Planning and Review Tool] has been invaluable. And working with [the implementation officer] on the process of developing that was really, you know, a very good process… very positive.” (Children’s Fund Programme Manager)

The Project Planning and Review Tool was piloted and further adapted to local needs in collaboration with the Children’s Fund programme and quickly incorporated as a tool for all their projects. The regional Children’s Fund office commended use of the tool, and other Children’s Funds in the area expressed interest in learning from the experiences of the initial tool users. The implementation officer also ran a number of evaluation workshops for Children’s Fund projects, as well as informally supporting the evaluation officers working for the Children’s Fund programmes.

“And there was some follow-up work with that as well, in terms of projects going away and rewriting their objectives… see what the outcome should be. And I think projects went away and did that and then they were on the phone and I was saying [to the implementation officer] what do you think about this question?” (Children’s Fund Development Officer)

The Project Planning and Review Tool is another example of how evidence-based practice may be used to meet the needs of local service planners within the framework of national policy priorities. Again, this tool was developed in response to a request by the Children’s Fund programmes we worked with, rather than being produced in a research setting and passively disseminated. A follow-up tool to the Project Planning and Review Tool is now being developed, and the implementation officer is working closely with programme staff on this.


Discussion

Evidence-based policy initiatives require evidence-based practice to make a difference on the ground, but implementing research evidence into local service plans is not straightforward. Service planners may lack the techniques and resources to carry through evidence-based interventions, and may fall back on providing services that they are familiar with, irrespective of the evidence (or lack of it) on positive outcomes (Randall et al. 2000). WWfC has tried to overcome obstacles to research implementation (Barnardo’s Research and Development Team 2000, Randall et al. 2000) by providing service planners and practitioners with a dedicated person for support in adapting research to their local context.

Social policy initiatives will always be driven by a range of agendas at different levels of government and service planning. In recent years, there has been an increased commitment to evidence-based policy and practice in health, education and social policy (Nutley et al. 2003). This may not have increased the use of research evidence, but it has highlighted the importance of outcomes as well as process in our work with disadvantaged children.

Our work with the Children’s Fund programmes provides a practice example of how evidence-based practice can play an important role in marrying national policy priorities with the local context. Key to this work is the use of a range of resources that enables flexible support in response to the needs of service planners and practitioners. We do not know in the short term whether the use of outcome-focused evaluation will improve services for children, nor if the mentoring approach adapting evidence to local need will succeed. However, our experiences provide an example of how researchers and service planners can meet and collaborate.

The tools used by WWfC can be used or adapted to provide a good starting point for similar initiatives elsewhere. Research summaries of systematic reviews, minimising the use of technical terminology and including implementation details, can inform service development. Practitioners often lack access to relevant research, and a research request service may start to meet this need, to identify gaps in the evidence base, and to help improve research skills in the practice community.

This paper illustrates how the local evidence base may be improved by use of evaluation tools that focus on outcomes as well as process. When combined with the implementation of evidence-based programmes, this type of evaluation may inform our understanding of context issues in research implementation. When there is no research conducted in a particular service area, outcome-focused evaluation may help to inform further research in the area and provide preliminary indicators of a service’s ability to produce change.

Evidence-based policy making can render an important contribution to services on the ground, and help the children and families they work with. However, support, education and technical training are needed to enhance policy makers’ understanding of practice, practitioners’ understanding of research and researchers’ understanding of how things work “in the real world”.


Concluding Remarks

Evidence-based practice is still a relatively new concept in social care, and WWfC is at the start of what we hope will be a range of similar initiatives, some of which will build on our initial findings. As with many research and development initiatives, this work leaves us with as many new questions as those we attempted to answer. Lessons to take forward from this work that are of particular relevance to social policy analysts at a national level include the following.

  • In order for research to be used in service planning and practice, central departments need to commission more effectiveness research on topics relevant to current social policy. Intervention planning needs to be carried out in consultation with user groups. We need trials of research implementation in social care to assess the impact of research implementation on outcomes for service users. For a good trial to take place, strong foundations involve building blocks of theoretical and practical work, and some of the work described above may help provide these foundations.
  • At present there is no standardisation in social-care service delivery equivalent to that in health. For example, the United Kingdom’s National Health Service has established a National Institute for Clinical Excellence (NICE), which makes recommendations for treatments and care based on the best available evidence (National Institute for Clinical Excellence 2004). Although there are gaps in our knowledge as to what works in social care for children, recommendations based on high-quality research evidence may help practitioners on the ground when making decisions on what service to deliver.
  • The role service users can play in evidence-based practice is still relatively unexplored. As the group to gain the most from research-based services, they have a role to play in choosing the type of interventions tested for effectiveness and in choosing the interventions they feel best suit their needs.
  • Resource limitation is a problem in social care. Identifying a cost-effective model to support evidence-based practice presents a challenge.

References

Barlow, J. (1998) “Parent-training programmes and behaviour problems: findings from a systematic review” in A. Buchanan and B. L. Hudson (eds.) Parenting, Schooling and Children’s Behaviour, Ashgate, Aldershot, pp. 89-109.

Barnardo’s Research and Development Team (2000) What Works? Making Connections: Linking Research and Practice, Barnardo’s, Ilford.

Brewer, D.D., J.D. Hawkins, R.F. Catalano and H.J. Neckerman (1995) “Preventing serious, violent and chronic juvenile offending: A review of evaluations of selected strategies in childhood, adolescence, and the community” in J.C. Howell, B. Krisberg, J.D. Hawkins and J.J. Wilson (eds.) A Sourcebook: Serious, Violent and Chronic Juvenile Offenders, Sage, London.

Brocklehurst, N. and K. Liabo (2004) “Evidence nuggets: promoting evidence based practice” Community Practitioner, 77:292-296.

Bull, J., G. McCormick, C. Swann and C. Mulvihill (2004) Ante- and Post-natal Home-visiting programmes: A Review of Reviews, Evidence Briefing, Health Development Agency, London.

Canadian Health Services Research Foundation (2002) Self Assessment Tool

CEBSS (2003) The Centre for Evidence-Based Social Services

Children and Young People’s Unit (2001) Children’s Fund: Part Two Guidance, CYPU, London.

Davidson, W.S., R. Redner, C.H. Blakely, C.M. Mitchell, J.G. Emshoff (1987) “Diversion of juvenile offenders: an experimental comparison” Journal of Consulting Clinical Psychology, 55:68-75.

Department of Health (1998) Modernising Social Services, Cm. 4169, The Stationary Office, London.

DuBois, D.L., B.E. Holloway, J.C. Valentine and H. Cooper (2002) “Effectiveness of mentoring programs for youth: A meta-analytic review” American Journal of Community Psychology, 30:157-197.

Ferlie, E.B. and S.M. Shortell (2001) “Improving the quality of health care in the United Kingdom and the United States. A framework for change” Milbank Memorial Fund Quarterly, 79:281-315.

Glass, N. (1999) “Sure Start: the development of an early intervention programme for young children in the United Kingdom” Children and Society, 13:257-264.

Grossman, J.B. and J.E. Rhodes (2002) “The test of time: Predictors and effects of duration in youth mentoring programs” American Journal of Community Psychology, 30:199-206.

Hertzman, C. and M. Wiens (1996) “Child development and long-term outcomes: a population health perspective and summary of successful interventions” Social Science and Medicine, 43:1083-1095.

Jadad, A., D. Cook and G. Browman (1997) “A guide to interpreting discordant systematic reviews” Canadian Medical Association Journal, 156:1411-1416.

Kitson, A., G. Harvey and B. McCormack (1998) “Enabling the implementation

of evidence based practice: A conceptual framework” Quality in Health Care, 7:149-158.

Liabo, K., J. Gibbs and A. Underdown (2004) Group-Based Parenting Programmes and Reducing Children’s Behaviour Problems, Highlight, 211, National Children’s Bureau, London.

Liabo, K. and P. Lucas (2004) Home Visiting and Childhood Injury, Highlight, 213, National Children’s Bureau, London.

Making Research Count (2004)

National Institute for Clinical Excellence (2004) NICE website

Nutley, S., Davies, H. and Walter, I. (2003) “Evidence-based policy and practice: cross-sector lessons from the United Kingdom” Social Policy Journal of New Zealand, 20:29-48.

O’Donnell, C.R., T. Lydgate and W.S.O. Fo (1979) “The Buddy System: Review and follow-up” Child Behaviour Therapy, 1:161-169.

Randall, J., P. Cowley and P. Tomlinson (2000) “Overcoming barriers to effective practice in child care” Child and Family Social Work, 5:343-352.

Research in Practice (2003) Research in Practice website

Roberts, H. (1997) “Socioeconomic determinants of health: Children, inequalities, and health” British Medical Journal, 314:1122-1125.

Roberts, H. (2002) What Works in Reducing Inequalities in Child Health? Barnardos, Barkingside.

Roberts, H., K. Liabo, P. Lucas, D.L. DuBois and T. Sheldon (2004) “Mentoring to reduce anti-social behaviour in childhood” British Medical Journal, 328:512-514.

Stevens, M., K. Liabo, S. Frost and H. Roberts (2005) “Using research in practice: A research information service for social care practitioners” Child and Family Social Work, 10:67-75.

Woolfenden, S., J. Williams and J. Peat (2002) “Family and parenting interventions for conduct disorder and delinquency: A meta-analysis of randomised controlled trials” Archives of Disease in Childhood, 86:251-256.


Footnotes

1 www.whatworksforchildren.org.uk

2 Systematic reviewing is a method of critically appraising, summarising and reconciling research findings concerning a particular problem or treatment (Jadad et al. 1997, Roberts 2002).

3 The key question was: If you could recommend one piece of research evidence to policy makers or practitioners that might be helpful for children aged 5–13, what would it be?

4 The key question was: You have a pot of money to spend on children’s services, no strings attached. What would you choose to do?


Cover photo of Social Policy Journal

Documents

Social Policy Journal - Issue 24

What Works for Children and What Works in Research Implementation? Experiences from a Research and Development Project in the United Kingdom

Print this page.