The findings reported here are based on a comprehensive and systematic review of research and evaluation reports on outreach legal services to disadvantaged people in Australia and overseas. A systematic review is a methodology for selecting and synthesising the results of relevant research and evaluation studies in order to provide practitioners with practical information that is based on the best available research on a specific question. This methodology:

The outcome of the process is a set of synthesised findings from research studies on the topic, which can then form the basis of best practice. Systematic reviews are increasingly being used by government policy makers to inform decision making (Government Social Research Unit, 2007).

Systematic reviews have been traditionally undertaken in the health sector and focused on the meta-analysis of experimental studies such as randomised controlled trials.2 However, more recently there has been work in the health sector to develop methodologies for systematically reviewing data collected using other research methods. In contrast, methodologies and software for reviewing and synthesising research data in the field of social research are less developed.

For this reason we based our methodology on an appraisal system developed by the Johanna Briggs Institute (JBI) to review health related research data.3 As well as conducting quantitative systematic reviews as part of the Campbell Collaboration, JBI have developed rigorous systematic review methodologies for qualitative data, economic (cost benefit) data and narrative text (expert opinion). Their rationale for developing these methods is as follows. Randomised controlled trials were traditionally the predominant form of research undertaken by medical researchers and were usually seen as the ‘gold standard’ for medical evidence of ‘effectiveness’. However, as argued by JBI and others, different research methods are required to answer different research questions (Government Social Research Unit, 2007). For instance, qualitative research is the most appropriate methodology to explore the experiences of a participant in an intervention while an economic study best addresses the cost of the intervention. A mixed method approach may be used to explore why an intervention worked or not. JBI further argue that:

Our methodology draws upon the JBI review process for reviewing qualitative data. We took this approach for several reasons. First, nearly all of the research available on outreach legal services had a qualitative component. Second, JBI does not provide a specific review and appraisal methodology for quantitative methodologies (e.g. surveys) which are not experimental designs. Finally, as the qualitative review process is designed to be used with a range of (qualitative) methodologies, we considered that it was broad enough to provide a rigorous ‘checklist’ for high quality mixed method research, and an appropriate methodology for reviewing and synthesising the findings from these studies.

As an evolving methodology, some debate remains about the power of and best methods for synthesising qualitative data (JBI, 2008, p.29). Similar questions arise when considering the synthesis of findings from mixed method research. However, in our view the JBI approach gave us a robust, transparent and replicable method to trial for the review and synthesis of mixed method data. As we discuss in Appendix 1, we will learn from this review as we refine a methodology to appraise and synthesise mixed method social-legal research evidence.

The systematic review process

In broad terms, our methodology involved four stages:

  1. developing a research protocol which clearly articulated the research questions, inclusion criteria and search strategy (Research protocol)
  2. using search terms drawn from these pre-defined inclusion criteria to guide the literature search and to select relevant studies (Literature search)
  3. having two researchers independently assess the quality of all studies retrieved against defined and consistent criteria (see Table 2). Only those studies which pass an agreed quality standard are included in the review (Appraisal)
  4. having two researchers independently identify key findings with supporting evidence in each report, categorising these findings, then drawing these categories into broader ‘synthesised’ findings that can be applied in practice (Data extraction and synthesis).

As an additional quality control this report was peer reviewed prior to publication.

The methodology is outlined below. Additional details are provided in Appendix 1. Readers should note that we have reviewed evaluation or research reports on outreach programs, not the programs themselves.

1. Research protocol

Before commencing the review, we developed a research protocol which defined the parameters of the systematic review, outlining the research questions, the inclusion criteria and our search strategy. The inclusion criteria identified are as follows.

Target group — disadvantaged people with complex needs

The review was restricted to studies on outreach services which targeted disadvantaged people with complex needs. ‘Disadvantaged people with complex needs’ were defined in this review as ‘people who have multiple problems, including legal problems, but for whatever reason, are not able to access the range of social services/institutions that can be accessed by the majority of the population’ (Schetzer et al, 2002, p. 15, based on a 1996 ABS definition). These disadvantaged people may also be described in the literature as ‘socially excluded’ or ‘hard-to-reach’.4 Such groups often include Indigenous people, CALD communities, people with disability, homeless people, prisoners and those in very remote locations.

Intervention — face to face outreach legal services

Only studies on outreach legal services were included in this review. ‘Legal services’ include legal advice and assistance services, which may or may not provide representation. We included money and debt related advice services within our definition, where the advice provided was not just ‘financial planning’ but advice and legal assistance in managing existing debts.

We defined ‘outreach’ as ‘face to face assistance (primarily advice but also minor assistance) delivered away from the primary legal service/office’, in places used by target groups (e.g. homes, community (health) centres, welfare agencies, rural towns). Outreach by video-link was included within the search criteria, but telephone outreach was excluded. No stand alone studies on the effectiveness of video-link outreach services were found, though video-link services were included in one study.

Types of studies

Initially, our literature search was not limited by study design. However, we found no randomised controlled trials or case controlled studies examining the legal outcomes of outreach services. As a result, our search focused on evaluation studies, including mixed method studies that used in-depth qualitative interviews, observational methods and surveys. Some studies also drew on service data to inform their results. One study, a cost effectiveness study, used only economic and administrative data. The review was restricted to reports written in English and published after 1998. This date was selected to focus the review on the most recent and relevant studies. Studies or documents which did not meet criteria to be included as ‘research’ (eg documents which simply described programs or were ‘expert opinion’) were excluded from this review.

Outcomes of interest

We looked for evidence about the ‘effectiveness’ of outreach models and about the elements of these programs which were found to contribute to success (or otherwise). We defined ‘effectiveness’ in terms of whether outreach services:

2. Literature search

The search strategy used, including search terms drawn from our inclusion criteria, is in Appendix 1. The databases and websites examined are listed in Appendix 2.

Our initial search of relevant databases and websites returned hundreds of potentially relevant documents. Through a staged process of rigorous assessment against the inclusion criteria described above, these were narrowed down to 16 studies. These were the only original research or evaluation reports that we located on outreach legal services to disadvantaged people with complex needs.

We believe that our search strategy was broad and deep enough to have located most published material which fits our criteria. However, while some unpublished studies were included in the review, there may be other unpublished reports that we did not identify. Table 1 below illustrates how we selected (filtered) studies through both the search and appraisal processes.

Table 1: Summary of process for selecting studies

Initial search
Broad search against search criteriaHundreds of studies identified at first search, using search terms listed. 98 studies which appeared to be relevant to outreach legal or similar advice services entered onto an Endnote database
Targeted search and selection
Targeted search and review of retrieved studies against inclusion criteria98 studies reviewed against inclusion criteria, 39 retrieved in full text. 39 studies reviewed again in more detail. 16 met all of the inclusion criteria.
Review of studies against appraisal criteriaWe ‘appraised’ the methodology of the 16 studies which met all of the inclusion criteria, to assess whether they qualified as rigorous research or evaluation (see Table 2 below) and, in the case of Smith and Patel (2008), quality ‘economic’ research. 11 studies which met criteria as quality research were selected for inclusion in the review.

3. Appraisal

The 16 studies that met all of the inclusion criteria (that is, were identified as relevant to the topic) were then separately appraised by two researchers for methodological quality, using the standard JBI critical appraisal criteria outlined in Table 2 below.5 On the basis of this appraisal they were then either included in the review or excluded.

Table 2: Criteria against which qualitative studies were assessed

1.There is congruity between the stated philosophical perspective and the research methodology.
2.There is congruity between the research methodology and the research question or objectives.
3.There is congruity between the research methodology and the methods used to collect data.
4.There is congruity between the research methodology and the representation and analysis of data.
5.There is congruity between the research methodology and the interpretation of results.
6.There is a statement locating the researcher culturally or theoretically.
7.The influence of the researcher on the research, and vice-versa, is addressed.
8.Participants, and their voices, are adequately represented.
9.The research is ethical according to current criteria or, for recent studies, there is evidence of ethical approval by an appropriate body.
10.Conclusions drawn in the research report do appear to flow from the analysis, or interpretation, of the data.


Ideally, the studies selected would have met all of these criteria. However, as is discussed further in Appendix 1, had we applied all of these criteria, we would have excluded all of the studies. Rather, we identified what we believed were the key criteria for methodological quality and only included studies which met this standard. To be included in the final review, each study had to score a ‘yes’ for at least criteria two to five, and criterion number ten. While no studies met all ten criteria, six met 8 or 9 criteria.

Of the 16 studies critically appraised against these criteria, eleven were included in this review. These are listed in Table 3.

Table 3: Summary of studies included in the review

AuthorTitleCountryOutreach location(s)Legal issuesEvaluation
Buck et al, 2007Putting money advice where the need is: evaluating the potential for advice provision in different outreach locationsEngland & Wales100+ outreach locations – Welfare agencies, credit unions, housing offices, schools, community centres, family and children centres, Citizens Advice Bureaus (CABs) and prisonsCredit and debtPhase 1 of evaluation – to assess different outreach locations. Early in 3 year life of projects. Interviews at 25 randomly selected outreach sites (5 of each location type). 563 interviewees. Frequency statistics from closed questions. Thematic analysis from open-ended questions.
Day, Collard & Hay, 2008Money advice outreach evaluation: qualitative outcomes for clientsEngland & WalesAs aboveCredit and debtImpact evaluation nearly two years into the project. Included semi-structured interviews with 41 clients and 8 people from the target group (people eligible but did not use the service).
Day, Collard & Davies, 2008Money advice outreach evaluation: the provider and partner perspectivesEngland & WalesAs aboveCredit and debtProcess evaluation over 18 months, 3-9 months after start of service delivery. Semi-structured interviews with 5 key policy stakeholders, telephone survey of the 22 outreach pilots, follow up interviews with 30 project coordinators and partners, in-depth case study of 8 pilot projects and a dissemination (of interim evaluation report findings) and information exchange seminar. Thematic analysis of data. Collection and analysis of statistical monitoring information and demographics.
Dimos, 2008Civil law ALS Outreach reviewNSW, AustraliaOutreach to 7 Aboriginal Legal Service Offices by Legal AidCivil lawEvaluation after 12 months operation. Semi-structured interviews with staff and other stakeholders (not service users), service data and literature review.
Gillespie et al., 2007Money advice for vulnerable groups: final evaluation reportScotland11 outreach projects – Surgeries, home and hospital visits, JCP offices, community flat in the locality, CAB offices, learning disability and welfare rights centres, youth servicesCredit and debtA synthesis of evaluation reports on the 11 outreach projects. Evaluation over the 2 years & 5 months development and life of the projects. Method included 111 interviews with service users and 55 with staff, focus groups, project data analysis. Included follow up interviews. Consultation with services users and staff.
Goldie, 2003PILCH Homeless Persons’ Legal Clinic: evaluation reportVictoria, Australia6 homeless persons servicesCivil lawEvaluation after 18 months operation. Semi-structured interviews with 10 clients and with 40 staff and host agencies reps, document/file analysis, case studies and costings.
Hartlepool New Deals for Communities, 2004Evaluation report on the Money Advice and Debt Counselling Service ProjectEnglandBased at a CAB office, but includes home visits and outreach service at other venuesCredit and debtEvaluation after nearly 3 years operation. Semi-structured interviews with 7 stakeholders and 6 clients, observations, focus group, survey of service users and analysis of file, monitoring information and other data.
Kilner, 2007Housing Legal Clinic final evaluation reportSouth Australia4 homeless persons servicesCivil lawEvaluation at start of project and for first 18 months, against outcomes and KPI’s and service agreement. Interviews with project coordinator, managers and pro bono partners, postal survey to 80 clients, in depth case studies of 20 clients, workshops with clients from two clinics, surveyed and held workshops with volunteer lawyers, postal survey to other stakeholders and analysis of program data and statistics.
Sherr, 2002A stitch in time: accessing and funding welfare rights through Health Service Primary Care. An evaluation of primary care based specialist welfare rights advice provision in Lambeth, Southwark and LewishamEngland
3 London Boroughs
Approx. 80 doctor’s surgeries, health care servicesWelfare entitlements, Credit and debtEvaluation after 15 months operation. Questionnaires completed by 79 host agency staff and 153 practitioners. Focus groups of clients (1 group) and other ‘target’ users of the health service (2 groups). Compared surgeries with advice clinic and some without.
Smith & Patel, 2008Money advice outreach evaluation: cost and effectiveness of the outreach pilotsEngland & Wales100+ outreach locations – Welfare agencies, credit unions, housing offices, schools, community centres, family and children centres, Citizens Advice Bureaus (CABs) and prisonsCredit and debtAnalysis of 12 months data commencing after most projects had been running for at least 6 months. Data included: monthly monitoring data from participating agencies; closed case data from 17 of the 22 agencies, including 4,885 client records; records of 90,560 clients provided advice under mainstream Legal Service Commission contracts (comparison group).
Westwood Spice 2005Evaluation report: Homeless Persons’ Legal ServiceNSW, Australia6 homeless persons servicesCivil lawEvaluation after just over a year of operation. Evaluation method: program evaluation, face to face interviews, telephone interviews, written surveys, focus groups, document analysis, including statistical reports. Consulted with clients, lawyers, welfare agency staff, other service providers.

Note: All of these studies reviewed face to face outreach legal services. See reference list for the full citation of each study.

The fact that studies only had to meet at least half of the criteria above to be included in our review reflects a number of issues. First, these criteria were designed to assess qualitative studies reported in academic literature in the health field. Nearly all of the evaluation reports and studies we located were mixed method studies written for policy makers in the legal sector, rather than for an academic audience. So, for instance, when assessing whether there is congruity between the stated philosophical perspective and the research methodology (criterion number one, above), there was no philosophical perspective stated in some papers, and therefore we could not make this judgement. However, the number of criteria accepted for inclusion also reflects the paucity of rigorous research on outreach legal services, particularly in Australia. While we have confidence that the studies included in the review represent the best available evidence on outreach legal services to disadvantaged people, the scale and quality of the studies we included did vary.

Some of the studies included in the review were evaluation reports on single services, while other studies reviewed outreach programs over a range of sites, with a variety of client groups. It should be noted that four studies (Buck et al, 2007, Day, Collard & Hay, 2008, Day, Collard & Davies, 2008, Smith & Patel, 2008) were part of an extensive and well funded series of evaluation research projects which separately evaluated different aspects of the same very large scale program of outreach services across the UK. We made the decision to include all four, as the studies were discrete and examined quite different aspects of this program of services.

Smith and Patel (2008) is a cost-effectiveness study, drawing on economic and administrative data. We therefore also assessed the quality of this study against the JBI criteria for economic data (see http://). Smith and Patel (2008) met the key criteria for both sets of criteria.

4. Data extraction and synthesis

Once the eleven studies were selected for inclusion in our systematic review, we undertook the following data analysis to produce the findings presented in this report. To begin with, each researcher identified and extracted key findings nominated by the studies’ authors, and evidence which supported those findings. Key findings are conclusions reached and reported by the author derived from a thematic analysis of the data. Evidence is the data from which the findings are derived. A quote from the report or data reported was recorded against each finding to illustrate the best evidence from the data presented to support the finding. The evidence for each finding was ranked according to three categories:

Any findings which were not supported were excluded from the review. Findings which were credible were only used in collaboration with other unequivocal findings.

We then grouped the findings from all the studies into thematically similar categories. The next step was to synthesise these categories into broader findings called ‘synthesised findings’, which could then be used to derive best practice principles. The data extraction and synthesis was undertaken using the Qualitative Assessment and Review Instrument (QARI), a standardised JBI data extraction software tool for qualitative data (see ).

A total of 226 findings were extracted from these eleven studies, and then grouped into 53 categories. A ‘thematic’ meta-synthesis, based on the amalgamation of related categories, was then used to produce 19 synthesised findings (see Table A.1 in Appendix 1 for a list).

It should be noted that only one study specifically examined cost effectiveness of outreach advice services compared to other models of service delivery. While we have included findings from this study in this review, we also highlight cost effectiveness as a gap in research, particularly in Australia.