All posts by LSC

Cleveland Legal Aid’s Use of Outcomes

The Cleveland Legal Aid Society uses outcomes measures to more effectively and efficiently meet clients’ needs. Key components of its outcome measures system are highlighted in the following sections:

Why Cleveland uses outcomes measures

How Cleveland developed its outcomes data system

Major outcomes categories and metrics Cleveland uses

Systems Cleveland uses to collect, compile and analyze outcomes data

Examples of the ways Cleveland uses outcomes measures

Cleveland’s lessons learned about using outcomes data


Why Cleveland uses outcomes measures

Prior to 2007, the data Cleveland collected was limited to outputs data such as LSC Case Services Reporting (CSR) data, LSC other services (matters) data, and similar information required by specific funders. Very few funders (most notably United Way) required outcomes data at the time.  Cleveland did collect some outcome data for its own purposes, but their value was limited because they were not compiled in a consistent fashion and were not the most significant outcomes of the program’s client services.

In 2007, Cleveland’ management determined that the data they were collecting was inadequate because it did not allow the program to meaningfully analyze and assess the results of its work.  CSR case closing data and similar output information documented the volume and levels of the program’s work in different substantive law areas or community engagement and partnerships, but the program could not show the ways and extent to which the program’s work benefited the client community.

These limitations led the program to develop a more comprehensive outcomes measures system.  The program did not replace CSR-type data and other outputs and descriptive data with outcomes data.  Instead, it uses the combination of this data.

Cleveland uses outcomes data to better accomplish four fundamental, interrelated objectives.

Objective 1: Assess the program’s success in achieving the program’s mission and strategic goals

Cleveland’s core mission is to secure justice and resolve fundamental problems for those who are low income and vulnerable by providing high quality legal services and working for systemic solutions.

Its strategic goals – the desired end results – for client services are to:

  • Improve Safety and Health;
  • Promote Education and Financial Stability;
  • Secure Decent, Affordable Housing; and
  • Ensure that the Justice System and Government Entities are Accountable and Accessible.

Cleveland determined that data about the results of its work – outcomes data – were essential to effectively assess whether and to what extent it was accomplishing its mission and goals.  And without outcomes data, the program could not determine how well its work was aligned with its mission and goals, nor could it identify how its work might need to be changed to ensure its was most effectively focused on the mission and goals.

Objective 2: Allocate resources and developing strategies

Cleveland also found that the outcomes data it collects allows the program to gain a more complete understanding of the results and impact of its work which, in turn, enables the program to more efficiently allocate its limited resources.  For example, the program needed outcomes data to assess whether its allocation of resources among different substantive law areas or among different regions within the service area were most appropriate.  Similarly, outcomes data were essential to assess the effectiveness of advocacy strategies and to develop the strategies that would provide the greatest benefits to the client community.

Objective 3: Fundraising and “telling our story”

The program determined that CSR data and other output data did not demonstrate the value of its work or convince other potential funders that supporting civil legal aid  is a good investment, especially when the program was competing with many other social services organizations for limited resources.  In fact, some funders had a “So what?” reaction to proposals with deliverables that were limited to CSR or similar outputs data.  Funders wanted to know what impact their investment would make in clients’ lives.

The use of CSR and other outputs data had similar limitations when the program tried to tell its story to the local media or community partners.   Anecdotes about the ways the program helped individual clients could be very compelling, but the program needed more comprehensive data to effectively demonstrate that the benefits for individual clients were examples of the broad benefits it provided the community.

Objective 4: Improve organizational performance and staff engagement

Cleveland staffers were skeptical when the program’s leadership first began discussing outcome measures.  They expressed concerns that the use of outcomes would divert resources from client services to bureaucratic record-keeping; drive the program to “cherry-pick” cases that were easy; have limited benefits; or be used as an inappropriate staff evaluation tool.

Program management successfully addressed these in the development and implementation of the system.  (The outcomes data is not used to evaluate individual employees’ performance.)

Staff have come to embrace and value the outcome measures system.


How Cleveland developed its outcomes measures system

Cleveland spent about a year developing its outcomes measures system.  It had to develop the system’s substantive and technical components and ensure that these components were effectively integrated.  The program recognized that the staff had to be integrally involved in the development process.  The board was also interested in the program developing more meaningful measurements of achievement, and supported the development of the new system.

The development process was not as linear as the description below implies.  Instead, it was an on-going iterative process in which proposals would be developed, analyzed by staff stakeholders, and then revised.

Outcomes are collected only for extended services cases.

Phase 1:  Articulate the Vision and Engage Staff

The program’s executive director and deputy director lead and coordinated the effort. The first phase in the process was for management to articulate to the staff its vision for the system as clearly and as frequently as possible.  Management focused on  how the system would enhance the program’s effectiveness in serving clients as well as how the system would help staff identify and improve their own work.  Staff expressed their views about the potential system’s benefits and pitfalls and also made suggestions about how the system could be most effectively developed and implemented.  The concerns and ideas of all program staff were essential to the successful development and implementation of the system.

Phase 2:  Identify Substantive Outcomes

After management set out its vision and actively solicited feedback and input from the staff, management turned to members of the program’s intake unit and four substantive law practice groups:

  • Consumer, including foreclosure (LSC CSR problem codes 01-09)
  • Housing, excluding foreclosure (problem codes 60-69)
  • Family (problem code 30-39), and
  • Health, Education, Work, Income and Immigration (problem codes 11-19, 21-29, 41-49, 51-59, 71-79 and 81-89)

The practice groups were tasked with developing the substantive elements of the system.  Specifically, the group was charged with identifying the outcome measures and indicators they thought would enable them to best assess and demonstrate the benefits of their work to clients.

The practice groups developed their outcome measures using the same framework. Outcomes needed to be defined from the client’s perspective.  The key questions they asked were:

  • Why are we doing this work?
  • What do we want to accomplish in each substantive law area?
  • What data would we need (and can obtain) that would enable us to know what we were accomplishing?
  • What do we need to report to funders?

In addition, the outcomes needed to be based on the LSC CSR substantive issue categories (“problem codes”) – e.g. Consumer/Finance, Family, Health and Housing.  Outcome questions are specifically associated with each of these groups, so that the number of questions required for any case is only a small subset of the entire list of questions.

A major challenge was developing a list of outcomes (and associated indicators) that would not be too long and detailed, but at the same time would provide enough information to effectively assess and demonstrate the impact of the advocates’ work.  The outcome measures the groups ultimately developed are identified in the Major outcomes categories and metrics section below.

This process took about six months.

Phase 3:  Develop the Technical Requirements

The third phase of the process involved developing  the technical requirements for collecting the outcomes measures and indicators.  The program’s technology staff led this effort. They had to develop the functionalities in the case management system (CMS) and other data systems that would: (a) enable advocates to easily and efficiently input the results of their case work; (b) provide for effective  data compilation and analysis; and (c) produce the reports the program needed to assess and show the impact of its work to different audiences.  And, as indicated above, these functionalities needed to be based on and integrated with the LSC CSR problem codes.

The program’s CMS lacked the necessary functionalities.  Therefore, Cleveland staff worked with staff of the Ohio Legal Assistance Foundation to develop these functionalities.  These are discussed in the Systems to collect, compile and analyze outcomes data section below.

This process took about four months.

Phase 4:  Staff Training

The fourth phase of the process was to train staff on how to use the outcome measures and the CMS.  All staff using the system were trained.  The training was coordinated by management, IT staff and the practice group leaders.   Rather than formal training, small group conversations were conducted with members of different practice groups.  After an overview of the system functions and content, IT staff and managers engaged with staff to respond to questions, provide necessary guidance on specific issues, identify areas for additional system improvements and further assistance, etc.   An initial training period of about two months was followed by an ongoing process of assessment and adjustment of the system to enhance its effectiveness.

This process took about two months.

Back to top


Major Outcomes Categories and Metrics

This section identifies the major outcomes categories Cleveland uses as well as the types of indicators and methods on which these measures are based.  The outcomes correspond to three of Cleveland’s strategic goals:

  1. Improving safety and health;
  2. Promoting education and economic stability; and
  3. Securing decent, affordable housing.

Cleveland is able to track their achievements related to these goals by aggregating (combining) the outcomes data from different substantive law case types.  See Cleveland’s Strategic Goal Outcomes for the list of outcomes that are combined to calculate the total outcomes for each of these goals.

There are two basic outcome categories:

Category 1:  Outcomes that describe the result of the case or matter

These outcomes are associated with a particular CSR problem codes (e.g. consumer/finance, housing, family). For example, the outcomes for the consumer/ finance area include, but are not limited to:

  • “obtained monetary claim,”
  • “reduced/avoided debt,”
  • “increased assets,” or
  • “obtained/restored utilities.”

Each of these potential outcomes require the advocate to indicate “yes,” “no,” or “not applicable.”  That way, Cleveland can track the percentage of success for each outcome indicator.  Requiring responses to every question also helps ensure data integrity.  Cleveland Legal Aid does not have to wonder whether the advocate only picked the first outcomes on the list and did not record the most important or all of the outcomes achieved.

View the complete lists of outcomes collected for each of the substantive law areas.

Category 2:  Financial outcomes secured for the client (where relevant)

These include data regarding changes in the amounts of a client’s income monthly income, assets, or debts that can be attributed to Cleveland’s work.  The two basic questions for including and calculating financial data are: “If Legal Aid had not been involved, what would the clients’s [value/amount of income/asset/debt] be at the time the case was closed?” and “What was the clients’s [value/amount of income/asset/debt] at the time the case was closed?”

Here is the form an advocate completes at case closing which shows the financial and non-financial outcomes for a consumer case:

CLAS Consumer Outcomes

Click image to enlarge

Back to top


Systems Cleveland uses to compile, analyze and report outcomes data

Cleveland LAS expanded the functionalities of its case management system (CMS) in order to compile, analyze, and report outcomes data effectively and efficiently.

In-house staff and staff of Ohio Legal Assistance Foundation (OLAF) made the necessary CMS changes.  The original development, testing, and refinements of the system took about four months.  Given the programming and other technology tools developed since then, this process would be much easier and quicker today.  (A skilled programmer could probably complete the requisite tasks in about two weeks.)

Functionalities that were added to the CMS include:

  • The ability for case handlers to record the appropriate outcomes data for closed cases. On the front end, case handlers would see the questions to which they needed to respond to input the data when they closed the case.  Here is a screenshot with the questions for Consumer cases.
  • Contain these answers in fields related to other CMS fields, so that the outcomes can be linked to type of case, level of service, and other key fields.
  • Develop the report formats that are most useful for staff, board, management and funders.
  • Develop the Crystal Reports required to create those reports.
  • Develop the Excel spreadsheets required to present the data in charts and graphs.

Cleveland uses Crystal Reports to compile data and generate a variety of reports with outcomes data.  A sample of these reports are highlighted below.

Some reports are automatically generated and routed to appropriate staff (e.g., executive director, practice group managers) on a monthly basis. The data in these reports can also be readily compiled for different time periods (e.g., quarterly, annually). Examples of these reports include:

Other reports combine different outcomes and other data. 

  • Reports with outcomes and other data corresponding to the program’s strategic goals. Here is a report which highlights data regarding the Health, Education, Work, Income and Immigration (HEWII) practice group:CLAS HEWII Health and Safety Strategic Goal
  • Reports comparing clients’ and advocates assessments of case results:Client-Advocate Outcomes Comparison
  • Reports with historical trend data showing the success rates for different case types and the amount of financial benefits provided clients: CLAS Historical Trend Report
  • Funder reports that combine CSR-type data, outcomes data, and quantitative and descriptive data about the program’s outreach and other community work.  See an example: Cleveland Evidence of Success-St. Lukes.

Back to top


Examples of the ways Cleveland uses outcomes measures to improve its services

There are many ways that Cleveland uses its outcome measures to enhance its organizational performance and client services.

  • Resource allocation. At the height of the foreclosure crisis, case outcomes data showed that a percentage of clients were unable to keep their homes.  Many clients would inevitably lose their home because of their inadequate resources.  Data analysis indicated that unless a client’s income was at least 75% of the poverty line, they would likely lose their home.  Therefore, the program decided to stop taking foreclosure cases  where the client’s income was less than 75% of the poverty line.  Instead, the program  concentrated its resources on helping those clients who would be able to keep their homes if they had the program’s assistance.  Cleveland LAS could not help all of the clients with incomes above 75% of the poverty line who needed the program’s help.  However, a high percentage of clients (about 70%) the program did help were able to keep their homes.
  • Resource development. The use of outcomes data significantly increased Cleveland’s fundraising success. By using outcome data in their grant proposals and reports, Cleveland was able to translate the impact of its work into a language that funders understand. Most funders do not fund lawyers or access to justice. They fund programs that, for example, provide housing, stabilize families, or increase access to healthcare.  By combining outcomes data with CSR-type data and information regarding the program’s work with community groups, Cleveland was able to concretely demonstrate the impact legal advice and counsel could have on achieving those outcomes. View an example of a funder report that incorporates all of these data types: Cleveland’s Evidence of Success-St. Lukes.
  • Staff morale. Cleveland’s leadership did not initiate the use of outcomes to improve staff morale. However, that has been a very important and valuable unintended consequence.  The outcomes data profiled in the various reports generated by the program demonstrate to the staff the value and benefits of their work to clients.

Back to top


Cleveland’s lessons learned about using outcomes data

Since it began using its outcomes system, Cleveland has learned a range of lessons about how outcome measures can be most useful.  The most significant are discussed below.

  • Many important aspects of the program’s work cannot be quantified. The program’s management and staff conclude that critical aspects of its work cannot be meaningfully quantified. These include the quality of an advocate’s casework, the impact of the program’s work on a client’s sense of well-being (e.g., from avoiding homelessness, escaping from an abusive partner) or empowerment, the impact of the program’s work with community organizations, and if and how its work increases the accountability and accessibility of the justice system and government entities to the low income population.
  • Every measure is imperfect. The program recognizes that the outcome measures it uses reflect assumptions and practices that can limit their validity and value. Questions include: Are they measuring the right things? Are the definitions of success appropriate? Are the measures based on the appropriate indicators? With regard to the financial outcomes, is the assessment about what the clients’ finances would have been without the program’s intervention valid? Is the method valid that is used to combine (“roll-up”) various outcomes to determine success in achieving strategic goals?  Are the advocates accurately reporting the information case outcomes?

The program addresses these issues by recognizing and trying to make refinements to overcome these limitations and being transparent about them.

  • Consistency is important. Because there may be different legitimate answers to each outcome question, it is important that staff members use consistent definitions and assessments in responding.
  • Outcome measures are necessary, but not the only tool to assess effectiveness. Outcomes provide valuable insights about the impact and value of Cleveland’s work.  However, outcomes must be used in combination with a wide range of other assessment tools and data sets to effectively analyze and maximize the effectiveness of program operations and benefits for clients. These include CSR data, Census data and data from other federal and state agencies about the client’s population characteristics (e.g. geographic location, economic and demographic data, education and employment status), on-going engagement with client and community groups, surveys and interviews with staff and community members, and GPS or similar data to align services with need.
  • On-going evaluation and improvement of the system is essential. The program’s management and staff engage in a continuous assessment of the outcomes system to identify ways it is – and is not – working.   A key aspect of this process is to ensure that the system does not distort the program’s work, such as by driving it to “cherry-pick” cases that can be easily measured and result in high success rates.

Back to top

The Virginia Model

LSC grantees in more than ten states use similar outcomes measure systems developed by Interest on Lawyers’ Trust Accounts (IOLTA) programs, bar foundations  or similar state funders (or “IOLTA” funders).

These IOLTA funders require grantees to collect and report similar types of data, which include outcomes and outputs data as well other quantitative, qualitative and narrative data.  These funders use these data as part of their grantmaking and grant oversight processes.

Grantees have participated in the design and implementation of these data collection and reporting systems to ensure that they were useful for grantees and not unduly burdensome.

Legal Services Corporation of Virginia (LSCV) implemented an outcomes reporting system in 1998.  All of its grantees – including six LSC grantees – now use this system.  See the sections below to learn about how the LSCV data reporting system and how one of these grantees, Blue Ridge Legal Services (BRLS), uses the system.

Why LSCV and BRLS use outcomes measures

How LSCV and Virginia’s legal services programs developed the LSCV outcomes data system

Major outcomes categories and metrics

Components of the outcomes reporting system

Examples of the ways BRLS and LSCV use outcomes measures

Lessons learned about using outcomes data in Virginia


Why LSCV and BRLS use outcomes measures

LSCV engaged BRLS and its other grantees in the development and implementation of its outcome measures system to ensure that the data collected was useful for both LSCV and the grantees.

Outcomes data are collected only for extended service cases (LSC CSR case closing codes F-L), not for limited services cases (LSC CSR case closing codes A and B).

Although LSCV does not make funding decisions based on outcomes data, those data have enabled LSCV to enhance its grantmaking decisions and grant oversight functions, better ensure grant funds are used effectively and efficiently, and successfully demonstrate the value and benefits of state funding of legal services.

For example, identifying and documenting the results of grantees’ work improves:

  • Grantees’ ability to formulate more concrete and measurable objectives in their funding proposals;
  • The capacity of LSCV and grantees to assess programs’ success in accomplishing these objectives;
  • Assessments of the alignment of grantees’ advocacy work and their objectives and priorities; and,
  • The identification of areas for program improvement.

LSCV also uses outcomes data to improve its ability to show the impact legal services programs have in the community and to demonstrate the value and cost-effectiveness of legal services programs’ work to the public and the Virginia General Assembly.  This is essential since the General Assembly funds legal services filing fees and general appropriations.  LSCV used outcomes data in its publications, including the 2012-2013 Report to General Assembly and its recent report,  Economic Impacts of Civil Legal Aid Organizations in Virginia.

BRLS has used the LSCV outcomes data system since it was first implemented in 1998. It has found that its outcome metrics are increasingly valuable for both external and internal purposes, because outcomes data enable the program to better “tell the story” of the benefits of its work much more powerfully than outputs data alone.

BRLS reports that outcome measures help improve its client and program services by:

  • Helping it better align its advocacy work with its strategic goals and priorities;
  • Generating reports that show the accomplishments of entire program, each office and each case handler;
  • Assess the results of advocacy strategies and identify approaches to improve the benefits the program provides clients and the client community; and
  • Informing the Board’s decision-making by profiling the focus and accomplishments of the program’s work.

More detailed information about the range of reports with outcomes data that LSCV and BRLS produce and assessments of the value of these data are in Components of the outcomes reporting system and Specific examples of the ways BRLS and LSCV use outcomes measures.

It should be emphasized that outcomes data alone do not meet LSCV and BRLS’ data needs. Both rely on the combination of a wide range of measures and data sets.  Both also use data such as CSR data and numbers of persons data, Census data and data from other federal and state agencies regarding the client’s population characteristics (e.g., geographic location, economic and demographic data, education and employment status)and financial and staffing data.  Equally important is BRLS’ on-going engagement with client and community groups, surveys and interviews with staff and community membersand GPS or similar data to align services with the needs of the client community need.

Back to top


How LSCV and Virginia’s legal aid programs developed the outcomes measures

LSCV partnered with its grantees to develop the LSCV outcomes system.  LSCV initiated what ultimately became a year-long process in 1997.  Throughout this process, LSCV and its grantees worked with the staff of The Resource for Great Programs, which facilitated this process and played the lead role in designing a system that LSCV and its grantees agreed would be most effective.  The Resource had previously worked with New York’s Interest on Lawyers’ Accounts (IOLA) program to develop an outcomes system for its grantees. At the same time that is was working with LSCV, it was working with the Maryland Legal Services Corporation (MLSC) to develop a similar outcomes system in that state. The Resource subsequently worked with other states to develop similar systems.

The development process began with a meeting of the LSCV executive director, the executive directors of Virginia’s legal aid programs and the director of The Resource. At the meeting, the group reviewed the outcome categories used in the NY IOLA system and discussed the benefits and shortcomings of using a similar system.  Over the next several months, Virginia’s program directors and LSCV staff developed suggestions for ways the NY IOLA system could be modified to best meet the needs of LSCV and the Virginia programs. LSCV then worked with The Resource staff to develop a system that incorporated these elements.

During this period, management of the individual programs engaged their staff in a process to identify how the system would be most useful and efficient.

In addition to developing the substantive content of the system (see Major outcomes categories and metrics and Components of the outcomes reporting system) the partners also had to identify the case management system (CMS) or other data systems that would be most effective in capturing the data.  About two years after the implementation of the outcomes system, LSCV purchased the same CMS for all of its grantees.  The CMS had all of the capacities needed for the effective and efficient collection, analysis and reporting of the outcomes system.

Back to top


Major outcomes categories and metrics used by Virginia legal service programs

The LSCV outcomes data system used by Virginia’s legal aid programs has four major outcomes data categories.  Outcomes data are collected only for extended service cases (LSC CSR case closing codes F-L), not for limited services cases (LSC CSR case closing codes A and B).

Outcomes Category 1: Major Benefits from Direct Legal Representation of Individuals

The system collects outcomes for cases closed in twelve areas that correspond to the LSC CSR legal problem categories (e.g. consumer/finance, education, employment, family, housing).  The number of possible outcomes in these categories varies.

For example, the consumer/finance category has 20 different possible outcomes.  Two examples of the outcomes in this category are: “Obtained federal bankruptcy protection,” and “Stopped or reduced debt collection activity.”  The family category has 32 possible categories.  Outcomes in this category include: “Obtained or maintained custody of children,” “Obtained protection from domestic violence,” and “Obtained, preserved, or increased child support.”   Information is also provided for the “Number of persons directly affected” for each of these outcomes. See a full list of the possible outcomes in each of the twelve categories.

Outcomes Category 2: Direct Dollar Benefits for Clients

These are for “affirmative dollar awards” to clients, not cost savings from judgments of payments avoided, which are reported separately (see next bullet).   Affirmative benefits are reported for the amounts of “lump sum awards/settlements” and “monthly benefits.”  Specific financial benefits categories include, but are not limited to, “Social Security, SSI,” “Child Support,” and “Affirmative consumer judgments.”  In addition, the “Other Benefits” category can include “Insurance settlements” and “Housing allowances.”   See a list of the affirmative dollar award categories.

Outcomes Category 3: Dollar Savings for Clients

These outcomes are for “amount of dollar savings achieved for clients through judgments or payments avoided,” such as “Defensive Consumer Law Matters” (e.g., bankruptcy, garnishment).  See a list of the affirmative dollar award categories.

Outcomes Category 4: Benefits From Direct Legal Representation of Groups

Only one outcome is captured in this area:  “Obtained incorporation/tax exempt status.”  All other data collected in this area are for outputs, e.g., “Obtained assistance with regulatory/licensing issues,” “Obtained assistance with other structural or governance issues.”  See the list of data collected for group representation.

Blue Ridge Legal Services uses the LSCV outcomes data categories and supplements them with the results of its on-going surveys of clients served by the program.  It uses three separate surveys:

There are English and Spanish-language versions of all of these surveys.  The surveys are emailed to clients who provide an email address; others receive the survey via U.S. mail (with a BRLS-addressed, postage page envelope). Each survey is coded with the case number to match the client’s responses with other case data.

Back to top


Components of Virginia legal services programs’ outcomes reporting system

The  Case Management System that LSCV purchased for its grantees provides specific three key functionalities. The CMS

  • enables case handlers to input required outcomes data about the case at the time of closing
  • generates basic reports for the grantees’ use
  • allows the grantee to seamlessly report the information the LSCV

At case closing, BRLS advocates complete a case closing memo. Once advocates enter the outcomes and other data into the CMS, the program can seamlessly transfer the necessary data into an Excel file to submit outcomes (and other data) data to LSCV as well as to “slice and dice” the data in a variety of ways to produce reports for its own purposes.

See examples of the reports that BRLS produces

Back to top


Specific examples of the ways BRLS and LSCV use outcome measures

BRLS’ Use of Outcomes Data

BRLS uses outcomes data – along with outputs and other data – to produce reports to assess and improve various aspects of its operations.  Examples of these reports are profiled below.

  • Reports are also generated for individual casehandlers. When BRLS first initiated the use of outcomes data, some staff were concerned that the data would be used inappropriately for performance evaluations.  However, staff now embrace the use of these data.  Having information which more fully shows the results of their work has increased their morale and has also led them to take on cases that provide the greatest benefits to clients.  They are not “cherry picking” cases that can yield high “success rates.” Instead, they are addressing more complicated issues and increasing the number of extended service cases that have greater impact on the lives of their clients.
  • BRLS considers outcomes data critical to its fundraising success in its largely rural service area. There are nine relatively small United Ways in BRLS’ service area and a limited number of small private foundations.  These funders want to assess the impact their limited funding has on the community, not just outputs such as cases closed.  Outcomes data more fully demonstrate to funders how supporting legal services is a cost-effective investment. The program estimates that the use of outcomes data has significantly increased the funds it receives from the United Ways.  For example, in 2013, a United Way allocations panel awarded BRLS twice the amount it had requested because it was impressed by the outcomes BRLS was able to document.  For some funding requests, BRLS will go beyond the requirements of the application and include a supplement that shows more fully the impact of their work.
  • BRLS has also used outcomes data to help inform difficult decisions to adjust its allocation of resources to ensure the program’s advocacy provides the greatest benefits to its client community. For example, the analysis of case outcomes and information from other sources indicated that certain types of family laws cases do not provide the same level of client benefits as cases in other substantive law areas. This led the program to reduce its volume of family law cases while increasing its volume of housing and consumer cases.

LSCV’s Use of Outcomes Data

LSCV does not make funding decisions based on outcomes data, but it does use these data along with a range of other data in desk reviews and associated grant oversight activities.  LSCV staff reports that outcomes data improve the ability of grantees and LSCV to assess a programs’ success in accomplishing grant goals and to identify strategies to better achieve these goals.

The outcomes data LSCV collects from grantees are integral to  LSCV’s Annual Reports to the VA General Assembly and provide the foundation for a comprehensive analysis of the Economic Impacts of Civil Legal Aid Organizations in Virginia.

LSCV cannot quantify how important its use of outcomes data has been to its success in sustaining and increasing these appropriations.  It believes that these data  contributes greatly to LSCV’s efforts to make the “business case” for legal services – and may have increased bipartisan support for legal services. Even though it cannot affirm that outcomes have been instrumental in sustaining support in the General Assembly, LSCV  has concluded that it would be remiss if it did not marshal as much data as possible to demonstrate the need for and value of civil legal services for low income people.

Back to top


Lessons learned about using outcome data in Virginia

  • You get what you measure. The use of outcomes and other performance measures can have the unintended consequence of driving program services rather than providing a means of evaluating and improving program operations.  BRLS management recognized that the measures the program used could significantly shape staffers views of what is – and is not – important and influence the type of cases they took and strategies they pursued, and BRLS’ executive director John Whitfield has analyzed different aspects of this issue.  
  • Outcome measures are necessary, but not the only tools needed to assess effectiveness. Both BRLS and LSCV stress that programs need a wide range of data, not just outcomes data, to assess and enhance program operations and improve client services. Both use outputs data such as CSR data and numbers of persons data; Census data and data from other federal and state agencies regarding the client’s population characteristics (e.g., geographic location, economic and demographic data, education and employment status); and financial and staffing data.  BRLS also considers essential the on-going engagement with client and community groups, surveys and interviews with staff and community members and GPS or similar data to align services with the needs of the client community need.
  • The outcome data gap for limited services is substantial. BRLS is striving for better outcomes data for cases involving only advice and brief counsel.  Conducting the follow-up necessary to obtain sufficient data for these cases could require more time and resources than those that were needed to provide the services in the first place.  The program has sought to overcome this limitation by revising its client satisfaction surveys for advice and counsel clients to include:
    1. Whether the advice provided answered their questions;
    2. Whether they understood the advice provided them; and
    3. Whether it was helpful in resolving their legal problem.

The staff maintains these data in the CMS along with the other outcomes data, so that it can generate customized reports for a particular funding source. (See Major outcomes categories and metrics and Components of the Outcomes Reporting System.)

  • On-going evaluation and improvement of the system is essential. Since the system’s establishment, LSCV, BRLS and VA’s other legal services programs have engaged in a continuous assessment of the outcomes system to identify ways it is – and is not – working.  Based on its own needs and feedback from grantees, LSCV has “tweaked” its system nearly every year.  And BRLS has modified it reports and generated new reports to fit its needs.
  • Many important aspects of the program’s work cannot be quantified. As with other legal services programs, LSCV and BRLS recognize that critical components of a program’s work cannot be meaningfully  Limitations include: the quality of an advocate’s casework; the impact of the program’s work on a client’s sense of well-being (e.g., from avoiding homelessness, escaping from an abusive partner) or empowerment; and the impact of the program’s work with community organizations.
  • Every measure is imperfect. LSCV and BRLS recognize that the outcome measures they use reflect assumptions and practices that can limit their validity and value. Questions they ask when assessing what measures to use include:
    1. Are we measuring the right things?
    2. Are the definitions of success appropriate?
    3. Are the measures based on the appropriate indicators?
    4. Are the factors used to calculate the financial benefits valid and appropriate?
    5. Are the advocates accurately reporting the information case outcomes?

As a quality assurance mechanism, when supervising attorneys review the case file at case closing they ensure that the case results information in the case closing memo is consistent with the case outcomes data entered into the CMS.

LSCV and BRLS address these concerns through their on-going evaluation of the system and their recognition and transparency about methodologies used to calculate outcomes and the limitations of these methodologies.