HomeProgram EvaluationEvaluation Planning and Data ManagementGrants DevelopmentAbout
dreamstimemedium_10742680.jpg


Program Evaluation

Document outcomes and program implementation, collect critical information and improve program design.

Organizations may require assistance with one or more of the following services that comprise a comprehensive evaluation study. In addition, research firms may contract for any of the following services:
  • Instrument Development- Design and/or selection of evaluation instruments and protocols.
  • Data Collection- Collection of evaluation data through focus groups, online surveys, in-depth interviews, observations, document review, or mixed methods.
  • Data Analysis- Synthesis and analysis of evaluation data, qualitative and quantitative, as required.
  • Report Preparation- Compilation of supporting data and analysis resulting in relevant findings and an extensive report covering evaluation objectives.
  • Findings Interpretation- Explanation of evaluation findings and implications for program design. This may include a series of meetings or workshops with high level staff, funder meetings or presentations, etc.
 

Clients:

Work to date in this area includes the following:
  • AED: Various projects include:
    • Data analysis and report preparation for a Middle Start report, detailing 2007-08 student progress at several schools.
    • Case study data collection and analysis for the case studies of six new small schools that received support from the Bill and Melinda Gates Foundation through an intermediary organization, culminating in an AED report, Small High Schools at Work-A case study of six Gates-funded schools in NYC. The project aimed to help inform the Bill and Melinda Gates Foundation, the New York City Department of Education, and the public at large about the effects of small schools on students’ experiences and performance and about the practices that may differentiate these schools from other schools in the system.
  • Asia Society, Texas Center:  In October, 2013 conducted a focus group of Advisory Board members to gather opinion data regarding longstanding stakeholders’ views about mission, fundraising, programs and marketing. Design, execution and analysis of a comprehensive random-sample Member Survey to gather information to help improve programming and fine-tune support and to document participant attitudes and outcomes.

  • Communities in Schools Texas Joint Venture (CIS), Arlington, TX:  Two, year-long formative, summative evaluations (2014-15 and 2015-16) of two CIS–ACE program cycles operating in a total of 20 schools in southeast Harris and Brazoria Counties. Current ongoing evaluation for 2016-17.

  • Exploring the Arts: Four, year-long evaluation projects of the Film and Media Arts Program at Frank Sinatra School of the Arts (FSSA) in Queens, including:
    • In 2008-09, working with program stakeholders to investigate the process of delivering the first year of the film and media arts program developed by Educational Video Corp (EVC), measure specific indicators of progress toward the student outcomes, and monitor and define planned program implementation. In addition, we looked closely at the challenges associated with implementing the EVC curriculum, the intended learning objectives themselves, and how the program fit within the larger culture of the school.
    • In 2009-10, the design and execution of an implementation and outcomes study of the second year of the Film and Media Arts program at FSSA.
    • In 2010-11, the ongoing management of internal evaluation efforts of the third year of the Film and Media Arts program at FSSA.
    • In 2011-12, the ongoing management of internal evaluation efforts of the fourth year of the Film and Media Arts program at FSSA; completion of Four-Year Film and Media Arts Evaluation Findings Report; and refinement of comprehansive ETA evaluation instruments.
  • The Lower Eastside Girls Club: The design and execution of  a case study on preliminary outcomes associated with participation in the organization's programs.  The project aimed to achieve a brief immersion into the lives of the LESGC students, to gain an insight into the issues that are important to them, and to ultimately undertsand how participation in the LESGC affects their lives.
  • MOUSE: Data collection and analysis for a series of focus groups sponsored by Best Buy in 2007.
  • PENCIL: A variety of surveys and case study work including:
    • The design, execution and analysis of a pre/post-survey of PENCIL participants (principals and volunteers) during the 2007-08 school year; survey results described the various projects undertaken by PENCIL partners and documented satisfaction rates of participating principals and volunteers.
    • In the spring of  2009, Carmody Consulting expanded upon the 2007-08 work and conducted two random-sample surveys—one for Principals and one for Business Partners—to gather information to help improve programming and fine-tune support and to document participant outcomes in the three “impact areas” in which PENCIL defines its partnership work.  The survey of randomly sampled participants generated statistically representative results, applicable to the entire PENCIL network from which they were sampled.
    • In the spring of 2009, Carmody Consulting designed and conducted case studies of four of PENCIL's partnerships to document how the success of these unique relationships affected student learning and building school capacity. The case study findings revealed that well-structured, long-standing partnerships appear to result in 1) students showing positive progress academically and with soft skills; 2) school capacity increases, particularly with regard to advancing school branding, improving attendance and fostering parent involvement; and, 3) organizational management and leadership improvement. A separate, internal report detailed best practices and recommendations for PENCIL staff going forward.
  • Teaching Matters: Evaluation, survey and data analysis projects, including:
    • The design, execution and analysis of a post-survey of Teaching Matters participants (principals and teachers) in Spring of the 2009-10 school year; survey results described the perceived outcomes of Teaching Matters professional development interventions and documented satisfaction rates of participating principals and teachers.
    • The analysis of individual student Writing Matters rubric scores (pre and post) received from five schools for 2009-10 and 2010-11 across multiple grade levels.
    • The design, execution and analysis of a 2010-11 process evaluation to assess TMI staff utilization of the newly developed database (TMICONNECT) and to obtain formative feedback from participants and relevant stakeholders as the first full year of use unfolds.
    • The design, execution and analysis of a random-sample survey of Teaching Matters participating teachers in Spring of the 2010-11 and 2011-12 school years. The surveys sought to: 1) measure participant perception of teaching and student learning at the beginning of the school year as compared with their perceptions about the same at the end of the school year; and, 2) capture specific information about program participant attitudes toward the TMI consultants deployed in their schools and about TMI in general.