Centering Community
This legislation includes a funding formula that provides proportionally more support to communities with a high prevalence of HIV, as well as communities with high-poverty regions with relatively high housing prices.
Centering Community
This program requires grantees to collect data on demographic characteristics and to report on experiences, outcomes and impacts for groups that have been underserved. For example, the program requires plans to “gather and synthesize data that will be used to identify specific social determinants of health and risk/protective factors where inequities are leading to disproportionately high rates of sexual violence within populations and communities.
Centering Community
This grant prioritizes applications based on the extent to which they will serve groups that have been underserved, have been subject to discrimination or have experienced unfavorable outcomes. The NOFO states: “Projects should demonstrate, to the extent possible, that outcomes should target at least 40 percent of benefits towards disadvantaged communities, including low-income communities, communities underserved by affordable transportation, or overburdened communities.”
This program also encourages authentic community engagement with communities affected by the grant activities, with the goal of shaping program goals and design. For example, the NOFO calls for “Public engagement activities, including community visioning or other place-based strategies for public input into project plans.”
Centering Community
This program awards points if “the application presents clear evidence of shared demographics or characteristics between organizational leadership, team members, and community members [...] in the defined service area.”
Define Desired Outcomes
The NOFO requires grant recipients to use specific outcome measures and provides detailed guidance on how to measure and report them. AmeriCorps requires different outcome measures in different focus areas.
Education measures include: number of children demonstrating gains in school readiness; number of students graduating from high school on time
Health measures include: number of individuals with improved access to medical care; number of individuals with improved health knowledge
Define Desired Outcomes
The NOFO requires grant recipients to track and report on outcomes established by the Workforce Innovation and Opportunity Act (WIOA). These include education and employment rates and median earnings following participation, attainment of credentials, gains in measurable skills and effectiveness in serving employers. The Department provides detailed
guidance on how to measure and report these outcomes.
Prioritize Evidence
The program provides a competitive preference of up to 3 points (out of 105) for applications presenting evidence at the moderate level, based on reviews by the
What Works Clearinghouse.
Prioritize Evidence
This program allocates up to 20 points (out of 100) for evidence presented in grant applications. Up to 12 points depend on the evidence tier (Strong, Moderate, Preliminary and Pre-Preliminary) determined by study design and findings. Up to eight points depend on the quality of the evidence and the extent to which it supports the proposed program design.
Prioritize Evidence
This program includes a 10% set-aside for evidence-based interventions to address the needs of individuals with early serious mental illness.
Prioritize Evidence
This program is required by law to reserve 75% of grant funds for replicating evidence-based approaches, while 25%may be used for developing and testing innovative approaches.
Build Evidence Through Evaluations
HUD’s requirements for financial assistance awards include, “As a condition of the receipt of the award under a NOFO, the recipient is required to cooperate with all HUD staff, contractors, or designated grantees performing research or evaluation studies funded by HUD. ”
Build Evidence Through Evaluations
The office provides
technical assistance to grantees “to ensure their evaluations are designed and implemented to meet research quality standards. OPA offers evaluation TA through a variety of mechanisms including individual TA, group training, webinars, and written documents.”
Build Evidence Through Evaluations
The
Buffering Toxic Stress Consortium included six research grantees partnering with Early Head Start service grantees. The grantees, in collaboration with federal staff, met regularly to share information and identified common measures for use in all their studies.
Build Evidence Through Evaluations
Implement Performance Management in Grants
The program’s NOFO requires grant recipients to “collect and report information on program implementation and program outcomes through a common set of performance measures.” Among others, data elements include attendance, reach and dosage provided; and participants’ “characteristics, behaviors, program experiences, and perceptions of effects (through participant entry and exit surveys).”
Implement Performance Management in Grants
The program’s NOFO requires grant recipients to collect data demonstrating their ability to meet performance standards established in the program’s authorizing legislation and detailed in regulation. Grant recipients must collect and report these data through a federally sponsored management information system. Among other data elements, the system includes information about client characteristics, length of stay and the type of residence they move to when they leave the program.
Clearly Define Evidence
This grant program uses the following evidence tiers:
(i) Demonstrates a statistically significant effect on improving student outcomes or other relevant outcomes based on— (A) Strong evidence from at least one well-designed and well-implemented experimental study; (B) Moderate evidence from at least one well-designed and well-implemented quasi-experimental study; or (C) Promising evidence from at least one well-designed and well-implemented correlational study with statistical controls for selection bias; or
(ii)(A) Demonstrates a rationale based on high-quality research findings or positive evaluation that such activity, strategy, or intervention is likely to improve student outcomes or other relevant outcomes; and (B) Includes ongoing efforts to examine the effects of such activity, strategy, or intervention.
(Page 5)
Clearly Define Evidence
ARP guidance features
three tiers of evidence:
Strong evidence requires “one or more well-designed and well-implemented experimental studies conducted on the proposed program with positive findings on one or more intended outcomes.”
Moderate evidence requires “one or more quasi-experimental studies with positive findings on one or more intended outcomes OR two or more non-experimental studies with positive findings on one or more intended outcomes.”
Preliminary evidence requires at least one non-experimental study.
Clearly Define Evidence
ESSA includes four levels of evidence, determined by study design, study results, negative findings from related studies, sample size and setting, and the match between study population and setting, and the population and setting for
implementation.
Clearly Define Evidence
This program considers models to be
evidence-based depending on the quality of studies and the breadth of impacts. Service models are considered
evidence-based if “At least one high- or moderate-quality impact study of the model finds favorable, statistically significant impacts in two or more … outcome domains,” or “At least two high- or moderate-quality impact studies of the model … find one or more favorable, statistically significant impacts in the same domain.”