Reporting Requirements
Agenda
OJJDP programmatic requirements CCUSA programmatic requirements CCUSA financial reporting DAP Implementation Supporting Documents Sub-Recipient Support
Questions
Objectives
Provide guidance on programmatic reporting requirements for the Catholic Charities National Mentoring Program Introduce DCTAT Log-in
OJJDP Outcomes I
Performance Measure % increase in the number of program mentors recruited Applicants Understanding Funding is provided to expand mentoring services Data Collection Methods Maintain a roster of mentors including the date of recruitment, completion of background checks and training; report baseline and growth periodically
% of program mentors successfully Before being assigned to youth each mentor must undergo a completing training criminal background check and training in youth development and mentoring % of trained program mentors with increased knowledge in program area Mentor retention rate Program must demonstrate initial and ongoing training provided for mentors Youth have better outcomes when the mentoring relationship lasts at least 12 months or an entire school year Funding is provided to expand mentoring services
Record pre- and post-test scores for all mentor training Dates for the beginning and ending of each mentoring relationship is recorded Maintain a roster of participating youth; report baseline & growth periodically
OJJDP Outcomes II
Performance Measure % of mentoring programs with active partners Applicants Understanding Mentoring programs require active community partners committed to the programs offering resources to achieve optimal success. Programs will engage, retain and document active partners and the services they provide to participating youth Each youth enrolled in the program will undergo an intake/assessment process that results in a formal record. A service plan developed with the participation of the youth, the mentor, and the youths family, if appropriate, will document goals and objectives. Program staff will regularly record participation and progress toward goals, as well as setbacks, should they occur. Data Collection Methods Each agency will develop MOUs with community partners and will send a PDF of each to CCUSA Maintain spreadsheet of partners services and resources utilized
# of program youth served % of program youth successfully completing program requirements % of program youth who offend or reoffend % of program youth exhibiting desired change in targeted behaviors
Maintain electronic and/or hard copy client charts and aggregate rosters Reason for termination will be recorded for all youth and % of successful completion tabulated Maintain spreadsheet with roster of participating youth including main goals, milestones and setbacks
Scroll down and click the plus next to - Juvenile mentoring and ARRA Mentoring Grant Programs
Open Performance Measures Grid Under Training Read FAQs Review DCTAT Overview Go here first!
Open Data Collection Form Juvenile Mentoring Sub-grantee
11. Choose Hamilton Fish Institute or Other 12. If other - Rhodes, J. E. (2005). A Model of Youth Mentoring. In D.L. DuBois & M.J. Karcher (Eds.), Handbook of Youth Mentoring, (p. 32). Thousand Oaks, CA: Sage Publications. 13. Model of youth mentoring (Rhodes, 2005)
Evidence-based programs and practices are those that have been shown, through rigorous evaluation and replication, to be effective at preventing or reducing juvenile delinquency or victimization, or related risk factors. Evidence based programs or practices can come from many valid sources (e.g., Blueprints for Violence Prevention, OJJDPs Model Programs Guide). Evidence based practices may also include practices adopted by agencies, organizations or staff which are generally recognized as best practice based on research literature and/or the degree to which the practice is based on a clear, well-articulated theory or conceptual framework for delinquency or victimization prevention and/or intervention.
OJJDP Timeframes
Reporting Periods January 1 June 30 July 1 December 31
Reminders will be sent out: 15 days prior to the end of the reporting period The end of the reporting period Five days prior to the end of the data input deadlines One day prior to the end of the data input deadlines
If you have not begun, please note that under the reporting period Please ensure that you are entering information into the correct quarter (if a mistake was made last quarter, please correct it) MOUs for NEW relationships are required
Reporting Deadlines
April 10 July 14 October 10 January 10
15 days prior to the end of the reporting period The end of the reporting period Five days prior to the end of the data input deadlines One day prior to the end of the data input deadlines
Does not need signatures Approver name required Fill in budget For most, if not all, agencies ignore Column 5 Training travel resolved For those who submitted after 5/13 pay Q3
Developmental Asset Profile Survey Referencing Assets Reference assets throughout match
Design activities that directly impact assets Explain how developing assets leads to program goals Encourage real-life examples of utilizing assets
Developmental Asset Profile Survey 2nd DAP Implementation Post-mentoring DAP implementation
Second to last meeting Administered with mentor Flexible administration
Will send the link to the 2nd DAP implementation April 2014 or sooner by request.
Sub-Recipient Support
Support activities include but not limited to
Reporting and reviewing of programmatic performance data and reports Site visits and follow up on policies and procedures Inquiries and feedback regarding all aspect of program activities Training & technical assistance
Questions?
Out of Office
June 22 July 7