Advisory Committee on Before and After School Programs Meeting Minutes
June 13, 2007
Approve Minutes from the May 23, 2007 Meeting
Advisory Committee Chair Report - Assembly Bill 1685
Outcomes and Evaluation Subcommittee Report
Final Recommendations for the Independent Evaluation of Before and After School Programs – Action
Outcomes and Evaluation Subcommittee Report (continued)
Definition of "After School Teacher" - Action
CDE Staff Report
Regional System of Technical Support
Quality Self Assessment Tool Update
After School Program Update
After School Network Report
Workforce Development Subcommittee Report
Advisory Committee Future Meeting Schedule
Agenda Items for the Next Meeting
Frank Pisi, CDE staff to the Advisory Committee
Sandra McBrayer, Committee Chair convened the June 13, 2007, Advisory Committee meeting at 1:00 p.m.
Sandra McBrayer made note that the Chair appreciated staff for providing more information. Frank Pisi thanked Paulette Pacheco for her transcription services. The Chair asked the committee to review the minutes from the May 23, 2007 meeting. A motion to approve the minutes was made by Ana Campos, seconded by Steve Amick, and unanimously approved as submitted.
The Chair stated that she was happy to announce to the committee that Assembly Bill 1685 (AB1685) has passed out of the Senate Education committee this morning. Once the bill gets off the Senate Floor, because there have been no changes, it does not need to go back to the Assembly. We do not see any reason for this bill not to be chaptered and signed in the next two weeks. The Chair also reported that per motion at the last meeting, a letter was sent to the Packard Foundation for the After School Network grant application. Frank Pisi said that the letter has not yet been sent pending the written proposal being sent to the Packard Foundation by the After School Network. A letter of support will accompany the proposal. Please see the draft letter dated May 29, 2007 included with the meeting materials packet.
Renee Newton stated that the two reports being presented to the committee were the culmination of the year's work and effort in that the Outcomes and Evaluation Committee convened for the first time on June 22, 2006. There have been eleven meetings and approximately 30 hours of testimony and subsequent discussion. The Subcommittee reviewed and heard testimony from over 15 individuals who not only provided direct testimony, but also supplemental materials in the form of literature, research articles, samples evaluation tools, and reports. Over the past year, the Subcommittee has focused on two separate endeavors: First, the independent evaluation and second, while looking at the outcome measures. From our discussion there was consensus for 19 recommendations for the independent evaluation as well as 8 recommendations for the outcomes measures. The last action taken by the subcommittee was to adjourn for the summer to allow the CDE appointed workgroup to begin developing the tools and instruments that will be put into use with grantees, presumably this fall. We anticipate reconvening at the end of the summer or in early September to review the content of the workgroup’s efforts. Ms. Newton announced that Frank Pisi would present PowerPoint presentation on each set of recommendations.
Frank Pisi provided an overview and gave context to the mandates of the independent evaluation and the related responsibilities of the advisory committee. Mr. Pisi reviewed the legislatively mandated components of each independent evaluation (one for the high school program, the other for the elementary/middle school program).
The recommendations of the subcommittee, after hearing testimony for a year of monthly meetings, are as follows:
- CDE clearly states the purpose of the evaluation.
- Evaluator collects and reports information on the full range of program activities and characteristics.
- Evaluator collects and reports data on program administration and organizational structure.
- Evaluator collects and reports data on program staffing patterns, composition, pre-service, and ongoing staff development.
- Evaluator describes funding sources and levels in relation to the scope and frequency of services offered by programs.
- Evaluator reports on local partnerships.
- Evaluator collects and reports data on "customer" satisfaction.
- Evaluator reports on local evaluation practices.
- Evaluator makes data available to programs for purposes of continuous improvement.
- Evaluator provides stipends or incentives to local staff participating in evaluation activities.
- Evaluator collects and reports data on access and equity issues.
- Evaluator includes a blend of objective and anecdotal, or narrative data.
- Evaluator addresses issues related to sample size and validity.
- Evaluator adopts an operational definition of "comparison group."
- Evaluator uses technological or electronic means of data collection.
- Evaluator compares the dosage effect on students within a program.
- Evaluator includes a longitudinal analysis spanning multiple years.
- Evaluator collects and reports subgroup data.
- Evaluator collects data from youth participants on critical features of programs that support healthy youth development.
Mr. Pisi commented that customer satisfaction should go beyond asking if participants and other are satisfied or not; really it is the question of why. It’s just as important to know why someone is satisfied as why they are not. Frank outlined the next steps, which would be to secure approval from the full advisory committee and then present these recommendations to the CDE and the State Board of Education at its September meeting. In September 2007, as an information item, there will be presented to the State Board recommendations of the Advisory Committee along with CDE recommendations as well as the research questions that the evaluator will be responding to. In November the State Board will have this come back as an action item and approve the research questions. From that we can work with the evaluator to develop the design of the evaluation. Frank Pisi complimented the subcommittee members for their diligence and hard work on these recommendations. The Chair asked for questions from the committee. Al Cortes asked about the issue of evaluators looking into other streams of funding sources. Mr. Cortes stated he had mixed feelings about this topic as he felt it was important to see how we operate on only the state funding. Mr. Cortes would like to make sure that the evaluator makes a distinction as far as this is how a provider runs the program on state funding alone. Carla Sanger stated that what an organization has by way of alternative funding doesn't matter if that alternative funding isn't going directly to support these programs, and if so, in what way? She commented that, in her opinion, the collection of funding source data is useless. Ms. Sanger stated that she didn't see the usefulness of Recommendation 5.The Chair commented that she thought that the purpose of Recommendation 5 was about gaining descriptive data so that programs may learn from one another If we see what other people are doing with other resources and funding streams and being able to use them effectively, how do we model after those programs? Gary Moody voiced that there were concerns that some programs that had very limited resources might being judged against programs that have many resources on the same plate. To that end, collecting descriptive data regarding a program’s other funding sources might be beneficial. Steve Amick stated that the bottom line is one of the elements of evaluation is finding out how people are meeting the match required. He suggested limiting the scope of Recommendation 5 to that. There is a good point to be made about looking at the difference between programs that are meeting that match requirement in a minimal way in comparison to people who are really bringing in hard cash match and the difference in those programs. Michael Funk offered that it was his understanding that Recommendation 5 was only concerned with looking at just match money. It wasn't worded that directly, but in the wording it says funding that is related to the scope of services offered by the program. If a particular program has a demonstrated scope of service and quality of service that has a richness that really stands out, it’s important to know that they did that because they went out and found match money that went directly to that program. It was decided to look at each recommendation in sequence:
There was no comment on recommendations 1,2,3,4,6,9,11,12,14,,15,18,19. Recommendation 7, Customer satisfaction: OK as long as we use a broad definition of customer. Recommendation 8, reporting on programs’ evaluation processes: Carla Sanger stated that intellectually this makes perfect sense, but as a practical matter it would not work. Organizations have a lot content area evaluation and there may be issues with proprietary ownership of evaluation products. Ms. Sanger didn't want to compel programs to turn all of their evaluation products over to the state. Frank Pisi clarified that this recommendation was not intended to result in what Ms. Sanger suggested, but rather to collect data on programs’ evaluation practices. Ms. Sanger suggested that the Committee revise this recommendation to allow programs to voluntarily provide this information. The Chair stated that so far two changes were noted; the first is to change Recommendation 5 to only look at how programs are satisfying the match requirement and to include “may choose to” or language like that for Recommendation 8. Recommendation 10, providing stipends for evaluation participants: Ana Campos noted that this recommendation could be read two ways. One is that we’re recommending that the grantees provide pay stipends to their staff for evaluative responses, or that second version is that the CDE will provide the grantee a stipend. Frank Pisi clarified that the recommendation is that the CDE will provide for the stipend, but reminded that Committee that CDE will present the committee’s recommendations and also present their own; if there is an Advisory Committee recommendation that the CDE doesn't agree with, we will articulate this to the State Board. Recommendation 13, issues related to sample size and validity: It was stated again that the issue here is regarding the distinction between size and scope of programs throughout the state. There was discussion regarding including reference to sample size. Frank Pisi stated that it is the CDE’s intent to leave the decisions regarding sample size and validity to the experts. Recommendations 16 and 17, dosage effect on students and longitudinal nature of study; Ana Campos voiced a concern about collecting data related to dosage when programs might not have the infrastructure in place. If programs are required to maintain records to document the number of hours that students participate without building an infrastructure that is stemmed around that, data would be suspect. Frank Pisi clarified that this requirement (numbers of hours of participation per student) is a high school requirement only, and elementary and middle school programs will not submit data in this way. The Chair stated that it makes sense that if the program design and requirements are different for high schools, their reporting requirements regarding attendance should match. The Chair recapped the changes, the first of which is under Recommendation 5 which will delineate the reference to match requirement, not total funding. The second is Recommendation 8 where we are going to add some language that says "may, can, if willing", whatever we figure out how we say it is an option that they can choose to participate in. Frank Pisi asked to clarify the changes; we still want to recognize that the evaluator will look at evaluation systems and that programs may, or that the evaluator may. The Chair confirmed that notion. Recommendation 10 change will make it clearer that it is CDE who should provide stipends or incentives to research participants. Recommendation 16 recognizes the dosage is different between high school and K-8 in that K-8 has days and high school has hours per week. The Chair asked for any other changes. There being none, the Chair asked for a recommendation to accept this report for submission to CDE. A motion was made to accept with revisions by Gary Moody, seconded by Amy Christianson, and unanimously approved by the committee.
Report of consultation with the CDE regarding outcome measures. Renee Newton explained that this report is being forwarded for information to the advisory committee. The Chair complimented the Outcomes and Evaluation Subcommittee on their hard work on their enormous amount of work. Frank Pisi asked the committee to review the Executive Summary and gave an outline of the full document which talks about each of the different measures which represent the testimony that has been received throughout the year. Frank explained to the committee that the first four outcome measures are for elementary and middle school programs and for high school programs there is the fifth measure that that they can select from which is performance on the CAHSEE and graduation rates. The important piece for this is that all grantees must demonstrate effectiveness based upon outcome measures. They can select one, some, or all of the measures. Based upon their choice, they have to report the data relative to those outcome measures to the CDE. Failure to show results based on these outcome measures over several years could result in the grantee losing funding. However, the CDE may not base the decision to terminate a grant on only one outcome measure. Per Education Code, the CDE must develop, in consultation with the Outcomes and Evaluation Subcommittee, tools and procedures to collect outcome measure data. The Outcomes and Evaluation Subcommittee has solicited and received testimony from numerous local, statewide, and national researchers, evaluation experts, and program practitioners in an effort to seek guidance on how to create meaningful statewide outcome measures. Everyone who provided testimony were asked the same three questions:
- What is your experience in evaluating programs based upon the measures outlined in law?
- Are there additional measures you would recommend?
- How can a tool or protocol be designed to measure the outcomes in question for a statewide effort?
Based upon the testimony and internal discussions, the CDE offered the following recommendations to the Subcommittee:
- Convene a workgroup of evaluators, practitioners, and CDE staff to develop the tools and procedures.
- Provide grantees withy the ability to "reconfirm" their outcome measures selections and change them, if necessary.
- Provide grantees the ability to submit (with CDE approval) supplemental data (in addition to the mandated tools and procedures) on the outcome measures for consideration.
Based upon the testimony provided at the Subcommittee meetings, the following eight recommendations for the CDE as it develops its tools and procedures have been identified: Support the creation of a workgroup to develop tools and procedures. One or two Outcomes and Evaluations subcommittee members should participate in this workgroup as well as local practitioners and CDE staff.
- Standardize the tools and procedures as much as possible so that you do have all programs being measured against a standard, not each other.
- Consider target populations of after school programs.
- Consider first year as a trial or pilot program. This is basically to be cognitive of the fact that in the first year, we didn't have the tools, programs were struggling to get up and running.
- Develop tools and procedures that employ technology to facilitate ease of use.
- Provide technical assistance to grantees to ensure reliable use of tools and reliable results. Frank Pisi offered that it is really important that the CDE provide good technical assistance to the field practitioners and to the people that will be administering whatever these tools and procedures are.
- Adopt standard definitions of specific measures. If we are talking positive behavioral changes, what does that mean? We have heard a million different recommendations from researchers so we need a standard definition.
- Support the ability of grantees to “reconfirm” their outcome measure selection.
There was discussion about the relationship between the independent evaluation and outcome measure collection. Carla Sanger stated that if the CDE is not interested in collecting local program evaluation data correlated to nothing standard across the state for the outcome measures, why is it recommended to be collected as part of the independent statewide evaluation?. Michael Funk commented that it’s because the first recommendation to look at local evaluation was part of an independent evaluation, which is different than this effort, which it about the outcomes for program performance and compliance. The research experts had advised that if you’re looking at outcomes and accountability, you can’t get there by looking at data that is not standardized across the system. This led to the Subcommittee to look for a standardized set of tools and questions that will allow as much flexibility as possible, but we can’t recommend something that’s totally flexible and localized. Carla expressed that she didn't understand why such a line of separation between outcome measures and evaluation was being made. In her estimation, it all goes to the same end: do these programs make a difference or not? Michael Funk responded that outcomes measures are being collected for the purpose of determining whether or not a program is re-funded, not for the purpose of showing the Legislature how effective these programs can be. That is the use of looking at those measures and elements through the independent statewide evaluation. So if one thinks in a narrow scope, these recommendations are only used to determine should a grantee get their money again. In that sense, the Subcommittee thought that by keeping it simpler was the cleanest approach. Frank Pisi was asked to clarify the data that would be collected for the 2006-07 year. Frank replied that since we don’t have the outcome measure tools developed yet, the only thing that we are going to be able to collect in 2006-07 is STAR data. Frank was asked to discuss the timeline in terms of when tools will actually be developed and disseminated to grantees? Frank responded that the plan is that once there is approval of this document by the CDE, the workgroup will be created. We are working on identifying members at this time that will convene in July through September to use the research that’s been collected and to have this done over the summer and present to these back to the subcommittee in late summer. Ana Campos asked the members to note that there should be consideration for year round schools. Frank explained that the workgroup would be developing the protocols and giving guidance on how to actually roll them out to the field. It was reinforced to the group that these outcomes are only for the grantees that selected one of these outcomes voluntarily as the outcome they want to choose in which to measure their program effectiveness. These are not evaluation outcomes that are going to be placed on all grantees across the board, only those who have selected an outcome.
The Chair reported that this agenda item came from the Outcomes and Evaluations Committee via a letter from Senator Torlakson to Renee Newton, Subcommittee Chair. Renee Newton reported that that the subcommittee made a motion at its last meeting to forward its recommendation regarding support for the CDE to honor the Legislative intent of the definition of teacher as it is referenced in Education Code. The Chair informed the group that is issue is with specific language that came out of SB 638. In that bill, the term "after school teacher"” was included as a possible report of student information to the CDE. In Education Code, the term "teacher" has a specific meaning; someone who is a certified credentialed teacher. The letter from Senator Torlakson explained that it was his intent that the term should be interpreted to include any after school staff, credentialed or not. CDE legal counsel as rendered an opinion that this would not be consistent with established practice. The chair offered her opinion that what is written in law is what the CDE has to follow, not intent. She offered that the Advisory may put itself in a quagmire because we could recommend that the CDE honors legislative intent, but what they have to go by what is written in law. The Chair recommended that the easiest solution would be to change the language to read after school staff and noted that this is an action item. Carla Sanger made the Motion to recommend to Senator Torlakson that there be clean-up language SB 638 to clarify after school staff, not after school teacher. The motion was seconded by Renee Newton and unanimously approved to direct staff to write a letter to Torlakson to carry language to read "after school staff." The Chair then directed the CDE staff to prepare a letter to Senator Torlakson’s office recommending that SB 638 carry language that would clean up this issue to mean after school staff.
The Chair recognized John Malloy and thanked him for his patience. John announced to the committee that he had three items to present and would entertain questions during the presentation.
The CDE is still working with the California County Superintendents Educational Service Association (CCSESA), both with their Curriculum and Instruction Steering Committee (CISC) and the Student Services Support Committee (SSSC) to further the development of our regional lead network. For the 2007-08 grant, we anticipate that all the county offices of education that currently house act as regional leads will be working with a regional advisory committee to develop technical assistance work plans that will build capacity to support all programs in the region. At the last advisory committee meeting it was mentioned that most of the regions already have committees such as these in place. Most importantly is that the CDE is really exploring exactly how to increase accountability within in the system, especially accountability as it pertains to services provided to programs? The CDE wants to know exactly what the system needs to look like to more fully address the needs of the people in the field. The chair asked for confirmation that the CDE is looking at system funding, but will not require that the regional lead county pass through a set amount of funding to other counties. John Malloy confirmed that for the 2007-08 fiscal year, that is correct. Decisions regarding how services would be provided in a given region would be left to regional advisory committees. The chair stated that, because of this, regional advisory committees may look different across the state. A question was asked about who oversees the regional system as a whole. John informed the group that the CDE is responsible for this. While we provide grants to eleven county offices, the responsibility is to provide services to all counties in the region. Frank Pisi added that this is consistent with other regional systems that the CDE administers. A grant is provided to one county in the region, for example, to the Butte County Office of Education. BCOE is accepting the funds on behalf of Region 2. The guidance from the CDE is to serve and provide technical assistance to all grantees and all counties. The Superintendent of Butte County Office of Education indicates a person to be lead person for this project. What the CDE does as a matter of protocol is deal with that lead on day to day issues. If there is an issue with any particular regional lead, we would talk to their supervisor. Ultimately, we would talk as a grant manager to the Superintendent of Butte County Office of Education saying you've agreed to do these certain things and they are not happening. Frank Pisi stated that it is the requirement of the plans be collaborative throughout all the counties, so there will actually be a sign off by all the county superintendents. Another step is that we won’t continue granting funds until the work plan is approved and we see the evidence that it is collaborative that all counties will receive services. Once it is approved in that way, we say here’s the plan and we can release the funds. Carla Sanger stated, as a practical matter, and that she appreciates all the diligence for the work plan, what about user satisfaction? Is there any attention to this item? John Malloy answered that this is exactly what we’re trying to find out, looking at the accountability as a matter from the people in the field, whether it’s grant, the grant manager, the front line staff, those are the questions that we are looking at specifically. It’s a question that we don’t know how we are going to answer because it’s a very complicated one. We are working diligently on figuring out exactly how to increase the accountability from everyone’s view. Ana Campos stated that the CDE must pay attention to the issue of making sure that what is asked of the lead counties is commensurate with the funding allocated to a region. Until regions know what their funding level will be, it will be impossible to plan adequately. John was asked if regions would be required to complete their plans in the absence of knowing their funding level. He replied that they have do preliminary work prior to the funding level being established, but they will be able to modify their work plan based on the funding allocated. Frank Pisi added that there will be a portion of funding going out soon to provide for a continuation of services. Al Cortes reiterated the concern that regional funding levels be commensurate with the distribution of programs. For example, a region may have 30% of the state’s programs, but only receive 18% of the regional funding. John Malloy stated that the CDE has worked with the CCSESA and the other regional lead systems in the state and have taken their recommendations. The After School Programs Office is working diligently right now to come up with several funding scenarios to consider. It was stated that John Malloy was in a difficult spot reporting on activities that are under the development of the After School Programs Office, given the fact that he is the administrator of the After School Policy and Evaluations Office. John replied that we are After School Partnerships and there are two offices together. There may be questions that he can’t answer but it’s important that we keep a continuity of service. Carla Sanger stated that she wanted this to be on record: Is it by design or coincidence that Jane Ross never makes these meetings? John replied that specifically the working with the Advisory Committee is under the purview of the After School Policy and Evaluation Office. Frank Pisi stated that he is employed in John’s office, so it is John’s responsibility to attend to these meetings. The Chair stated that she does hear from the committee a recommendation that as we continue talking about the Regional System of Technical Support, that Jane Ross come and address the work related to it. The recommendation was supported and it was noted that Jane Ross would be asked to address the committee. John Malloy continued his information: With reference to the Healthy Snacks Project and the Physical Activities Guidelines, applications to be considered for the advisory committees were sent out on May 25th and are due today (June 13th) Participant requests were sent out through the regional leads, CASRC, After School Network, and a variety of other partner related distribution lists. Regarding the After School Demonstration Site program, the applications will be released in mid August for the demonstration sites and we are working with the Southwest Educational Development Laboratory (SEDL) on this project. Site visits of the finalists should be conducted in October and November 2007.
This tool is being developed by The CDE using research based criteria identified by SEDL. This is just one part of the process of evaluation and it doesn't give the hard data that looks at outcomes, but it’s a process evaluation. The tool is organized around nine core program areas:
- Program design
- Program environment
- Leadership management
- Linkages with the regular day
- Staff development
- Youth development
- Community partnerships
- Program finance
- Program accountability
John noted that the handout presented today was only a one page draft that is representative of the elements to be included in the tool. The results of this tool will help the regional leads prioritize needs for technical assistance. The time line is tentative for this and is tied to the After School Demonstration Sites. We hope to make the full tools available in the fall of 2007. Concern was raised about the four qualifiers of performance level: advanced, adequate, minimal, and inadequate. John replied that these were developed with SEDL and it looks a lot like our rubrics for when we're actually reading grants. John confirmed that a rubric will be included to assist in completing the tool, as "adequate outdoor space" is going to look a whole lot different at the 29th Street School then it is at in Thermalito. Lou Fernandez clarified that the tool is basically for individual programs' use and it’s not a monitoring tool. John agreed and stated that it is a self evaluation tool. It’s also for self monitoring tool; it’s not for auditing purposes. Amy Christianson asked if this tool would be used in unison in with the site visit. John replied it could be. John was asked if this tool would be into Categorical Program Monitoring (CPM). John Malloy replied that it’s certainly not. One should not confuse CPM with Technical Assistance. Everything on the CPM document talks about education code section, etc. The Quality Self Assessment Tool is about assessing programs related to what researchers have shown are quality indicators of a quality program.
Lou Fernandez made the point that if this tool is to be used to determine technical assistance needs, the CDE must ensure that regional leads and others will be able to provide the assistance. Michael Funk applauded the notion of self assessment and any kind of tools that help support program reflection. He emphasized the notion of supporting program reflection. Sometimes reflecting on an area of program that needs improvement doesn't necessarily trigger formal technical assistance. To that end, Michael raised the concern is that the assessment tool feels a little institutional, this might cause programs to not want to fill it out honestly because they are not sure where its results will go. The language and performance level indicators might not promote honesty- program coordinators at a site will wonder if they should check "inadequate" because they’re not sure of the repercussions. Michael stated that he did not want to judge the whole document, as they were only provided a small slice of the whole thing. Frank Pisi informed the committee that once the tool is developed it’s going to be put on the Web for public comment for at least two weeks. This will provide opportunity for anybody and everybody to actually look at this tool and scrutinize it, give their comments and send the information to the recommendations office and send the tool back to us. From that the CDE intends to develop a real consensus document. If there is a recommendation that the document is too institutional and it doesn't look like it would really spur on the user to answer honestly, then you have the opportunity to give us the specific recommendations. Ana Campos raised the concern that there has not been practitioner input in the development of this draft tool. While there will be an opportunity for public comment, it seems that the development is backward. Carla Sanger asked the committee to briefly return to a topic previously addressed. It was stated earlier that applications for the Healthy Snacks and Physical Activity projects went out on May 25 and were due today. The Chair added that this came up with Ana’s question of how the advisory committees form. At the last meeting, Frank Pisi reported back that these committees have been formed, and others are in process. Carla stated that she missed reading this in the previous meetings' minutes. Carla added that she had just talked to three big programs in Los Angeles, and not one of them received that information from their regional leads. She took responsibility for not reading the minutes more carefully, but the bigger issue here is an issue of not receiving information from the regional leads. Continuing the conversation of the self assessment tool, Renee Newton asked John to invite Jane Ross to the next meeting to talk about this tool, as there are many questions that are not able to be answered here. The Chair summarized that what she is hearing across the board (speaking to John Malloy) that there is recommendation that the CDE quickly engage the field as this moves forward, and well prior to it going up on the web or anything else so that whoever is in charge of this will reach out and start engaging people, both advisory members and others to get this moving.
John Malloy reported that staff sent out the 21st CCLC notices to all grantees who were not funded for part or all of their programs that the appeal date has been extended to June 18, 2007. A major project the CDE is engaging in is consolidating the multiple ASES grants. Given that this is now a continuous appropriation, there will no longer be cohorts. This cannot be done for 21st CCLC because they run on cohorts and it is a time limited program. Steve Amick commented that, for unsuccessful 21st CCLC applicants, only a letter, and not scores or comments, were sent. Applicants had to contact their regional consultant to receive them. Had they received comments and scores in their packet, many applicants might not file an appeal. Likewise, others would have had more time to craft theirs. Steve suggested that the CDE consider including readers commends and scores in the unsuccessful applicants’ correspondence. Carla asked John a question about a situation where a school’s demographics have changed since first receiving a grant. If a school’s free and reduced lunch rate has improved, is their funding in jeopardy? John stated that there is nothing in Education Code that would cause them to lose funding. Michael Funk voiced a concern regarding new 21st CCLC grantees that also received ASES Cohort 5 funding. To that issue, he asked that the CDE take a serious look at how attendance reporting is going to be handled next year. If the federal funds are designed to supplement and not supplant state funding, He wants to ensure that there is a plan in place to insure that folks are double funded are exhausting their state funding before they start tapping into their federal funding and that there’s an attendance system in place that clearly instructs people to do it that way so that we will know that all funds are appropriately accounted for. John Malloy assured that he will make sure to look at that because that is an item that the federal government will be coming to look at. He agreed that the CDE needs to make absolutely clear that is handled. The Chair thanked John Malloy for his presentation.
Lynn DeLapp provided this report. She informed the Committee of new leadership coming on board, the success of the Network’s quality assessment workshop, and a new research committee. Andee Press-Dawson, former executive director of Sacramento START and before that, the founding director of Kids on Campus, will joining the staff as Executive Director on August 1st. On another note, Lynn informed the Committee that the leadership team and executive committee have undergone some changes. The new co-chairs of the leadership team who will start July 1st are Steve Amick and Lindsay Callahan. Members of the executive Committee are Jennifer Peck, Amy Sharf, Michael Funk, Frank Pisi, and of course, the ex-officio, Renee and the executive director. Our next leadership team meeting is scheduled for September 27th in Southern California. Lynn expressed that all are very excited to now have a leadership team in place and the new executive team will be in place very shortly to make sure that everything in our strategic plan gets underway. The Quality Assessment workshop was a huge success. This was the inaugural public event for the After School Network. There were over 70 participants from the after school field and academia. Evaluations were outstanding with 97 percent reporting that the presentations were relevant to their needs and 89 percent reported that the information could be used right away. Resources from the Quality Assessment Workshop are available on the network and the video from that presentation will be put up tomorrow. More community level forums will follow. The Network plans to go to local communities around the state to sponsor workshops that address the needs of the local communities on research topics. Finally, the After School Network has established a new Research Committee which will oversee field research. The Network wants to know what the after school fields’ concerns are, what the needs are, and their input on issues. It will be capably co-chaired by Michael Funk. It should be a very exciting committee that will work with the university researchers and very strong field representation to find out more about what the field wants and make sure that good research gets out to them.
There being no questions, the Chair thanked Lynn for her presentation.
Frank Pisi reported that the committee has not met since the last Committee meeting and we are looking at a late July, possibly early August date. The Chair thanked Frank Pisi for his report.
The Chair noted that there is a recommendation that we go dark in July and August and reconvene in September. She reminded everyone that the original intent was to meet every other month. For a short time, just until release of the Proposition 49 and the 21st CCLC monies, meetings were held monthly. The proposed is that we go back to every other month meetings, again until we see that something needs to have more oversight. The Chair acknowledged that this membership is very busy with other committees and sub committees and this is also costing a lot of agencies a lot of money. She then opened the floor for suggestions. Al Cortes stated that he would like to recommend that we go back to every other month. This will allow the subcommittees to meet during off months and then if something comes up, we can always revisit the monthly schedule. Amy stated she was in agreement to going dark in July and August. She expressed concern, however, over waiting all the way to September with a lot of things that are coming out such as tools development and everything else and recommend that we meet again in late August. The Chair stated that one option, due to the first week of September being Labor Day is to schedule to meet during the 2nd week of September. Ana Campos stated that because the whole regional lead system issue is very important, she was concerned about waiting until September because we will not then have an opportunity to really have a discussion. It will be one of those after the fact comment meetings, again. She opposed waiting until September and proposed that the Committee meet again in August. The Chair stated that we will look at availability at dates for this room (Conference room 1101, CDE Headquarters) in mid August. We really need to use this room so that we have input from the public and adequate space.
The Chair reviewed the standing agenda items:
- Chair report
- CDE staff report
- Workforce Development Subcommittee report
- Outcomes and Evaluation Subcommittee report
- California Afterschool Network report
The following additional items were identified:
- Update on the Regional Technical Assistance System from Jane Ross
- Quality Assessment Tool
- Demonstration sites project
- Recommendations for the Independent Evaluation
- TA projects advisory committee update (Physical Activity Committee, Nutrition Subcommittee, the eventual Youth Development subcommittees).
- Finance Project to present on the funding guide they've developed
- Results of the Statewide Technical Assistance Needs Assessment
Katie Brackenridge introduced herself as being with the Bay Area Partnership for Children and Youth. Katie commented on the discussion of whether STAR test scores are going to be used for accountability in the first year before outcome measure tools become available. She encouraged the advisory committee to take a stance or make a recommendation to the CDE that STAR test scores not be used for accountability for programs that did not select them for outcomes or outcome options. Frank Pisi clarified that STAR test data is going to be reported by everyone statewide, but will only be used as an accountability measure for those programs that selected it as an outcome. Katie acknowledged the clarification and withdrew her suggestion. The Chair asked for the record to show that the committee made a member of the public happy. Lindsey Callahan, Co-Chair of the Quality committee of the After School Network, provided the next public comment. She reported that the Network Quality Committee is taking a look at the Quality Self Assessment tool. In response to Ana Compos’ earlier comment on the lack of practitioner input in the early development of the tool, there is some public comment going on, especially with Frank Pisi's leadership. Ms. Callahan asked for clarity on the timing of the self assessment, particularly on the demonstration sites project and the need for public comment. If the applications are due in August, it just is not clear whether or not the self assessment is part of that application or part of the review process. Frank Pisi commented that he will be sure that this will be addressed at the next Advisory Committee meeting. The Chair stated for the good of the order, this meeting was adjourned until mid-August. The meeting was adjourned 3:25 p.m.