Skip to content
Printer-friendly version

Frequently Asked Questions: August 12, 2011

Questions and Answers related to the California English Language Development Test Request for Proposals, 2012-2015.
Section 3.0 – Scope of the Project

Section 4.0 – General Proposal Information

Section 5.0 – Proposal Specifications

Section 7.0 – Contract Terms and Requirements

Section 11.0 – Rating Criteria and Evaluation Forms


Section 3.0 – Scope of the Project

Section 3.1 (Task 1) – Project Management, Meetings, and Project Deliverables

  1. Page 8 – Section 3.1.A.1 – Overlap of Contracts and Continuity of Assessment: The last sentence on this page "Camera-ready tests must be itemized separately in the Cost Proposal." refers to itemizing camera-ready tests in the cost proposal. To which year does this refer, the first year in which the successful bidder receives camera-ready tests or the last year in which the successful bidder would be preparing them for the next contractor?

    The last year. See Errata #3, Section 3.3.A, Test Development Plan, page 24.

  2. Page 9 – Section 3.1.A.2 – Table 1. Major Deliverables by new CELDT Contractor by CELDT Edition: Training Workshop Materials for 2015–16 indicates that these will be prepared by the next contractor after the upcoming contract (RFP2) but that cell is not shaded. Please confirm that this is correct.

    Yes, all marked as RFP2 must be completed by the next contractor.

  3. The “RFP*” code in Table 1 under the “Training Workshops Materials” column for 2012–13 could be interpreted to mean that the camera-ready training materials will be provided by the current contractor. This contradicts section 3.5.B.1 that states the successful bidder will develop and deliver the training materials in 2012–13. Is it correct that the current contractor is not providing camera-ready materials and the successful bidder must develop the training materials for the 2012–13 training workshops?

    Yes; see Errata #3, Table 1, p. 9.

  4. Page 11 – Section 3.1.B – Orientation, Annual Planning, and Transition Meetings: Is it correct that there is no annual planning meeting in July of 2012?

    Yes, there only will be an orientation meeting in 2012.

  5. Page 12 – Section 3.1.F – Narrative Schedule and Timeline: To be sure that we understand and are correctly using the same terminology for schedule representations, could the CDE please further describe and/or provide examples of the timelines and detailed narrative schedule and explain how these two deliverables differ?

    The bidder needs to provide a chronological timeline that provides sufficient narration to address the requirements in Section 3.1.F.  The narrative schedule and timeline are one in the same.

  6. Page 14 – Section 3.1.I – Project Deliverables and Final Document Specifications: Does the CDE own the http://www.celdt.org External link opens in new window or tab. URL and if so will it and all of the code and content hosted there be provided to the successful bidder?

    No, CDE does not own the URL; it was established by the current contractor. The URL may be available to the successful bidder during the transition.

Back to Top

Section 3.2 (Task 2) – Item Development

  1. Page 15 – Section 3.2.A – Item Development Plan: Would the CDE please provide item maps for each domain and grade span for the 2011–12 edition, showing the item type, field test or operational designation, and score range for CR items? Something similar to the item maps in the 2009–10 technical report would be very helpful in understanding the number of each item type to be field tested each year.

    No, CDE cannot provide these. The 2009–10 technical report is the most recent report available.

  2. Since the total number of items to be field tested each year appears to be the same, is it the CDE’s intention that the 2011–12 structure for field testing be duplicated in each of the three following years under the new contract?

    Yes.

  3. Page 16 – Section 3.2.B – Item Specifications: Are there existing item specifications that will be provided to the successful bidder to maintain? If so, are they currently sufficiently detailed or would the successful bidder need to enhance them further?

    The successful bidder will receive a copy of the CDE-owned item database containing item specifications. Currently, that bank does not contain dimensions of knowledge and skills being assessed under each item. This would be an enhancement.

  4. Page 17 – Section 3.2.C.1a – Selection of Item Writers: In regards to the selection of Item Writers, will the CDE provide a list of subject matter experts they have worked with in the past, or is the successful bidder to recruit qualified item writers on their own?

    According to Section 3.2.C.1.a, the bidder must describe how item writers will be selected and trained.

  5. Page 17 – Section 3.2.C.2 – Item Writing Logistics: Would online (WebEx, etc.) item writer training be acceptable?

    No.

  6. Page 18 – Section 3.2.D – Item Inventories: Does CDE have a standard or preferred overage percentage to be used to determine item development needs (vis-à-vis the number of items required for field testing), or is the percentage of development overage to be suggested by the vendor?

    The bidder must make that decision considering attrition during CDE content reviews and bias and sensitivity reviews.

  7. Page 19 – Section 3.2.F – Released Test Questions: Is the entire 30% of operational items replaced each year released in the Released Test Questions (RTQ)?

    No.

  8. Is it correct that the successful bidder identifies items for the Released Test Questions document only in spring of 2014 and not for the other years?

    Yes. The successful bidder must identify items and provide sample student responses, so the CDE may produce an updated document.

  9. Will the successful bidder be responsible for providing the CDE with the sample student responses for the released items such as those that are present in the current January 2011 Released Test Questions document (e.g., p. 24)?

    Yes; see Errata #3. Section 3.2.F, Released Test Questions, page 19.

  10. Page 19 – Section 3.2.G – Item Database: The Suggested RFP Language for use of the APIP document states currently, the “required components of the APIP Item Standard include the provision of accessibility information for text only, graphic only, text and graphic, and non-visual audio representation of item content, and Braille representation of item content.” In addition, there are several optional components for item modifications. Please clarify which components the vendor should plan to implement.

    We do not know at this time; refer to Section 3.2, paragraph 2, p. 15.

  11. Is the vendor responsible for APIP compliance only for the newly developed items or is it CDE’s intent to make all existing items APIP compliant as well?

    The vendor is responsible for the newly developed items.

  12. Page 20 – Section 3.2.G.3 – Item Passages: Does the CDE intend to enter item passages into the CELDT item database or would this also be a responsibility of the successful bidder?

    It is the responsibility of the successful bidder.

  13. Pages 20-21 – Section 3.2.H – Field Testing: What is the CDE’s minimum required sample size for printing and distributing each field test form? 

    As described in Section 3.2.H, the bidder must describe statistically sound sampling strategies for field testing that will ensure appropriate distribution and data gathering among the state's diverse student population.

  14. What is the CDE’s minimum required sample size for scoring each field test item, by item type?

    The field test plan must allow for robust item performance statistics that is representative of the different students' ability level.

  15. Are all annual testers in selected field test districts or schools required to take the assigned field test form or can participation be limited to a subset of those students?

    No, a subset of students is not allowed. School districts and schools are not allowed to opt out of field testing once the contractor determines their participation.

  16. How many items are administered operationally with each reading passage?

    As stated in 3.2.H, pages 20-21, a minimum of six items, approved by the CDE, must be field tested for each reading comprehension passage. The intent is to have no fewer than four items associated with each operational reading passage.

  17. Page 21 – Section 3.2.I – Internal and External Item Review: What is the current schedule for Content and Bias and Sensitivity Review Panel meetings each year?

    Due to budget cuts to the current contract, content and bias and sensitivity reviews are not currently being held.

  18. Page 22 – Section 3.2.I.2a – Content Review Panel: Please confirm that Content Review Panel meetings will occur in Fall 2012, 2013, and 2014 under this contract.

    Yes, these are the correct years. See Errata #3, Section 3.2.I.2a, Content Review Panel, page 22.

Back to Top

Section 3.3 (Task 3) – Test Development

  1. Page 24 – Section 3.3.A.1 – Overall Test Development: Are costs for the 2015–16 camera-ready CELDT edition to be included in the total base bid cost?

    Yes.

  2. Is it correct that the development of a camera-ready 2015–16 CELDT Edition should be included in the "TOTAL amount" referenced in section 5.5.A on page 82? If yes, what do you mean by "cost out separately from the overall contract cost"?

    The answer to the first question is yes.  For the second question, see Errata #3, Test Development Plan, page 24.

  3. Page 24 – Section 3.3.A.2 – Separating Kindergarten Tests: Should the retention of the vertical scales for the K–2 tests be part of the consideration in the redesign of grades 1 and 2, and to separate kindergarten for Reading and Writing?

    Yes.

  4. Page 25 – Section 3.3.B – Test Specifications: Are there existing test specifications to be maintained or do new ones need to be created?

    The successful bidder will receive a copy of the 2012–13 Edition test specifications from the current CELDT contractor and will be responsible for maintaining/creating the specifications for the upcoming editions.

  5. Regarding the second to last sentence of the paragraph, can we assume the schedule refers to 20 working days prior to the selection of items, not to the delivery of item writer training materials?

    No. See Errata #3, Test Specifications, page 25.

  6. Pages 25-26 – Section 3.3.C – Test Form Production: Does the CDE require that field test statistics be generated only from annual assessment students or can a mix of annual assessment and initial assessment students be used?

    The field test data is collected during the AA window from all test takers.

  7. Please provide page counts for each document to be produced, including Test Books, Answer Books, Administration Manuals and Scoring Guides, and Test Coordinator Manuals, for each grade span and form (1 versus 2-6).

    See the following table.
Test Materials:  Number of Pages
(Count applies to the 2011–12 Edition. Number of pages does not include covers.)

Test Material

K-1

Grade 2

Grade 3-5

Grades 6-8

Grades 9-12

Examiner’s Manual
Form 1

118

77

83

79

85

Examiner’s Manual
Forms 2-6

257

139

151

149

155

Answer Book
Form 1

22

40

6

6

6

Answer Book
Form 2

26

46

6

6

6

Answer Book
Form 3

22

42

6

6

6

Answer Book
Form 4

22

48

6

6

6

Answer Book
Form 5

22

46

6

6

6

Answer Book
Form 6

26

46

7

7

7

Test Book
Form 1

n/a

n/a

28

31

30

Test Book
Form 2

n/a

n/a

32

33

32

Test Book
Form 3

n/a

n/a

23

31

30

Test Book
Form 4

n/a

n/a

30

35

34

Test Book
Form 5

n/a

n/a

30

33

34

Test Book
Form 6

n/a

n/a

29

33

32

Test Coordinator’s Manual

39

39 39 39 39
Count applies to the 2011–12 Edition.
Number of pages does not include covers.
  1. Which of the following three requirements regarding the development of Braille special test versions is correct?

The first bullet is correct. For the second bullet, see Errata #3, Test Form Production, page 26. For the third bullet, see Errata #3, Special Test Versions, page 28, Section 3.3.G.

Back to Top

  1. Page 27 – Section 3.3.E – Test Form Planners: The test form planners described here seem to be identical to the test specifications described on page 26 (prior to the bulleted list). Are these the same or how do they differ?

    Once a test form planner is completed, it will be identical to the test specifications.

  2. Page 28 – Section 3.3.G – Special Test Versions: The information in the first bullet conflicts with Table 3 on page 26, where special test versions are also indicated for 2015–2016. Which is correct?

    See Errata #3, Special Test Versions, page 28.

  3. Pages 28-29 – Sections 3.3.G & 3.3.I – Special Test Versions & Comparability of Test Forms: Have the special version items from the current CELDT item bank been calibrated and placed on the same scale? If so, has the scale been linked to that for the regular items? Will information about the current research design and method(s) be provided?

    The answer to question one is yes, for the entire population of students. The answer for the second question is yes. The answer to the last question is yes, Section 2.5, p. 7, contains a link to the CELDT Web site which includes a technical report.

  4. Page 29 – Section 3.3.J – Test Booklet Production: Does the CDE prefer test booklets to be printed in black and white, or must they be printed in color?

    In color; see Section 3.3.K.1.b.

  5. CELDT Live! presentations posted on celdt.org indicate that K-1, grade 2, and K-2 test items and administration directions are contained in the Examiners Manual, and that the scannable Answer Book contains only spaces for recording students’ responses and/or teacher-assigned scores. Please confirm that this is correct and is desired for the next contract.

    Yes, that is correct.

  6. What is the average total order quantity as a percentage of the number of tests scored per grade span each year? Do many or most districts choose to reuse test books for grade spans 3-12?

    CDE does not have the total order quantity percentage. However, in 2010-11, there were approximately 1.6 million students tested (1.6M includes AA, IA, Unknown, and AA Outside the Window), and the bidder needs to propose sufficient quantities to assess all students. See the DataQuest Web site (link is on p. 33 of the RFP) or CBEDS for actual numbers. Table 5 on p. 33 provides test book order estimates that represent an upper end of possible orders if the number of ELs should increase in the future. Yes, many districts do re-use them.

  7. Page 30 – Section 3.3.K – Answer Books: Would it be possible for the CDE to provide prospective bidders with secure copies or access to a current set of test materials, including test books, answer books, and administration manuals?

    No, only the successful bidder may have access to these secure test materials.

  8. Page 30 – Section 3.3.K.1a – Demographic Data Fields: This section refers to a “pre-printed SSID.” Does this mean a pre-ID label or is the SSID information to be printed directly on the Answer Books?

    This refers to a Pre-ID barcode label.

  9. Page 30 – Section 3.3.K.1d – Space for Written Responses: How much space, as a percentage of the total page, is allocated to each 0-3 and 0-4 item in the writing section of the 3-12 Answer Books?

    Currently, a maximum of three "Sentences" items fit on one page. "Short Composition" items use a whole page.

  10. Page 30 – Section 3.3.K.2 – Annual Update: Please confirm that the dates in this section should be 2015–16 and March 31, 2015.

    Yes. See Errata #3, Annual Update, page 30-31.
  1. Is the same Answer Book used for all forms 1-6 within a grade span or does each form have an Answer Book corresponding to the specific field test item configuration (MC, DCR, CR, etc.) within that form? Does this section refer only to the demographic information captured on each Answer Book within an administration year?

    Each form has an Answer Book: Form 1 is operational; Forms 2-6 contain field test items specific to each domain (Form 2-Listening; Form 3- Speaking; Forms 4 and 5-Reading; Form 6-Writing).  As to your second question, the answer is no; the number of field test items and the location vary by domain.

  2. Page 31 – Section 3.3.L – Test Administration Manuals: Are the Examiner’s Manuals different for each grade span and for form or does one manual encompass multiple forms and/or grade spans?

    Currently, one examiner’s manual contains operational Form 1, and the other manual contains the field test Forms 2-6.

  3. Do the first two lines in Table 4 indicate the number of manuals to be included in each school’s materials? Do the following three lines indicate how many extra copies of each manual are to be included in the district overage? If so, is this the only quantity of manuals provided in the district overage or does it also include some percentage of the total number of manuals distributed to the school(s) in that district?

    The answer to the first question is yes.  The answer to the second question is yes.  The answer to the third question is yes; this is the only quantity of manuals to be provided per Table 4.

  4. Which test administration materials are considered non-secure, to be posted on the CELDT Web site?

    Only the Test Coordinator’s Manual is a non-secure material.

Back to Top

Section 3.4 (Task 4) – Test Administration

  1. Page 32-33 – Section 3.4.B – Ordering of Test Materials: Please provide a breakdown of the percentage of AA and IA tests scored each month, on average.

    CDE does not have this data; however, the majority of students, approximately 1.3M., are tested by the close of the AA window, October 31st of each year.

  2. Do the fixed costs to be reimbursed per this paragraph include things like materials composition, packaging, shipping, etc.?

    Yes, everything described in 3.4.B.1. on page 34.

  3. Page 34 – Section 3.4.B.2 – Web-based Ordering System: How many historical data file orders are placed each year?

    CDE does not keep this data; the contractor fills requests on a fee-for-service basis.

  4. Page 34 – Section 3.4.B.4 – Excessive Ordering Prevention: How much on average are districts charged in total each year for excess orders, excess materials, excess special test versions, and excess pickups?

    CDE does not keep this data.

  5. Page 35 – Section 3.4.C.1 – Collection and Handling of Secure Test Materials: On average, what percentage of all materials is returned to the contractor for secure destruction each year?

    In 2010–11, about 40% was returned, about 10% did not respond to repeated inquiries, and the balance was local destruction (includes LAUSD).

  6. Page 36 – Section 3.4.D.2a – Pre-ID Data Entry: Please clarify whether this means that actual answer documents must be “pre-bubbled” or whether this means all demographic data contained on the answer document must be able to be associated with the student during scanning time (by use of barcode labels, etc. for example)

    LEAs that participate in Pre-ID receive individual student labels that can be adhered to a pre-determined area on the cover of the answer book.  The label contains written student demographic information in the barcode.

  7. Page 36 – Section 3.4.D.2b – Accuracy of Pre-ID: Approximately how much in total do districts pay each year for pre-ID services?

    CDE does not keep this data. The contractor fills requests on a fee-for-service basis. The current contractor charges a set-up fee of $200 and approximately $0.40 per student label.

Back to Top

Section 3.5 (Task 5) – Training Materials and Workshops

  1. Page 40 – Section 3.5.B.1a – Workshop Logistics: Please describe the current training materials (e.g., domains/ components covered, minimum number of sample student responses for each domain/component, number of pages, number of sections, ancillary materials, etc.)

    Training Resources DVD includes:

Administration and Scoring Video DVD includes:

Training Binder includes:

  1. Page 40 – Section 3.5.B.1b – Sample Speaking and Writing Responses: Are these sample student responses the same as would appear in the Released Test Questions (RTQ)?

    No. The samples in the Released Test Questions are student responses to items that have been retired. The samples provided in the training are student responses for secure items that will be administered in the upcoming edition.  The sample responses provided during training are much larger in number. For Speaking, they are recorded audio answers and transcripts of those answers. For Writing, they are pictures (PDF files) of scanned student responses.

  2. Page 40 – Section 3.5.B.2 – Web-based Trainings: In regards to the Scoring Training Workshops and Web-Based Trainings, are the Web-based training sessions for CELDT coordinators to be live sessions (i.e. Webinars), or self-paced sessions that a participant could take at their convenience?

    Live sessions that are archived for up to one year.

  3. Page 40-41 – Section 3.5.C – Training Materials: Does the CDE provide CDs for administrators to administer the speaking tests or to train on scoring the speaking tests?

    No. Examiners read the directions and questions to students. DVD and CDs provided during training contain training videos and audio samples of student responses available to calibration exercises.

  4. Do the required workshop training materials include a printable materials CD containing PDFs of Form 1 and all the Examiner’s Manuals? If so, how many are required and are they to be encrypted?

    The answer to the first question is yes. The answer to the second question is:  approximately 1500 (15 workshops X 100 participants), and yes, they must be encrypted.

Back to Top

Section 3.6 (Task 6) – Test Security

  1. Page 42 – Section 3.6.B.2 – Documentation of Security Measures: Please describe the contents of the monthly audit reports referred to in this section.

    See Section 3.1.G., pages 12-13, for specific content to be included in the monthly audit reports.

Section 3.7 (Task 7) – Test Scoring

  1. Page 44 – Section 3.7.A – Scoring Specifications Plan: This requirement implies that the stimuli will be read to students by the test administrators. In light of the variability that this introduces into the testing situation by individual administers, what measures are currently implemented to reduce the potentially negative impact on score reliability and validity? Would CDE consider alternative options such as a CD- delivered listening section?

    The answer to the first question is:  test administration standardization is addressed during the training of examiners. Only trained examiners may administer the CELDT.  The answer to the second question is no.

  2. Page 44 – Section 3.7.A.2 – Scoring Student Results by Domain: Please provide the specific examiner’s instructions for recording the students’ speaking item scores and sending them to the contractor.

    Student responses for the Speaking items are not recorded by the examiners. Students are scored locally, in real time, by examiners. The examiner scores the response and enters the score on the Answer Book.

  3. Item maps in the 2009–10 technical report indicate that there are two constructed response (CR) items scored on a 0-3 scale in the K-1 reading assessment. Does the bidder’s commitment to score the reading domain and writing domain as stated in this paragraph refer to contractor handscoring for these K-1 reading CR items?

    No. The examiner enters all the scores for Reading on the Answer Book. In the case mentioned, the examiner counts how many letters the student has read correctly. Please refer to the CDE Web document Released Test Question (PDF; 5MB).

  4. Page 46 – Section 3.7.C.2 – Web-based Electronic Local Scoring: Does the current local scoring tool meet all of the requirements specified in this section or are enhancements desired? Will documentation and source code be provided to the successful bidder?

    No; the RFP specifies that the bidder will develop and provide the electronic scoring tool.

  5. Is it correct that the electronic scoring tool needs to be available by July 1, when the new edition becomes available to LEAs, instead of June 1 as in the RFP?

    No, it is June 1 as written.

  6. Page 47 – Section 3.7.E – Scoring Speaking Constructed-Response Items: Since the speaking items in the field test and operational administration are not recorded and are scored by the test administrator, it appears that the only responses available for the range finding will be from the pilot testing. Is this a correct assumption? If so, since the size of the pilot sample is specified (Section 3.2.H) as only 5-10 students per item, is the expectation that the “full range of ability” will be obtained across multiple items rather than for each individual item?

    The answer to the first question is no; the successful bidder will have to arrange for special administrations of Speaking items with school districts in order to make audio recording of sample student responses.  The answer to the second question is:  Range finding must be provided with a variety of recorded and transcribed student responses for each item and each score point. These audio recordings are made in addition to pilot testing, although pilot testing responses are not precluded from becoming sample student responses if no changes are made to the item.

  7. How many student Speaking responses per grade span are typically produced from range-finding? How are spoken responses captured for range-finding review and how are they provided for local training and scoring purposes?

    Between 40 to 100 responses per item (e.g., Oral Vocabulary would need fewer papers, 4-Picture Narrative would need more). Sample student responses are collected during special Speaking administrations at local school districts. These responses are audio recorded and kept in a database where they are associated with each individual item. These recordings are later compiled and presented during a calibration exercise in the workshop. Workshop participants also receive a disk containing all sample responses.

  8. Page 48 – Section 3.7.F – Scoring Writing Constructed-Response Items: Please confirm that all of the Writing constructed-responses be scored locally. Then the responses will be shipped to the vendor. The vendor will score all of the Writing constructed-responses for the purposes of reporting.

    The response to the first statement is:  "Local scoring of writing constructed-response items is necessary for immediate instructional placement of students….." (Section 3.7.F, p. 48 of the RFP).  The response to the second statement is yes; LEAs send all answer books to the CELDT contractor(see Section 3.3.J-last bullet at top of p. 30).  The response to the third statement is yes, the successful bidder must score all writing constructed-responses. "The successful bidder is responsible for processing all answer books submitted…." (Section 3.3.K, p. 30); also: “The proposal must indicate the bidder's commitment to score the reading domain and the writing domain and that the bidder is to produce the official listening, reading, and writing scores through its scoring procedures and then uses the speaking raw score provided by the LEA to produce the scale score" (Section 3.7.A.2, pp. 44-45).

Back to Top

Section 3.8 (Task 8) – Analysis of Test Results

  1. Page 49 – Section 3.8: Is it correct to assume that "June 30, 2013" should be "June 30, 2015"?

    Yes. See Errata #3, Analysis of Test Results, page 49.

  2. Page 49 – Section 3.8.A – Analysis Plan: What software is currently used for calibration, scaling, and equating, for differential item functioning analyses, and for other classical item analyses? Can any proprietary software be used to facilitate the process of item selection and form assembly in test development?

    Examples of current software are commercially-available products such as Multilog, Parscale, and Polyequate.   The answer to the second question is no; see Section 3.8.A, last sentence: "no proprietary software can be used for statistical analyses." 

  3. Page 50 – Section 3.8.E – Braille Versions: Is there a similar requirement for Large Print versions?

    All versions of the CELDT are developed based on the overall population of students.

Section 3.9 (Task 9) – Reporting of Test Results

  1. Page 53 – Section 3.9.B.1 – Student Performance Level Report and Label: Are Student Performance Level Reports and Labels for annual assessments returned in August or September to be held until the end of the AA window and reported with all other AA results or are they to be reported each month?

    Monthly reports and labels are to be shipped within 6-8 weeks of receipt by the contractor.

  2. Page 53 – Section 3.9.B.1a – Design of Reports: Is it true that the successful bidder must provide two paper copies of the Student Performance Level Report for each student to LEAs: one for sending to parents and one for the student’s permanent record?

    Yes, two copies. See Errata #3, Design of Reports, page 53.

  3. Is the parent/guardian address required to be present on the Student Performance Level Report?

    See Errata #3, Design of Reports, page 53.

  4. Page 54 – Section 3.9.B.2a – Summary Results: When are the state level AA, IA, and combined aggregate results to be provided?

    AA:  files are to be provided approximately in mid-March for the May reports. IA and Combined:  files are to be provided approximately in mid-August for the October report.

  5. Pages 54-55 – Section 3.9.B.2b & 3.9.B.2c – Content of Student Performance Level Reports & LEA-level and School-level Reports: Which reports are to be provided in downloadable electronic format? Does this include Student Performance Level Reports?

    The LEA and school level reports are electronic, as specified in Sec. 3.9.B.2.c.  As to the second question: No, see Errata #3 LEA-level and School-level Reports, page 55.

  6. Page 55 – Section 3.9.B.2c – LEA-level and School-level Reports: Is it correct that the successful bidder needs to deliver the summary reports (the Performance Level Summary Report and the Roster) to the districts every six to eight weeks? If this is correct, can you verify that these reports would contain a summary of just the students scored within each monthly batch?

    The answer to the first question is yes.  For the second question, see Errata #3 LEA-level and School-level Reports, page 55.

  7. The production, printing, and shipping of LEA-level and school-level reports on a monthly basis seems to be a new requirement for this contract. If this is a new requirement, are these monthly summary reports in addition to or instead of the printed AA summary reports produced, printed, and shipped at the end of the AA testing window (post-DRM), and the IA and the AA/IA combined summary reports produced, printed, and shipped at the end of the IA testing window?

    No, this requirement is not new.

  8. Page 55 – Section 3.9.B.3 – Replacement of Student Performance Level Reports: What is the approximate total annual district billing for replacement reports?

    CDE does not keep this information; the contractor fills requests directly.

  9. Page 55 – Section 3.9.B.4 – Test Results Interpretation Guides: Will this requirement be adjusted when Grade K is separated from Grade 1?

    See Errata #3, Test Results Interpretation Guides, page 55.

  10. Please clarify which of the following three interpretations for the requirement related to the translations of the Test Results Interpretation Guides (TRIGs) into the top three languages is correct:

See Errata #3, Test Results Interpretation Guides, page 55, for clarification.

  1. Page 56 – Section 3.9.C.3 – Historical Data Files: What is the approximate total annual district billing for historical data files?

    CDE does not keep this information; the contractor fills requests on a fee-for-service basis.

  2. Page 57 – Section 3.9.E.1 – Annual Reporting: Is it correct that 3.9.E.1 requires the printing and distribution of annual summary level reports for AA and AA/IA combined results to LEAs, in addition to the production and delivery to the CDE of the electronic date file for posting to the DataQuest Web page?

    No, this section refers to the state-level electronic data files.

Back to Top

Section 3.10 (Task 10) – Customer Support System

  1. Page 61 – Section 3.10.A – CELDT Support Center: Are employees of the customer support center allowed to service other customers during low-volume periods of CELDT customer support activity or is it the CDE’s intent to pay for 100% of these employees’ time year-round, including peak staffing, even during slack times?

    The intent is to provide dedicated customer support staff who are trained and knowledgeable to answer complex CELDT questions.  There must be a dedicated CELDT toll free #, fax #, and email address per section 3.10.A (paragraph 2).

Section 3.11 (Task 11) – English-Only Study

  1. Pages 63-64 – Sections 3.11.A & 3.11.C – Sampling and Recruitment Plan & Data Analysis: To effectively replicate the 2010–2011 study and to compare the results between that study and current one, it is important to obtain key information about the study design and research results from the 2010–2011 study at an early stage of research planning and preparation. Who will be responsible for providing such information? When and in what method will the information be made available?

    See Errata #3, Test Materials and Data Collection, page 63.

  2. Page 63 – Section 3.11.C – Data Analysis: The "comparison between yearly gains" suggests that we would have test scores for students from more than one test administration. Is there an expectation that students will be tested multiple times in support of the study?

    See Errata #3, Test Materials and Data Collection, page 63.

Section 4.0 – General Proposal Information

  1. Pages 64-65, 86, and 89 – Sections 4.2 (Funding), 7.3 (Funding Contingency Clause), and 7.5 (Contracts Funded by the Federal Government): These paragraphs indicate that the Contract may be terminated or amended in the event of non-appropriation of funds. Can the Contractor assume that the CDE will notify the Contractor of such non-appropriation of funds in writing and Contractor will be paid for any acceptable materials and services delivered by the CDE up to the date of such notification?

    The contractor will be notified of the State's right to terminate in accordance with Section 7. Contract Terms and Requirements, provision 7.25, Right To Terminate. Additionally, the State will reimburse the Contractor for any acceptable materials and services delivered to the CDE up to the date of such notification.

Section 5.0 – Proposal Specifications

  1. Pages 74 and 90 – Sections 5.4.A (Cover Letter) and 7.7 (Ownership of Materials): It is our understanding that in the event that the Contractor uses or furnishes any pre-existing Contractor intellectual property (including revisions or derivative works) in performing its obligations under the Contract, all intellectual property rights in and to such Contractor Intellectual Property will remain the sole property of the Contractor and will only be licensed to CDE. All newly developed materials under the Contract will be the sole property of the CDE. Is this the CDE’s understanding?

    Section 5.4A.1. and Section 7.7 address different issues.

    Section 5.4A.1 addresses ownership of the particular copies of written and electronic materials that bidders and/or subcontractors will submit to CDE in response to the Request for Proposal. The intent is that bidders acknowledge that CDE has the right to the copies of materials that are submitted, that these copies become the property of CDE and may be used or disclosed by CDE, and that the materials submitted need not be returned to bidders or subcontractors, whether or not the bid is successful. 

    As to the issue of ownership of intellectual property rights, that is a separate issue addressed in Section 7.7. Section 7.7 states “… CDE acknowledges that any materials and proprietary computer programs previously developed by the contractor or its subcontractors (meaning prior to commencement of work conducted pursuant to a contract with CDE) shall belong to the contractor or its subcontractors.” Section 7.7, further states “All materials developed under the terms of this agreement are the property of CDE.” Thus, all materials developed pursuant to a contract with CDE, including any derivative works, will be solely the property of CDE and CDE reserves the exclusive right to copyright such materials as set forth in Section 7.7. 

  2. Page 82 – Section 5.5.A – Cover Sheet: Please explain the CDE’s expectations as it relates to per-pupil rates, number of test takers stated in Section 2.3.B.3 (which does not appear to exist in the RFP), and fixed costs.

    See Errata #3, Cover Sheet, page 82.

Back to Top

Section 7.0 – Contract Terms and Requirements

  1. Page 90 – Section 7.7 – Ownership of Materials: IP Ownership provides that Contractor will retain ownership of previously developed materials or proprietary computer programs. We presume that the CDE agrees that Contractor will also retain ownership of any derivate works or modifications to the previously developed materials or proprietary computer programs which may be made by the Contractor during the term of any contract awarded as a result of this RFP?

    As set forth in Section 7.7, any materials and proprietary computer programs previously developed by the contractor or its subcontractors (meaning prior to commencement of work conducted pursuant to a contract with CDE), shall belong to the contractor or its subcontractors. However, all materials developed under the terms of an agreement with CDE, including any derivative works, will be solely the property of CDE and CDE reserves the exclusive right to copyright such materials as set forth in Section 7.7. For example, if the contractor had previously developed a test question, but the contractor later modified that test question as a result of work performed pursuant to the contract, then CDE would have ownership of the modification but the contractor, or its subcontractor as appropriate, would continue to retain ownership rights to the original test question.  Thus, the contractor would need CDE’s permission to use the modified question for other purposes but would be free to use the original question for other purposes.

  2. Page 91 – Section 7.11 – IT Requirements: IT Requirements for Source Code:  Section 7.11 requires that the Contractor must provide Web application or Web site source code for contracts that require the Contractor to develop, modify or maintain a Web site. Section 7.7 on page 90 provides that proprietary computer programs previously developed by the Contractor will remain the property of the Contractor. Question: May the Contractor presume that Section 7.11 is not intended to require the Contractor to provide application source code if it utilizes a previously developed Web application, software or computer programs to provide services required by the RFP?

    If the contractor uses proprietary software as part of the Web site, the contractor may place the source code at a third-party source code repository instead of providing the code to CDE.  CDE will be given the address to the source code if the contract is terminated prior to the scheduled end of the contract.

Section 11.0 – Rating Criteria and Evaluation Forms

  1. Page 109 – Step I, Part 2, Section 3.2 (Task 2) – Item Development Evaluation: In Step 1, Part 2 of the Technical Evaluation, the final bullet refers to “two required updates” of the Released Test Questions document. This seems to conflict with the requirements in section 3.2.F, which requires one update. Is it correct that only one update is required?

    See Errata #3, Item Development, page 109.

  2. Page 112 – Step I, Part 2, Section 3.4 (Task 4) – Test Administration Evaluation: In Step 1, Part 2 of the Technical Evaluation, the final bullet refers to “the capability to provide LEAs with Historical Data Files.” Should this bullet be in the evaluation criteria for Task 9 instead of Task 4?

    See Errata #3, Test Administation, page 112.

Back to Top

Questions: CELDT Team | celdt@cde.ca.gov | 916-319-0784 
Download Free Readers