Quantcast
Channel: Burkins & Yaris
Viewing all articles
Browse latest Browse all 6

Text-based Responses? The Jury is Still Out.

$
0
0

Note…A special thank you to the third-grade teachers who helped us with our little experiment!

This week we have looked closely at the EBSR, PARCC’s two-question prompts, for third grade ELA. The exploration has been interesting to say the least!

Two days ago, after Kim’s fourth-grade Nathaniel (Nathan) and Jan’s fourth-grade Nathaniel (Natie) read the passage How Animals Live, we watched as they answered the prototype questions. We were struck by how much their success (Nathan) and failure (Natie) seemed to depend little on the actual text. This made us wonder; does one have to read the text to answer the questions?

So we pulled in our other sons, and Jan recruited her husband. All three answered both questions correctly (separately) without reading the text. It seems that, by not reading the text one might actually be better able to filter out the detractors among the possible answers. This only works if the test taker understands the relationship between main ideas and supporting details and understands some things about how this testing format works. With these givens, however, looking at which pair of answers across the items works together seems to actually be an easier way to arrive at the correct answer than actually reading the text.

These findings, though unscientific, are interesting especially given David Coleman’s statement that 80% of questions teachers ask can be answered without reading text closely, an observation which appears to have led to Common Core Shift #4.

In support of Shift #4, PARCC’s explanation of how its assessments are aligned to the Common Core states:

“In regards to the ELA/Literacy assessments, this means PARCC will include:

  • Texts worth reading: The assessments will use authentic texts worthy of study instead of artificially produced or commissioned passages.
  • Questions worth answering: Sequences of questions that draw students into deeper encounters with texts will be the norm (as in an excellent classroom), rather than sets of random questions of varying quality.”

Of course, the two teenagers and one grown man in our “study” all read far above third-grade level. So, given our tiny sample size and the difference in the age of the test takers and the intended audience, what might our pseudo-results mean? Can third graders answer these questions without reading the text? We don’t know.

So we set out to gather more data. We contacted several third-grade teachers. Some of the teachers we asked to give the students the question prototype with the passage. The remaining teachers administered the prototype without letting students read the passage. The results are interesting, even though we need more information to draw any kind of conclusions.

Results for 3rd Graders Who DID NOT Read the Passage

Yesterday, 43 third-graders answered the ELA EBSR prototype for their grade-level. Of those, none of the students gave the correct answer for part A and only one gave the correct answer for part B, which means none of the students would receive any points for their answers. Once we get more details about the actual answer combinations of the students, we will have more insight into this result.

We are interested in whether students understood that there should be a connection between the two questions. Their three years of experience with standardized tests have taught them that multiple choice questions function independent of each other. We asked the teacher to administer the “test” without any explanation. We are concerned that the results might indicate an issue with the test format rather than validly indicating that students have difficulty identifying main ideas and citing evidence to support their opinions. Once we get a chance to look at the individual answer combinations, we hope to have some insight into whether or not students seemed to be connecting the two answers or answering them as if they were unrelated.

Results for 3rd Graders Who DID Read the Passage

As it turns out, however, to some extent we (you and us) have been testing this text-based question throughout this blog series, even when Nathan and Natie read the passage. Thank you to the two Burkins & Yaris readers who pointed out that there is actually a page missing from the passage! So even students who read the passage before answering the question, really didn’t get a chance to read the complete text closely.

It seems that our earlier criticisms of the text may prove irrelevant. We hope so! We are optimistic that inserting the missing page will make the text much more cohesive, although it is hard to imagine what text could fill that missing page and tie together the pages we have!

So, as an indicator of whether or not reading the passage before taking the “test,” our field test is completely invalid because the passage was incomplete. It seems noteworthy, however, that of the 72 third-graders who read the incomplete version of How Animals Live, 21% got both Part A and Part B correct, despite the absence of the middle ⅓ of the text.

So, here are our accumulating concerns:

1. (from Tuesday’s post) The test makers don’t seem to understand text complexity. Please note: The confusing text was due in part to the missing page that we did not notice until two of our readers pointed it out! Even with that page, we can’t imagine that the text fits our understandings of complexity.

2. (from Wednesday’s post) We are skeptical about the validity of the test.

3. (from today’s post) The paired items may actually make the questions less text-based. Because the two questions depend on each other, then the process of elimination may be easier. We suspect that students may be able to get more of these questions right without reading the passage than if it was a single test item.

 

 


Viewing all articles
Browse latest Browse all 6

Latest Images

Trending Articles



Latest Images