Loading ...
Sorry, an error occurred while loading the content.
 

The bottom line is that these tests are abysmal

Expand Messages
  • Leonie Haimson
    Reposted from here: http://testingtalk.org/response/book-1-nys-common-core-test-ela-8 Book 1 - NYS Common Core Test - ELA 8 * Author: anonymous, Teacher * | *
    Message 1 of 15 , Apr 1, 2014

      Reposted from here:

      http://testingtalk.org/response/book-1-nys-common-core-test-ela-8

      Book 1 — NYS Common Core Test — ELA 8

      • Author: anonymous, Teacher
      • |
      • State: NY
      • |
      • Test: State test: Pearson
      • |
      • Date: April 1 at 4:17 pm ET

      NYS Common Core Test — Day 1 — Book 1 — ELA 8 — April 1, 2014 — 42 repetitive multiple-choice questions …

      Here’s an example of a typical question:

      One question asked, “Which sentence best connects two central ideas of the article?” As adult readers, we do not give texts such scrutiny, especially if we’re reading a non-fictional text. We do not say to ourselves as we are reading: Hey, now there’s a sentence connecting two central ideas! Nor, as writers, do we say to ourselves as we’re writing: I will now use this specific sentence to connect two central ideas. We would never read a text in the ways that these multiple-choice questions are forcing us to read them.

      Another questions asked: “What is the most likely reason for including information about the Smithsonian laboratory in Panama?” “To emphasize…,” “to illustrate…,” “to point out…,” “to provide…” I’m thinking does it really matter. A student, in a non-testing context, could easily grasp why the author includes that information. He or she would use the information or simply move on. “Asking test-takers to respond to text passages with multiple-choice questions induces response processes that are strikingly different from those that respondents would draw on when reading in non-testing contexts” (from How assessing reading comprehension with multiple-choice questions shapes the construct: a cognitive processing perspective by Rupp, Ferne & Choi).

      Why are the multiple-choice questions more difficult than the actual texts? Most of the texts do not warrant such nitpicky multiple-choice questions. We’re taking relatively easy concepts (main idea, evidence, etc,.) and distorting these concepts through the assessment. It’s detrimental.

      Too many of the questions are overly concerned with HOW the Pearson-selected texts are written and structured and not at all concerned with content and comprehension. The argument will be that the questions are assessing close reading skills, but many would argue that this is NOT what they are doing.

      After reading all of the questions and looking at the possible answers, my colleagues and I simply look at each other and say, who really cares. I bet the authors of the texts would even scratch their heads as to why such questions were being posed to an 8th grader (or posed at all for that matter).

      In my classroom or out in the real world, we would be reading these texts for information, for understanding, reading to integrate, reading to develop an argument, reading for entertainment, etc,. We wouldn’t necessarily be reading them to discover why the author uses that word “stimulate” in a non-fictional article. In class, if I ask my students what “stimulate” means in the context of the article we are reading, they would nail it, but the testing context is something altogether different and the multiple-choice format is something altogether different.

      The same goes for reading to discover “which lines support the author’s claim” or “which lines develop a key concept” — In the real world, in my classroom, these would be straightforward tasks that my students could do with no problem, but the multiple-choice questions on these state exams turn the straightforward into a muddied mess.

      Overall: Book 1 is still a slog. Long passages. Many of the multiple choice questions were quite involved, requiring students to flip back and forth a number of times and re-read multiple times. Again, the actual texts do not warrant such scrutiny. Three academically strong students didn’t finish and simply guessed on 10-12 questions.

      The bottom line is that these tests are abysmal. For book 1, why don’t they simply include one poem, one fictional text, and one non-fictional text with fifteen multiple-choice questions and be done with it? These multiple-choice assessments do not connect to the real world of reading, thinking, and writing. They’re simply not important.

      We need to move away from multiple choice- questions and away from these large test manufacturers and bring on more local and regional assessments that incorporate a variety of assessment strategies: projects, portfolios, open-ended questions, writing, short-answer, artistic expression, etc,.

       

       

       

    • nealhugh17
      Many thanks!!! Great info!!! Neal Sent from my BlackBerry 10 smartphone. From: Leonie Haimson Sent: Tuesday, April 1, 2014 5:13 PM To:
      Message 2 of 15 , Apr 1, 2014
        Many thanks!!! Great info!!! Neal

        Sent from my BlackBerry 10 smartphone.
        From: Leonie Haimson
        Sent: Tuesday, April 1, 2014 5:13 PM
        To: nyceducationnews@yahoogroups.com; changethestakes-open-forum@...
        Reply To: nyceducationnews@yahoogroups.com
        Cc: cts-internal@...
        Subject: [nyceducationnews] The bottom line is that these tests are abysmal

         

        Reposted from here:

        http://testingtalk.org/response/book-1-nys-common-core-test-ela-8

        Book 1 — NYS Common Core Test — ELA 8

        • Author: anonymous, Teacher
        • |
        • State: NY
        • |
        • Test: State test: Pearson
        • |
        • Date: April 1 at 4:17 pm ET

        NYS Common Core Test — Day 1 — Book 1 — ELA 8 — April 1, 2014 — 42 repetitive multiple-choice questions …

        Here’s an example of a typical question:

        One question asked, “Which sentence best connects two central ideas of the article?” As adult readers, we do not give texts such scrutiny, especially if we’re reading a non-fictional text. We do not say to ourselves as we are reading: Hey, now there’s a sentence connecting two central ideas! Nor, as writers, do we say to ourselves as we’re writing: I will now use this specific sentence to con nect two central ideas. We would never read a text in the ways that these multiple-choice questions are forcing us to read them.

        Another questions asked: “What is the most likely reason for including information about the Smithsonian laboratory in Panama?” “To emphasize…,” “to illustrate…,” “to point out…,” “to provide…” I’m thinking does it really matter. A student, in a non-testing context, could easily grasp why the author includes that information. He or she would use the information or simply move on. “Asking test-takers to respond to text passages with multiple-choice questions induces response processes that are strikingly different from those that respondents would draw on when reading in non-testing contexts” (from How assessing reading com prehension with multiple-choice questions shapes the construct: a cognitive processing perspective by Rupp, Ferne & Choi).

        Why are the multiple-choice questions more difficult than the actual texts? Most of the texts do not warrant such nitpicky multiple-choice questions. We’re taking relatively easy concepts (main idea, evidence, etc,.) and distorting these concepts through the assessment. It’s detrimental.

        Too many of the questions are overly concerned with HOW the Pearson-selected texts are written and structured and not at all concerned with content and comprehension. The argument will be that the questions are assessing close reading skills, but many would argue that this is NOT what they are doing.

        After reading all of the questions and looking at the possible answers, my colleagues and I simply look at each other and say, who really cares. I bet the authors of the texts would even scratch their heads as to why such questions were being posed to an 8th grader (or posed at all for that matter).

        In my classroom or out in the real world, we would be reading these texts for information, for understanding, reading to integrate, reading to develop an argument, reading for entertainment, etc,. We wouldn’t necessarily be reading them to discover why the author uses that word “stimulate” in a non-fictional article. In class, if I ask my students what “stimulate” means in the context of the article we are reading, they would nail it, but the testing context is something altogether different and the multiple-choice format is something altogether different.

        The same goes for reading to discover “which lines support the author’s claim” or “which lines develop a key concept” — In the real world, in my classroom, these would be straightforward tasks that my students could do with no problem, but the multiple-choice questions on these state exams turn the straightforward into a muddied mess.

        Overall: Book 1 is still a slog. Long passages. Many of the multiple choice questions were quite involved, requiring students to flip back and forth a number of times and re-read multiple times. Again , the actual texts do not warrant such scrutiny. Three academically strong students didn’t finish and simply guessed on 10-12 questions.

        The bottom line is that these tests are abysmal. For book 1, why don’t they simply include one poem, one fictional text, and one non-fictional text with fifteen multiple-choice questions and be done with it? These multiple-choice assessments do not connect to the real world of reading, thinking, and writing. They’re simply not important.

        We need to move away from multiple choice- questions and away from these large test manufacturers and bring on more local and regional assessments that incorporate a variety of assessment strategies: projects, portfolios, open-ended quest ions, writing, short-answer, artistic expression, etc,.

         

         

         


      • Laura@...
        Whatever teacher had the guts to do this, thank you. My son is in 8th grade. How is this testing reading comprehension? How is this testing critical thinking?
        Message 3 of 15 , Apr 1, 2014
          Whatever teacher had the guts to do this, thank you.  My son is in 8th grade.

          How is this testing reading comprehension? How is this testing critical thinking? Without reading the passages, it seems just tedious and confusing. They were not thinking, they were rummaging through the passage to find the answer that most closely matched the choices.   My son said he had to refer back to the passages several times.  Also, he shared one of the questions about the Panama Lab was designed to trick them.  C&D referenced a locust from the Panama Lab but the lab was about bees and not locusts.

          Are they going to release the tests this year or keep it a secret like last year?  

          Laura E. Timoney
          (O) 718.987.6411
          (C) 917.667.2711
          Laura@...



          nealhugh@...
          Sent by: nyceducationnews@yahoogroups.com

          04/01/2014 06:51 PM

          Please respond to
          nyceducationnews@yahoogroups.com

          To
          nyceducationnews@yahoogroups.com, nyceducationnews@yahoogroups.com, changethestakes-open-forum@...
          cc
          cts-internal@...
          Subject
          Re: [nyceducationnews] The bottom line is that these tests are abysmal





           

          Many thanks!!! Great info!!! Neal


          Sent from my BlackBerry 10 smartphone.

          From: Leonie Haimson
          Sent: Tuesday, April 1, 2014 5:13 PM
          To: nyceducationnews@yahoogroups.com; changethestakes-open-forum@...
          Reply To: nyceducationnews@yahoogroups.com
          Cc: cts-internal@...
          Subject: [nyceducationnews] The bottom line is that these tests are abysmal


           

          Reposted from here:

          http://testingtalk.org/response/book-1-nys-common-core-test-ela-8

          Book 1 — NYS Common Core Test — ELA 8

          • Author: anonymous, Teacher
          • |
          • State: NY
          • |
          • Test: State test: Pearson
          • |
          • Date: April 1 at 4:17 pm ET
          NYS Common Core Test — Day 1 — Book 1 — ELA 8 — April 1, 2014 — 42 repetitive multiple-choice questions …

          Here’s an example of a typical question:

          One question asked, “Which sentence best connects two central ideas of the article?” As adult readers, we do not give texts such scrutiny, especially if we’re reading a non-fictional text. We do not say to ourselves as we are reading: Hey, now there’s a sentence connecting two central ideas! Nor, as writers, do we say to ourselves as we’re writing: I will now use this specific sentence to con nect two central ideas. We would never read a text in the ways that these multiple-choice questions are forcing us to read them.

          Another questions asked: “What is the most likely reason for including information about the Smithsonian laboratory in Panama?” “To emphasize…,” “to illustrate…,” “to point out…,” “to provide…” I’m thinking does it really matter. A student, in a non-testing context, could easily grasp why the author includes that information. He or she would use the information or simply move on. “Asking test-takers to respond to text passages with multiple-choice questions induces response processes that are strikingly different from those that respondents would draw on when reading in non-testing contexts” (from How assessing reading com prehension with multiple-choice questions shapes the construct: a cognitive processing perspective by Rupp, Ferne & Choi).

          Why are the multiple-choice questions more difficult than the actual texts? Most of the texts do not warrant such nitpicky multiple-choice questions. We’re taking relatively easy concepts (main idea, evidence, etc,.) and distorting these concepts through the assessment. It’s detrimental.

          Too many of the questions are overly concerned with HOW the Pearson-selected texts are written and structured and not at all concerned with content and comprehension. The argument will be that the questions are assessing close reading skills, but many would argue that this is NOT what they are doing.

          After reading all of the questions and looking at the possible answers, my colleagues and I simply look at each other and say, who really cares. I bet the authors of the texts would even scratch their heads as to why such questions were being posed to an 8th grader (or posed at all for that matter).

          In my classroom or out in the real world, we would be reading these texts for information, for understanding, reading to integrate, reading to develop an argument, reading for entertainment, etc,. We wouldn’t necessarily be reading them to discover why the author uses that word “stimulate” in a non-fictional article. In class, if I ask my students what “stimulate” means in the context of the article we are reading, they would nail it, but the testing context is something altogether different and the multiple-choice format is something altogether different.

          The same goes for reading to discover “which lines support the author’s claim” or “which lines develop a key concept” — In the real world, in my classroom, these would be straightforward tasks that my students could do with no problem, but the multiple-choice questions on these state exams turn the straightforward into a muddied mess.

          Overall: Book 1 is still a slog. Long passages. Many of the multiple choice questions were quite involved, requiring students to flip back and forth a number of times and re-read multiple times. Again , the actual texts do not warrant such scrutiny. Three academically strong students didn’t finish and simply guessed on 10-12 questions.

          The bottom line is that these tests are abysmal. For book 1, why don’t they simply include one poem, one fictional text, and one non-fictional text with fifteen multiple-choice questions and be done with it? These multiple-choice assessments do not connect to the real world of reading, thinking, and writing. They’re simply not important.

          We need to move away from multiple choice- questions and away from these large test manufacturers and bring on more local and regional assessments that incorporate a variety of assessment strategies: projects, portfolios, open-ended quest ions, writing, short-answer, artistic expression, etc,.

           

           

           


        • nealhugh17
          We must see the tests!!! This sounds like gross incompetence, dumb and as said, abysmal! Pathetic... Sent from my BlackBerry 10 smartphone. From:
          Message 4 of 15 , Apr 1, 2014
          We must see the tests!!! This sounds like gross incompetence, "dumb" and as said, abysmal! Pathetic... 

          Sent from my BlackBerry 10 smartphone.
          From: Laura@...
          Sent: Tuesday, April 1, 2014 8:10 PM
          To: nyceducationnews@yahoogroups.com
          Reply To: nyceducationnews@yahoogroups.com
          Subject: Re: [nyceducationnews] The bottom line is that these tests are abysmal

           

          Whatever teacher had the guts to do this, thank you.  My son is in 8th grade.

          How is this testing reading comprehension? How is this testing critical thinking? Without reading the passages, it seems just tedious and confusing. They were not thinking, they were rummaging through the passage to find the answer that most closely matched the choices.   My son said he had to refer back to the passages several times.  Also, he shared one of the questions about the Panama Lab was designed to trick them.  C&D referenced a locust from the Panama Lab but the lab was about bees and not locusts.

          Are they going to release the tests this year or keep it a secret like last year?  

          Laura E. Timoney
          (O) 718.987.6411
          (C) 917.667.2711
          Laura@...



          nealhugh@...
          Sent by: nyceducationnews@yahoogroups.com

          04/01/2014 06:51 PM

          Please respond to
          nyceducationnews@yahoogroups.com

          To
          nyceducationnews@yahoogroups.com, nyceducationnews@yahoogroups.com, changethestakes-open-forum@...
          cc
          cts-internal@...
          Subject
          Re: [nyceducationnews] The bottom line is that these tests are abysmal





           

          Many thanks!!! Great info!!! Neal


          Sent from my BlackBerry 10 smartphone.

          From: Leonie Haimson
          Sent: Tuesday, April 1, 2014 5:13 PM
          To: nyceducationnews@yahoogroups.com; changethestakes-open-forum@...
          Reply To: nyceducationnews@yahoogroups.com
          Cc: cts-internal@...
          Subject: [nyceducationnews] The bottom line is that these tests are abysmal


           

          Reposted from here:

          http://testingtalk.org/response/book-1-nys-common-core-test-ela-8

          Book 1 — NYS Common Core Test — ELA 8

          • Author: anonymous, Teacher
          • |
          • State: NY
          • |
          • Test: State test: Pearson
          • |
          • Date: April 1 at 4:17 pm ET
          NYS Common Core Test — Day 1 — Book 1 — ELA 8 — April 1, 2014 — 42 repetitive multiple-choice questions …

          Here’s an example of a typical question:

          One question asked, “Which sentence best connects two central ideas of the article?” As adult readers, we do not give texts such scrutiny, especially if we’re reading a non-fictional text. We do not say to ourselves as we are reading: Hey, now there’s a sentence connecting two central ideas! Nor, as writers, do we say to ourselves as we’re writing: I will now use this specific sentence to con nect two central ideas. We would never read a text in the ways that these multiple-choice questions are forcing us to read them.

          Another questions asked: “What is the most likely reason for including information about the Smithsonian laboratory in Panama?” “To emphasize…,” “to illustrate…,” “to point out…,” “to provide…” I’m thinking does it really matter. A student, in a non-testing context, could easily grasp why the author includes that information. He or she would use the information or simply move on. “Asking test-takers to respond to text passages with multiple-choice questions induces response processes that are strikingly different from those that respondents would draw on when reading in non-testing contexts” (from How assessing reading com prehension with multiple-choice questions shapes the construct: a cognitive processing perspective by Rupp, Ferne & Choi).

          Why are the multiple-choice questions more difficult than the actual texts? Most of the texts do not warrant such nitpicky multiple-choice questions. We’re taking relatively easy concepts (main idea, evidence, etc,.) and distorting these concepts through the assessment. It’s detrimental.

          Too many of the questions are overly concerned with HOW the Pearson-selected texts are written and structured and not at all concerned with content and comprehension. The argument will be that the questions are assessing close reading skills, but many would argue that this is NOT what they are doing.

          After reading all of the questions and looking at the possible answers, my colleagues and I simply look at each other and say, who really cares. I bet the authors of the texts would even scratch their heads as to why such questions were being posed to an 8th grader (or posed at all for that matter).

          In my classroom or out in the real world, we would be reading these texts for information, for understanding, reading to integrate, reading to develop an argument, reading for entertainment, etc,. We wouldn’t necessarily be reading them to discover why the author uses that word “stimulate” in a non-fictional article. In class, if I ask my students what “stimulate” means in the context of the article we are reading, they would nail it, but the testing context is something altogether different and the multiple-choice format is something altogether different.

          The same goes for reading to discover “which lines support the author’s claim” or “which lines develop a key concept” — In the real world, in my classroom, these would be straightforward tasks that my students could do with no problem, but the multiple-choice questions on these state exams turn the straightforward into a muddied mess.

          Overall: Book 1 is still a slog. Long passages. Many of the multiple choice questions were quite involved, requiring students to flip back and forth a number of times and re-read multiple times. Again , the actual texts do not warrant such scrutiny. Three academically strong students didn’t finish and simply guessed on 10-12 questions.

          The bottom line is that these tests are abysmal. For book 1, why don’t they simply include one poem, one fictional text, and one non-fictional text with fifteen multiple-choice questions and be done with it? These multiple-choice assessments do not connect to the real world of reading, thinking, and writing. They’re simply not important.

          We need to move away from multiple choice- questions and away from these large test manufacturers and bring on more local and regional assessments that incorporate a variety of assessment strategies: projects, portfolios, open-ended quest ions, writing, short-answer, artistic expression, etc,.

           

           

           



        • Leonie Haimson
          They seem to be trying to trick kids to refer only to the passage & not to any background knowledge, a la David Coleman’s “close reading” BS. From a
          Message 5 of 15 , Apr 1, 2014

            They seem to be trying to trick kids to refer only to the passage & not to any background knowledge, a la David Coleman’s “close reading” BS.  From a parent of an 8th grader:

             

            My daughter said there was a passage about "proving the existence of bigfoot". She was troubled because the passage made it seem as if there is an actual debate about whether bigfoot exists or not. she said the passage was an article about how there is a great deal of evidence suggesting bigfoot is real!?!

             

            Meanwhile here is a photo of a classroom door at a school on testing day:

             

             

            From: nyceducationnews@yahoogroups.com [mailto:nyceducationnews@yahoogroups.com] On Behalf Of Laura@...
            Sent: Tuesday, April 01, 2014 8:10 PM
            To: nyceducationnews@yahoogroups.com
            Subject: Re: [nyceducationnews] The bottom line is that these tests are abysmal

             

             

            Whatever teacher had the guts to do this, thank you.  My son is in 8th grade.

            How is this testing reading comprehension? How is this testing critical thinking? Without reading the passages, it seems just tedious and confusing. They were not thinking, they were rummaging through the passage to find the answer that most closely matched the choices.   My son said he had to refer back to the passages several times.  Also, he shared one of the questions about the Panama Lab was designed to trick them.  C&D referenced a locust from the Panama Lab but the lab was about bees and not locusts.

            Are they going to release the tests this year or keep it a secret like last year?  

            Laura E. Timoney
            (O) 718.987.6411
            (C) 917.667.2711
            Laura@...


            nealhugh@...
            Sent by: nyceducationnews@yahoogroups.com

            04/01/2014 06:51 PM

            Please respond to
            nyceducationnews@yahoogroups.com

            To

            nyceducationnews@yahoogroups.com, nyceducationnews@yahoogroups.com, changethestakes-open-forum@...

            cc

            cts-internal@...

            Subject

            Re: [nyceducationnews] The bottom line is that these tests are abysmal

             




             

            Many thanks!!! Great info!!! Neal


            Sent from my BlackBerry 10 smartphone.

            From: Leonie Haimson
            Sent: Tuesday, April 1, 2014 5:13 PM
            To: nyceducationnews@yahoogroups.com; changethestakes-open-forum@...
            Reply To: nyceducationnews@yahoogroups.com
            Cc: cts-internal@...
            Subject: [nyceducationnews] The bottom line is that these tests are abysmal



             

            Reposted from here:

            http://testingtalk.org/response/book-1-nys-common-core-test-ela-8

            Book 1 — NYS Common Core Test — ELA 8

            • Author: anonymous, Teacher
            • |
            • State: NY
            • |
            • Test: State test: Pearson
            • |
            • Date: April 1 at 4:17 pm ET

            NYS Common Core Test — Day 1 — Book 1 — ELA 8 — April 1, 2014 — 42 repetitive multiple-choice questions …

            Here’s an example of a typical question:

            One question asked, “Which sentence best connects two central ideas of the article?” As adult readers, we do not give texts such scrutiny, especially if we’re reading a non-fictional text. We do not say to ourselves as we are reading: Hey, now there’s a sentence connecting two central ideas! Nor, as writers, do we say to ourselves as we’re writing: I will now use this specific sentence to con nect two central ideas. We would never read a text in the ways that these multiple-choice questions are forcing us to read them.

            Another questions asked: “What is the most likely reason for including information about the Smithsonian laboratory in Panama?” “To emphasize…,” “to illustrate…,” “to point out…,” “to provide…” I’m thinking does it really matter. A student, in a non-testing context, could easily grasp why the author includes that information. He or she would use the information or simply move on. “Asking test-takers to respond to text passages with multiple-choice questions induces response processes that are strikingly different from those that respondents would draw on when reading in non-testing contexts” (from How assessing reading com prehension with multiple-choice questions shapes the construct: a cognitive processing perspective by Rupp, Ferne & Choi).

            Why are the multiple-choice questions more difficult than the actual texts? Most of the texts do not warrant such nitpicky multiple-choice questions. We’re taking relatively easy concepts (main idea, evidence, etc,.) and distorting these concepts through the assessment. It’s detrimental.

            Too many of the questions are overly concerned with HOW the Pearson-selected texts are written and structured and not at all concerned with content and comprehension. The argument will be that the questions are assessing close reading skills, but many would argue that this is NOT what they are doing.

            After reading all of the questions and looking at the possible answers, my colleagues and I simply look at each other and say, who really cares. I bet the authors of the texts would even scratch their heads as to why such questions were being posed to an 8th grader (or posed at all for that matter).

            In my classroom or out in the real world, we would be reading these texts for information, for understanding, reading to integrate, reading to develop an argument, reading for entertainment, etc,. We wouldn’t necessarily be reading them to discover why the author uses that word “stimulate” in a non-fictional article. In class, if I ask my students what “stimulate” means in the context of the article we are reading, they would nail it, but the testing context is something altogether different and the multiple-choice format is something altogether different.

            The same goes for reading to discover “which lines support the author’s claim” or “which lines develop a key concept” — In the real world, in my classroom, these would be straightforward tasks that my students could do with no problem, but the multiple-choice questions on these state exams turn the straightforward into a muddied mess.

            Overall: Book 1 is still a slog. Long passages. Many of the multiple choice questions were quite involved, requiring students to flip back and forth a number of times and re-read multiple times. Again , the actual texts do not warrant such scrutiny. Three academically strong students didn’t finish and simply guessed on 10-12 questions.

            The bottom line is that these tests are abysmal. For book 1, why don’t they simply include one poem, one fictional text, and one non-fictional text with fifteen multiple-choice questions and be done with it? These multiple-choice assessments do not connect to the real world of reading, thinking, and writing. They’re simply not important.

            We need to move away from multiple choice- questions and away from these large test manufacturers and bring on more local and regional assessments that incorporate a variety of assessment strategies: projects, portfolios, open-ended quest ions, writing, short-answer, artistic expression, etc,.

             

             

             

             

          • Patrick Sullivan
            Yes, my son said the passage was offered as an unbiased article with experts offering evidence that bigfoot did or did not exist.
            Message 6 of 15 , Apr 1, 2014

              Yes, my son said the passage was offered as an unbiased article with experts offering evidence that bigfoot did or did not exist. 

              On Apr 1, 2014 8:37 PM, "Leonie Haimson" <leonie@...> wrote:
               

              They seem to be trying to trick kids to refer only to the passage & not to any background knowledge, a la David Coleman’s “close reading” BS.  From a parent of an 8th grader:

               

              My daughter said there was a passage about "proving the existence of bigfoot". She was troubled because the passage made it seem as if there is an actual debate about whether bigfoot exists or not. she said the passage was an article about how there is a great deal of evidence suggesting bigfoot is real!?!

               

              Meanwhile here is a photo of a classroom door at a school on testing day:

               

               

              From: nyceducationnews@yahoogroups.com [mailto:nyceducationnews@yahoogroups.com] On Behalf Of Laura@...
              Sent: Tuesday, April 01, 2014 8:10 PM
              To: nyceducationnews@yahoogroups.com
              Subject: Re: [nyceducationnews] The bottom line is that these tests are abysmal

               

               

              Whatever teacher had the guts to do this, thank you.  My son is in 8th grade.

              How is this testing reading comprehension? How is this testing critical thinking? Without reading the passages, it seems just tedious and confusing. They were not thinking, they were rummaging through the passage to find the answer that most closely matched the choices.   My son said he had to refer back to the passages several times.  Also, he shared one of the questions about the Panama Lab was designed to trick them.  C&D referenced a locust from the Panama Lab but the lab was about bees and not locusts.

              Are they going to release the tests this year or keep it a secret like last year?  

              Laura E. Timoney
              (O) 718.987.6411
              (C) 917.667.2711
              Laura@...


              nealhugh@...
              Sent by: nyceducationnews@yahoogroups.com

              04/01/2014 06:51 PM

              Please respond to
              nyceducationnews@yahoogroups.com

              To

              nyceducationnews@yahoogroups.com, nyceducationnews@yahoogroups.com, changethestakes-open-forum@...

              cc

              cts-internal@...

              Subject

              Re: [nyceducationnews] The bottom line is that these tests are abysmal

               




               

              Many thanks!!! Great info!!! Neal


              Sent from my BlackBerry 10 smartphone.

              From: Leonie Haimson
              Sent: Tuesday, April 1, 2014 5:13 PM
              To: nyceducationnews@yahoogroups.com; changethestakes-open-forum@...
              Reply To: nyceducationnews@yahoogroups.com
              Cc: cts-internal@...
              Subject: [nyceducationnews] The bottom line is that these tests are abysmal



               

              Reposted from here:

              http://testingtalk.org/response/book-1-nys-common-core-test-ela-8

              Book 1 — NYS Common Core Test — ELA 8

              • Author: anonymous, Teacher
              • |
              • State: NY
              • |
              • Test: State test: Pearson
              • |
              • Date: April 1 at 4:17 pm ET

              NYS Common Core Test — Day 1 — Book 1 — ELA 8 — April 1, 2014 — 42 repetitive multiple-choice questions …

              Here’s an example of a typical question:

              One question asked, “Which sentence best connects two central ideas of the article?” As adult readers, we do not give texts such scrutiny, especially if we’re reading a non-fictional text. We do not say to ourselves as we are reading: Hey, now there’s a sentence connecting two central ideas! Nor, as writers, do we say to ourselves as we’re writing: I will now use this specific sentence to con nect two central ideas. We would never read a text in the ways that these multiple-choice questions are forcing us to read them.

              Another questions asked: “What is the most likely reason for including information about the Smithsonian laboratory in Panama?” “To emphasize…,” “to illustrate…,” “to point out…,” “to provide…” I’m thinking does it really matter. A student, in a non-testing context, could easily grasp why the author includes that information. He or she would use the information or simply move on. “Asking test-takers to respond to text passages with multiple-choice questions induces response processes that are strikingly different from those that respondents would draw on when reading in non-testing contexts” (from How assessing reading com prehension with multiple-choice questions shapes the construct: a cognitive processing perspective by Rupp, Ferne & Choi).

              Why are the multiple-choice questions more difficult than the actual texts? Most of the texts do not warrant such nitpicky multiple-choice questions. We’re taking relatively easy concepts (main idea, evidence, etc,.) and distorting these concepts through the assessment. It’s detrimental.

              Too many of the questions are overly concerned with HOW the Pearson-selected texts are written and structured and not at all concerned with content and comprehension. The argument will be that the questions are assessing close reading skills, but many would argue that this is NOT what they are doing.

              After reading all of the questions and looking at the possible answers, my colleagues and I simply look at each other and say, who really cares. I bet the authors of the texts would even scratch their heads as to why such questions were being posed to an 8th grader (or posed at all for that matter).

              In my classroom or out in the real world, we would be reading these texts for information, for understanding, reading to integrate, reading to develop an argument, reading for entertainment, etc,. We wouldn’t necessarily be reading them to discover why the author uses that word “stimulate” in a non-fictional article. In class, if I ask my students what “stimulate” means in the context of the article we are reading, they would nail it, but the testing context is something altogether different and the multiple-choice format is something altogether different.

              The same goes for reading to discover “which lines support the author’s claim” or “which lines develop a key concept” — In the real world, in my classroom, these would be straightforward tasks that my students could do with no problem, but the multiple-choice questions on these state exams turn the straightforward into a muddied mess.

              Overall: Book 1 is still a slog. Long passages. Many of the multiple choice questions were quite involved, requiring students to flip back and forth a number of times and re-read multiple times. Again , the actual texts do not warrant such scrutiny. Three academically strong students didn’t finish and simply guessed on 10-12 questions.

              The bottom line is that these tests are abysmal. For book 1, why don’t they simply include one poem, one fictional text, and one non-fictional text with fifteen multiple-choice questions and be done with it? These multiple-choice assessments do not connect to the real world of reading, thinking, and writing. They’re simply not important.

              We need to move away from multiple choice- questions and away from these large test manufacturers and bring on more local and regional assessments that incorporate a variety of assessment strategies: projects, portfolios, open-ended quest ions, writing, short-answer, artistic expression, etc,.

               

               

               

               

            • Patrick Sullivan
              And Mug Root Beer was back. This time as a puddle under the table. The busboy missed it. On Apr 1, 2014 9:58 PM, Patrick Sullivan
              Message 7 of 15 , Apr 1, 2014

                And Mug Root Beer was back. This time as a puddle under the table. The busboy missed it.

                On Apr 1, 2014 9:58 PM, "Patrick Sullivan" <patk.j.sullivan@...> wrote:

                Yes, my son said the passage was offered as an unbiased article with experts offering evidence that bigfoot did or did not exist. 

                On Apr 1, 2014 8:37 PM, "Leonie Haimson" <leonie@...> wrote:
                 

                They seem to be trying to trick kids to refer only to the passage & not to any background knowledge, a la David Coleman’s “close reading” BS.  From a parent of an 8th grader:

                 

                My daughter said there was a passage about "proving the existence of bigfoot". She was troubled because the passage made it seem as if there is an actual debate about whether bigfoot exists or not. she said the passage was an article about how there is a great deal of evidence suggesting bigfoot is real!?!

                 

                Meanwhile here is a photo of a classroom door at a school on testing day:

                 

                 

                From: nyceducationnews@yahoogroups.com [mailto:nyceducationnews@yahoogroups.com] On Behalf Of Laura@...
                Sent: Tuesday, April 01, 2014 8:10 PM
                To: nyceducationnews@yahoogroups.com
                Subject: Re: [nyceducationnews] The bottom line is that these tests are abysmal

                 

                 

                Whatever teacher had the guts to do this, thank you.  My son is in 8th grade.

                How is this testing reading comprehension? How is this testing critical thinking? Without reading the passages, it seems just tedious and confusing. They were not thinking, they were rummaging through the passage to find the answer that most closely matched the choices.   My son said he had to refer back to the passages several times.  Also, he shared one of the questions about the Panama Lab was designed to trick them.  C&D referenced a locust from the Panama Lab but the lab was about bees and not locusts.

                Are they going to release the tests this year or keep it a secret like last year?  

                Laura E. Timoney
                (O) 718.987.6411
                (C) 917.667.2711
                Laura@...


                nealhugh@...
                Sent by: nyceducationnews@yahoogroups.com

                04/01/2014 06:51 PM

                Please respond to
                nyceducationnews@yahoogroups.com

                To

                nyceducationnews@yahoogroups.com, nyceducationnews@yahoogroups.com, changethestakes-open-forum@...

                cc

                cts-internal@...

                Subject

                Re: [nyceducationnews] The bottom line is that these tests are abysmal

                 




                 

                Many thanks!!! Great info!!! Neal


                Sent from my BlackBerry 10 smartphone.

                From: Leonie Haimson
                Sent: Tuesday, April 1, 2014 5:13 PM
                To: nyceducationnews@yahoogroups.com; changethestakes-open-forum@...
                Reply To: nyceducationnews@yahoogroups.com
                Cc: cts-internal@...
                Subject: [nyceducationnews] The bottom line is that these tests are abysmal



                 

                Reposted from here:

                http://testingtalk.org/response/book-1-nys-common-core-test-ela-8

                Book 1 — NYS Common Core Test — ELA 8

                • Author: anonymous, Teacher
                • |
                • State: NY
                • |
                • Test: State test: Pearson
                • |
                • Date: April 1 at 4:17 pm ET

                NYS Common Core Test — Day 1 — Book 1 — ELA 8 — April 1, 2014 — 42 repetitive multiple-choice questions …

                Here’s an example of a typical question:

                One question asked, “Which sentence best connects two central ideas of the article?” As adult readers, we do not give texts such scrutiny, especially if we’re reading a non-fictional text. We do not say to ourselves as we are reading: Hey, now there’s a sentence connecting two central ideas! Nor, as writers, do we say to ourselves as we’re writing: I will now use this specific sentence to con nect two central ideas. We would never read a text in the ways that these multiple-choice questions are forcing us to read them.

                Another questions asked: “What is the most likely reason for including information about the Smithsonian laboratory in Panama?” “To emphasize…,” “to illustrate…,” “to point out…,” “to provide…” I’m thinking does it really matter. A student, in a non-testing context, could easily grasp why the author includes that information. He or she would use the information or simply move on. “Asking test-takers to respond to text passages with multiple-choice questions induces response processes that are strikingly different from those that respondents would draw on when reading in non-testing contexts” (from How assessing reading com prehension with multiple-choice questions shapes the construct: a cognitive processing perspective by Rupp, Ferne & Choi).

                Why are the multiple-choice questions more difficult than the actual texts? Most of the texts do not warrant such nitpicky multiple-choice questions. We’re taking relatively easy concepts (main idea, evidence, etc,.) and distorting these concepts through the assessment. It’s detrimental.

                Too many of the questions are overly concerned with HOW the Pearson-selected texts are written and structured and not at all concerned with content and comprehension. The argument will be that the questions are assessing close reading skills, but many would argue that this is NOT what they are doing.

                After reading all of the questions and looking at the possible answers, my colleagues and I simply look at each other and say, who really cares. I bet the authors of the texts would even scratch their heads as to why such questions were being posed to an 8th grader (or posed at all for that matter).

                In my classroom or out in the real world, we would be reading these texts for information, for understanding, reading to integrate, reading to develop an argument, reading for entertainment, etc,. We wouldn’t necessarily be reading them to discover why the author uses that word “stimulate” in a non-fictional article. In class, if I ask my students what “stimulate” means in the context of the article we are reading, they would nail it, but the testing context is something altogether different and the multiple-choice format is something altogether different.

                The same goes for reading to discover “which lines support the author’s claim” or “which lines develop a key concept” — In the real world, in my classroom, these would be straightforward tasks that my students could do with no problem, but the multiple-choice questions on these state exams turn the straightforward into a muddied mess.

                Overall: Book 1 is still a slog. Long passages. Many of the multiple choice questions were quite involved, requiring students to flip back and forth a number of times and re-read multiple times. Again , the actual texts do not warrant such scrutiny. Three academically strong students didn’t finish and simply guessed on 10-12 questions.

                The bottom line is that these tests are abysmal. For book 1, why don’t they simply include one poem, one fictional text, and one non-fictional text with fifteen multiple-choice questions and be done with it? These multiple-choice assessments do not connect to the real world of reading, thinking, and writing. They’re simply not important.

                We need to move away from multiple choice- questions and away from these large test manufacturers and bring on more local and regional assessments that incorporate a variety of assessment strategies: projects, portfolios, open-ended quest ions, writing, short-answer, artistic expression, etc,.

                 

                 

                 

                 

              • lorna_feeney
                I for one think we need to demand more accountability in our schools. We need to raise the bar, and demand higher standards from our curriculum and test
                Message 8 of 15 , Apr 1, 2014
                  I for one think we need to demand more accountability in our schools. We need to raise the bar, and demand higher standards from our curriculum and test publishers. 

                  Perhaps small groups of teachers, editors, parents -- hell, anyone with half a brain -- can form Publishing Excellence Committees to review curriculum and testing materials before they ever make it to a classroom. Publishers who consistently don't meet proficiency standards will pay fines and restitution to districts, schools and states. Eventually, the lowest performing publishers will be drained of their revenue and shut down. 

                  It'll be tough at first. We all know change is hard. John king and Meryl Tisch might be finding out for the first time that they gave tens (hundreds?) of millions to a company that basically robbed us blind -- not a brilliant move, and that's scary. 

                  If we want to compete globally, we must demand more for our tax dollars. Because if we continue to accept the status quo, then aren't we giving in to the soft bigotry of the low expectations that publishers have for us? 

                  Lorna

                • nealhugh17
                  With 1.1 public school students, United Parents of New York City would be powerful!!! Since some of do org development and fundraising, and since active
                  Message 9 of 15 , Apr 1, 2014
                    With 1.1 public school students, United Parents of New York City would be powerful!!! Since some of do org development and fundraising, and since active parents with school PTA/PAs HAVE made big differences!: Can there be a mtg? Surely there are a few foundations (Starr, NY, NY Comm Trust, Ford!... :) etc. that could help sponsor- pay expenses... do it midtown NY NY--- Hilton, Yale Club, etc. In 2003 we ran conference Private Funding for Public Education (endowments, etc.) at Columbia U Low Library in the rotunda... 

                    All the good ideas here need ORGANIZATION, thereby POWER! 

                    Jus' sayin'... but for real. Best, Neal 


                    Sent from my BlackBerry 10 smartphone.
                    From: floreena@...
                    Sent: Tuesday, April 1, 2014 10:51 PM
                    To: nyceducationnews@yahoogroups.com
                    Reply To: nyceducationnews@yahoogroups.com
                    Subject: [nyceducationnews] Re: The bottom line is that these tests are abysmal

                     

                    I for one think we need to demand more accountability in our schools. We need to raise the bar, and demand higher standards from our curriculum and test publishers. 


                    Perhaps small groups of teachers, editors, parents -- hell, anyone with half a brain -- can form Publishing Excellence Committees to review curriculum and testing materials before they ever make it to a classroom. Publishers who consistently don't meet proficiency standards will pay fines and restitution to districts, schools and states. Eventually, the lowest performing publishers will be drained of their revenue and shut down. 

                    It'll be tough at first. We all know change is hard. John king and Meryl Tisch might be finding out for the first time that they gave tens (hundreds?) of millions to a company that basically robbed us blind -- not a brilliant move, and that's scary. 

                    If we want to compete globally, we must demand more for our tax dollars. Because if we continue to accept the status quo, then aren't we giving in to the soft bigotry of the low expectations that publishers have for us? 

                    Lorna


                  • Ann Kjellberg
                    I am kind of a professional reader, and I don t think the way that the ELA and its related curriculum teaches kids to read is at all meaningful. No writer or
                    Message 10 of 15 , Apr 1, 2014
                      I am kind of a professional reader, and I don't think the way that the ELA and its related curriculum teaches kids to read is at all meaningful.  No writer or grown-up reader thinks this way.  I'm not sure I'm ready to go all the way with Hirsch's national standardized curriculum, but I think he's right (see for instance Huff Post here) that trying to gauge "reading comprehension" minus any content can't have any value.  It is all about the "how" and not about the "what," but the "how" is a kind of invented theorized form of reasoning very far from the actual experience of learning from what you read.

                      My daughter (7th grade) is a bright kid capable of taking in some pretty sophisticated stuff, but it seems like she's groping in the dark with these questions, just trying to figure out what they're after and not actually analyzing the material.  


                      On Apr 1, 2014, at 8:09 PM, Laura@... wrote:

                       

                      Whatever teacher had the guts to do this, thank you.  My son is in 8th grade.

                      How is this testing reading comprehension? How is this testing critical thinking? Without reading the passages, it seems just tedious and confusing. They were not thinking, they were rummaging through the passage to find the answer that most closely matched the choices.   My son said he had to refer back to the passages several times.  Also, he shared one of the questions about the Panama Lab was designed to trick them.  C&D referenced a locust from the Panama Lab but the lab was about bees and not locusts.

                      Are they going to release the tests this year or keep it a secret like last year?  

                      Laura E. Timoney
                      (O) 718.987.6411
                      (C) 917.667.2711
                      Laura@...



                      nealhugh@...
                      Sent by: nyceducationnews@yahoogroups.com

                      04/01/2014 06:51 PM


                      To
                      nyceducationnews@yahoogroups.com, nyceducationnews@yahoogroups.com, changethestakes-open-forum@...
                      cc
                      cts-internal@...
                      Subject
                      Re: [nyceducationnews] The bottom line is that these tests are abysmal





                       

                      Many thanks!!! Great info!!! Neal


                      Sent from my BlackBerry 10 smartphone.

                      From: Leonie Haimson
                      Sent: Tuesday, April 1, 2014 5:13 PM
                      To: nyceducationnews@yahoogroups.com; changethestakes-open-forum@...
                      Reply To: nyceducationnews@yahoogroups.com
                      Cc: cts-internal@...
                      Subject: [nyceducationnews] The bottom line is that these tests are abysmal


                       

                      Reposted from here:

                      http://testingtalk.org/response/book-1-nys-common-core-test-ela-8

                      Book 1 — NYS Common Core Test — ELA 8

                      • Author: anonymous, Teacher
                      • |
                      • State: NY
                      • |
                      • Test: State test: Pearson
                      • |
                      • Date: April 1 at 4:17 pm ET
                      NYS Common Core Test — Day 1 — Book 1 — ELA 8 — April 1, 2014 — 42 repetitive multiple-choice questions …

                      Here’s an example of a typical question:

                      One question asked, “Which sentence best connects two central ideas of the article?” As adult readers, we do not give texts such scrutiny, especially if we’re reading a non-fictional text. We do not say to ourselves as we are reading: Hey, now there’s a sentence connecting two central ideas! Nor, as writers, do we say to ourselves as we’re writing: I will now use this specific sentence to con nect two central ideas. We would never read a text in the ways that these multiple-choice questions are forcing us to read them.

                      Another questions asked: “What is the most likely reason for including information about the Smithsonian laboratory in Panama?” “To emphasize…,” “to illustrate…,” “to point out…,” “to provide…” I’m thinking does it really matter. A student, in a non-testing context, could easily grasp why the author includes that information. He or she would use the information or simply move on. “Asking test-takers to respond to text passages with multiple-choice questions induces response processes that are strikingly different from those that respondents would draw on when reading in non-testing contexts” (from How assessing reading com prehension with multiple-choice questions shapes the construct: a cognitive processing perspective by Rupp, Ferne & Choi).

                      Why are the multiple-choice questions more difficult than the actual texts? Most of the texts do not warrant such nitpicky multiple-choice questions. We’re taking relatively easy concepts (main idea, evidence, etc,.) and distorting these concepts through the assessment. It’s detrimental.

                      Too many of the questions are overly concerned with HOW the Pearson-selected texts are written and structured and not at all concerned with content and comprehension. The argument will be that the questions are assessing close reading skills, but many would argue that this is NOT what they are doing.

                      After reading all of the questions and looking at the possible answers, my colleagues and I simply look at each other and say, who really cares. I bet the authors of the texts would even scratch their heads as to why such questions were being posed to an 8th grader (or posed at all for that matter).

                      In my classroom or out in the real world, we would be reading these texts for information, for understanding, reading to integrate, reading to develop an argument, reading for entertainment, etc,. We wouldn’t necessarily be reading them to discover why the author uses that word “stimulate” in a non-fictional article. In class, if I ask my students what “stimulate” means in the context of the article we are reading, they would nail it, but the testing context is something altogether different and the multiple-choice format is something altogether different.

                      The same goes for reading to discover “which lines support the author’s claim” or “which lines develop a key concept” — In the real world, in my classroom, these would be straightforward tasks that my students could do with no problem, but the multiple-choice questions on these state exams turn the straightforward into a muddied mess.

                      Overall: Book 1 is still a slog. Long passages. Many of the multiple choice questions were quite involved, requiring students to flip back and forth a number of times and re-read multiple times. Again , the actual texts do not warrant such scrutiny. Three academically strong students didn’t finish and simply guessed on 10-12 questions.

                      The bottom line is that these tests are abysmal. For book 1, why don’t they simply include one poem, one fictional text, and one non-fictional text with fifteen multiple-choice questions and be done with it? These multiple-choice assessments do not connect to the real world of reading, thinking, and writing. They’re simply not important.

                      We need to move away from multiple choice- questions and away from these large test manufacturers and bring on more local and regional assessments that incorporate a variety of assessment strategies: projects, portfolios, open-ended quest ions, writing, short-answer, artistic expression, etc,.

                       
                       
                       



                    • Norm Scott
                      Are you kidding? Bigfoot DOESN T exist? Biggest blow for me since I was 10 and found out there was no Santa. Coming soon to a Pearson test near you: A passage
                      Message 11 of 15 , Apr 1, 2014
                        Are you kidding? Bigfoot DOESN'T exist? Biggest blow for me since I was 10 and found out there was no Santa.

                        Coming soon to a Pearson test near you:  A passage talking about how much a hero Eva Moskowitz is for taking on big bad de Blasio.


                        On Tue, Apr 1, 2014 at 9:58 PM, Patrick Sullivan <patk.j.sullivan@...> wrote:
                         

                        Yes, my son said the passage was offered as an unbiased article with experts offering evidence that bigfoot did or did not exist. 

                        On Apr 1, 2014 8:37 PM, "Leonie Haimson" <leonie@...> wrote:
                         

                        They seem to be trying to trick kids to refer only to the passage & not to any background knowledge, a la David Coleman’s “close reading” BS.  From a parent of an 8th grader:

                         

                        My daughter said there was a passage about "proving the existence of bigfoot". She was troubled because the passage made it seem as if there is an actual debate about whether bigfoot exists or not. she said the passage was an article about how there is a great deal of evidence suggesting bigfoot is real!?!

                         

                        Meanwhile here is a photo of a classroom door at a school on testing day:

                         

                         

                        From: nyceducationnews@yahoogroups.com [mailto:nyceducationnews@yahoogroups.com] On Behalf Of Laura@...
                        Sent: Tuesday, April 01, 2014 8:10 PM
                        To: nyceducationnews@yahoogroups.com
                        Subject: Re: [nyceducationnews] The bottom line is that these tests are abysmal

                         

                         

                        Whatever teacher had the guts to do this, thank you.  My son is in 8th grade.

                        How is this testing reading comprehension? How is this testing critical thinking? Without reading the passages, it seems just tedious and confusing. They were not thinking, they were rummaging through the passage to find the answer that most closely matched the choices.   My son said he had to refer back to the passages several times.  Also, he shared one of the questions about the Panama Lab was designed to trick them.  C&D referenced a locust from the Panama Lab but the lab was about bees and not locusts.

                        Are they going to release the tests this year or keep it a secret like last year?  

                        Laura E. Timoney
                        (O) 718.987.6411
                        (C) 917.667.2711
                        Laura@...


                        nealhugh@...
                        Sent by: nyceducationnews@yahoogroups.com

                        04/01/2014 06:51 PM

                        Please respond to
                        nyceducationnews@yahoogroups.com

                        To

                        nyceducationnews@yahoogroups.com, nyceducationnews@yahoogroups.com, changethestakes-open-forum@...

                        cc

                        cts-internal@...

                        Subject

                        Re: [nyceducationnews] The bottom line is that these tests are abysmal

                         




                         

                        Many thanks!!! Great info!!! Neal


                        Sent from my BlackBerry 10 smartphone.

                        From: Leonie Haimson
                        Sent: Tuesday, April 1, 2014 5:13 PM
                        To: nyceducationnews@yahoogroups.com; changethestakes-open-forum@...
                        Reply To: nyceducationnews@yahoogroups.com
                        Cc: cts-internal@...
                        Subject: [nyceducationnews] The bottom line is that these tests are abysmal



                         

                        Reposted from here:

                        http://testingtalk.org/response/book-1-nys-common-core-test-ela-8

                        Book 1 — NYS Common Core Test — ELA 8

                        • Author: anonymous, Teacher
                        • |
                        • State: NY
                        • |
                        • Test: State test: Pearson
                        • |
                        • Date: April 1 at 4:17 pm ET

                        NYS Common Core Test — Day 1 — Book 1 — ELA 8 — April 1, 2014 — 42 repetitive multiple-choice questions …

                        Here’s an example of a typical question:

                        One question asked, “Which sentence best connects two central ideas of the article?” As adult readers, we do not give texts such scrutiny, especially if we’re reading a non-fictional text. We do not say to ourselves as we are reading: Hey, now there’s a sentence connecting two central ideas! Nor, as writers, do we say to ourselves as we’re writing: I will now use this specific sentence to con nect two central ideas. We would never read a text in the ways that these multiple-choice questions are forcing us to read them.

                        Another questions asked: “What is the most likely reason for including information about the Smithsonian laboratory in Panama?” “To emphasize…,” “to illustrate…,” “to point out…,” “to provide…” I’m thinking does it really matter. A student, in a non-testing context, could easily grasp why the author includes that information. He or she would use the information or simply move on. “Asking test-takers to respond to text passages with multiple-choice questions induces response processes that are strikingly different from those that respondents would draw on when reading in non-testing contexts” (from How assessing reading com prehension with multiple-choice questions shapes the construct: a cognitive processing perspective by Rupp, Ferne & Choi).

                        Why are the multiple-choice questions more difficult than the actual texts? Most of the texts do not warrant such nitpicky multiple-choice questions. We’re taking relatively easy concepts (main idea, evidence, etc,.) and distorting these concepts through the assessment. It’s detrimental.

                        Too many of the questions are overly concerned with HOW the Pearson-selected texts are written and structured and not at all concerned with content and comprehension. The argument will be that the questions are assessing close reading skills, but many would argue that this is NOT what they are doing.

                        After reading all of the questions and looking at the possible answers, my colleagues and I simply look at each other and say, who really cares. I bet the authors of the texts would even scratch their heads as to why such questions were being posed to an 8th grader (or posed at all for that matter).

                        In my classroom or out in the real world, we would be reading these texts for information, for understanding, reading to integrate, reading to develop an argument, reading for entertainment, etc,. We wouldn’t necessarily be reading them to discover why the author uses that word “stimulate” in a non-fictional article. In class, if I ask my students what “stimulate” means in the context of the article we are reading, they would nail it, but the testing context is something altogether different and the multiple-choice format is something altogether different.

                        The same goes for reading to discover “which lines support the author’s claim” or “which lines develop a key concept” — In the real world, in my classroom, these would be straightforward tasks that my students could do with no problem, but the multiple-choice questions on these state exams turn the straightforward into a muddied mess.

                        Overall: Book 1 is still a slog. Long passages. Many of the multiple choice questions were quite involved, requiring students to flip back and forth a number of times and re-read multiple times. Again , the actual texts do not warrant such scrutiny. Three academically strong students didn’t finish and simply guessed on 10-12 questions.

                        The bottom line is that these tests are abysmal. For book 1, why don’t they simply include one poem, one fictional text, and one non-fictional text with fifteen multiple-choice questions and be done with it? These multiple-choice assessments do not connect to the real world of reading, thinking, and writing. They’re simply not important.

                        We need to move away from multiple choice- questions and away from these large test manufacturers and bring on more local and regional assessments that incorporate a variety of assessment strategies: projects, portfolios, open-ended quest ions, writing, short-answer, artistic expression, etc,.

                         

                         

                         

                         




                        --
                        Have a good day

                        Norm Scott
                        normsco@...
                        917-992-3734

                        On Twitter:  @normscott1

                        Education Notes Online
                        ednotesonline.blogspot.com/
                         
                        GEM, Grassroots Education Movement
                        gemnyc.org
                         
                        The Inconvenient Truth Behind Waiting for Superman - Now Online
                        http://gemnyc.org/our-film/
                         
                        Education Editor, The Wave
                        http://www.rockawave.com/

                        Norms's Robotics blog
                        http://normsrobotics.blogspot.com/
                      • moxico2003
                        Another questions asked: “What is the most likely reason for including information about the Smithsonian laboratory in Panama?” “To emphasize…,”
                        Message 12 of 15 , Apr 2, 2014
                          'Another questions asked: “What is the most likely reason for including information about the Smithsonian laboratory in Panama?” “To emphasize…,” “to illustrate…,” “to point out…,” “to provide…” I’m thinking does it really matter. '

                          Obviously, any of these responses could be correct, especially from the way the question is worded. The phrase 'most likely' calls for speculation. And the responses are differently worded variations on a similar theme. There is clearly something else going on and it's not just that it's a badly written test. 



                        • Ann Kjellberg
                          I do feel a little sad sometimes for the people writing these tests. They must be getting so many contradictory instructions and they are trying to do stuff
                          Message 13 of 15 , Apr 2, 2014
                            I do feel a little sad sometimes for the people writing these tests.  They must be getting so many contradictory instructions and they are trying to do stuff that would be interesting to kids.  Sara had a section on Tibetan Yaks, which would normally be right up her alley, but, she said, "they managed to make it boring."  And then there was the time they got in the hole using the great Daniel Pinkwater...


                            On Apr 2, 2014, at 7:29 AM, seymourella@... wrote:

                             

                            'Another questions asked: “What is the most likely reason for including information about the Smithsonian laboratory in Panama?” “To emphasize…,” “to illustrate…,” “to point out…,” “to provide…” I’m thinking does it really matter. '

                            Obviously, any of these responses could be correct, especially from the way the question is worded. The phrase 'most likely' calls for speculation. And the responses are differently worded variations on a similar theme. There is clearly something else going on and it's not just that it's a badly written test. 





                          • pgarrity10025
                            At the risk of saying something nice about the ELA test, and Bigfoot aside, my 9th grade son said that he found a that section about Amerigo Vespucci was very
                            Message 14 of 15 , Apr 2, 2014
                              At the risk of saying something nice about the ELA test, and Bigfoot aside, my 9th grade son said that he found a that section about Amerigo Vespucci was very interesting and he had a lot of questions for me and wants to read more about it.  All about how maps work, how the idea of latitude was created, etc.
                            • Leonie Haimson
                              This wasn’t a state test if your son is in 9th grade. From: nyceducationnews@yahoogroups.com [mailto:nyceducationnews@yahoogroups.com] On Behalf Of
                              Message 15 of 15 , Apr 2, 2014

                                This wasn’t a state test if your son is in 9th grade.

                                 

                                From: nyceducationnews@yahoogroups.com [mailto:nyceducationnews@yahoogroups.com] On Behalf Of pgarrity@...
                                Sent: Wednesday, April 02, 2014 4:44 PM
                                To: nyceducationnews@yahoogroups.com
                                Subject: RE: [nyceducationnews] The bottom line is that these tests are abysmal

                                 

                                 

                                At the risk of saying something nice about the ELA test, and Bigfoot aside, my 9th grade son said that he found a that section about Amerigo Vespucci was very interesting and he had a lot of questions for me and wants to read more about it.  All about how maps work, how the idea of latitude was created, etc.

                              Your message has been successfully submitted and would be delivered to recipients shortly.