Loading ...
Sorry, an error occurred while loading the content.

[Net-Gold] Physics Education Research - Not Widely Known in Higher Education #2

Expand Messages
  • David P. Dillard
    . . Date: Sat, 30 Apr 2011 13:49:34 -0700 From: Richard Hake Reply-To: Net-Gold@yahoogroups.com To: AERA-L@listserv.aera.net Cc:
    Message 1 of 1 , May 1, 2011
    • 0 Attachment
      .

      .

      Date: Sat, 30 Apr 2011 13:49:34 -0700
      From: Richard Hake <rrhake@...>
      Reply-To: Net-Gold@yahoogroups.com
      To: AERA-L@...
      Cc: Net-Gold@yahoogroups.com
      Subject: [Net-Gold] Physics Education Research - Not Widely Known in Higher
      Education #2

      .

      .


      If you reply to this long (25 kB) post please
      don't hit the reply button unless you prune the
      copy of this post that may appear in your reply
      down to a few relevant lines, otherwise the
      entire already archived post may be needlessly
      resent to subscribers.

      .

      ******************************************

      .

      ABSTRACT: In response to my post "Physics
      Education Research - Not Widely Known in Higher
      Education" [Hake (2011a) <http://bit.ly/iT4YsN>],
      a discussion-list subscriber wrote to me
      privately, asking for references on instructional
      methods that had been used in physics to produce
      relatively high average normalized gains <g> in
      students' conceptual understanding of mechanics.

      .

      In this post I give the titles and references to
      seven of the more popular "Interactive
      Engagement" (IE) methods discussed in
      "Interactive-engagement vs traditional methods: A
      six-thousand-student survey of mechanics test
      data for introductory physics courses" [Hake
      (1998a,b)] and (following Heller, 1999
      <http://bit.ly/mNz2Q9>) their relationship to
      learning theories from cognitive science.

      .

      The common features of those methods are
      reflected in the *operational* definition of IE
      courses given in Hake (1998a): " 'IE courses' are
      those designed at least in part to promote
      conceptual understanding through active
      engagement of students in heads-on (always) and
      hands-on (usually) activities WHICH YIELD
      IMMEDIATE FEEDBACK through discussion with peers
      and/or instructors, all as judged by their
      literature descriptions."

      .

      Thus a hallmark of IE course is their use of
      "formative assessment" as defined by Black &
      Wiliam <http://bit.ly/kuDmNX>: "All those
      activities undertaken by teachers -- and by their
      students in assessing themselves -- THAT PROVIDE
      INFORMATION TO BE USED AS FEEDBACK TO MODIFY
      TEACHING AND LEARNING ACTIVITIES."

      .

      BTW: (1) I think this post is relevant to
      instructors in ANY subject that requires higher
      order thinking skills, not just Newtonian
      mechanics. (2) After transmitting "Physics
      Education Research - Not Widely Known in Higher
      Education" [Hake (2011a)], I was reminded that
      Peggy Maki <http://www.peggymaki.com/> is one of
      the few assessment gurus in higher education who
      is both knowledgeable and appreciative of Physics
      Education Research - see e.g. "Assessing for
      Learning" [Maki (2011) <http://bit.ly/j1hTeW>],
      especially Chapter 4.

      .

      ******************************************

      .

      In response to my post "Physics Education
      Research - Not Widely Known in Higher Education"
      [Hake (2011a)] a discussion-list subscriber wrote
      to me privately, asking for references on
      instructional methods that had been used in
      physics to produce relatively high average
      normalized gains <g> in students' conceptual
      understanding of mechanics.

      .

      In "Interactive-Engagement Methods in
      Introductory Mechanics Courses" [Hake (1998b)], I
      discussed the methods and materials that were
      used in Interactive-Engagement (IE) courses that
      were surveyed in "Interactive-engagement vs
      traditional methods: A six-thousand-student
      survey of mechanics test data for introductory
      physics courses" [Hake (1998a)].

      .

      However, more compactly, the following excerpt
      from "Design-Based Research in Physics Education
      Research: A Review" [Hake (2008a)] gives the
      titles and references to seven of the more
      popular "Interactive Engagement" (IE) methods
      discussed in Hake (1998a,b) and their
      relationship to learning theories from cognitive
      science following Heller (1999) [bracketed by
      lines "HHHHHH. . . . ."; see that online article
      for references other than Heller (1999); my
      inserts at ". . . . .[[insert]]. . . ."]:

      .

      HHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH

      .

      For the 48 interactive engagement courses of
      Figure 1. . . . [[same as Fig. 1 of Hake
      (1998a)]]. . . ., the ranking in terms of number
      of IE courses using each of the more popular
      methods follows:

      .

      1. COLLABORATIVE PEER INSTRUCTION (Johnson,
      Johnson, & Smith, 1991; Heller, Keith, &
      Anderson, 1992; Slavin, 1995; Johnson, Johnson, &
      Stanne, 2000): 48 (all courses) [[[CA]]] - for
      the meaning of "CA" and similar abbreviations
      below within the triple brackets "[[[...]]]", see
      the three paragraphs at the end of this list.

      .

      2. MICROCOMPUTER-BASED LABORATORIES (Thornton &
      Sokoloff, 1990, 1998; Thornton, 1995): 35 courses
      [[[DT]]].

      .

      3. CONCEPT TESTS (Mazur, 1997; Crouch & Mazur,
      2001; Fagen, Crouch, & Mazur, 2002); Lorenzo,
      Crouch, & Mazur, 2006); Rosenberg, Lorenzo, &
      Mazur, 2006): 20 courses [[[DT]]]; such tests for
      physics, biology, and chemistry are available on
      the Web, along with a description of the peer
      instruction method, at the Mazur group's Galileo
      (2007) and Interactive Tool Kit (2007) sites. . .
      . .[[more recently see e.g., ""Confessions of a
      Converted Lecturer" [Mazur (2008)] and "The Case
      for Classroom Clickers - A Response to Bugeja"
      [Hake (2008b)]]. . . . .

      .

      4. MODELING (Halloun & Hestenes, 1987; Hestenes,
      1987, 1992; Wells, Hestenes, & Swackhamer, 1995):
      19 courses [[[DT + CA]]]; a description is on the
      Web at <http://modeling.la.asu.edu/>.

      .

      5. ACTIVE LEARNING PROBLEM SETS OR OVERVIEW CASE
      STUDIES (Van Heuvelen, 1991a, 1991b, 1995): 17
      courses [[[CA]]].

      .

      6. PHYSICS EDUCATION RESEARCH-BASED TEXT
      (referenced in Hake, 1998b, Table II) or no text:
      13 courses.

      .

      7. SOCRATIC DIALOGUE INDUCING LABORATORIES (Hake
      1987, 1991, 1992, 2000, 2002d, 2007b; Tobias &
      Hake, 1988; Hake & Wakeland, 1997; & Tobias &
      Hake, 1988): 9 courses [[[DT + CA]]]; a
      description and laboratory manuals are on the web
      at <http://www.physics.indiana.edu/~sdi>. . . . .
      . [[More recently see "Helping Students to Think
      Like Scientists in Socratic Dialogue Inducing
      Labs" [Hake (2010b)]]. . . . . .

      .

      The notations within the triple brackets [[[. . .
      ]]] follow Heller (1999) in associating loosely
      the methods with learning theories from cognitive
      science. Here "DT" stands for "developmental
      theory," originating with Piaget (Inhelder &
      Piaget, 1958; Gardner 1985; Inhelder, deCaprona,
      & Cornu- Wells, 1987; Phillips & Soltis, 1998);
      "CA" stands for "cognitive apprenticeship"
      (Collins, Brown, & Newman, 1989; Brown, Collins,
      & Duguid, 1989).

      .

      All the methods recognize the important role of
      social interactions in learning (Vygotsky, 1978;
      Lave & Wenger, 1991; Dewey, 1938/1997; Phillips &
      Soltis, 1998). . . . .[[and "formative
      assessment" in the Black & Wiliam (1998) sense -
      see below]]. . . .

      .

      It should be emphasized that the above rankings
      are by popularity within the survey and have no
      necessary connection with the effectiveness of
      the methods relative to one another. In fact, it
      is quite possible that some of the less popular
      methods used in some survey courses, as listed by
      Hake (1998b), could be more effective in terms of
      promoting students' understanding than any of the
      popular strategies noted above.

      .

      HHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH

      .

      In "Interactive-engagement vs traditional
      methods: A six-thousand-student survey of
      mechanics test data for introductory physics
      courses" [Hake (1998a)] I defined Interactive
      Engagement (IE) courses *operationally* [even
      despite the anti-positivist vigilantes (Phillips,
      2000; Hake (2010a)] so as to reflect the common
      features of all the above relatively effective
      methods:

      .

      " 'IE courses' are those designed at least in part to promote conceptual
      understanding through active engagement of students in heads-on
      (always) and hands-on (usually) activities which yield immediate
      feedback through discussion with peers and/or instructors, all as
      judged by their literature descriptions". . . . .

      .

      . . . . . . . . . . . . . . . . . . . . . . . .

      .

      (1)

      .

      Definition "1" indicates that IE methods rely on
      "FORMATIVE ASSESSMENT" in the sense employed by
      Black and Wiliam (1998) in their famous paper
      "Inside the Black Box: Raising Standards Through
      Classroom Assessment," and more recently by
      Shavelson et al. (2008) in "On the Impact of
      Curriculum-Embedded Formative Assessment on
      Learning. . . .".

      .

      CAUTION! Black and Wiliam's definition of
      "formative assessment" differs from "formative
      evaluation" as defined by JCSEE: "Formative
      evaluation is evaluation designed and used to
      improve an object, especially when it is still
      being developed." For a recent discussion of
      "formative" in the JCSEE sense see "ETS Pushes
      For FORMATIVE Assessment?" [Hake (2011b)].

      .

      Black and Wilhelm's meaning of "formative
      assessment" is explained by them in this segment
      from "Inside the Black Box" [bracketed by lines
      "B&W-B&W-B&W-. . . . ."; my CAPS]:

      .

      B&W-B&W-B&W-B&W-B&W-B&W-B&W-B&W-B&W-B&W

      .

      We start from the self-evident proposition that
      TEACHING AND LEARNING MUST BE INTERACTIVE.
      Teachers need to know about their pupils'
      progress and difficulties with learning so that
      they can adapt their own work to meet pupils'
      needs -- needs that are often unpredictable and
      that vary from one pupil to another. Teachers can
      find out what they need to know in a variety of
      ways, including observation and discussion in the
      classroom and the reading of pupils' written work.

      .

      We use the general term assessment to refer to
      ALL THOSE ACTIVITIES UNDERTAKEN BY TEACHERS --
      AND BY THEIR STUDENTS IN ASSESSING THEMSELVES --
      THAT PROVIDE INFORMATION TO BE USED AS FEEDBACK
      TO MODIFY TEACHING AND LEARNING ACTIVITIES. Such
      assessment becomes formative assessment when the
      evidence is actually used to adapt the teaching
      to meet student needs.

      .

      B&W-B&W-B&W-B&W-B&W-B&W-B&W-B&W-B&W-B&W

      .

      In a footnote #2 to the last paragraph above, B&W
      wrote: "There is no internationally agreed-upon
      term here. 'Classroom evaluation,' 'classroom
      assessment,' 'internal assessment,'
      'instructional assessment,' and 'student
      assessment' have been used by different authors,
      and some of these terms have different meanings
      in different texts."

      .

      B&W, in a section "Commenting on 'A poverty of practice' " wrote:

      .

      B&W-B&W-B&W-B&W-B&W-B&W-B&W-B&W-B&W-B&W

      .

      There is a wealth of research evidence that the
      everyday practice of assessment in classrooms is
      beset with problems and shortcomings, as the
      following selected quotations indicate. . . . . .

      .

      . . . . . . . . . . . . . . . . . . . . . . . . .

      .

      . . . . . . . . . . . . . . . .

      .


      "The assessment practices outlined above . . . .
      .[[consistent with those advocated by B&W]]. . .
      . . are not common, even though these kinds of
      approaches are now widely promoted in the
      professional literature," according to a review
      of assessment practices in U.S. schools [Neill
      (1997)] . . . . .[[now updated to Neill (2008),
      Monty Neil <http://bit.ly/mz70zZ> is Executive
      Director of the National Center for Fair & Open
      Testing (FairTest) and is a prolific contributor
      to FairTest's ARN-L discussion list with archives
      at <http://bit.ly/jeiTPm>]]. . . . . .

      .

      B&W-B&W-B&W-B&W-B&W-B&W-B&W-B&W-B&W-B&W

      .

      The abstract of "Interactive-engagement vs
      traditional methods: A six-thousand-student
      survey of mechanics test data for introductory
      physics courses" [Hake (1998a)] reads [bracketed
      by lines "HHHHH. . . . ."; I have inserted
      references]:

      .

      HHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH

      .

      A survey of pre/post test data using the
      Halloun-Hestenes Mechanics Diagnostic test
      [Halloun & Hestnenes (1985a,b ] or more recent
      Force Concept Inventory [Hestenes et al. (1992)]
      is reported for 62 introductory physics courses
      enrolling a total number of students N = 6542.

      .

      A consistent analysis over diverse student
      populations in high schools, colleges, and
      universities is obtained if a rough measure of
      the average effectiveness of a course in
      promoting conceptual understanding is taken to be
      the average normalized gain <g>. The latter is
      defined as the ratio of the actual average gain
      (%<post> - %<pre>) to the maximum possible
      average gain (100 - %<pre>). . . . .[[Where here
      and below angle brackets <. . . > indicate class
      averages]]. . . .

      .

      Fourteen "traditional" (T) courses (N = 2084)
      which made little or no use of
      interactive-engagement (IE) methods achieved an
      average gain <g>T-ave = 0.23 ± 0.04 (std dev). In
      sharp contrast, forty-eight courses (N = 4458)
      which made substantial use of IE methods achieved
      an average gain <g>IE-ave = 0.48 ± 0.14 (std
      dev), almost two standard deviations of <g>IE-ave
      above that of the traditional courses.

      .

      Results for 30 (N = 3259) of the above 62 courses
      on the problem-solving Mechanics Baseline test of
      Hestenes-Wells imply that IE strategies enhance
      problem-solving ability.

      .

      The conceptual and problem-solving test results
      strongly suggest that the classroom use of IE
      methods can increase mechanics-course
      effectiveness well beyond that obtained in
      traditional practice.

      .

      HHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH

      .

      Thus a major conclusion of the above survey was:

      .

      " 'Interactive-engagement' (IE) courses result in average pre-to-posttest
      normalized gains <g> in conceptual understanding of mechanics which
      are high relative to those achieved by 'traditional' (T) courses"
      (on average, about two standard deviations
      above)" . . . . . . . . . . . . . . . . . . . .

      .

      (2)

      .

      Conclusion "2" is meaningful because in Hake
      (1998a) IE courses are *operationally* defined as
      in "1" above and 'traditional' (T) courses are
      *operationally* defined as:

      .

      " 'T courses' are those reported by instructors to
      make little or no use of IE methods, relying
      primarily on passive-student lectures, recipe labs,
      and algorithmic-problem exams

      .

      . . . . . . . . . . . . . . . . . . . . . . .

      .

      (3)

      .

      Two BTW's:

      .

      (1) I think this post is relevant to instructors
      in ANY subject that requires higher-order
      thinking skills, not just Newtonian mechanics.

      .

      (2) After transmitting "Physics Education
      Research - Not Widely Known in Higher Education"
      [Hake (2011a)], I was reminded by Beverley Taylor
      <http://bit.ly/k5JpAW> that Peggy Maki
      <http://www.peggymaki.com/> is probably unique
      among assessment gurus in higher education in
      appreciating the importance of Physics Education
      Research - see e.g. "Assessing for Learning"
      [Maki (2011) <http://bit.ly/j1hTeW>], especially
      Chapter 4 "Raising and Pursuing Open-Ended
      Research or Study Questions to Deepen Inquiry
      Into and Improve Student Learning."

      .

      .

      .

      Richard Hake, Emeritus Professor of Physics, Indiana University
      Honorary Member, Curmudgeon Lodge of Deventer, The Netherlands
      President, PEdants for Definitive Academic References which Recognize the
      Invention of the Internet (PEDARRII)
      <rrhake@...>
      <http://www.physics.indiana.edu/~hake>
      <http://www.physics.indiana.edu/~sdi>
      <http://HakesEdStuff.blogspot.com>
      <http://iub.academia.edu/RichardHake>

      .

      .

      .


      Physics educators have led the way in developing
      and using objective tests to compare student
      learning gains in different types of courses, and
      chemists, biologists, and others are now
      developing similar instruments. These tests
      provide convincing evidence that students
      assimilate new knowledge more effectively in
      courses including active, inquiry-based, and
      collaborative learning, assisted by information
      technology, than in traditional courses."
      Wood & Gentile (2003)

      .

      .

      .

      REFERENCES [All URL's shortened by
      <http://bit.ly/> and accessed on 30 April 2011.]

      .

      .

      .

      Black, P. & D. Wiliam. 1998. "Inside the Black
      Box: Raising Standards Through Classroom
      Assessment," Phi Delta Kappan 80(2): 139-144,
      146-148; online as a 168 kB pdf at
      <http://bit.ly/kuDmNX>. Note that Black & Wiliam
      do not use "formative" in the same sense as the
      "Joint Committee on Standards for Educational
      Evaluation" [JCSEE (1994)].

      .

      Hake, R.R. 1998a. "Interactive-engagement vs
      traditional methods: A six-thousand-student
      survey of mechanics test data for introductory
      physics courses," Am. J. Phys. 66: 64-74; online
      as an 84 kB pdf at <http://bit.ly/d16ne6>.

      .

      Hake, R.R. 1998b. "Interactive-engagement methods
      in introductory mechanics courses," online as a
      108 kB pdf at <http://bit.ly/aH2JQN>(108 kB). A
      crucial companion paper to Hake (1998a). Rejected
      :-( by an AJP editor who thought the very
      transparent Physical-Review-type data tables were
      "impenetrable."

      .

      Hake, R.R. 2008a. "Design-Based Research in
      Physics Education Research: A Review," in Kelly,
      Lesh, & Baek (2008); a prepublication version of
      Hake's chapter is online as a 1.1 MB pdf at
      <http://bit.ly/9kORMZ>.

      .

      Hake, R.R. 2008b. "The Case for Classroom
      Clickers - A Response to Bugeja," online as a 716
      kB pdf at <http://bit.ly/gOonp8>.

      .

      Hake, R.R. 2010a. "Education Research Employing
      Operational Definitions Can Enhance the Teaching
      Art," invited talk, Portland AAPT meeting, 19
      July; online as a 3.8 MB pdf at
      <http://bit.ly/aGlkjm>.

      .

      Hake, R.R. 2010b. "Helping Students to Think Like
      Scientists in Socratic Dialogue Inducing Labs,"
      submitted to The Physics Teacher on 19 August;
      online as a 446 kB pdf at
      <http://bit.ly/99yb7p> (446 k B). Accepted for
      publication -to be published in Fall 2011.

      .

      Hake, R.R. 2011a. "Physics Education Research -
      Not Widely Known in Higher Education" online on
      the OPEN! AERA-L archives at
      <http://bit.ly/iT4YsN>. Post of 27 Apr 2011
      17:07:07-0700 to AERA-L and Net-Gold. The
      abstract and link to the complete post were
      transmitted to various discussion lists and are
      also on my blog "Hake'sEdStuff" at
      <http://bit.ly/msoLwx> with a provision for
      comments.

      .

      Hake, R.R. 2011b. "ETS Pushes For FORMATIVE
      Assessment?" online on the OPEN! AERA-L archives
      at <http://bit.ly/hQHID2>. Post of 24 Apr 2011
      14:04:36 -0700 to AERA-L and Net-Gold. The
      abstract and link to the complete post were
      transmitted to various discussion lists and are
      also on my blog "Hake'sEdStuff" at
      <http://bit.ly/h2qVbW> with a provision for
      comments.

      .

      Halloun, I. & Hestenes, D. 1985a. "The initial
      knowledge state of college physics," Am. J. Phys.
      53(11): 1043-1055; online at
      <http://bit.ly/b1488v>, scroll down to
      "Evaluation Instruments." Contains the
      "Mechanics Diagnostic" test (omitted from the
      online version), precursor to the widely used
      "Force Concept Inventory" [Hestenes et al.
      (1992)].

      .

      Halloun, I. & D. Hestenes. 1985b. "Common sense
      concepts about motion," Am. J. Phys. 53(11):
      1056-1065; online at <http://bit.ly/b1488v>,
      scroll down to "Evaluation Instruments."

      .

      Halloun, I., R.R. Hake, E.P. Mosca, & D.
      Hestenes. 1995. "Force Concept Inventory (1995
      Revision)," online (password protected) at
      <http://bit.ly/b1488v>, scroll down to
      "Evaluation Instruments." Currently available in
      20 languages: Arabic, Chinese, Croatian, Czech,
      English, Finnish, French, French (Canadian),
      German, Greek, Italian, Japanese, Malaysian,
      Persian, Portuguese, Russian, Spanish, Slovak,
      Swedish, & Turkish.

      .

      Heller, K.J. 1999. "Introductory physics reform
      in the traditional format: an intellectual
      framework. AIP Forum on Education Newsletter
      (Summer): pages 7-9 (the entire newsletter is
      online as a 1.1 MB pdf at
      <http://bit.ly/in4bGv>); and as a brilliant talk
      at the Minnesota Physics Education website in the
      form of a 864 kB pdf at <http://bit.ly/mNz2Q9>.

      .

      Hestenes, D., M. Wells, & G. Swackhamer. 1992.
      "Force Concept Inventory," The Physics Teacher
      30(3): 141-158; online as a 100 kBpdf at
      <http://bit.ly/foWmEb > [but without the test
      itself]. For the 1995 revision see Halloun et
      al. (1995).

      .

      JCSEE. 1994. Joint Committee on Standards for
      Educational Evaluation, "The Program Evaluation
      Standards," 2nd ed., Sage. A glossary of
      evaluation terms from this publication is online
      at <http://bit.ly/efXea6>. JCSEE defines
      "Formative" in its evaluation sense as:
      "Formative evaluation is evaluation designed and
      used to improve an object, especially when it is
      still being developed." For a recent discussion
      that emphasizes this use of "formative" see "ETS
      Pushes For FORMATIVE Assessment?" [Hake (2011b)].

      .

      Kelly, A.E., R.A. Lesh, J.Y. Baek. 2008.
      "Handbook of Design ResearchMethods in Education:
      Innovations in Teaching." Routledge
      Education,publisher's information at
      <http://bit.ly/dkLabI>. Amazon.com information at
      <http://amzn.to/flJaQ9>.

      .

      Maki, P.L. 2011. "Assessing for Learning:
      Building a Sustainable Commitment Across the
      Institution." Stylus, 2nd edition, publisher's
      information at <http://bit.ly/j1hTeW>. Amazon.com
      information at <http://amzn.to/jIAawZ>. Note the
      searchable "Look Inside" feature. Unfortunately,
      as of 30 April 2011, the Amazon search had not
      been updated and was for the earlier 2004 edition.

      .

      Mazur, E. 2009. "Confessions of a Converted
      Lecturer" talk at the University of Maryland on
      11 November 2009. That talk is now on UTube at
      <http://bit.ly/dBYsXh>, and the abstract, slides,
      and references - sometimes obscured in the UTube
      talk - are at <http://bit.ly/9qzDIq> as a 4 MB
      pdf. As of 30 April 2011 09:40-0700 Eric's talk
      had been viewed 36,045 times. In contrast,
      serious articles in the education literature (or
      even exciting posts such as this one) are often
      read only by the author and a few cloistered
      specialists, creating tsunamis in educational
      practice equivalent to those produced by a pebble
      dropped into the Pacific Ocean.

      .

      Neill, M.D. 2008. "Transforming Student
      Assessment," among Fairtest's
      <http://fairtest.org/> "authentic assessment"
      papers, online at <http://bit.ly/m6AClE>,
      evidently a more recent version of the article
      referenced by Black & Wiliam (1998) and published
      by Neill in the "Phi Delta Kappan" of September,
      1997, pp. 35-36.

      .

      Phillips, D.C. 2000. "Expanded social scientist's
      bestiary: a guide to fabled threats to, and
      defenses of, naturalistic social science." Rowman
      & Littlefield; publisher's information at
      <http://bit.ly/fj2P1E >. Amazon.com information
      at <http://amzn.to/cstR0B>.

      .

      Shavelson, R.J. D.B. Young, C.C. Ayala, P.R.
      Brandon, E.M. Furtak, M.A. Ruiz- Primo, M.K.
      Tomita, & Y. Yin 2008. "On the Impact of
      Curriculum-Embedded Formative Assessment on
      Learning: A Collaboration between Curriculum and
      Assessment Developers" Applied Measurement in
      Education 21(4): 295-310, abstract online at
      <http://bit.ly/cVMvye> - to access the ENTIRE
      article click on the pdf icon.

      .

      Wood, W.B., & J.M. Gentile. 2003. "Teaching in a
      research context," Science 302: 1510; 28
      November; an abstract is online at
      <http://bit.ly/9qGR6m>.

      .

      .
    Your message has been successfully submitted and would be delivered to recipients shortly.