Past Efforts in Computer based Questioning and Testing  (Q&T) in Physics

 

Tom Dickinson

Professor of Physics and Materials Science

Washington State University

 

Computer based Q&T is often linked strongly with efforts in computer based learning systems.  For example, the PLATO system was developed by the University of Illinois in the 1960’s and 1970’s and subsequently incorporated into the Cyber-Prof and NovaNET programs at the same institution.  PLATO’s strengths include its modular structure and ease of course administration.  As currently conceived, it spans a wide range of topics and grade levels {primarily 6-12}.  Hardware limitations were an early barrier to portability.(1)  Significantly, student evaluations in introductory college physics rate the current NovaNET material as more helpful than professors and texts.(2)  The software is written in the TUTOR language and is available for purchase by educational institutions.  The system is still largely mainframe based, which limits its portability to other institutions.  In terms of testing, it claims to provide “powerful testing, assessment, student management, record-keeping, and communications tools” which unfortunately we were not able to access for evaluation.

       Bork based an early deployment of educational software on the self-paced Personalized System of Instruction, similar to the Keller plan.(3)  This software included 27 end-of-module quizzes as well as examinations, all computer graded.  As described,(3) the testing involves multiple choice and numerical fill-in questions.  Among the noted difficulties was the tendency of students to make several attempts to pass a given module, rather than learning the material and testing out the first time.  Although elements of this work were employed at several universities, a major handicap was that much of the course administration was hardware-specific and not readily disseminated. 

       Lea, Thacker, Kim and Miller of the University of North Carolina, the Ohio State University, and Miami University developed a testing engine using the cT programming language.(4)  This engine presents a graphical interfacej where answers are chosen from menu lists.  The questions are typically drawn from Physics by Inquiry materials(5) and are designed for use in a laboratory environment.  Student explanations of their results are input into a text file and evaluated for completeness by checking against a keyword list.  Student responses to the graphical input are compared with human evaluations of the text input. 

       A recent implementation of computer-assisted learning is the PALS (Personal Assistants for Learning) project at Carnegie Mellon.(6)  Written in Macromedia Authorware, these are well designed tutorials on acceleration and Newton’s Laws.  Although not designed for Web use, they can be accessed over the web by students who download the appropriate Authorware reader.  These tutorials are effective, but require hours to complete and cover little of the material in a typical introductory physics course.  

       Multiple choice (radio button and check box) and numerical answers are readily graded over the Web.(4, 7)  Grading software for these formats is found at many locations.  Examples at a variety of levels can be found with almost any search engine by searching on the key words:  quiz test physics.  All the grading services we found employed multiple choice, numerical fill-in-the-blank, or equivalent inputs.

       Electronic mail and chat sessions have proven to provide useful communication for administrating coursework, for getting help with particular problems, and for submitting written assignments.(810)  Force diagrams can be entered by choosing vectors of appropriate length and direction from a menu and moving them to the appropriate point in a graph or diagram.(6)  Some very clever interactive Java applets (Physlets) have been developed which require answers from students, often relating to problems found in standard texts.(11)  These are portable and available for non-commercial application; we plan to use some of them in our material.

       Whole courses have been offered over the web,(8) leading some to think in terms of virtual departments.(12)  Tutorials, homework assignments, and quizzes are offered over the Internet.(2)  Homework grading services, such as WebAssign (North Carolina State University), will grade homework and maintain a gradebook; but here again, student responses are limited to numerical input (the outcome of a number of equation solving and substitution operations) and multiple choice. 

       Problems and Difficulties.  Major problems and difficulties stem from the formats used in current computer graded quizzes and exams, e.g., multiple choice.  Shea found that students did 40% better on multiple choice tests relative to fill in the blank tests over the same material.  Students can recognize key words and phrases in a multiple choice format with little understanding of what these words actually mean.  Steinberg et al.  found that roughly 30% of students who gave the correction option on selected questions from the multiple choice Mechanics Baseline Test(13) showed evidence for serious misconceptions when required to defend their choice in a short paragraph.(14)  Software that grades a response as correct despite student misconceptions is a serious problem in teaching environments because it reinforces, instead of challenges, student misunderstandings.  Having students write statements and equations at various stages of problem solving elevates the dialog and probes understanding at a higher level.  Non-numerical answers encourage a more universal appreciation of a problem, often nudging the student towards some insight into nature.  Follow-up questions regarding interpretation can help.  (Example—The acceleration of a rolling hoop vs. a solid sphere down the same incline of angle q is acenterofmass= 1/2 g sin(q) vs. 5/7 g sin(q).  Follow-up questions can ask about the absence of mass and radius dependence.  What happens at q = 90°?  What happens when g triples?)  Students seeking numbers rather than equations for answers generally lose sight of such insight.  The format we are developing is born of a desire to achieve a higher level of student response.

       The dissemination of educational software can also be a problem.  Software can be difficult to adapt to changing pedagogical needs.(15)  Hardware incompatibilities can be a factor, as in the case of early PLATO and some of the early personalized instruction materials.(1, 3)  Nevertheless, the PLATO system has been revised at least twice and is still in use at the University of Illinois after thirty years.(1)  Successful implementation often requires significant teacher training.  Since our approach is limited in scope (This is not a full learning system.) and utilizes machine independent, web based software (Java; Javascript), we anticipate fairly straight forward dissemination to other instructors and institutions. 

 

 

References

 

1.     J. F. Wallin, Comput. Phys. 322-327 (1998).

2.     L. M. Jones, D. J. Kane, Am. J. Phys. 62, 832-836 (1994).

3.     A. Bork, J. College Sci. Teach. 10, 141-149 (1980).

4.     S. M. Lea, B. A. Thacker, C. i. P. Kim Eunsook, 122, Comput. Phys. 8, 122-127 (1994).

5.     L. C. McDermott. (University of Washington, Seattle,  WA USA, 1992).

6.     F. Reif, L. A. Scott, Am. J. Phys. 67, 819-831 (1999).

7.     A. P. Titus, L. W. Martin, R. J. Beichner, Comput. Phys. 12, 117-123 (1998).

8.     R. C. Smith, E. F. Taylor, Am. J. Phys. 63, 1090-1096 (1995).

9.     J. D. Finch, L. N. Hand, Am. J. Phys. 66, 914-919 (1998).

10.    G. M. Novak, E. T. Patterson, A. D. Gavrin, W. Christian, Just-in-Time Teaching:  Blending Active Learning with Web Technology (Prentice Hall, Upper Saddle River, NJ 07458 USA, 1999).

11.    W. Christian, A. Titus, Comput. Phys. 12, 227-232 (1998).

12.    D. J. Suson, L. D. Hewett, J. McCoy, V. Nelson, 67 6 (1999).

13.    D. Hestenes, M. Wells, Phys. Teach. 30, 159-166 (1992).

14.    R. N. Steinberg, M. S. Sabella, Phys. Teach. 35, 150-154 (1997).

15.    W. Christian, Comput. Sci. Education 1, 13-15 (1999).