AustLII Home | Databases | WorldLII | Search | Feedback

Journal of Law, Information and Science

Journal of Law, Information and Science (JLIS)
You are here:  AustLII >> Databases >> Journal of Law, Information and Science >> 1993 >> [1993] JlLawInfoSci 11

Database Search | Name Search | Recent Articles | Noteup | LawCite | Help

Tyree, Alan L; Rawson, Shirley --- "Cost-Effective Computer Assisted Learning" [1993] JlLawInfoSci 11; (1993) 4(1) Journal of Law, Information and Science 155

Cost-Effective Computer Assisted Learning

by

Alan L. Tyree and Shirley Rawson[*]

Abstract

We describe a new and simple form of computer tutorial known as CRES. The CRES method has the advantage of accepting free form short answers thus freeing it of one of the major objections to the use of CAL methods in law. We also describe different teaching models which integrate CRES tutorials into existing courses. Finally, we describe an experimental program SAGES which automatically marks free form short answers.

___________________________

1. Introduction

In a recent article in this Journal,[1] Allen and Robinson present a bleak picture of the state of computer assisted learning (CAL) in legal education. They identify costs and lack of efficient distribution channels as factors which have inhibited the growth of CAL. They also point to the structure of traditional CAL programs as a limiting factor.

Traditional CAL programs have been designed as "pseudo-Socratic" teaching machines.[2] Using the ordinary branching capabilities of the computer these programs attempt to guide a student through complex material by branching on student replies to questions. Unfortunately, the designer of the system must anticipate every possible student response and, perhaps worse, the most likely reason for the incorrect response.

The construction of a "pseudo-Socratic" machine is costly. Best estimates of costs range between 100 and 40 0 hours of teacher time to build a one hour tutorial.[3] Once built the programs are generally not easy to modify, making them unsuitable for many areas of law teaching.

The results can be effective. Kulik, et al, used a "meta-analysis" to study the results of 59 independent experiments which compared CAL with other teaching strategies.[4] Their findings were that computer taught students did marginally better than students taught by other methods, but that the time required was only two thirds. The reduced time probably reflects the one-on-one aspect of computer assisted learning.[5]

2. Must the Computer do everything?

In our view one major problem with the traditional approach to CAL has been the attempt to build machines that are complete teachers.[6] Why should we discard excellent textbooks and other familiar teaching materials? Suppose instead that we begin by asking what the machine can do well, see if it is possible to assign some tasks to the machine, other tasks to books and printed materials and some tasks to verbal interaction between the teacher and student. In other words, let's explore a systems view of the teaching process.

It was considerations such as these that led us to explore the implementation of a Keller Plan course using computers to administer the heavy test burden that such courses require.[7] Although we found the Keller Plan to be an extremely effective method of teaching and to be very well received by students who did the courses, we also encountered ideological opposition to the method.[8] Keller Plans also produce rather flat marking curves which do not fit well into courses which are characterised by high levels of stutent competitiveness for marks. Finally, most of the teacher's work for a Keller Plan course must be completed before the course begins. Collegues who have enjoyed a summer vacation mutter darkly about the Keller Plan teacher who seems to have nothing to do while others are lecturing.

The idea that the computer can do some educational tasks better than others led Park and McGregor-Lowndes to develop a series of multiple-choice questions as a basis for tutorials in a corporate law for business students course.[9] The machine can administer the questions, record results and maintain complete records. It is a limited but effective use of computer assisted learning.

3. CRES Tutorials

Multiple choice questions have a number of perceived limitations[10] They are difficult to write and are not generally satisfying to either teacher or student when testing any but the most routine knowledge. When used in a Keller Plan course we found that students tended to cycle throught the exams quickly. In our Keller Plan courses, where the tests are administered by computer, we called the result the "arcade effect", for it seemed that many students reached a stage where the tests were treated more like an arcade game than as a serious academic exercise.

It was the "arcade effect" in the Keller Plan courses which orginally led us to consider the use of more traditional problem type questions. The technical problem, of course, is how to handle the free form response from students. The solution that we have adopted is so simple as to be embarrassing: after the student answer has been "submitted" the computer asks the student a number of simple yes/no questions about the answer. The practical effect of these "critical review" questions is that the student marks their own answer. The "critical review" questions may be arranged in a tree structure so that a variety of possible student answers will result in a pass,[11] thus facilitating the use of questions which have no "right" answer.

As mentioned above, CRES (Critical Review Examination System) was originally conceived as an assessment device to be used in conjunction with a Keller Plan course.[12] It was the students who first alerted us to its tutorial potential when we learned that a preferred strategy was to use the examining system as a tutorial early in the study of a module.[13] Conversations with students conviced us that using CRES purely as a tutorial system was both feasible and desirable. We were also encouraged in this approach by the literature on "mastery learning" which suggested that short, criterion referenced tests could be used in conjunction with other teaching methods to provide substantial improvements in student performance.[14]

The first use of CRES as a pure tutorial system was in International Law at the University of Sydney in 1992. Students were given a choice of using the CRES tutorial system or attending ordinary tutorials. Almost exacly half of the students chose computer tutorials.[15] The 'computer' students did marginally but consistently better in two examinations.[16]

The authors are part of a team which has received a National Teaching Development Grant from the Committee for the Advancement of University Teaching to construct CRES tutorials in additional subjects.[17] Current projects are contract, company, evidence, legal research, real property and intellectual property.

4. Integrating CAL - IP93

Our view is that the construction of computer tutorials is only a part, perhaps not the largest part, of the problem of computer assisted learning. The more challenging question is to find the 'best' way of using the tutorials in an overall integrated method of teaching. We are exploring several models which we will discuss in order from least to most 'radical'. Considering a range of options is necessary since there is in our Faculty a certain amount of political and ideological opposition to the use of computer assisted learning in any form.

The easiest approach is to simply use the tutorials as a 'bolt-on accessory' to an existing class.[18] This might take the form of offering tutorials in classes where financial constraints preclude the use of human tutors, or, as in International Law, offering the tutorials as an optional alternative to human tutorials. It is difficult (but, alas, not impossible) to object to the use of computer tutorials in classes which currently offer no tutorial assistance at all. We have encountered some resistance to offering computer tutorials as an optional alternative, but student reaction to the option has been so favourable that the opposition is unlikely to be successful or sustained.

A better method is to incorporate the tutorials as part of a "mastery learning" strategy.[19] This requires the teacher to identify clear "behavioural objectives" for each section of the course. The CRES tutorials would then be keyed to the objectives, and the CRES feedback should direct the student to alternative materials for reviewing areas of weakness. Mastery learning methods can result in a significant improvement in student performance as measured by final examination marks.[20]

A different way of using computer tutorials is to make them a compulsory prerequisite to meeting with human tutors. We have all had the experience of the aggressive unprepared student who monopolises tutorial time to learn elementary material that he or she should have mastered before attending the tutorial. Requiring the completion of the computer tutorial before attending a conventional tutorial substantially enhances the quality of the face to face contact time. We are planning to explore this model in Company Law during the second semester of 1993.

Our current course in Intellectual Property illustrates the most radical departure from standard teaching models. The course consists of ten 'modules'. Each module has four components:

1. A directed self-study component - students are given a detailed 'study guide' which guides them through the reading material of the module. The study guide contains a list of behavioural objectives and notes which assist the student in places where the textbook is difficult;

2. 'Cooperative Learning Group' meetings: students are requuired to form CLGs of between 5 and 8 students. Each group meets to discuss the issues raised by the module and to identify any problems that should be discussed with the teacher; there are two CLG meetings per module;

3. Computer tutorials: these CRES tutorials are keyed directly to the stated behavioural objectives. Students must make a 'serious attempt' at the tutorials; there are two computer tutorials per module.

4. Small Group Meetings with the Teacher: provided the student attends the CLG meetings and makes a 'serious attempt' at the computer tutorials, he or she qualifies for a meeting with the teacher. These meetings are in groups of fewer than 10 students and are held fortnightly.

Experience so far with the course has been extremely positive. Student morale is positive and participation rates have been very high. The Small Group Meetings have been very productive since the students are prepared and have resolved the easier issues in the Cooperative Learning Groups.[21] The computer tutorials assure that no significant issues are overlooked. We believe that it is important that the prerequisite conditions be enforced: to allow unprepared students to attend the Small Group Meetings rewards lazy students and punishes those who have fulfilled the requirements.

5. Making CAL smarter

Student feedback on CRES tutorials has been very positive, but we feel that the system would be better if the CRES response was tailored to the individual student.[22] In order to keep the system 'cost effective' we are interested only in methods which require no additional costs in tutorial construction. We are exploring a method which automatically marks the student answer. The basic idea is that a student who has passed the tutorial question should not necessarily be required to progress through all of the critical review questions.

SAGES (Short Answer General Examination System) is a system of automatic marking which depends upon the existance of a database of model pass and fail answers. SAGES works by defining a measure of similarity between two answers. The mark assigned by SAGES to a student's answer is then the mark of the 'closest' answer in the database.[23] SAGES performance can be good with a database of between 10 and 20 model answers. For example, experiments with International Law questions have shown agreement with the teacher in over 80% of all questions marked. In the Intellectual Property course, SAGES agreed with the teacher in 78% of all cases examined.[24] Note that with these type of questions it would be unlikely that the agreement rate would exceed 90% between two human markers - there are simply too many borderline answers.

The reader will recall that we originally designed the CRES method for use in a Keller Plan course. The module examinations in such a course 'count' towards the final mark in the course so that we thought it necessary to monitor the system to ensure student honesty. SAGES was employed as one of the means of supervision. When the SAGES result differed from the student's own mark, the teacher was sent an e-mail message informing him of the difference. He could then examine the students answer and act as a final arbitrator.

As a watchdog SAGES performed admirably. Students knew that the system was in operation, and would often send the teacher an e-mail message explaining one of their answers where they were concerned that they had been close to the margin in their replies to the critical review questions. The teacher did not find a single instance of 'cheating' in a course where nearly 10,000 questions were administered by the CRES system.[25]

SAGES is presently a very crude (but computationally cheap) method. It remains to be seen if it can be developed into a workable scheme for providing more tailored feedback in CRES tutorials. Issues for futher research include: the appropriate definition of the distance between answers, the optimal construction of a database, and the way in which SAGES calculations might be suitably incorporated into the CRES system.

6. Acknowledgements

Construction of CRES tutorials has been facilitated by a National Teaching Development Grant from the Committee for the Advancement of University Teaching. We are also grateful to the Law Foundation of New South Wales for its support of the Keller Plan courses.


[*] Alan Tyree is Landerer Professor of Information Technology and Law; Shirley Rawson is a senior lecturer in the Faculty of Law, University of Sydney.

[1] Allen, T and Robinson, W "The Future of Computer Assisted Learning in Law" (1992) 3 JLIS 274.

[2] The terminology is due to Graham Greenleaf.

[3] Most of the time is spent in drawing the flow charts. A language such as Andrew Mowbray's LES can be used to code the flow chart very quickly.

[4] Kulik, et all, "Effectiveness of Computer-based College Teaching: a Meta-analysis of Findings", Reviews of Educational Research, 1980, Vol 50 No4, pp 525 - 544

[5] One-on-one tutorial methods are generally more effective than other teaching methods; see Bloom, BS, "The 2 Sigma Problem: The Search for Methods of Group Instruction as Effective as One-to-One Tutoring" [1984] Educational Researcher 4-16.

[6] Having said that, we should confess that part of the National Teaching Development Grant supports the construction of a "teaching workstation" based on combining the concepts of this paper with those developed for the DataLex Workstation.

[7] For a description of our earliest Keller Plan courses, see Rawson and Tyree "Fred Keller Goes to Law School" [1991] LegEdRev 12; (1990-91) 2 Legal Education Review 253; later developments may be found in Fred Keller Studies Intellectual Property, paper presented at the ALTA Conference, QUT, 1992 (also available from our ftp site: sulaw.law.su.oz.au).

[8] It has never been clear whether this political opposition is to the Keller Plan per se or the Keller Plan with computer administered testing.

[9] McGregor-Lowndes, M "Computer Tutorials for Business Law" ALTA Conference Proceedings, 1990.

[10] We say "perceived" because the law teacher's view of them is not shared by experts in assessment: see, for example, Heywood, J Assessment in Higher Education, 2nd Edition, Wiley, New York, 1989; Ebel, RL, and Frisbie, DA, Essentials of Educational Measurement, 4th ed, 1986, Prentice-Hall

[11] See appendix I for a sample CRES question.

[12] Rawson and Tyree "Fred Keller Goes to Law School" [1991] LegEdRev 12; (1990-91) 2 Legal Education Review 253

[13] There is no penalty for failing a module examination in a Keller Plan course.

[14] Bloom BS, Madaus GF and Hastings JT, Evaluation to Improve Learning, 1981, McGraw-Hill. Stice, James E., "PSI & Bloom's Mastery Model: A Review & Comparison" [1979] Engineering Education 1979, 175-180

[15] This result is interesting in itself. The only information available to the students was a short in-class demonstration of CRES.

[16] The result was not statistically significant if the null hypothesis was 'no difference'. However, the results are almost exactly those found by the Kulik, et al, study mentioned above. We also find that students at our university find the concept of an 'insignificant' increase in marks to be meaningless!

[17] The other member of the team is Don Rothwell of the Faculty of Law at the University of Sydney.

[18] Bloom, BS, Evaluation to Improve Learning, 1981, McGraw-Hill.

[19] Bloom, BS, "The 2 Sigma Problem: The Search for Methods of Group Instruction as Effective as One-to-One Tutoring" [1984] Educational Researcher 4-16.

[20] Bloom, BS, "The 2 Sigma Problem: The Search for Methods of Group Instruction as Effective as One-to-One Tutoring" [1984] Educational Researcher 4-16.

[21] For information on "cooperative learning" methods, see Slavin, RE Cooperative Learning: Theory Research and Practice, 1990, Allyn & Bacon; Johnson, DW and Johnson, RT, Learning Together and Alone: cooperative, competitive and individualistic learning, 1987, Prentice-Hall; Sharan, S (ed) Cooperative Learning: Theory and Research, 1990, Praeger.

[22] Interestingly enough, we know of no educational research which lends support for this idea.

[23] This idea has been used successfully in information retrieval systems: see Salton G, SMART Retrieval System: Experiments in Automatic Document Processing, 1971, Prentice-Hall.

[24] This result is particularly encouraging: due to a lack of time, the database for most questions consisted of a single model pass. This pass answer was automatically "deformed" into a model fail.

[25] SAGES was not the only method used for surveillance of the system. The teacher also automatically looked at questions which were statistically deviant in other ways, as well as routinely examining questions at random.


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/journals/JlLawInfoSci/1993/11.html