The world of computer-assisted instruction (CAI) can appear at first to be one of complicated jargon and equipment. In response to this feeling it must be added that while getting started may be difficult, the results of CAI are extremely rewarding. The purpose of this article is to provide a step-by-step account of a computer-assisted project in music fundamentals as developed at the University of Nebraska-Omaha and the University of Minnesota, and conclusions based on the results.

The set of computer lessons developed is known as MUSFUND—a group of computer instructional programs in the basic elements of music theory. The first step in developing MUSFUND was to establish that there was indeed a need for computer-assisted instruction in a music fundamentals course. While the authors had very positive feelings about the need for CAI in such a course, it seemed important to demonstrate on paper the fact that CAI would bring forth a significant instructional improvement. Therefore an evaluation of a traditional class in music fundamentals was undertaken in order that information might be obtained about instructional needs. The measurement tool selected was TABS (Teaching Analysis by Students),1 an evaluation chosen because of its concentration on questions which measure teaching effectiveness. Particularly relevant were items which addressed aspects of the course which related to the rate of understanding and to informational delivery effectiveness.

After careful analysis of the TABS results, it became apparent that there was a major problem with a diversity of backgrounds among the students in the class. Some felt that the pace of the course was too fast; others indicated that it was too slow. While the survey showed no problems with other aspects of the course, it was evident that there was a serious problem with the extreme range of musical experience each student brought to an introductory music course. Based on this information it was decided to initiate MUSFUND, a CAI project in music fundamentals. Next the authors were faced with a situation where the computer equipment of their respective institutions differed to the extent that programs developed on one campus would not work on the other. (PLATO served as instructional hardware at Nebraska while the Cyber 73 of the Minnesota Educational Computer Consortium served at Minnesota.) However, the basic instructional design would not require extensive revision. After considering different approaches it was decided to incorporate an existing textbook-workbook2 with computer-delivered drills. In this way MUSFUND could build on pedagogical tradition as well as provide students with material to study between sessions at the terminal. Another way to build on previous effort was to recommend existing computer drills in addition to those developed for this project.

The programing for MUSFUND was done by Dr. Gross, with assistance from programers at the University of Minnesota; Dr. Foltz previewed and evaluated each lesson. In addition some advanced music students tried the lessons and provided student reaction. Such evaluation before the intended student use was particularly valuable since new programs were being developed, but previewing would have been useful in any event in order to insure a successful implementation.

The resulting course3 consists of twenty drills and two practice tests and includes music notation, scales, intervals, chords and terminology. Each question in the computer drills is either multiple choice or can be answered with one typewriter character. Such a one-character response helps prevent spelling or typing errors, which would register as wrong answers even if the student knew the correct answer. In case a student types in the answer itself, the program will accept the content of the correct choice, as well as its letter. For example, if the correct answer on a multiple choice item is "a. rhythm," the computer will accept either "a" or "rhythm."

To illustrate the course, consider the following case from an exercise in note reading. First, the computer presents directions. Then an arrow points to a note selected at random from the screen display of music notation. The student answers with a letter from A to G and receives an immediate evaluation. Not all drills require music notation. For example the identification of musical terms is done in a straightforward, multiple choice format. To alleviate the boredom and impersonality of working at a computer terminal, some choices are intended to supply humor rather than a meaningful option, e.g. "Beats me!"

After each item has been asked once, any questions not correctly answered on the student's first try are asked again. At the end of the lesson, students receive a score based on 100%. If the score is 90% or above, the terminal displays a congratulatory message, "Good! You're on the ball!" Scores below 70% are accompanied by a recommendation that the student study more.

At Nebraska MUSFUND was integrated in a music fundamentals course by replacing one 50-minute class period per week with one half-hour individual session at the computer. Additionally the class met two 50-minute sessions per week. The entire course was worth three credit hours. At Minnesota MUSFUND was treated as optional work done outside of the classroom, not replacing any class time.

After successful determination of feasibility and implementation of the CAI project, it was time for the final step: evaluation. This was undertaken in three stages: (1) administering a pre/post test in music fundamentals, (2) giving a PLATO evaluation (at Nebraska only), and (3) personal interviews of students and evaluation of enrollment figures. The pre/post test was based on one developed by Professors Gross and John Anderson of the University of Minnesota.4 Results of this test are still being studied and compared with scores from non-CAI fundamental tests over a three year period. However, preliminary indications suggest a dramatic improvement in achievement of the understanding of fundamentals in the CAI course as compared to non-CAI courses.5 The second evaluation tool is one developed specifically for PLATO at the University of Delaware.6 This has been found to be an extremely valuable tool in that it helps to isolate specific problem areas within a CAI course.

The final evaluation tools undertaken with MUSFUND were personal interviews and study of enrollment figures. Here student responses were extremely supportive of CAI. Students enjoyed the personalized instruction and felt that time spent on the computer was the single most effective activity outside class for studying fundamental materials. This is supported at Nebraska in a comparison of enrollment figures over the past three years where attrition rates for music fundamentals classes have averaged from 20-30%. With the use of MUSFUND, the attrition rate was lowered to less than 10%. It seems increasingly clear that the personalized instruction and graphic screen display does improve student learning and attitude.

From the authors' experience with CAI, it has become apparent that implementation and evaluation of music CAI must proceed in the context of each school's instructional goals and limitations. Just as individuals differ, so do institutions, so there is no one ideal system for everyone. Yet the following conclusions seem to apply to all music programs.

1. The system must be affordable. This includes not only the initial purchase or rental price but also the cost of operation and lesson development or acquisition.

2. Students should improve as a result of the CAI. If not, it may be the wrong level of difficulty, students may not be using it enough, or the course may not lend itself to computer drills. In the last case, this incompatibility may be due to the subject matter itself or to discrepancies between the methodology used in class and that of the authors of available CAI.

3. Lessons must be easy to use. Besides being unfamiliar and possibly uneasy with computers, many music students have trouble handling its typewriter-like keyboard. On the other hand, students who know how to type will find it hard to adjust to a program in which one typewriter character is used to represent another.

4. While not intended as entertainment, music CAI should not be so uninteresting that students stop using it. At first the novelty will keep many students involved. To insure continued interest, lessons should include "human" elements such as humor and personalization, and also as much actual sound as possible.

5. Evaluation should not be limited to studies after students complete a course. Effective evaluation begins before students even see the lessons, in the form of need assessment and instructor previews of completed courseware. Such "dry runs" can reveal problems ranging from missing computer parts to faulty programing or simply confusing test items. To insure that a program can be properly reviewed, advance evaluation is a necessity.

6. Attitude and achievement should be measured separately, as previous studies in music CAI7 and in other fields have shown that students may express greater satisfaction with programs which are teaching them less.8 Therefore, simultaneous attention to both attitude and achievement will result in effective music CAI that students will want to use.

In conclusion the authors hope that this article has provided ideas and material that might help others implement and evaluate music CAI. In the period between writing and publication, a whole new generation of computers can surface, therefore information of a technical nature has been omitted from this article. Yet, armed with clear musical and instructional objectives, and with access to existing research and development in CAI, college instructors have the materials before them to incorporate computers into their own instructional strategies with relative ease.



Financial support for the Minnesota programs was provided by grants from the University of Minnesota Center for Educational Development and the University of Minnesota Council on Liberal Education Small Grants Program. Funding for development at Nebraska was provided by a grant from the University of Nebraska Computer Network to Professors Roger E. Foltz and Warren F. Prince.

1Teaching Analysis by Students, developed by the Center for Instructional Resources and Improvement, University of Massachusetts, adapted by Marilyn Leach, University of Nebraska-Omaha, 1977.

2Frank W. Hill and Roland Searight, Study Outline and Workbook in the Elements of Music (Dubuque: Wm. C. Brown Co., 1977).

3Dorothy Gross and Scott Kallen, "MUSFUND," computer program written on PLATO, 1979.

4Dorothy Gross, Computer-Assisted Music Course: User's Guide (University of Minnesota, 1979).

5Dorothy Gross, "A Computer-Assisted Music Course and its Implementation," in Computing in the Humanities, ed. Peter C. Patton (D.C. Heath and Co., in press).

6Fred Hofstetter, "Evaluation of a Competency-Based Approach to Teaching Aural Interval Identification," Journal of Research in Music Education XXVII (1979), 202-213.


8Esther Steinberg, "A Review of Student Control in Computer-Assisted Instruction," Journal of Computer-Based Instruction III (1977), 84-90.

1749 Last modified on October 25, 2018