Back Home Next

ASC Proceedings of the 38th Annual Conference
Virginia Polytechnic Institute and State University - Blacksburg, Virginia
April 11 - 13, 2002          pp 71 - 78

 

Computer-Assisted Tutorials for Structures Instruction

 

Anoop Sattineni and Roger Killingsworth

Auburn University

Auburn, Alabama

 

In Fall semester 2000 Auburn University made the transition from quarters to semesters. University administration mandated a limit of 120 semester units for all programs. To meet this limit the construction management faculty had to delete all repetition from the construction management undergraduate curriculum including a review of physics principles at the beginning of the structures sequence. As a replacement, computer-assisted tutorials were developed for students to use outside the classroom. The tutorials covered basic topics using interactive questions, pictures, diagrams and video clips. The students’ grasp of the principles was measured by means of a quiz at the end of each unit. The quizzes comprised 5% of the term grade to encourage student participation. A questionnaire, tutorial generated data and a comparison of quarter grades with semester grades were used to provide a preliminary evaluation of the tutorials. The preliminary evaluation suggested that while some improvements were needed, the tutorials were successful in providing the needed review. The process of creating these tutorials and the results of the survey conducted are detailed in this manuscript.

 

Key Words: Undergraduate Education, Structures Instruction, Computer-Assisted Instruction, Intelligent Tutoring Systems, Video Clips, Quizzes.

 

 

Introduction

 

In the fall of 2000 Auburn University made the transition from the quarter system to the semester system. In the preparation for this change the university administration mandated that all programs be limited to 120 semester hours in order that students could realistically expect to complete an undergraduate degree in four years. This caused many problems for the technical programs of the university, particularly for the construction management unit.

 

The construction management unit was required to reduce the equivalent of 136 semester units to the mandated 120 semester units and also meet all the accreditation requirements for ACCE. In order to meet this goal the unit eliminated all of the redundancies and some of what was felt to be necessary repetition in course content. One area that was particularly affected by this process was the structures sequence. Four quarter courses (Mechanics of Structures, Strength of Materials, Reinforced Concrete and Applied Structures) had to be compressed into three semester courses.

 

Mechanics of Structures and Strength of Materials were the two courses found to have most of the repetition in this sequence. It was common practice in these courses to provide a thorough review of the physics concepts used as a basis for structures at the beginning of both courses and a review of Mechanics at the beginning of Strength of Materials. Because of this repetition the structures faculty decided to combine these two courses and leave the Reinforced Concrete and Applied Structures courses essentially intact.

 

Past experience, however, indicated that this approach had problems. First, to cover material from two courses consisting of 50 contact hours each in one class of 45 contact hours, not only would the repetition have to be eliminated but the pace of the course would have to be very brisk. Second, the course would be based on physics concepts, which past experience indicated had not been well learned. Third, not all of the students would have had physics the preceding semester which would make the grasp of the basic concepts even more difficult. Fourth, this lack of fully grasping the background concepts would hinder the students understanding of the structural applications. And, fifth, there was not time in the single course to provide even a rudimentary review. Needed, therefore, was some method to provide a review of these structural concepts outside of class.

 

Inoue reports that a tutorial format of computer-assisted instruction is particularly suitable to teaching basic concepts (Inoue, ’99). Inoue also reports that the strengths of this format are that it explains the basic concepts, tests the understanding of the concepts and guides the student through a logical sequence in learning the material. Therefore, the structures faculty decided to use interactive computer tutorials to provide the needed review.

 

This paper will describe the development of the tutorials, the development of an evaluation questionnaire, and the results of the questionnaire. This paper will also discuss the results of the questionnaire and make recommendations for modification and future applications.

 

 

METHODOLOGY

 

Tutorial

 

The purpose of the project was to provide a review outside the classroom of basic structural concepts from physics to prepare the students to better understand mathematical applications presented in the beginning structures class. The topics considered important were forces, trusses, stress, strain, Poison’s Ratio, temperature stress, stress concentration, center of gravity, moment of inertia, and beams. Also, to accommodate different learning styles, it was decided to present the concepts in writing, and through visual illustrations and video clips. Further, recognizing that different students react differently to different professors, a choice of video clips from different professors was offered for each topic.

 

Macromedia Flash was used to create the tutorials for use through Web-CT. Web-CT is an internet based course material presentation software program. The tutorials and quizzes work better on Web-Ct because it gives the students an opportunity to review them at their convenience and Web-Ct keeps up with all the grades and it is only a one-time effort for the instructor. The interactive nature of the tutorials would also be difficult to duplicate if the same were done on paper. The tutorials were divided into three sections: an interactive tutorial, video clips of explanations, and a unit quiz. Web-CT also provided the means to record quiz performance and time spent on each tutorial. The interactive tutorial consisted of simple, interactive, context sensitive, often humorous, multiple-choice questions. Each question was accompanied by an illustration (see figure 1). Any choice would take the student to a screen with feedback on the student’s choice at the top, and an explanation of the concept and a picture illustrating the concept at the bottom (see figure 2). The oral explanations were provided through video clips of professors in an office setting. These video clips were created in real media and were placed on a streaming server to enable fast downloads. A multiple-choice quiz was given at the end of each unit with immediate feedback on the student’s performance.

 

Figure 1: An example of a tutorial question

Figure 2: An example of an answer to a question in the tutorial

 

Each tutorial was designed to be completed in 15 to 30 minutes including viewing of video clips. To encourage the use of the tutorials it was decided to have the unit quizzes comprise 5% of the student’s term grade. The students were given a schedule when the tutorials should be viewed and the quizzes taken for credit prior to that subject being covered in class. After the scheduled end date the tutorials could still be viewed but the quizzes could no longer be taken. A missed quiz would result in a grade of 0. The quiz grades would be recorded and saved by the Web-CT program.

 

Evaluation

 

Inoue reports that there is no formally validated method to evaluate the performance of a computer-assisted instruction tutorial. Therefore, the professors developed their own evaluation strategy.

 

Four areas were considered to be of particular importance in evaluating the tutorials: the reaction of the students, quiz performance, time spent on the tutorials and a comparison of grades between the semester course and the quarter courses.

 

A specialist in questionnaire development and evaluation from the University’s College of Education was employed to develop and evaluate a questionnaire to determine the reaction of the students. The specialist attended the class to learn format and methodology, and interviewed students and professors to determine concerns. Using this information the consultant developed a questionnaire consisting of 42 questions, most in a multiple-choice format. He tested the validity of the questions and eliminated 4. The first 16 questions measured the students’ opinion of the overall quality of the web site, the quality of the tutorials, the quality of the video clips, and the quality of the tutorial quizzes. Question 17 gave the students the opportunity to type in comments. The next 15 questions were used to get a response on the general concerns that the professors and students expressed during the interviews and to validate the students’ responses to the first 16 questions. And, finally, the students were asked if they would have completed the tutorials and quizzes if the quiz results had not been part of the course grade. The questions are given in Appendix A.

 

The survey was administered using Web CT. The program recorded that a student had completed the questionnaire but did not correlate responses with individual students so that confidentiality could be maintained while allowing extra credit to be given to encourage participation.

 

A comparison of the percentage of letter grades of the semester course with those of the quarter courses was used to provide a preliminary, rough evaluation of effectiveness. The total number of students taking the courses during the last year of quarters and the first year of semesters was tallied according to course grades and percentages were calculated based on the total number of students for the year.

 

The questionnaire was completed and ready for use in Fall Semester, 2001.The data for students’ reaction, quiz performance, and time spent on the tutorials was limited to one section of 26 students. Also, the only year of data for student grades for the semester course was the transition year. In this year it was anticipated the unsettled atmosphere and necessary changes to the courses would have an adverse effect on student performance. It was decided that an evaluation at this time would give a preliminary indication of the success of the project, but that more data would have to be collected for a definitive evaluation.

 

Twenty-four of the twenty-six students in the course answered the survey. The size of the data was relatively small and the data was collected from only one group. A close inspection of the data revealed that the data conformed to a gaussian population. Since the data collected was being used to describe only one group, mean and standard deviation were considered sufficient for statistical evaluation. The mean and standard deviation for all pertinent questions were calculated. A detailed statistical analysis including correlating the students’ responses was not performed because the survey had only twenty-four respondents. After conducting the survey for at least another year, the authors expect to make a more elaborate statistical analysis.

 

 

Results and Discussion

 

As stated above, four specific areas were identified as important for evaluation: the reaction of the students, quiz performance, time spent on the tutorials and a comparison of grades between the semester course and the quarter courses. The students’ reaction was determined through the questionnaire. Students were asked to rate the tutorials, quizzes and the videos on a scale of 5 (5 for most effective and 1 least effective). The student response tabulated in table 1 below indicates that the online videos were the least effective among the three.

 

Table 1

 

Summary of student responses to rating the three components of the web-site

Interactive Tutorials

Online Quizzes

Online Videos

Mean

SD

Mean

SD

Mean

SD

3.7

0.88

3.43

1.04

2.26

1.25

 

The same four questions were asked to the students about four specific areas: The interactive tutorials, the online quizzes, the online videos and the overall web content. Their responses to these sixteen questions are summarized in Table 2. Each question was on a scale of one to seven, one for most favorable answer and seven for least favorable answer.

 

Table 2

Specific questions about the web components

Question Number

Overall Web Content

Interactive Tutorials

Online Quizzes

Online Videos

Mean

SD

Mean

SD

Mean

SD

Mean

SD

1

1.42

0.72

1.35

0.88

1.48

0.73

2

1.38

2

2

1.1

2.22

0.9

1.52

0.79

2

0.98

3

2.96

1.19

2.91

1.16

2.74

0.92

3.83

1.7

4

2.87

1.14

2.91

1.08

2.96

1.07

3.74

1.6

Notes:

Question 1: How often did you complete / review this component on the web?

Question 2: In general, how seriously did you treat this particular component?

Question 3: How helpful was this component in helping you be better prepared for the upcoming classes?

Question 4: How well did this particular component cover the material for the upcoming classes?

 

Furthermore the students were encouraged to write comments about the questions. The comments presented in Table 3 are summarized from the written responses. Similar comments were grouped together and counted.

 

Table 3

Representative comments from students

Comment

N

"The tutorials were helpful in explaining basic concepts"

21

"The content in the videos was too repetitive and most often already covered in the tutorials"

18

"The tutorials were too basic in their depth of material covered"

20

"The tutorials should cover mathematical procedures by solving problems in a step by step manner"

24

"The quality of the streaming video was poor"

5

"The quiz questions sometimes got a little confusing"

5

Note:

"N" refers to the number of similar comments written by students.

 

These results from tables 1, 2 and the comments in table 3 indicate that the students often completed the tutorials, quizzes and videos. The results also indicate that the quizzes and tutorials were more helpful than the videos, and that they felt that the material was better covered in the quizzes and tutorials than in the videos. As can be seen, the students believed the basic objective of the tutorials was met; that is, the tutorials helped in the review of the basic concepts. However, the students suggest that improvement could be made to video content, to the depth of the material in the tutorials and to the quiz questions. They also indicated that it would be beneficial to cover mathematical procedures.

 

Fifteen questions were used to validate the students’ responses to the preceding questions and to address general concerns of the students and the professors. The results for these questions are in Table 4

 

Table 4

Results for Questions 18 through 32

Question

Mean

SD

I enjoyed completing the tutorials that were associated with this class.

2.26

0.62

I thought the tutorials were too simple, or over simplified.

2.3

0.97

I thought the tutorials reached an appropriate level of depth for the material.

3.48

0.85

The interactive quizzes were easy to understand and follow.

1.87

0.63

I think the tutorials have helped my performance in this class.

2.78

0.74

I think the tutorials should introduce more theory.

2.22

0.8

I think the tutorials should provide more explanation as to the "why" of the content being covered.

2

0.95

I think the tutorials should work through some problems step by step.

1.22

0.42

The videos were easy to understand.

2.57

1.2

The videos were too long in length to hold my attention.

3.04

1.07

The tutorials took too long to complete.

3.78

0.6

The quizzes were a fair test of the knowledge introduced during the tutorial.

2.3

0.82

The quiz point allocation (50/1000) was an appropriate relevant percent.

2.35

0.83

The quiz point allocation (50/1000) should have been higher.

3.13

0.81

I would have completed these tutorials even if they did not count towards my final grade.

2.96

1.19

Notes:

The students had five options for the above questions:

1. Strongly Agree 2. Agree, 3. Neutral, 4. Disagree, 5. Strongly Disagree

 

The responses confirmed that the material could have been covered in more depth, that math calculations should be included, that the videos needed improving, and that most of the students found the quizzes to be fair and appropriate. The students did not feel strongly that the tutorials helped course performance. Most students felt that the amount of time spent on web-based content was reasonable and the material itself was easy to follow. The students reported that five percent of the course grade for on-line quizzes was reasonable. Only half of the students reported that they would have completed the online quizzes if they were not a part of the grade.

 

Web-CT allows the course instructor to measure certain web statistics such as the total number of hits per page, the total amount of time spent by all the students in a particular page and the average amount of time spent by students per hit. These statistics along with the number of students taking each quiz and the average for each quiz is presented in Table 5.

Table 5

Summary of time spent on each tutorial and the quiz average

Tutorial Topic

Hits

Total Time

Average Time per Hit

N

Quiz Average
(Out of 5)

Forces1

193

17:45:46

5:31

26

4.5

Beams

51

4:27:23

5:14

21

4.1

Forces2

108

9:21:18

5:11

23

3.9

Poisson's Ratio

65

5:20:03

4:55

21

4.4

Moment of Inertia

56

4:11:29

4:29

24

4.2

Center Of Gravity

57

4:06:53

4:19

20

4

Trusses: Part II

74

5:15:48

4:16

25

4.4

Introduction to Strain

47

3:04:22

3:55

23

3.45

Trusses: Part I

76

4:33:21

3:35

25

4.4

Introduction to Stress

75

4:17:35

3:26

23

3.45

Stress and Strain

56

1:41:47

1:49

23

3.45

Stress Concentration

52

1:25:43

1:38

23

3.45

Temperature Stress

60

1:11:58

1:11

21

3.7

Mean

74.62

5:07:57

3:48

22.92

3.95

Standard Deviation

39.02

4:20:25

1:26

1.80

0.41

Note:

The value "N" refers to the number of students who took the quiz.

 

The mean and standard deviation for the number of hits are skewed due to the fact that students visited the first two tutorials more than the others. The authors believe that this may be attributed to the students getting used to the system. If the first two tutorials are ignored, the average number of hits was 60.82 (SD 10.25) and the total time spent was a mean of 3:36 hours (SD 1:30 hours). With an overall average of just under 4 minutes per hit, the tutorials took much less time than was anticipated. This data also indicates that most students reviewed the tutorials at least twice before taking the quiz and that most of the students completed the quizzes. The quiz average, 3.95 out of a possible 5, indicates that the tutorials were successful in providing a review of the material.

 

A preliminary evaluation of the comparison of the class grades of the semester and quarter system are presented in table 6. The grades are from the last year of quarter system and the first year of the semester courses.

 

Table 6

Comparison of quarter and semester grades

A

B

C

D

F

Total

Quarters

59

81

84

35

29

288

Quarter Percentages

20.5

28.1

29.2

12.2

10.1

100

Semesters

33

47

36

13

13

143

Semester Percentages

23.2

32.9

25.2

9.1

9.1

100

 

As can be seen, the preliminary indication is that the tutorials are equally successful in providing the needed review as the in class review provided in the quarter courses. It is possible that with some modification of the tutorials, student performance will show that the computer tutorials provide a superior review to that done in the quarter courses.

 

 

SUMMARY AND CONCLUSION

 

In response to the need to provide a review of basic principles from physics for a new structures course, a series of computer-based tutorials was developed for use outside of class. A preliminary evaluation of the tutorials was conducted using a questionnaire, tutorial generated data and a comparison of grades made in the new course with those of the old courses. The preliminary results indicated that, while some improvement was needed, the tutorials were successful in providing a needed review of basic principles. Suggestions for improvement of the tutorials included greater depth of coverage of the topics, the addition of mathematical procedures, an improvement to the video clips and a revision of quiz questions to make them more easily understood. Because of the limited population involved, however, a definitive evaluation cannot yet be made. It is recommended that these revisions be made, the study be continued to provide a definitive evaluation and that use of this tool as a method of providing review from pre-requisite courses be further investigated.

 

 

References

 

Inoue, Yukiko (1999). Evaluating Intelligent Tutoring Systems, Information Analysis, 70, 1-9.

 

Appendix A

 

List of questions in the student survey

1. How often did you complete the tutorials and associated quizzes?

2. In general, how seriously did you treat those tutorials you completed?

3. How helpful were the tutorials in helping you be prepared for upcoming classes?

4. How well did the tutorials cover the material for upcoming classes?

5. How often did you complete the interactive question sections quizzes?

6. In general, how seriously did you treat those interactive questions you completed?

7. How helpful were these interactive questions in helping you be prepared for upcoming classes?

8. How well did the interactive questions cover the material for upcoming classes?

9. How often did you view the associated videos?

10. In general, how seriously did you attend to those videos you watched?

11. How helpful were these videos in helping you be prepared for upcoming classes?

12. How well did the videos cover the material for upcoming classes?

13. How often did you complete the quizzes?

14. In general, how seriously did you attend to the quizzes you completed?

15. How helpful were these quizzes in helping you be prepared for upcoming classes?

16. How well did the quizzes cover the material for upcoming classes?

17. Please write any additional comments you have regarding questions 1 through 20.

18. I enjoyed completing the tutorials that were associated with this class.

19. I thought the tutorials were too simple, or over simplified.

20. I thought the tutorials reached an appropriate level of depth for the material.

21. The interactive quizzes were easy to understand and follow.

22. I think the tutorials have helped my performance in this class.

23. I think the tutorials should introduce more theory.

24. I think the tutorials should provide more explanation as to the "why" of the content being covered.

25. I think the tutorials should work through some problems step by step.

26. The videos were easy to understand.

27. The videos were too long in length to hold my attention.

28. The tutorials took too long to complete.

29. The quizzes were a fair test of the knowledge introduced during the tutorial.

30. The quiz point allocation (50/1000) was an appropriate relevant percent.

31. The quiz point allocation (50/1000) should have been higher.

32. I would have completed these tutorials even if they did not count towards my final grade.

33. Please write any additional comments you have regarding questions 22 through 36.

34. When did you take the course Physics 1?

35. Rank the various components of the course: Tutorials, Quizzes and Videos.

36. Rate the Interactive Questions in the tutorials of the web-site.

37. Rate the Video clips in the web-site.

38. Rate the quizzes in the web-site.