Go to Introduction Go to Quick Looks Go to Conversations Go to Cases Go to Resources




Table of contents
Go to University of Massachusetts Dartmouth Summary
Blank spacer
Go to University of Massachusetts Dartmouth Introduction
Blank spacer
Go to the University of Massachusetts Dartmouth Setting
Blank spacer
Go to University of Massachusetts Dartmouth Learning Problems and Goals
Blank spacer
Go to University of Massachusetts Dartmouth Creating a Learning Environment
Blank spacer
Go to University of Massachusetts Dartmouth Outcomes
Blank spacer
Go to University of Massachusetts Dartmouth Implementation
Blank spacer
Go to University of Massachusetts Dartmouth Conclusion
Additional materials
Go to University of Massachusetts Dartmouth Reader's Guide
Blank spacer
Go to University of Massachusetts Dartmouth Discussions
Blank spacer
Go to University of Massachusetts Dartmouth Resources
Blank spacer
Go to University of Massachusetts Dartmouth Glossary
Blank spacer
Got to University of Massachusetts Dartmouth References
Blank spacer
Go to  University of Massachusetts Dartmouth Endnotes
Blank spacer
Show entire University of Massachusetts Dartmouth case
Blank spacer
Download University of Massachusetts Dartmouth case

Go to previous page IMPULSE: The Integrated Math Physics, Undergraduate Laboratory Science, and Engineering Program Go to next page

Picture of Nick Pendergrass
Professor Nick Pendergrass


In September 1998, we began an integrated, first-year engineering curriculum. Assessment data showed a major impact on attrition rate, students passing two semesters of physics and calculus on schedule, and increased student performance on common final exams in calculus.


"Integration--the fact that students are learning in a cross-disciplinary way--is a key goal. So when they're doing math, they're also doing physics. They realize that these subjects somehow connect to each other...Technology is a part of it; the teaming and the active learning are very important."

Picture of John Dowd
Professor John Dowd


Introduction
    Here we describe the IMPULSE program, what it is doing with technology and why. We briefly present information about student learning outcomes, and explain why it's not just the technology that has made this a successful program.

The Setting

    In this section, we introduce you to the IMPULSE faculty and provide information necessary to understand the context within which they strive to achieve their goals for student learning.

Learning Problems and Goals

    Here we examine, first, the learning problems that the IMPUSE faculty faced, problems that ultimately motivated them to change their curriculums; then, we take a look at the goals they have set for student learning.

Creating the Learning Environment

    In this section, we look closely at how the IMPULSE faculty created their new learning environments-the tools they use, the activities they assign, and how they assess their success. This section is deeply informative and includes links to discussions of learning activities, as well as information on specific assessment tools and activities.

Summative Outcomes Data

    How well is it working? Find out in this section, which tracks the achievement of IMPULSE students.

Implementation

    Wondering about the logistics? The IMPULSE faculty share how they did it: from acquiring the necessary resources (time, space, money, etc.), to networking, to overcoming their own ingrained ideas about teaching and learning.

Conclusion

    You have to generate in students a physical intuition...


Introduction

The context and goals of IMPULSE
Since the 1980s, colleges of engineering have been responding to forces for change in engineering education. In particular, the
National Science Foundation and the Accreditation Board for Engineering and Technology (ABET) have articulated the need for, and provided resources to support, changes that enable students to learn concepts that they can apply in novel settings, and to problem-solve effectively in multidisciplinary teams. In 1998, the College of Engineering at the University of Massachusetts Dartmouth (UMD) successfully acted in response to these forces. Led by Nick Pendergrass of the Electrical Engineering Department, and with funding from the Davis Educational Foundation and subsequently as a member of the NSF-sponsored Foundation Coalition, UMD launched the Integrated Math, Physics, Undergraduate Laboratory Science, and Engineering (IMPULSE) program.

The goals of the IMPULSE program are to improve retention rates in engineering while also improving students' learning of engineering fundamentals, teamwork, and communication skills. This case study recounts how, in a short time, this program led to significant documented improvement in retention rates of engineering students.


What is the IMPULSE program?
The Integrated Math, Physics, Undergraduate Laboratory Science, and Engineering (IMPULSE) program is "a learning community that combines an integrated curriculum with active collaborative learning, teamwork, and the latest technology" (http://www.umassd.edu/specialprograms/impulse/).

It offers a two-semester sequence (11 units per semester) for first-year engineering and physics majors that integrates mathematics, physics and introduction to engineering. The sequence includes the following courses: Physics for Scientists and Engineers I & II, Introduction to Applied Science and Engineering I & II, and Calculus for Applied Science & Engineering I & II. Each course is taught by a faculty member, assisted by an undergraduate teaching assistant (a junior or senior hired by the professor) and technical support staff. Students progress through the sequence in cohorts. Each professor assigns the grades for his/her own course. When they give assignments that span more than one course, each professor incorporates outcomes from these projects into the grades for their individual course.

At the time of our visita to UMD (Spring 2000), 87 students who started IMPULSE in the fall were continuing with the second semester, and another 41 (who had to take precalculus in the fall) had just begun the IMPULSE program. (Since then, the program has grown by about 10 percent.) UMD physics majors and approximately 80 percent (all but Civil Engineering) of the engineering majors participate in the program. If IMPULSE students with undecided majors choose to major in Civil Engineering, the IMPULSE credits are accepted.


What problems does IMPULSE address?
Like many other universities nationwide, UMD had a poor retention rate in its engineering programs. Forty percent of their first-time full-time freshman dropped out between enrollment and the sophomore year. In addition, students did not see the need for the lower division science courses, showed weak interest in engineering content, and failed to see connections between introductory courses and engineering majors and careers.


What goes on in the IMPULSE program?
The IMPULSE program combines:

  • a curriculum that is informed by research about how students learn physics and mathematics, and that integrates calculus, physics and engineering;

  • a student team-based approach to learning;

  • a "studio" classroom for 48 students, equipped with networked engineering workstations set up with up-to-date math, physics, and engineering software, and reserved specifically for this program.

    Research-based curriculum.
    The IMPULSE faculty designed a curriculum in which students take calculus and calculus-based physics simultaneously so that the faculty can use physics to motivate and enhance students' intuition for calculus. Using the research-based Real Time Physics methods developed by David Sokoloff, Ronald Thornton, and Priscilla Laws, and the Harvard Calculus approach, the faculty make ample use of engineering applications to teach calculus. Likewise, they include an engineering course each semester to motivate students to learn science and math fundamentals and also to provide engineering foundations. Just as they use physics to motivate the learning of calculus, the IMPULSE faculty use projects in the engineering course to help the students make sense-through application-of what they are learning in calculus and physics. The IMPULSE faculty also use assessment tools (like the Force Concept Inventory test) that have been shown to accurately measure conceptual learning.


    Teaming.
    Teaming, making use of collaborative learning methods, is another essential part of the IMPULSE program. These faculty found that in the past, when working alone, students often could not complete assigned projects. In teams, however, they complement each other's skills and as a result are more likely to be successful in completing projects. Designing their classes so that teams (not individuals) are the units that perform experiments, ask and answer questions, solve problems, debate and brainstorm, the faculty devote a substantial proportion of class time to student teamwork. They also structure their course assessment practices to foster a "team ethic." For example, teams are "responsible for ensuring that each member contributes and learns," and while many assignments are graded individually, some are graded on a team basis. The instructors assign students to teams (taking care that teams that include women have at least two women) and change team composition at their discretion in order to give students an opportunity to meet and work with new people.

    The IMPULSE professors, their undergraduate teaching assistants and technical support people work as members of a team, interacting frequently about the curriculum and the progress of their students. The faculty find that it's easy for their undergraduate assistants (who are paid at student hourly rates in the neighborhood of eight dollars per hour) to relate to the freshmen. They rely on their undergrad assistants to provide role models and to keep communication channels open. The faculty also have noticed that, as a result of their teaching experience in IMPULSE, the undergraduate assistants learn a great deal while performing their role, and some began considering a future as a faculty member.


    Studio classroom.
    UMD designed and built studio classrooms much like those developed at Rensselaer Polytechnic Institute1. In these classrooms, the IMPULSE instructors use computer-based technology to get students involved in key scientific processes: modeling, real-world and real-time acquisition of data, locating information, and communicating. Sometimes the faculty perform experiments and simulations for the students, and often the students work in their teams to carry out experiments and perform data analysis. As John Dowd (physics) explained, "In the physics section, we adopted lock, stock and barrel the curriculum developed by Priscilla Laws at Dickinson College. Students do experiments. The experiments are videotaped. The images are digitized and the data is extracted. They plot the data and perform several calculations on the raw data in real time."

A picture of the old lab.  It shows old computers and various electronic equipment strewn around and set up for individual work stations rather than group work stations.
Click here to see a larger version of this graphic.

A picture of the IMPULSE lab.  It shows new computers, very little electronic equipment, and designed for group work rather than individual work.
Click here to see a larger version of this graphic.


Why is learning technology important in IMPULSE?
The IMPULSE faculty believe that the speed and other capacities of computer-based technology are critical to the success of this program. In particular, the technology enables instructors to perform laboratory demonstrations sufficiently rapidly that there is time for discussion of concepts and interpretation of results. The technology also enables students to carry out experiments and perform data analysis within a reasonable amount of class time. For example, as Nick Pendergrass (electrical and computer engineering) explained,

    A student goes in front of the class and drops a ball. A camera records the path of the ball; a frame grabber records the activity onto a computer; the data is transferred into an Excel worksheet. Students plot position versus time, calculate quantities like velocity, acceleration, and so forth. In one class period, they can do this five times if they want. That's what technology does in the classroom - it allows you to accelerate the learning process.


But does it work?
There is hard evidence to support the success of this program. Outcomes on common exams indicates higher levels of conceptual understanding and performance in physics and calculus for IMPULSE students. In addition, the freshman to sophomore year attrition rate for IMPULSE students is 17 percent, a dramatic improvement from the 40 percent recorded before the program began. IMPULSE students acknowledge that, while the program is demanding, it goes a long way toward making them much more competitive as future interns and permanent employees of industry. Moreover, the UMD Admissions Office is very happy with the IMPULSE program, which has become an important recruitment factor for UMD.


The Setting

Note: For useful tips and information on how this case study is organized, please see the
Reader's Guide. To read a brief overview of the activities of the IMPULSE program, see the Introduction.

The IMPULSE program operates within in the University of Massachusetts Dartmouth (UMD) College of Engineering, which includes the Physics Department. (For more on UMD, see Resource A. Institutional Context.) UMD's IMPULSE program initially was funded in part by the Davis Foundation, which provides grants to institutions of higher education in the New England area in support of educational improvement efforts, and by UMD. Additional funding to launch the program was provided by a second grant from the Davis Foundation and by the NSF-funded Foundation Coalition, one of several NSF-funded multi-campus coalitions designed to improve student retention and learning in engineering. The Foundation Coalition, like the other coalitions in this NSF program, takes as one of its key principles the use of collaborative learning2. For a very useful history of the IMPULSE Program, see Discussion 1. History of the IMPULSE Program.


Dramatis Personae
The IMPULSE faculty work as a team to provide an integrated learning experience for their students. To this end, they maintain a high level of interaction with each other, meeting weekly to plan curricular modifications and to discuss student progress, and occasionally stopping in on each other's IMPULSE courses.

In Spring 2000, when we studied their efforts, these faculty included Nick Pendergrass (electrical and computer engineering), John Dowd (physics), Renate Crawford (physics), Robert Kowalczyk (mathematics), and Raymond Laoulache (mechanical engineering).

Dr. N. (Nick) A. Pendergrass is presently Department Chair and Professor in the Electrical and Computer Engineering Department at the University of Massachusetts Dartmouth. A leader in the development and implementation of innovative educational programs and courses, he has been instrumental in getting the external and internal funding needed to effect significant change. While at the University of Massachusetts Dartmouth, Dr. Pendergrass has been the Associate Dean for Undergraduate Programs, the Assistant Dean for Industrial Relations and the Chair of the University Graduate Council. His current research interests include adaptive equalization with application to underwater acoustic telemetry, as well as digital signal processing including applications to adaptive noise reduction. Dr. Pendergrass received a B.S. degree in Electrical Engineering at the University of Missouri at Rolla in1967, and M.S. in Electrical Engineering from Purdue University in 1969, and a Ph.D. in Electrical Engineering at the University of California, Berkeley, in 1975.


Dr. John P. Dowd is a Chancellor Professor (Emeritus) at the University of Massachusetts Dartmouth. He received his B.S. and Ph.D. from the Massachusetts Institute of Technology. His research interest is in experimental particle physics. He was the principal investigator for the UMD particle physics group in their experimental program at Brookhaven National Laboratory and was part of the physics team that discovered the first credible evidence for the existence of exotic mesons. He has had a long time interest in pedagogical issues associated with introductory physics, particularly the laboratories, and introduced UMD's first computer based lab for introductory physics in 1983.


Renate J. Crawford received her Ph.D. in Physics from Kent State University in Ohio in 1993, working at the Liquid Crystal Institute. She started at the University of Massachusetts Dartmouth in 1996 as a lecturer within the Physics Department and became an Assistant Professor in the department in 1998. She is actively involved in teaching innovations and outreach in addition to her research in soft condensed matter. She has published 16 refereed research publications, plus two invited and seven refereed education and outreach oriented publications. She has also edited educational outreach newsletters and conducted countless workshops in science education outreach to K-12 students and teachers. She has been actively involved in the teaching and innovations in IMPULSE since its first semester.


Dr. Robert E. Kowalczyk is a Professor of Mathematics at the University of Massachusetts Dartmouth. He received his B. A. in Mathematics from the University of Massachusetts Dartmouth and his Ph. D. in Applied Mathematics from Brown University. Dr. Kowalczyk's research interests include the development of educational software and the integration of technology into the mathematics classroom. He is the co-author of the mathematics software package TEMATH. His teaching philosophy stresses the use of active learning strategies in a cooperative learning environment.


Dr. Raymond N. Laoulache is an Associate Professor of Mechanical Engineering at the University of Massachusetts Dartmouth. He received his Ph.D. from Brown University, his MS and BS from Northeastern University. Dr. Laoulache's research interests are in the areas of multiphase fluid flow. He developed the engineering component of the integrated curriculum in the IMPULSE program, and served as the program director.


Learning Problems and Goals

A. Problems

The IMPULSE instructors pointed to three concerns that motivated them to consider making improvements in their engineering program:

  1. high attrition rate,

  2. weak student engagement, and

  3. poor student performance.

1. High Attrition Rate
In spite of university services like tutoring, withdrawal and failure rates were fairly high for students seeking to major in engineering at UMD. As in many other engineering programs, students were required to take several semesters of calculus, physics and chemistry as pre-requisites to the first engineering courses. In calculus courses at UMD, attrition rates were in the 40 percent range, while in chemistry courses they ranged up to 52 percent. High rates of attrition in these courses meant that many students left before being exposed to a single engineering course
(see Figure 1)b. Their short experience with the pre-engineering curriculum left them bitter and discouraged about science and engineeringc.

A graphic of retention rates as entering classes progress through their academic careers.  The graphs covers from 1991 to 1997 and shows a precipitous drop between the 1st and 2nd years.  The drop in retention continues until the 4th year when it is between 35 and 50%.
Figure 1
Click here to see a larger version of this graph.

Of note, many of the students who left the engineering program had entered with high SAT scores--which are viewed as predictors of success in engineering programs (Figure 2). Moreover, these students did not switch to science majorsd. According to Nick Pendergrass (electrical engineering), the faculty who eventually developed IMPULSE were distressed that many faculty in the UMD College of Engineering--like those in many other engineering programs--viewed these high attrition rates as a sign of their rigor. The IMPULSE instructors, by contrast, did not believe that high attrition rates were simply a sign of the program's success and rigor. They felt compelled to investigate the reasons why students were dropping oute.

A graphic of SAT math scores versus Quality Points earned in the 1st year.  There doesn't seem to be a correlation between high SAT math scores and if the student earned a high number of quality points in their 1st year.  Students who dropped out had SAT scores comparable to those staying in the program.
Figure 2
Click here to see a larger version of this graph.

2. Weak Student Engagement
While retention was the primary challenge that motivated faculty to implement IMPULSE at UMD, they also identified student engagement as an impediment to an effective learning environmentf. Many students' only motivation for taking the basic foundation courses was to fulfill requirements--a motivation that often led to disastrous performance on tests, even for the best-prepared studentsg. This lack of motivation occurred even when caring instructors tried using grades as an incentive to encourage participation and engagementh.

Nick Pendergrass described the problem in terms of students' complaints about the curriculum and, in particular, about the ways the pre-requisite, or foundation, science and math courses were taughti. Students, he explained, felt that these courses offered only a complex display of mathematics and science concepts and did not offer any linkages to applications. They saw no connection between the content of these courses and real life problems faced by engineers in the workplacej. Even an attempt by faculty in the Electrical Engineering Department to create a hands-on technology course was unsuccessful in addressing the problem.

In the past, science and engineering instructors taught basic concepts using simple canonical models, promising students that they would later recognize and experience their applicability and usefulness in the real world, if not during their university studies. For two reasons, this does not work well for most of today's students, according to Renate Crawford (physics). First, today's students firmly believe that the world in which they live is totally different from that described in physics and math courses. Second, they are much less inclined to take their instructors' word for it; they require tangible evidence that the concepts they are asked to learn--particularly in their prerequisite courses in physics and math--will be useful to them as engineers and also will relate meaningfully to their own experiencek.


3. Student Performance
A graphic of SAT math scores versus Quality Points earned in the 1st year.  There doesn't seem to be a correlation between high SAT math scores and if the student earned a high number of quality points in their 1st year.  Students who dropped out had SAT scores comparable to those staying in the program.

Weak student performance was another concern at UMD, particularly in physics and math courses where engineering students' motivation and interest were noticeably low, but also in the engineering courses. The poor performance of the engineering students was also evident in the physics Force Concept Inventory (FCI) test3. The FCI is a multiple-choice test designed to assess the understanding of basic concepts in Newtonian physics. Recently, it has become one of the popular methods for assessing the effectiveness of physics instruction. The UMD students scored at 20 in the percent gain figure on the FCI test, a score comparable to typical traditional courses, but significantly lower than classrooms that had implemented active learning. Renate Crawford, a physics instructor explained:

    This is why we are interested in active learning (pointing to the pink data points on Figure 3). We're right there...where you'd expect for the lecture courses-- the lecture courses have very little gain, about twenty percent. So, while we weren't any worse than anybody else, we were pretty pitiful. The gains achieved by students in these active learning courses are more significant.

Renate also related students' performance to an apparent lack of engagement (see "problem 2" above) that, in turn, she linked to the traditional lecture-based classroom. She remembered that, as a student herself,

    it can be incredibly boring, no matter how interesting the subject matter may be, to just sit there for 50 minutes or 90 minutes and just listen. And a lot of times it would happen that while you [as a student] are writing something down, the professor is talking about the next subject. So you never actively participated.

She recalled this when, as a professor herself, she began using lectures and observed that although her students wrote down her words, the meaning "didn't go through their brain."

Bob Kowalczyk (mathematics) saw that this kind of faculty dissatisfaction with student performance was a good reason to start considering other approaches: "Engineering had a pretty high dropout rate, and the students who made it through were doing okay, but not as well as we would expect. So this gave us a chance to look for other pedagogical techniques that could help us retain the students as well as do a better job teaching them."


B. Goals

The UMD instructors explained that, to address their concerns about student learningl, they settled on a set of key strategies that are supported by evidence from around the country and that they believed would enable them to:

  • understand the reasons for poor retention, so that they could change the critical problematic factors,

  • help students understand why the foundation courses are important to success in engineering majors and future engineering careers,

  • help students relate the foundation courses to their own current experiences, and

  • provide learning experiences that are engaging and active and that result in greater gains in student understanding.

Their key strategies were to design a learning environment that incorporates the following elements:

  • integration of math, physics and engineering,

  • technology that supports conceptual and skill-based learning,

  • application of principles to novel situations, and

  • collaborative learning

See Discussion 2, "Making the connections: A discussion of the key principles used in the IMPULSE program," for the instructors' reasons for choosing these strategies to achieve their goals for improving student learning."


Creating the Learning Environment

Instructors at
UMD and many other institutions of higher education across the country are now designing and implementing their courses as learning environmentsm. The Institute for Learning Technology researchers consider these instructors to be bricoleursn As bricoleurs, they use what we view as a three-step process:
  1. They address specific problems (e.g., high attrition rate; weak student engagement; and poor student performance).

  2. They articulate an overall set of goals (see Goals), and settle on a set of elements for designing an effective learning environment (e.g., integration of math, physics and engineering; technology that supports conceptual and skill-based learning; application of principles to novel situations; and collaborative learning).

  3. They evaluate, adapt and integrate a variety of teaching and learning activities (computer-enhanced and otherwise).

As with all the case studies appearing in the LT2 site, the IMPULSE activities are organized into three categories:

  1. Computer-dependent activities that faculty believe simply would not be possible, or at least not feasible, without computers.

  2. Computer-improved activities that faculty believe work incrementally better with technology but can still be implemented without it. (The UMD faculty did not give us examples of this type of activity.)

  3. Computer-independent activities that can be done without technology.

In this section, we devote relatively little space to "computer-independent" teaching strategies. This is because the key computer-independent teaching strategies used in IMPULSE--the use of student teams, the very limited use of lecturing--are difficult to present independent of the computer-dependent methods. Student teams and the "instructor as guide on the side" approach to teaching are critical to the effective implementation of the computer-intensive methods that characterize a studio classroom-based program like IMPULSE.

The IMPULSE instructors (faculty, undergraduate student assistants and technical staff) believed that it was critical for them to work together as a cohesive team in order to sustain the success of their program. They focused on a common goal, and continuously refined the strategies they used to achieve these goals. Their weekly meetings provided opportunities to consider the different types of information each instructor had about the successes and problems the students were experiencing, and to make necessary adjustments. At these meetings they also planned how to continue to integrate topics across the three courses, and how to integrate their computer-based and other teaching methods to produce as effective a learning environment as possible.


A. Computer-dependent Activities

Computer-depended learning activities are those that are significantly enhanced through the use of technology. In the IMPULSE program, these activities involve the following key scientific processes: real-world and real-time acquisition of data, modeling, locating information, and communicating. At UMD, the use of these activities is greatly facilitated by the fact that computers are readily accessible in the classroom. However, the critical factor is that these computers make it possible to undertake these activities rapidly, right in the classroom. It is the combination of the convenience and speed of the computer technology utilized in IMPULSE that allows students to engage in these important scientific activities.


1. From the Instructors' Point of View

The IMPUSLE faculty observed that one reason why using computers to investigate scientific problems in class works is because it captures students' interest. As Renate Crawford put it, "Having them work in teams, using a lot of the new technology works because the students are impressed by it." As important as this factor may be, however, the IMPULSE faculty were more inclined to stress that their use of computers enables students to learn by using the following scientific processes.

Real-time acquisition and analysis of data
At the center of the IMPULSE reform at UMD is the opportunity for students to do science in the classroom. In the studio classroom, students are able to generate and capture data, and to analyze, calculate and explain the relevance of parameters like velocity, acceleration. John Dowd (physics) explained:

A picture of a monitor showing the experimental data the software captures.  Shown are data tables, graphing of these data, and a video tape of the experiment itself (a ball travelling through the air).
Click here to see a larger version of this graphic.


    We use software that Priscilla Laws developed. We throw a ball through the air, videotape it and then digitize the video. Laws has software that lets you put little cross hairs on each frame. Frame by frame you advance the video and put a little cross hair on the ball. You analyze the path of the ball and find the trajectory; you can find the acceleration, the velocity, the initial velocity; you can fit curves to the trajectory, do all kinds of things. The idea is to not do that with a canned presentation, but to have the students do it. One student throws a ball and another student drops a ball, or shoots a rocket up in the air. In fact, Ray [Laoulache] had the students in his engineering class design these little soda bottle rockets. Well, they videotaped their rockets. They figured out the height of the rocket and its trajectory, and curve fit parabolas across the trajectory.

They found the use of graphing is particularly important. It helps students see the relevance of concepts that they so often think are purely abstract. Instead of just plugging numbers in formulas and obtaining values that are meaningless to them, students can provide scientifically sound explanations for these values. Renate Crawford (physics) commented on how using computers and real data enables this more meaningful kind of learning:

    They have to write reports. They are required to use Excel spreadsheets to analyze the data and then put those results into a Word document. We're doing a lot with graphing. Before I started teaching IMPULSE, I did very little graphics. It's become plainly evident to me, and you probably noticed it today too from their confusion, that they have absolutely no problem with word problems: "The initial velocity is this, the angle is that, where does the ball land?" They plug in all the right values, and get values that are just numbers. But having them analyze the data today makes them realize that they don't really understand what those values mean. Now that we're having them analyze that data, they realize that they are so confused, which is good because they work through their confusion. When everybody left today, I think 11 out of the 12 groups felt confident that they now understood the relationship between the three graphs.

Modeling
Computer technology use in IMPULSE is linked to use of models to provide real-world relevance to content taught in physics, mathematics and engineering. An example of modeling that Bob Kowalczyk uses in his calculus class is taking data to produce the temperature profile of a hot coffee cup: "I'll bring a cup of hot coffee to class, we'll put a temperature sensor in it, we'll watch the temperature data being plotted and then we'll try to model the data mathematically." In the calculus sections, "Maple" is the software of choice. However, as Bob Kowalczyk explained, because Maple is not user-friendly and has excessively complex syntax, it hinders students' easy visualization of concepts and interpretation of results. To overcome these problems, IMPULSE students frequently use their calculators in combination with Maple for graphing functionso In engineering, modeling is used in the design, construction, testing and analysis of a makeshift water rocket. Raymond Laoulache (mechanical engineering) explained:

    Take the water rocket project: both the concept of momentum and numerical schemes were covered. The students have to design fins for a rocket for stability purposes. They were given access to NASA data on what would be a good length for the rocket, what would be the fin design. They used Mechanical Desktop for all those things. So what happened? To make it work, you have to teach the students numerical methods at the freshman level for a period of time.

In physics, videos are used to provide models of two- and three-dimensional motion. In response to our questions about how she uses technology in IMPULSE, Renate Crawford (physics) said:

    I use technology in almost everything we do. Like today, students were modeling data. They were using Video Point, because two-dimensional motion is something very difficult to study if you use the regular traditional equipment, whether it's Airtrack or this other low-friction track. Two or three dimensions are almost impossible to analyze unless you videotape the experiment. For saving time, we used a preexisting video projectile motion and let them analyze it.

The IMPULSE faculty found that the combination of readily available computer resources and the team approach is effective. Bob Kowalczyk explained,

    I relied on technology a lot before IMPULSE to include using real data and modeling that data. So I sort of brought those ideas with me. Now in the IMPULSE environment, it's a lot easier to implement these activities because you have the students working in teams of four and you have the computers in the classroom. The necessary equipment is readily available.

Locating Information and Communicating
At UMD, the use of computers in the IMPULSE program provides instructors and students the means to share information conveniently, whether in the classroom or outside. Raymond Laoulache (mechanical engineering) explained:

    The computers are essential. Absolutely essential, because in the past if I wanted to provide the information to the students I would have to print on paper. Now it is on the Internet. If I'm working on a design problem in the classroom, I ask them to turn on their monitors and start doing the work. Today with technology we are able to do graphics and accomplish much more in a semester.

Students also pointed out that they commonly used web-based communication tools like ICQ and Netscape AOL messenger to communicate among themselves. Students also used email frequently to communicate with their instructors.

Summary
Overall, the IMPULSE computer-enhanced classrooms allow students to experience modeling, data collection, analysis and interpretation in real time. It gives them a first-hand appreciation of the importance of laboratory devices involved in the inquiry processes central to science, and helped them learn how to interact with and rely upon each other. Renate Crawford (physics) explained:

    The thing I like about the computer-interfaced labs (the students still don't realize this all the time) is that the data is showing up for them in real time. As they are taking the data, each point is appearing. So a lot of times I can tell them while they are taking data, "Hey, look what is happening to your graph." And they look at their cart, or their graph, and they realize there's a little discrepancy--"My motion sensor sees me but not the cart"--this kind of thing. If you don't have computers, how are you going to bring that across to them? Can you do this in a lecture type of mode? No. I've tried.

At the same time, the IMPULSE instructors are mindful of the risks of over-reliance on computers. They bear in mind, for example, that when they bypass steps to make time for students to have time to focus on more important parts of the experiment (analysis of data and interpretation of results), important details in analysis may escape students who have only been exposed to computer-based methods. In this regard, Raymond Laoulache (mechanical engineering) commented:

    I could show you how we took the equation of the rocket, which turned out to be a non-linear equation. Unfortunately, to solve the problem you have to do a numerical analysis. So I showed the students how painstaking it is to do it by hand in a single trial. And they were given an assignment to do it by hand for the next time period, you know, when you step things in time. Now they solve the details of the process. Then they take those equations and put them in an excel spreadsheet and make the process faster. But I refuse to let them rely on technology 100%. That's my style.


2. From the Students' Point of View

From the students' point of view, the use of computers facilitates learning. It permits the virtual simulation of experiments that in real life would be difficult to carry out in a classroom setting and time. As one student told us, "A lot of stuff was perhaps easier to learn. Like they had a program, such as Chemlab or whatever it was. If the professor wasn't able to show something in person because it had, like, deadly gases or something, you could actually see the whole experiment simulated on the computer." Students also appreciated the ability to design engineering systems right there in class. And, students were aware that being in the IMPULSE program gave them training and skills that seem to appeal to industry:

    Student #1: Also things like AutoCAD are pretty huge. Learning all that right there and being able to design things right in front of you, that was neat. We did a project making bottle rockets, and we had to design the fins for it and everything, and we just used AutoCAD.
    Student #2: And everybody was responsible for that.
    Student #3: Ice scraper, assemble handles, screws - It was actually a good job-related skill. We used AutoCAD Version 14, which was the most powerful at the time.

The students we interviewed also made clear that computers add to the quality of their experience by providing convenience and speed, as well as helping them with conceptual understanding. A group of students reflected on the benefits of using computer technology in the classroom:

    Student #1: Yeah, if you're sitting there with little spring scales trying to figure out the force in Newtons, I mean, we tried that for a little while and we realized that everybody hated it and moved on. It took me so much paperwork trying to catch up. I know we learned a lot more in physics with the computers because it sped things up a thousand times.
    Student #2: All of the sudden everybody got the concepts like really quickly.
    Student #3: It's more like a convenience. It speeds up calculations, except for calculus where it slowed them down. Because we used Excel for a lot of physics stuff, it would just do everything quickly. It just gets it done instead of having to draw it out.


B. Computer-independent Activities

In this section, we consider teaching and learning strategies recognized nationally as catalysts of student learning that do not necessarily depend on computer use. According to the IMPULSE instructors, two strategies--using student teams, and shifting the faculty role from that of a lecturer to that of a coach--are critical to the success of the program. We present their views on "teaming" here. For a discussion of the shift to a faculty "coach" role, see Discussion 6, Changes in the instructor role. The use of teaming helps the program became a real learning community, according to Nick Pendergrass (electrical engineering):

    I believe what's going on is that the students are getting connected with each other. They develop relationships that they don't want to break and those relationships are built around performance in a classroom. They learn from each other. I believe that the single most important thing that happened is that a learning community occurred. We watched 48 students and five faculty members teaching those five courses weld together. They became responsible for each other. And once that got going, then that's what made it work.

Setting up and facilitating these teams takes effort and skill, however. Renate said:

    Change the teams, if they need to be changed, because some people just do not get along and maybe they'd do better a second time around. Keep an open dialogue with the students as much as possible, which is, you know, sometimes hard; you get a great connection with some students, and some don't want to participate.

In IMPULSE, individual student accountability does not disappear because of the teamwork approach. IMPULSE instructors work hard to keep all members of the group involved, keeping up the expectation that any student may be called in at any given time to give a report. Nick explained:

    It's most important that you understand that there is individual accountability - that students must know it themselves. If somebody does the work and the rest of the group doesn't figure it out, they're going to lose. Individual accountability is enforced by not accepting reports from the whole group (so that the one student who always knows the answer is doing everything), but by selecting somebody in the group to report for the rest of them. The team has to make sure that everyone has the story right.

The IMPULSE instructors acknowledged that there is some distribution of tasks among students in their teams: students tend to perform tasks that they are best at and more comfortable doing. And while some instructors expressed concern about uneven individual contributions, it seems that performance of students at all levels has gone upp. The students with whom we talked also found teamwork valuable. But they also expressed more concern than the faculty about unequal contributions by group members--an understandable concern, given that rewards are allocated on an individual basis in academic culture.

Summary
According to the IMPULSE faculty, the speed and other capacities of computer-based technology are critical to their program. It allows students to carry out experiments and perform data analysis within a reasonable amount of class time, and enables instructors to perform demonstrations sufficiently rapidly that they have time to draw students into thoughtful discussion of concepts and interpretation of results. While the IMPULSE instructors were quick to emphasize the value of these computer-based features of their learning environment, they were equally emphatic that the technology must be combined with other key features in order to achieve the kinds of student learning outcomes they were observing. These other features are:

  • an integrated curriculum that is informed by research about how students learn physics and mathematics, and that integrates calculus, physics and engineering;

  • a student team-based approach to learning;

  • properly designed physical spaces in which a workable number of students can use the right kind of equipment in a collaborative way;

  • instructors trained to use a student team-based and largely "lectureless" approach to teaching.

In the next section (Outcomes), we consider in some detail what these UMD instructors learned about how well their IMPULSE learning environment achieved the their goals for student learning. In the subsequent section, "Implementation," we present their accounts of activities and factors that were critical to their process of creating this apparently very successful learning environment. The instructors includes, among the many factors we present under "Implementation," information about how they learned to work across disciplines and use a new approach to teaching, and how their efforts were received by colleagues locally and nationally.


Outcomes

One of the major goals of this project is to identify the conditions that make it possible for computer-dependent activities used effectively in one institution to be adapted elsewhere. To that end, evidence that these activities are effective in enhancing student learning is important. At UMD, we obtained diverse initial assessment information gathered from students and others about the learning resulting from the
IMPULSE program.


1. Performance on Exams
One indication of the effectiveness of the IMPULSE program is student performance on the Force Concept Inventory (FCI) physics test. The "FCI gain," defined as (Posttest - Pretest)/(100 - Pretest), assesses students' understanding of basic concepts in mechanics. In Figure 4, the Gaussian fit to the histogram of the FCI gains is plotted. John Dowd (physics) commented on his students' performance and also provided interesting insights on implementation of active learning methods:

    At the beginning of the semester we gave the students in the three lecture mode classes and the IMPULSE students the same exam, the Force Concept Inventory. And we gave the same exam again as an exit exam. The IMPULSE students did considerably better than previous lecture classes. In the first semester of implementing active learning we ended up there [points to the vertical line closest to .3 in Figure 4]. Now, the interesting thing is that those two lines are two different courses. One of these is the IMPULSE course. The other vertical line is the regular physics lecture with the Real Time Physics labs. And actually, the Real Time Physics is a little bit higher, although there is no statistical difference. David Sokoloff claims-and our data so far certainly doesn't contradict-- that you don't need a heck of a lot of interactive learning to make a big difference.

Performance on Physics Exams: Gaussian fit to histogram of FCI gains in traditional, tutorial and Workshop Physics classes at eight institutions.  The Workshop Physics shows the greatest gain, followed by tutorial and then traditional classes.
Figure 4
Click here to see a larger version of this graph.


IMPULSE sections performed better than traditional sections, but not as well as would be expected using the Priscilla Laws' Workshop Physics model. John Dowd (physics) attributed this to having reduced the program from six to four hours of class time, and to their lack of experience in applying these new methodsq.

Like in physics, calculus students in IMPULSE performed better on exams than their counterparts enrolled in traditional sections. Bob Kowalczyk (mathematics) explained that in 1998 they selected a control groupr. and compared the results in a common final exam taken by the students in IMPULSE, the students in the control group, and to all the other students in courses using traditional methods. The results appear in Figure 5. Referring to this graph, Kowalczyk emphasized that,

    Thirty-odd percent of students in the traditional classes never even took the final-- they'd been weeded out. Whereas in IMPULSE, 46 of the 48 students who enrolled took the final. And if you look at the mean scores, IMPULSE performed 14 points better than the control group. The IMPULSE students did almost 2 full grade points better than the control group and all the standard sections.

1st semester calculus performance: Histogram of exam scores for an IMPULSE group, a fall 1998 control group, and all students enrolled.  Data are for a common exam and the final.  The IMPULSE group had the highest score (95.8), then the all enrolled group (78), and the control group (72).
Figure 5
Click here to see a larger version of this graph.

2nd semester calculus performance: Histogram of exam scores for an IMPULSE group, a fall 1998 control group, and all students enrolled.  Data are for a common exam and the final.  The IMPULSE group had the highest score (97.5), then the all enrolled group (77), and the control group (70.8).
Figure 6
Click here to see a larger version of this graph.


He also pointed out that most students passed physics and calculus on schedule, as shown in Figure 8:

    And look, 81 percent passed physics on schedule. Now, this is one semester sooner than the traditional students would've taken physics because of the calculus sequence. Almost 70 percent of the calculus students had passed Calc 1 and 2 on schedule, as opposed to less than half of the other students.

Credits earned in the 1st semester: Histogram of credit earned for an IMPULSE group, a fall 1998 control group, and a fall 1997 control group.  The IMPULSE group had the highest credits (15.8), then the fall 1997 control group (12.5), and then the fall 1998 control group (10.6).
Figure 7
Click here to see a larger version of this graph.

2nd semester calculus performance: Histogram of exam scores for an IMPULSE group, a fall 1998 control group, and all students enrolled.  Data are for a common exam and the final.  The IMPULSE group had the highest score (97.5), then the all enrolled group (77), and the control group (70.8).
Figure 8
Click here to see a larger version of this graph.


We asked whether there were areas or questions within the common calculus final exam that are better suited to the IMPULSE method of learning than others. We were particularly interested in the dichotomy that some instructors have noted between learning skills and concepts. Bob Kowalczyk answered that the IMPULSE students performed comparatively less well on topics that traditional courses focus on, like limits, but perform much stronger on "application type problems."s.


2. Attendance
Recorded attendance is another indication of student engagement. Students not only attended class, but remain enrolled until the end of the semester. This type of dramatic change of behavior is typically experienced in IMPULSE math courses. Bob Kowalczyk (mathematics) said:

    I think the biggest success of the IMPULSE program is that the good students are now coming to class because of the teaming and everything; they are involved and we're retaining these students. And of course I think this is helping to keep some of the weaker students in the program too. Last year the excitement for me was when I went to give the final exam and there were 47 students there. And I started with 48. That has never happened before in my 25 years of teaching. In a traditional calculus section, we would end up with anywhere from 30 to 50 percent fewer students.

    The other excitement is that the students are usually there all the time. Last year, there were five courses that were integrated, and we had them from 8 till 12 every day in that room. They felt almost like prisoners there. Every student was there every day unless they were sick; we had perfect attendance.


3. Observation
Instructors have traditionally gathered a great deal of assessment information about their students through observation and interaction. At UMD, instructor observation is made easy because the studio classrooms are designed with large window panels that permit anyone to observe classroom activities. These windows provide a place to observe classroom activities with minimal interruption. John Dowd (physics) explained how one can clearly observe active participation in IMPULSE classrooms:

    If you're a visitor and you look in through that fish tank window in the corridor wall, you see the kids talking to each other. You see them adjusting the apparatus and playing with the screen, and you know they're working. I don't know if they're working in a more intelligent way, but at least it's better to work than not to work. They're participating. It's called active learning, which means active participation by the people who should be learning. You can see it happening in the studio classroom.


4. Student responses
Another way to measure the effectiveness of IMPULSE, is to listen to what students have to say about it. On the negative side, students complained that they should have a choice about participation in the program. They also identified the increased workload as a problem. On the positive side, they noted that their relationship with their instructors is better. When asked if they feel that they were better prepared than if they had been in the traditional sections, they unanimously said, "Yeah." (See Discussion 2, "Student views on the effectiveness of the IMPULSE program.") Students also noted that their confidence level had increased by going through the IMPULSE program.


Summary
The IMPULSE instructors were hesitant to claim success. For example, Raymond Laoulache (mechanical engineering) interpreted the early outcomes cautiously: "Regarding the success of the program, at least from a quantitative point of view, the most useful thing to tell you is that we have gotten students' attention." And Bob Kowalczyk (mathematics) suggested that because the program has been in place a limited time, they need to take a "wait and see" attitude when considering the performance improvement recorded by the students of IMPULSEt.

On the other hand, it is quite evident that something significant is underway. For example, upon reviewing the favorable results shown here, we asked the faculty this important question: Is the success of IMPULSE caused by factors such as amount of funding and the attention that both instructors and the institution have invested in the program, or is it the methods? Bob answered:

    My guess is it's because of the program. I think the students working together, solving problems in class, talking to each other, make a difference. They are talking about math. I would like to believe that this interaction, the working together of students, is what was causing some of this performance to be better. Yesterday, you called ten of my students out of my class for an interview, and one of them said, "Why do I have to go? I'm gonna miss this stuff." If anything, I would think the attention of observers has detracted from their learning. I give all my students attention. Even in the standard class I work very hard; I'm there all the time for them. But I can't teach the same way there as I teach in this IMPULSE classroom, because the standard class is not suited for teaming; I don't have computers. Now we do have a new computer classroom, so we're starting to do more of this in the traditional class. I've always been excited about using technology in the classroom.

Of note, his belief that it is the IMPULSE methods, not the special attention, that makes the difference in student learning is substantiated by Bob's comment that he has begun using these methods in his non-IMPULSE courses.


Implementation

It is important to keep in mind that implementing the kinds of changes made in the IMPULSE program requires hard work, planning, and access to good resources. During our interviews with the IMPULSE instructors, we explicitly asked "how" questions, such as:
  • "What kinds of new resources did you need?"

  • "What were the nitty-gritty tasks and problems you faced when you were just getting started?"

  • "How did you deal with the stresses that come with change?"

We also asked them for advice they would give to others who are about to embark on this path--things they would have appreciated knowing before they got started. Drawing on their responses to these questions, we present IMPULSE faculty insights and advice on how to implement the kind of learning environment they have developed. We start with the personal resources and processes for learning from colleagues across the nation that the IMPULSE faculty believe are crucial for "getting going." We then consider technical resource issues, which can pose major problems for faculty at many institutions. We finish by presenting a set of issues pertaining to the cultural and institutional factors that shape faculty teaching and student learning practices.


A. Processes for Getting Going

1. Personal Resources

Leadership
So far, we have presented the key players in the IMPULSE program as if they performed similar functions. In fact, they had different roles. Nick Pendergrass played the key leadership role in conceptualizing, funding and administering IMPULSE. The other members of the group--John Doud, Raymond Laoulache, Renate Crawford, and Robert Kowakczyk--took care of the classroom and curriculum issues.

As the leader of a program of this type, Nick had valuable personal resources. He drew on his two positions within the university--member of the Electrical and Computer Engineering faculty and Associate Dean for Undergraduate Programs in the College of Engineering-- to promote this programu. We also learned that Nick has charisma and vision, and is an astute leader. For example, he orchestrated one of the key processes that enabled this change at UMD, the formation of a working group of faculty across many disciplines who already had demonstrated a strong interest in reformv. As one of these instructors, Bob Kowalczyk (mathematics), explained, "I think what brought me into it was that I'd been using technology and teaching mathematics for quite a while, and when Nick talked about this new kind of idea--using technology with engineers and even a computer classroom, that sounded very exciting to me." (For more on Nick's role as a leader, see Discussion 3. The role of organizational savvy. Additional insights into the role of leadership in launching this program can be found in Discussion 1. History of the IMPULSE Program.)

Reform-ready, technology-savvy frame of mind
Key personal resources that we noted among all the IMPULSE faculty are a remarkable willingness to cooperate with each other, and a strong desire to accommodate the engineering program.

Interest in integration across disciplines
Another important personal resource that characterizes the IMPULSE faculty is their interest in integrating topics across physics, math and engineering. For example, Bob Kowalczyk (mathematics) explained:

    Another thing I like about this program is the integration of topics between physics and engineering. In the traditional classroom, I'm over there in the other building and I teach the topics without regard for how and when students use them in other courses. But with IMPULSE, I get together with the physicists, and sort of know what they are teaching that needs calculus. So I can redesign the calculus curriculum and the syllabus in such a way that the topics I teach now are in step with what physics is doing. Engineering is also becoming more integrated with calculus. But it's not an easy task to integrate all these subjects.

These reformers' commitment to the integrated curriculum of IMPULSE is also evident in their wholesale reorganization of the content they teach. Bob Kowalczyk described how he adjusted topic order in his syllabus to meet the needs of engineering students taking introductory physics:

    In the third week of the semester, students are using vectors in physics. Now, in traditional calculus, we don't teach vectors until third semester calculus. So in the past engineering students wouldn't see vectors in calculus until their sophomore year. But of course they'd been using vectors in physics all along as freshmen. It never made sense, but we never did anything about it. So now, if you look at my syllabus, in the second or third week I'm skipping from chapter one to chapter ten, and doing vectors with the engineers: we're talking about dot products, cross products, components of vectors, and applications. The students are getting the content from a calculus point of view and go into physics and use it right away. So it's not like we are teaching them in the abstract.

    In the traditional calculus course we'd spend most of the first semester doing differentiation; we'd do all of the rules and applications. Well now, we do some of the differentiation rules and by week six, we're starting integration; the students need some integrals in physics. So in the new syllabus, we do a little bit of differentiation, then we introduce integrals and do a little bit of integration, so the physicists can use it. And then we return to differentiation again including the more advanced rules and applications.

2. Learning from colleagues across the nation and from each other

Academic institutions are very traditional enterprises. Even in disciplines like engineering that are generally very responsive to technological changes, wholesale curriculum changes are difficult to achieve. Getting educational institutions to undertake change not only requires leadership and other personal characteristics (such as a reform-ready technology-savvy frame of mind), but also the establishment of organizational processes that can get the project going and sustain it. Key among the organizational processes that the UMD faculty identified as critical for them is a period of preparation during which they learned from colleagues across the nation.

The IMPULSE faculty identified the following reasons why learning from colleagues across the nation is critical for launching a program such as theirs. External expertise:

  • often lessens the probability of personal/departmental conflicts and jealousy, and thus increases the chance that new reform ideas will be adopted;

  • validates new methods that are being promoted;

  • shortens the trial and error period for designing a successful curriculum to act as the backbone of program;

  • enables faster adjustment to the challenges of implementing a truly integrated curriculum and team-based pedagogy; and

  • enables adapters to make more informed decisions when designing expensive new technology-based classrooms.

For more on this topic, see Discussion 4. How learning from external experts was critical when getting started. See also Discussion 1. History of the IMPULSE Program.


B. Material Resources

1. Special project funding

See Discussion 1. History of the IMPULSE Program, for information about the external funding, and special internal support that the IMPULSE faculty obtained in order to launch this program.

2. Technical Resources

Implementing a program like IMPULSE at UMD, where students perform experiments, and then capture, analyze and graphically represent their data, requires significant technical resources. We consider the (1) hardware, (2) software and (3) technical support resources that these faculty needed to achieve their pedagogical objectives. We address these three topics separately, although in many cases, they are connected.

Hardware
Nick explained that the move to a studio classroom environment entailed a heavy initial hardware investment but also enabled a fundamental change in how basic physics, math and engineering material is learned in IMPULSE, a change that showed promising outcomes in student learning (see the section on Outcomes), and positive changes in classroom management activity for the instructors. He made this point as follows:

    It is a fundamental philosophical change to move from laboratories that had 12 students to ones that have 48. And it's remarkable how this works. Physics education researchers have discovered that students learn better when you tell them to work with each other, to question each other, and to ask a question only when the team cannot solve its own problem. And with teams, you have 12 pretty well organized problems coming at you, rather than 48. So it's not the same classroom management problem. What's remarkable is the students are getting a very personal experience. They believe that they're better connected with the instructor. We've picked this up in some of our surveys.

    You have to get the resources for the equipment for 48, but it's worth it--I went through the numbers: You abandon the lecture-recitation- lab design, and you lose all the scheduling nightmares that go with it--80 to 150 students in the lecture, 24 in each recitation, 12 in each lab. In its place you put 48 students in each studio class. We will merge the class and lab and meet four hours per week like the RPI model. It will be team-based, and we will put the instructor in there with the undergraduate TA and they'll go around and help. The philosophical change meant that we had to equip 12 tables serving 48 students as opposed to four stations in a 12-student lab. That involved a bigger initial investment.

In the early stages, there were technical challenges for faculty who were expected to be "Jacks and Jills of all trades" and have answers for any problems, technical and otherwise. They had learned by observing at RPI that having a faculty member and an undergraduate teaching assistant in room at all times was critical.

The number of technical problems declines over time, but it is still necessary to have a strategy for coping with equipment failure in the steady state. To accomplish this, the IMPULSE instructors use spare equipment to keep the classroom running and bypass the notoriously slow university repair process. Nick explained that when every seat in the class is occupied, every piece of equipment must work. And to work around the slow university repair processes, they arranged for a spare computer for each of the classrooms so that they can just replace defective equipment. "It takes five minutes to pull out the tables, plug it in and put it back together. The hard part is to keep the spare from becoming somebody's machine." While this approach works fairly well, particularly if the classes are not enrolled to capacity (48), it is still necessary to anticipate irritating and expensive problems with the hardwarew.

Another central hardware-related issue is the need to keep the equipment current. In order to be able to minimize loss and be able upgrade their systems at modest cost, the IMPULSE faculty chose to use desktop, rather than laptop, modelsx.

Software
The studio classrooms computers are loaded with the mathematical and engineering software packages (e.g., AutoCAD, Maple and Mathlab) that are used in the IMPULSE program. The effective functioning of IMPULSE depends on effective management and upgrading of the software on all these computers. This management work, including clearing hard drives and loading software onto individual machines (which are connected to an NT server), is done using Ghost, a program that permits the creation of an image from a standard computer configuration, including the operating system and applications. With Ghost one can easily copy this image to one or more machines. At UMD, Ghost allows technicians to, as one interviewee put it, "take an entire room completely down and back up with everything on the hard drive replaced in seven minutes. So now we can do it every two or three days."

If managing the application and operating system installation is cost efficient, keeping the application software current is not, and it is not easy to find the resources to purchase upgrades. As Nick explained, upgrading AutoCAD alone requires a substantial financial commitment--"$4,000 or $5,000 a year, or every time they do a major upgrade." Nick was worried about how to cover these costs, especially when the concept of the program itself was one he was still selling to the administration. The technicians working in IMPULSE also were concerned. They knew of no plans to cover future costs for updating the hardware and software, and for providing sufficient staff support were not adequate. Given the heavy reliance on experimental and computer-based equipment of this program, this could jeopardize long-term sustainabilityy.

Technical Support
At UMD, the shift from a lecture-recitation-lab approach to a studio-type classroom was not accompanied by an increase in technical support. Over the last few years, as IMPULSE grew from a one-studio classroom to four, the technical support remained the same. Raymond Laoulache (mechanical engineering) expressed his frustration about this, stating that with program growth, they need a technician whose time is dedicated full time to the program--someone to "troubleshoot the computers immediately, and load software immediately." Other IMPULSE instructors felt that the equipment so far had been pretty reliable, although they knew that this might be due to the relatively newness of the studios. Overall, the growth of the program was stretching the capacity of the technical support staffz. This situation was made more worse by the fact that often technicians only have the 10 to 15-minute time period between classes to intervene and act on an equipment failure.


C. Organizational & Cultural Issues

The implementation of IMPULSE at UMD entailed not only a fairly drastic shift in instruction processes, but also involved integration across different disciplines. Such an effort carries many challenges at organizational and cultural levels. Here we consider "culture change" challenges and opportunities that the IMPULSE faculty encountered, and how they responded to them. In particular, we present the challenges/opportunities associated with (1) territoriality and resistance to change, (2) rewards, and (3) adapting to team- and assessment-based teaching. For a more in-depth understanding of these issues, see Discussion 6. Changes in the instructor role.

1. Resistance

The IMPULSE program at UMD has the support of the majority of the College of Engineering and Mathematics Department faculty and relevant members of the administration. As expected, however, the attention and publicity generated by the program brought with it resistance and resentment on the part of some faculty. Political situations of this type frequently are encountered when implementing significant changes, and often pose major obstacles. As one IMPULSE faculty member explained:

    I think that among many faculty members, there's gratitude because this program is getting so much attention in the high schools; it's incredible. I get a call from guidance counselors periodically, different ones. "What's IMPULSE? I've got a senior all excited and talking about it because they heard about it from one of your students." You can't pay for that kind of advertisement. But at the same time there are some people who are very threatened that this should be working and would love to see this fail.

Indeed, while the program has achieved an impressive close collaboration among some engineering, physics and mathematics faculty, adoption of IMPULSE reform methods is not unanimous. For example, within the College of Engineering, the Department of Civil Engineering has elected not to participate in IMPULSE. We did not have an opportunity to talk with a civil engineering faculty representative, and the faculty and administrators with whom we talked carefully chose their words when asked about the civil engineering position on IMPULSE. For example, one faculty member said: "Civil has its own intro course. They like their course and see no reasons to put their majors in IMPULSE." When asked if he would consider the civil engineering faculty "passive participants" in IMPULSE, this faculty member answered in a manner that suggested emotional tension was involved in adopting the program. He said, "Yes. And that gets into a whole set of the problems of trying to get one of these things going. That part was truly amazing; I still have a few emotional scars from that experience."

A potential adapter of IMPULSE methods may ask how to proceed in a way that minimizes conflicts among colleagues. While fundamental reforms like IMPULSE will likely generate friction stemming from a departure from the institution's established processes, Nick Pendergrass identified the ability to quickly move IMPULSE out of the experiment stage into a more permanent format as the one crucial factor leading to its success. Based on his observations at other institutions, he stated that prolonged stay in the pilot stage is likely to lead to polarization of the facultyaa.

2. Tenure and other Rewards

The workload involved in teaching in the program presents an on-going challenge. The participating faculty with whom we spoke all reported a significant increase in workload. Bob Kowakczyk said:

    There's a lot more work in this kind of teaching and departments have to be made aware of this. If they can't come up with some kinds of incentives, as the program expands and more people get involved, some faculty may be reluctant to participate, to do this. But I find the extra work is worth it when I see the rewards and students doing better. I mean that's what we're here for, right? So I think I get my rewards from the improvement in learning.

At UMD, the effects on tenure and promotion associated with participation in a reform program like IMPULSE depend on the department. In the math department, participating in the program is considered a sign of creative pedagogy and is generally supportedbb. In engineering, the ability to become heavily involved in teaching-related activities seems dependent upon first achieving tenurecc. Renate Crawford, an untenured assistant professor in physics at the time of our interview, was unsure of how her department would view her participation in IMPULSE when reviewing her for tenure.

Another type of reward is release time to develop or improve courses. Renate Crawford's department gave her small amounts of "assigned time units" to work in the IMPULSE program. However, the amount of assigned teaching units is less than the actual time spent on the courses, as faculty are constantly being asked to evaluate, review and make changes to the curriculumdd.

3. Adapting the Team-based Studio Approach

The IMPULSE faculty faced another challenge when implementing collaborative methods in the classroom. They initially felt that they were not prepared to help students work effectively in teamsee. In order to learn how to use a team-based computer-dependent studio approach to teaching, the IMPULSE faculty had to:

  • learn to work together

  • integrate assessment into all aspects of the program

  • come to terms with the "coverage" issue.

Once they made these adjustments, the IMPULSE faculty found that using collaborative methods definitely resulted in richer and more frequent interactions among students and between students and instructorsff. The instructors also noted that the close working relationships among all parties in the IMPULSE program lead to much better advising of students. As one of the technical support people explained, because they are working together with a cohort of 48 students, the professors are able to identify problems with groups or individuals at an earlier stage than with the traditional lecture arrangement. Thus they are more able to intervene in real time and provide options, like tutoring, for students experiencing difficulty with the contentgg.

For more on how the IMPULSE faculty met the challenges of learning to use a team-based computer-dependent studio approach to teaching, see Discussion 7. Learning how to use a team-based computer-dependent studio approach to teaching.

4. Ensuring Sustainability

We asked the IMPULSE faculty to tell us what things would have made the implementation process easier "if only we had known that at the outset." Here we present several of their valuable "lessons learned" in bullet form.

  • Secure support at all levels of the institution

  • Plan a way to survive turnover in faculty

  • Get training from colleagues elsewhere in the nation.

  • Make the learning problems evident to departmental colleagues, but be diplomatic!

  • Make adaptations as necessary to suit your students.

For a discussion of these lessons, see Discussion 8. Tips for ensuring sustainability.


Summing Up

The IMPULSE instructors told us that high attrition rates, weak student engagement, and poor student performance motivated them to make improvements in their engineering program. To address these problems, they settled on a set of key strategies that are supported by evidence from around the country and that they believed would enable them to:

  • understand the reasons for poor retention, so that they could change the critical problematic factors,

  • help students understand why the foundation courses are important to success in engineering majors and future engineering careers,

  • help students relate the foundation courses to their own current experiences, and

  • provide learning experiences that are engaging and active and that result in greater gains in student understanding.

Their key strategies were to design a learning environment that incorporates the following elements:

  • integration of math, physics and engineering,

  • technology that supports conceptual and skill-based learning,

  • application of principles to novel situations, and

  • collaborative learning.

Another key strategy, at least in the view of the case study researchers, is the collection and use of high quality assessment information on what students are learning and why. As this case study indicates (Outcomes), these strategies correct the problems that motivated these faculty to develop the IMPULSE program. Data from the first year pilot program showed that it more than halved the attrition rate of first-year engineering students, and nearly doubled the percentage of students passing two semesters of physics on schedule. Assessment data also showed that the IMPULSE students were engaged with and more effectively learning the concepts presented in their math, science and engineering courses: the percentage of students passing calculus on schedule increased by 40 percent, and IMPULSE students, taking common final exams in calculus, scored on average more than a grade point and a half higher than comparison students--despite the fact that the percentage of enrolled students who actually took the final was much higher for the IMPULSE program.

Why does this program work so well? At the outset, we learned from John Dowd that a key strategy of the IMPULSE program is that "students are learning in a cross-disciplinary kind of way... So that when they're doing math, they're also doing physics." He also named other key strategies: "The technology is a part of it; the teaming and the active learning are very important." And then he gave us his underlying reason for why these strategies work: they enable students to "realize that these subjects somehow connect to each other."

In our experience, the IMPULSE Program's goals, strategies and student learning outcomes are valued by the vast proportion of the nation's science, mathematics and engineering faculty. We find it notable how effectively quickly this program was institutionalized at UMD. We look forward to new information about the long-term outcomes of the UMD program. Last, we hope to learn that others around the country find this account of their efforts useful.


Reader's Guide for the LT2 Case Studies

Special terms appear in the Glossary. The first time one of these terms occurs in a major section, it appears underlined and the definition is available in a mouse-over box. (These definitions also appear as lettered footnotes.)

All citations to which the case study refers are listed in the References.

Technical asides are indicated by a numbered footnote marker and available to the reader in a mouse-over box. These asides also can be found in the Endnotes.

Lengthy quotes from participants that illustrate a point often are available in mouse-over boxes (also as lettered footnotes), for the benefit of the reader who prefers to read the participants' own words.

Various topics introduced in the study are developed at greater length in Discussions (specified by number) to which the reader is referred at relevant points.

The reader is referred at relevant points to various other Resources (specified by letter). Among these is a short description of the Methods Used to Produce this Case (Resource B).

Of note for users of the web version: Clicking the "previous page" button will take you to the previous linear section of the case study, not necessarily to the page which you lasted visited. Clicking the "back" button of your web browser will return you to the section last visited.

In the quoted material, the researchers are identified as "interviewer" the first time their voice appears an interview segment. Lengthier quotes appear in italics.

The instructors and administrators who are identified in the case study read the document and gave us permission to use the quotes we attribute to them. These University of Massachusetts Dartmouth readers also affirmed that this case study conveys the essence of what they were doing in the Spring of 2000.


Discussions


  • Discussion 1. History of the IMPULSE Program

  • Discussion 2. Making the connections: A discussion of the key principles used in the IMPULSE program
    "Integration--the fact that students are learning in a cross-disciplinary kind of way--is a key goal."

  • Discussion 3. Student views on the effectiveness of the IMPULSE program
    "I know for me personally IMPULSE helped me develop a really good work ethic"

  • Discussion 4. How learning from external experts was critical when getting started
    "Do you sit down and invent something, or do you look around and see what other people are doing? My expertise is in chasing physics particles"

  • Discussion 5. The role of organizational savvy
    "Everybody I've talked to has told me that you have to institutionalize it."

  • Discussion 6. Changes in the instructor role
    " If you put up a wall between you and the students, you've killed the program"

  • Discussion 7. Learning how to use a team-based computer-dependent studio approach to teaching
    "...in terms of concepts, the students know more. Now, you would think that if you know the concepts better you should solve the problems better, but..."

  • Discussion 8. Tips for ensuring sustainability
    "The first thing you need is to find a group of people that will communicate with each other."

  • Discussion 9. ASEE Conference paper by Pendergrass, Laoulache, & Fowler
    "Can an Integrated First-Year Program Continue to Work as Well after the Novelty Has Worn Off?"


Discussion 1. History of the IMPULSE Program

During 1994, Nick Pendergrass (electrical and computer engineering), John Dowd (physics), Bob Kowakczyk (mathematics), Jim Golen (chemistry) and Bill Nelles began discussing the increasing difficulty in motivating and educating freshmen. They read papers by
Johnson, Johnson and Smithhh and by Foundation Coalition leaders, and decided together that they wanted to adapt the best ideas from these papers to improve the situation for first-year engineering majors. As do many faculty who are motivated to make major changes, they planned to seek external funding, and began thinking about a proposal to submit to the Davis Foundationii.

In late June of 1995, while at the annual meeting of the American Society of Engineering Education, Nick attended sessions that focused on the freshman year and how to conduct assessment to demonstrate whether reform efforts were achieving their goals for student learning. He also spent as much time as he could with these presenters, some of whom responded by inviting him to visit their institutions. So it was that he and the other four UMD faculty visited Texas A & M University, where they learned a great deal about what freshman integrated curriculum programs.

    Nick explained, We were at Texas on a Friday. It was remarkable to watch these teams of students in class, and to just walk up to them and have them explain why the program was--and was not--working for them. We were impressed with what we saw and we figured, "We can do this at UMD, and maybe do it even better!" On Saturday morning we outlined our proposal to Davis [Foundation]. Then we went to an A&M football game, and came back and finished the proposal. Talk about a bonding experience!

    If anyone is interested in doing a program like this, they should go into one of these classrooms and observe here at UMD or at some other Foundation institution.

Knowing that the Davis Foundation insisted on good assessment, Nick decided to also attend the upcoming Frontiers in Engineering conference. There he connected with Gloria Rogers, an assessment expert at Rose-Hulman University and member of the Foundation Coalition, and Karen Frair, a Foundation Coalition leader from the University of Alabama. He incorporated ideas and methods they were using into UMD's Davis Foundation proposal. Nick and his colleagues submitted their proposal on November 15, 1996, and learned by late December that it was funded for $180,000.

During the next 20 months (January 1997 through August 1998), they used Davis Foundation and some campus funding to prepare for their fall 1998 pilot program. During this time, the following important events took place:

  • A studio classroom for 48 students was built;

  • Ray Laoulache (mechanical engineering) joined the group;

  • UMD invited experienced people from Foundation Coalition to help them learn new teaching methods and change management strategies; and

  • UMD was invited to participate in the Foundation Coalition's proposal to the NSF for a second five years of funding.

The Foundation Coalition NSF proposal would provide support to UMD for assessment, course development time to expand the IMPULSE approach into sophomore- and junior-year courses, "change management" activities, and travel to conferences where IMPULSE could present their efforts and continue to learn from an emerging national learning community of faculty who were exploring the value of combining technology, teamwork, and decentered teaching. The Coalition proposal was funded as of October 1998. Among other things, this enabled them to engage an assessment specialist, Emily Fowler.

During the fall of 1998, IMPULSE began with an initial cohort of 48 first-semester freshmen who were calculus-ready and had passed the English placement examjj. In February 1999, when their initial assessment results were available, Pendergrass began a bid to institutionalize IMPULSE within the College of Engineering. This bid succeeded in May 1999, when all but one department adopted a modified version of the program. This meant that some 80 percent of the next cohort of freshmen would enroll in IMPUSE.

In March 1999, aware of the likelihood of expansion, Nick submitted to the Davis Foundation a second grant proposal for the funds for two more studio classrooms. In July 1999, Davis provided another $150,000 for these classrooms. UMD needed one of these to be ready by fall 1999. Due to this challenging implementation schedule, Nick decided he needed to release himself from the role of UMD PI for the Foundation Coalition and focus on getting the classroom built and on other campus-level matters associated with scaling up the IMPULSE program. Thus, he asked Paul Fortier, a member of his department (Electrical and Computer Engineering) to assume the PI role. In summer 1999, Fortier (at the time untenured) agreed, freeing Nick to accept the role of Associate Dean for the college, and focus on the classrooms. Nick remained active in Foundation as a member of its advance planning team.

With the second classroom built in time for the second IMPULSE cohort of 87 students in fall 1999, Nick turned his attention to the third classroom, which they needed by January 2000. They needed this third room for extension of the ideas of IMPULSE into the sophomore and later years of the engineering program. The new room was also needed to provide scheduling flexibility for IMPULSE classes because there were 41 pre-calculus students who needed "trailer sections" of IMPULSE classes in the spring semester. By January 2000, they had three studio classrooms operating.

By fall 2000, IMPULSE was serving 144 new freshmen students, and the IMPULSE leaders found themselves having to compete with other professors for time in the studio classrooms. As Nick explained, "Once the faculty found out what you can do in these rooms, you can't keep them out." Currently, these classrooms are in such high demand that it is difficult for the technical support staff to get in to repair machines.

The original group of IMPULSE faculty realized that it is important for IMPULSE faculty to be very effective with freshmen. They also decided it would be important to continually include new members in their group, both to renew the group with fresh ideas, and to replace people who retired or left for sabbaticals or other reasons. They learned that keeping the group resilient also depends on an effective process for integrating new members. John Dowd provided an effective prototype for integrating new members. Preparing to retire, he hand-picked and trained a successor. The training consisted of co-teaching his course with his successor (Renate Crawford) for a semester before he retiredkk.

Nick observed that the resilience of the core faculty is also significantly enhanced by timely changes in leadership. For example, Nick decided that, in order to ensure that the program took on a life of its own, and avoid the phenomenon of becoming too strongly associated with an initial founder, he would step out of the leadership. Of course, once on their own, the new course leaders had to experience some of their own "organizational learning." But it is precisely this kind of experience that enables the new leaders to "own" a program. Fortunately, the continuing faculty already had begun using formative feedback regularly and had learned the importance of getting together every week to discuss new information about the program, and make needed adjustments. As a result, the program is getting better, as demonstrated by student outcomes data as well as the robust group of faculty participants.

At the time this case study was completed (spring 2002), IMPULSE continues to thrive, serving 80 percent of the incoming first-year engineering students plus physics majors. In addition, the IMPULSE faculty and others have begun using many of the methods that have been found successful with the first-year students in their upper-division courses.


Discussion 2. Making the connections: A discussion of the key principles used in the IMPULSE program
"Integration--the fact that students are learning in a cross-disciplinary kind of way--is a key goal."

As noted in the section on Problems and Goals, the IMPULSE faculty focused on four key principles for designing an effective learning environment:
  • Integration of math, physics and engineering;

  • Technology that supports conceptual and skill-based learning;

  • Application of principles to novel situations; and

  • Collaborative learning

Integration of math, physics and engineering
The IMPULSE faculty chose to integrate math, physics and engineering in order to connect concepts, applications, and methods. In the traditional classroom setting, these subjects are typically taught in an isolated manner, leaving students the responsibility to draw the connections and relationships between them. The end result is that, while students spend hours learning derivatives, for example, they are unable to use them in a different context.

    John Dowd (physics) commented: Integration--the fact that students are learning in a cross-disciplinary kind of way--is a key goal. So that when they're doing math, they're also doing physics. They realize that these subjects somehow connect to each other. In the past, I've had students come to me who know how to take the derivative of x squared; it's 2x. I ask them: "What's the derivative of t squared?" "I don't know; we didn't study that." They're doing mechanics. Many of the derivatives and integrals are over time. They don't make the conceptual jump, going from one symbol to another. They don't realize the symbol can stand for anything. The technology is a part of it; the teaming and the active learning are very important.

Technology that supports conceptual and skill-based learning
Technology use is an integral part of the reform. IMPULSE instructors understood that today's students see computer use as a sign of a modern, up-to-date curriculum. Nick Pendergrass (electrical engineering) explained that using state-of-the-art equipment in the classroom helps get both student and faculty attention and interest:

    Our objective was to use technology to accelerate and improve the learning process and help motivate students to do what we wanted to get done. Technology somehow helps bring relevance to the classroom for the student. In the mind of today's student, if computers are involved and if a lot of video things are involved, they seem to be more willing to pay attention, and believe that this is a forward-looking environment that they're in. So it has a motivational factor for the students. There's no doubt about that. The other thing is that it's much easier with these classrooms to attract really first-rate faculty in to teach freshmen.

Learning content is always important to IMPULSE instructors, and they believe that the computer-based studio classroom, combined with this new pedagogy, helps students learn to apply content.

    As Bob Kowakcyk (mathematics) said: First of all, I hope they learn calculus better. So whether I teach in a traditional course or in this course, the underlying goals are the same: to teach them good insight into calculus as well as how to apply calculus. It's just the way we go about trying to impart the knowledge to the students and how they learn is completely different in the IMPULSE environment. And you know, this is still an experiment. We got some assessment data from last year. We compared the calculus students in IMPULSE to students in the traditional section, and it seems that with this new pedagogy they seem to be learning better.

Application of principles to novel situations
Getting students to understand basic concepts was a clear goal of IMPULSE. As noted by researchers in the national physics reform movements (e.g., Workshop Physics), students may be skilled in the mechanics of applying memorized formulas, but unable to relate these calculations and their results to the concepts. At UMD, IMPULSE instructors want their students to be both conceptual and problem-solving learners.

    Renate Crawford (physics) explained: I have really high hopes that get crushed a lot of times- that they really understand the fundamentals. We need a way to measure both their conceptual understanding and their problem solving skills. These are different types of skills. I would really like for them to understand the relationship between the two. Sometimes I give them a problem on one of the worksheets. They throw a ball up into the air and I ask them, "How long does it take to come down? How high does it get in the air, and what is the velocity? They plug in all the right values at each point. Then I asked them to graph it. Disaster. They think the acceleration is a sine function; but they understand that whenever they see "a" (acceleration) in a free-fall problem they should plug in -9.m/s28. But they don't really understand what they have just done. So that's what I'm hoping, that they understand the concepts, that they can do the problem solving and see the relationship between the two. And it hasn't happened yet, so I'm redesigning yet again.

Collaborative learning
Another important strategy for the program is to use collaborative learning methods, which UMD faculty called "teaming." IMPULSE faculty are very aware of the team approach to projects in industry. Their intent is to get students to function in collaborative settings that research shows facilitate learning citations, and that resemble working environments in the real world.

    John Dowd (physics) explained: For the engineers, teaming in itself was a goal. As Nick pointed out so many times, engineers work in teams when they get out. The companies don't want ivory tower type workers, who work by themselves in a corner; it's not how companies function. They work in teams; sometimes, a team works on a design project for six months or a year, then it is broken up. Then comes a new project and they rearrange the people depending on who's available. That was very important for the engineers.

A technician who works in the program gave his perspective on the need for collaborative learning:

    The whole emphasis was collaborative learning, okay, in a computer-based environment. In the traditional lecturing system, there seems to be some gaps, and the students aren't necessarily picking up the material, and they aren't necessarily relating it in between the individual courses. So they wanted to get the professors talking to each other, so that you would have a more cohesive program, and also get the students to work as groups, because we're training engineers. We educate students on a basis of individual performance, and then they get a job at a company like Boeing, they immediately are dropped into a team environment, because there is very little you can accomplish in engineering by yourself. So IMPULSE wanted to emphasize collaborative learning from a very, very early point, and also use this as an opportunity to bring in computers, and integrate the computers into the curriculum.


Discussion 3. Student views on the effectiveness of the IMPULSE program
"I know for me personally IMPULSE helped me develop a really good work ethic"

The following interview excerpt provides insights into why IMPULSE students believe this program prepares them for engineering better than the traditional approach.
    Student #1: Because you have a better relationship with your instructor. It seems like they work a lot harder. At first they put a lot into it and they wanted everything to go smooth. Some things were a little bumpy like the equipment didn't work as they expected it or whatever.

    Student #2: I know for me personally IMPULSE helped me develop a really good work ethic because I just had so much homework that now I feel the need to do homework. In high school I didn't do homework. I still did well in high school, but I never did my homework. But when I came here I just had to do my work because there was just so much of it; it was just overwhelming; they did give us too much, but it helped me out in the long run.

    Student #3: Also most of the time you didn't want to let your teammates down. In my group I saw the one kid letting the whole team down, and that really bothered me; so I didn't want to be the one that was doing that.

    Student #4: Working in groups does bring your average down. So when I go home I work hard.

    Student #5: The projects that you do in class help you learn a lot, too. The different concepts that you try on paper or see in the book, you can't quite understand until the physics teacher gets up in front of the class and does a little demonstration; and you're like, "Oh yeah, that's how it works." I felt that IMPULSE helped me absorb concepts and help me understand and learn things better, so in my classes now I can understand and learn things quicker.

    Student #6: We did more projects than the other people that weren't in IMPULSE. Like in engineering we did the cardboard boats and the bottle rockets, and that helped us learn so much stuff. It was fun, and no one else in the other engineering classes had projects that they had to do.


Discussion 4. How learning from external experts was critical when getting started
"Do you sit down and invent something, or do you look around and see what other people are doing? My expertise is in chasing physics particles"

Nick Pendergrass knew that, in order to succeed in piloting and establishing the
IMPULSE Program in a reasonable length of time, it would be necessary to learn from others nationally who already had tried many of the methods that interested the IMPULSE faculty. He took the lead in this regard by introducing a multidisciplinary group of faculty colleagues to national reform trends and arranging workshop sessions in which national experts trained them in, for example, teaming and collaborative learning methods. He reasoned that in the university environment, seeking external expertise often lessens the probability of personal/departmental conflicts and jealousy, and also increases the chance that new reform ideas will be adopted. It validates new methods that are being promoted.

    John Dowd described his experience of learning from other universities: So I started going with Nick and with other faculty members to different universities that were doing this. Texas A&M, for example was one. Also, we went to a couple of meetings at Arizona State. We also went to hear an expert in interactive learning give a typical teacher seminar. And then I started planning the curriculum and also the layout of the room. We did everything, from what equipment to buy to the nuts and bolts.

Another important process in which they engaged with significant help from colleagues across the nation was to select the curriculum that would constitute the backbone of IMPULSE. While the UMD faculty were "reform-ready," they did not have the expertise needed to design an 11-unit curriculum based on active learning methods. They sought assistance from Dr. Priscilla Laws at Dickinson College, and Dr. David Sokoloff at the University of Oregon, leading national experts who helped develop Real-Time/Workshop Physics4 and microcomputer-based laboratory materials that are designed to promote inquiry-based learning in physics. One of the IMPULSE faculty, John Dowd (physics), explained how they first linked up with the leaders of Workshop Physics:

    I was trying to figure out what to do. Do you sit down and invent something, or do you look around and see what other people are doing? My expertise is in chasing physics particles. I had no particular expertise in this [matter of active learning]. So I went to one of the Chautauqua NSF workshops at the University of Oregon, taught by David Sokoloff and Priscilla Laws. I went there particularly because I had read about what these people were doing.ii The next year, both Renate, who was at that time a relatively new faculty member, and I went to a follow-up workshop with Priscilla Laws at Dickinson College in Pennsylvania. So we had pretty good idea of what this is approach is capable of doing.

    I decided to adopt Priscilla's Workshop Physics as a basis for our course. Within our particle physics research group, one of our collaborating institutions is Rensselaer Polytech and one of our collaborators there taught in the RPI studio course. So a couple years before we instituted IMPULSE, Nick and I and two or three other people got the school van and went out there and spent a day at RPI looking at Studio Physics.

The faculty participants also committed substantial time to learning from external experts and then talking with each other and working out the integration of topics.

    Bob Kowalczyk (mathematics) explained: It probably took about two years of preparation. We visited all these other schools to see what they were doing, to see what the idea of teaming is. They have this whole philosophy of teams and cooperative learning. We went to workshops with Johnson and Johnson, and with P.K. Imbrie at Texas A&M. He's very good at this and did a lot of teaming workshops for us.

    Then we had to sit down with a physicist and determine how we were going to integrate calculus with physics. For example, in the standard calculus program, engineering students take calculus in the first semester, and physics the second semester, so they've had differentiation, integration and more. In the IMPULSE program they're taking physics at the same time they are taking Calculus 1.

Choosing the studio classroom design and deciding how many students each room should accommodate were also critical factors in the planning stages of IMPULSE. The faculty had to decide among several models, from one designed for 12 to 16 students (like the one Priscilla Laws uses at Dickenson College) to one for over 100 students resembling a large lecture hall (like the one used at Texas A&M). John Dowd (physics) explained that the factors important to him in choosing a classroom design were that it (1) promotes cooperative interaction and group dynamics among the students and (2) was feasible to implement in UMD's environment. They decided on a room that accommodates 48 students (more than their old labs, but fewer than their lecture halls), and that has big tables around with student groups sit. This size optimized a number of factors. Nick explained that they could tell administrators, faculty and potential funding agencies that, in spite of a large initial investment, the program offers savings in the long run, as Dr. Wilson at RPI had demonstrated:

    Jack Wilson at RPI started building this idea of the studio physics program. He discovered that it would actually be cheaper to deliver instruction that way, so they piloted it. I'm not sure that they did it all at one time because it's pretty expensive to build all this stuff. They discovered that the students responded, they did the work, and liked it better.


Discussion 5. The role of organizational savvy
"Everybody I've talked to has told me that you have to institutionalize it."

In this Discussion, we consider the organizational savvy and leadership skills that led to the conception and successful implementation of the
IMPULSE program at UMD. We primarily focus on the role of two reformers--Nick Pendergrass and John Dowd--who showed a great deal of administrative and curriculum leadership instrumental in getting IMPULSE going. To use a political metaphor, Nick Pendergrass (electrical engineering,) was like the campaign manager for the program. His contributions were numerous, from helping conceive a project that could reform freshman education in engineering at UMD, to making a case to the institution for support and acceptance, to garnering external funding, and encouraging its faculty. John Dowd (physics, now retired), on the other hand, took the lead with regard to curriculum design and pedagogical change issues.

Developing a community of reformers
High on the list of Nick's priorities was developing a community of like-minded reformers among peers in the disciplines of engineering math and physics. Nick accomplished this by focusing the conversation on a subject that many faculty members could relate to--students learning or the lack thereof:

    I got involved with some other faculty on campus. We were talking about how different the students coming in were; that students were looking at things in little stovepipe kind of cells. The thing that actually got the conversation going was a physics faculty who said, "Yeah, you know they've had calculus before they come to Physics 1. And I put up an integral and say you (the students) all know what this is? And they're staring at me blankly and it took me a while to figure out that the reason was because I had a "t" in it instead of an "x", under the integral." He had to reconstruct calculus to some degree in physics. And we started talking about that and realized that's a darn good place to learn calculus, by working with physics problems. Isn't that how Newton kind of got into it? It was particle motion. And so the conversation got going among several of us as faculty concerned about how students were learning today.

We asked Nick how he recruited specific faculty members from physics, math, and engineering--all disciplines with different traditions--to participate in IMPULSE.

    It was easy. Sitting around a table talking about our challenges. This is a small university. At the time physics wasn't in the College of Engineering. Neither was computer science. Math is in the College of Arts and Sciences. But those of us who were sort of interested in curriculum reform and making education better, we'd find each other and we started talking. I would go over and say, "I think we should put a proposal to the Davis Foundation to build a classroom that looks like this. Would you like to be involved?" That was easy. I also knew many of the faculty, like John Dowd in physics, for example. So we had exposure to each other. Then I talked to somebody in physics about "Well, who do you think might be really interested in picking up the calculus portion?" Bob [Kowalczyk] has been working for years on something called TEMATH, which is a software program that helps calculus students understand derivatives and integrals. And so there's a person who might want to move into a technology-equipped classroom to teach calculus. It turns out when I started talking to him, he was really frustrated that they didn't have a room like that, and he couldn't ever get one in the College of Arts and Sciences because of space and budgetary limitations. And so he jumped into this as a way to get his room. Turns out, he did get his room in the Math Department patterned after the model of the IMPULSE classroom.

Garnering financial support
If developing a community among peers at UMD was important, making a case for external support was crucial. As Nick pointed out, internal support only comes after the ideas have been blessed in kind by external agencies:

    We could show the Davis Foundation that it lowered the cost of instruction. The Davis Educational Foundation wants to improve education but also to lower its cost and so we were well-tuned to their particular criteria and they gave us $180K to launch this program. The university came up with something more than $180K to renovate the classroom. Our budget from the Davis Foundation included equipment and getting people like you to come and talk in order to get people to think that change is not bad.

However, not everyone gets to ask and receive this kind of financial support from external foundations, or for that matter his or her own institution. How did Nick mange to convince these groups to invest money in his idea?

    It's a whole lot of going around, showing slide shows, and telling people that it's a good idea: "Look what we can do with this." What really helped a lot was we had things from RPI and Foundation Coalition schools like Texas A&M where we could show attrition rate curves that looked good. What I could help them understand is that we're not doing a bold experiment. We're going to build on the best programs in the country.

In order to garner support from within UMD a certain amount of political skill was required. Nick realized that a dean-level position could help him significantly in gathering support for redesigning classrooms for an inquiry-based program:

    I started to hit some grants in that direction. The Dean decided to change my title to Associate Dean for Undergraduate Programs so that building classrooms could be done more easily. And it would add a little bit more credibility to the idea... At some point, I started devoting 100 percent of my time, other than my teaching load, to worrying about undergraduate programs, especially the freshman year.

John Dowd (physics), who is now retired, was an original member of the team. He explained the importance of getting buy-in from all segments of the institution for a program as ambitious as IMPULSE. He also commented on Nick's skills in crafting this coalition and selling the program:

    Everybody I've talked to has told me that you have to institutionalize it. You have to have buy-in from everybody from the president of the university to the faculty members who are going to be teaching, to the TAs and to the students who are going to be in it. And you can't do that if you rely exclusively on one person. I had planned to retire. So I went into this knowing that in two years I'm going to retire. I'm going to get this thing going and then walk out the door after it runs the first year. So I was aware that I had to do as much as I could to involve other people. Having it spread over three different classes was a big help with that.

    Nick was fantastic in getting buy-in from the top --- the Engineering School, his compatriots in the Engineering School, and also the higher-level administration. The administration saw IMPULSE as something they could show off on the campus. That's one of the reasons for having the fish tank (large glass panels allowing passers-by to look into the classroom). They could bring the Trustees or other VIPs for them to look at how hard the students work. This is one of the things the fish tank window does, despite the fact that the kids hate it.

It certainly did not hurt having the former Dean, now Provost, as part of that supportive coalition:

    I had a dean who was not at all reluctant to change things, as long as we didn't get the faculty really unhappy. We verged on that last spring. He was proud of being able to show these statistics to the provost's office and so on. So, he was an advocate for doing it. Because we knew that typically if the Dean becomes for something, the faculty become against it, he had to quietly be in the background helping with resource issues and so on. At lunch the other day, he looked at me and he says, "I still don't know where you got all that money." He provided some of it. He may not even be aware he provided some of it. But, it worked.

Designing the program
Lastly, there were programmatic and curriculum issues that needed to be tailored to the institution's culture. One key strategy here was to run a "pilot" program. Nick explained that the term "pilot" helped alleviate anxiety that can accompany an abrupt change:

    We were able to get people to buy into it as a pilot. It would have been virtually impossible to start this program without the word pilot in it. Let us try it. We will have firm milestones. We will prepare and then we will review the program. Our plan was to review it in the late spring and into the summer after one year of operation. We ran it the first semester and we were already seeing, even by midterm, remarkable results.

A second key strategy was reducing the number of credit hours allotted to IMPULSE, which meant re-writing material being adapted by the program.

    John Dowd (physics) explained: I realized I was going to have a problem using Priscilla Laws' material. She had a six-hour block of time that she uses. I forget exactly how it's broken up, but it's six hours a week. One of the great drawbacks of bringing the engineering faculty up to RPI is that they discovered that physics supposedly can be taught in four hours. That was the lesson from RPI-- that physics can be taught in four hours. That established the parameter. I didn't expect that to happen, but nonetheless, that's what happened. We had been teaching physics in six hours, but after all was said and done engineering decided that if it could be done in four hours, so much the better. I'm exaggerating, but nonetheless, that was a hard and fast parameter to sell the program, I think, to the engineering college. The students wouldn't spend that much time in it, but we were then trying, I kid you not, to have them learn as much or more than in the traditional six hour courses.

And a third strategy was weaving the idea of assessment into the program and getting colleagues to accept the validity of the resulting assessment data. This task required some doing on Nick's part:

    We knew we wanted to do the assessment. We also knew that faculty would, if they had some clues about the program, start to say, "Well, if students self-selected their way in there, they must be the better students." So we decided that we wouldn't. We would simply randomly select. And that's what we did. So, the students showed up at orientation. They were told they were IMPULSE students and we took them in the classroom. We were also concerned about faculty reaction to the assessment data. So we hired an assessment specialist and we put the entire data gathering into institutional research.


Discussion 6. Changes in the instructor role
" If you put up a wall between you and the students, you've killed the program"

We asked the
IMPULSE faculty and staff how the use of computer-based technology changed their role as instructors. In general, they felt that shifting from a lecture-based style to a collaborative environment did not come naturally and easily to faculty whose entire teaching and learning experiences were based on the lecture method. The following is a summary of their responses to our questions.

The shift to the "coach" role
The IMPULSE faculty made clear that in order for this program to be successful, they needed to play more the role of coach than any of them were used to.

    Nick explained, For example, both the physics and calculus instructors are frequently in the mode where students are showing up at their door asking for advice about how to approach an engineering problem that involves some physics or math. To the extent that we're able to pull that off, to be able to recast the instructor as a mentor and coach instead of being this person who lectures to you and demands that somehow you got to redo it on the exam, we are successful.

In part because it was challenging for the faculty to adopt a coach role with students, they found themselves more inclined to learn from one another.

    As Nick put it, I don't think we've exploited this coaching method as far as we can. It started to work especially well last semester, as the faculty progress on this learning curve, and they learn to work together. We're not used to sequencing courses and worrying about projects together. That's an interesting thing for faculty to get involved with. As they've learned together, we're getting better at it.

Integration of subjects was slow in coming, in part due to struggles faculty faced in applying collaborative learning methods in an inquiry-based classroom.

    Ray Laoulache (mechanical engineering) explained: The problem we faced is that originally we did not integrate the coursework. We could not integrate the subjects, because it was the first time ever in our lives we walked into a classroom where we are going to tell the students it was active learning. So I had to recondition my whole thinking from being an active professor on a blackboard to being a mentor, OK? So it took me a while, it took me a semester basically to bring myself up to speed, learn about all the aspects of teaming. There's a learning curve for everyone. Now that I've got that under control, now we are spending more time on integrating.

In fact, as IMPULSE faculty discovered, promoting teamwork was easier said than done. According to Ray, it takes a special kind of teacher to be successful in a team-based classroom:

    I'm able to walk into the aisles and talk to the students, pat them on the shoulder, be involved so they feel I am part of the team. When the instructor walks into the classroom and he or she still has the attitude "I am on the blackboard," then he or she is doomed. If you put up a wall between you and the students, you've killed the program.

Ray believes that the success of the program depends heavily on the ability of the participating instructors-a concern that has significant implications for whether a program will outlast a particular group of faculty participants:

    My opinion is that, in large measure the success of IMPULSE-like programs depend on the personality of the instructors. Let's say you take two instructors and send them to a workshop by Johnson & Johnson on active learning, and they both understand the material and what it takes to develop the curriculum. You put them in an IMPULSE program, and you could have one person doing a hell of a job with the students and another who did not get through. It is a fine rope to walk on when you are dealing with teaming, because if you don't find what it takes to trigger an interest in the students, you will fail to teach the students.

Students with whom we spoke appreciated a more personable relationship with their instructors. The general sense we got from students is that IMPULSE faculty genuinely cared about them and about their success in the program. The following is an exchange between the case researchers and a group of students discussing the impact of IMPULSE reform on the teacher-student interaction:

    Student #1: Well the teachers we had, they were more personable. They tried to get very close to you and not be professors. "Professor Laoulache" was "Ray." The other ones weren't that relaxed, but they weren't just standing up there.

    Student #2: They all know who you are, not just because you're the geeky kid that sits in the front and is always kissing up to them, or you're the stupid kid that's always coming in late and getting in trouble. They just know who you are, so that when you go to talk to them, you don't feel intimidated or anything like that.

    Interviewer: Does the relationship you developed with the instructor continue on in other classes?

    Student #3: He's always asking how everybody is doing whenever you run into him. He's a mechanical engineering faculty. I'm not going to have him again ever for a class.

    Student #4: When you see them in the hallway they're like, "Oh hi, how are you doing? How are classes?"

    Interviewer: Do you think all of these positive things could have been achieved without computers?

    Student #5: Yes, but I don't think the teachers would go for that style. You're sitting there working on the computer, the teacher doesn't have to stand up front lecturing because you're loading a program that runs for you off the computer, the teachers do have the time to go around and talk to you individually.

Managing the increased workload
Almost unanimously, faculty at UMD pointed to an increase in their workload as a result of their participation in IMPULSE. Ray reflected on his own erroneous impression that it is easy to work in an inquiry-based environment:

    I was told that when you are dealing with an IMPULSE active learning classroom, you don't lecture at all, you are free from this preparation, you go to the class and you sort of relax, you ask questions. But I am working twenty times harder in this course than I used to in a traditional course. It turned out to be there is more work in IMPULSE.

The increase in workload has to do in part with a shift from a lecture-style teaching to one where the teacher constantly interacts with his/her students. The interaction is a carefully guided one, in which faculty encourage students to explore science and let them do the exploration.

    Renate Crawford (physics) explained: It's a completely different environment. It's a completely different energy level that you need for it. In a lecture class you are completely in control. You go in there with prepared notes, which you probably have used before. I always encourage questions, but still it's completely different than walking from group to group, figuring out what they are doing, where they are making mistakes, where are they getting hung up. You have to ask them the right questions without telling them too much which is so tempting, because you feel like, "Please now understand that you do it this way." You want to just sit and do it for them. So it's a completely different energy level. A lot of the times when I come out of there, I feel like I'm drained. You have to give it so much because if you go in there and you just sit back and let it happen, they are not going to get the same learning experience; so you have to walk around; you have to be very interactive with them.

A fair amount of work was also required to adapt the material from RealTime physics so that it could be used in a four-hour time slot at UMD.

    John Dowd explained: Also, you don't have to but we all produced a lot of written material. I essentially edited Priscilla Laws' book from one designed for six hours a week to one designed for four hours a week, and put it out every week. I did two labs a week. With fourteen weeks, that's twenty-eight labs. They're all nine or ten pages long. That's a lot of work.

We did find one instructor in the program who felt that the IMPULSE methods, particularly use of technology in the curriculum, were quite in line with the kind of teaching he had been doing for years.

    Bob Kowakczyk said: Well, I don't think they were at all surprised, because, as I mentioned earlier, another professor and myself have written a math software package that was published back in 1991; so for at least ten years already, he and I have been using technology in the classroom. We've been bringing a computer into the classroom; before it was LCD panels and now the projectors, so we've always been using technology in the classroom. It was either [another professor] or I as option to teach in this program. I won (or lost, depending on how you look at it!) and I became the first instructor in IMPULSE. I don't think anyone was surprised in my department that I became enrolled in this program because we were the two instructors out of the whole department that were using technology the most. That has changed. Now a lot more instructors in the math department are using technology because we have our own computer classroom.


Discussion 7. Learning how to use a team-based computer-dependent studio approach to teaching
"...in terms of concepts, the students know more. Now, you would think that if you know the concepts better you should solve the problems better, but..."

In order to learn how to use a team-based computer-dependent studio approach to teaching, the IMPULSE faculty had to:
  • learn to work together,

  • integrate assessment into all aspects of the program, and

  • come to terms with the "coverage" issue.

Learning to work together effectively
The process of learning to work together as an effective teaching team involves keeping communication lines open and being flexible about accommodating each other's topics. Bob Kowalczyk (mathematics) explained that working together means being flexible about changing one's course curriculum to fit the needs of others:

    I think if you want it to work you have no choice. It seems like physics has pretty much a set curriculum and both engineering and calculus had to really adapt to the physics content. So it wouldn't make sense for me to be doing differentiation while the physics instructor is doing integration. If I change the order of topics a little bit, maybe I can teach them integration before they actually see it in physics. When students see integration the week after in physics, they say, "Oh, gee, I just learned this in calculus so now I'm seeing it in physics." It makes more sense.

Bob Kowalczyk also explained that once communication is established, it is easier to synchronize the content in ways that reinforce the concepts through real-world applications.

    The main thing is that the engineering, math and physics faculty have to talk to each other. We try to meet during the semester and see how things are going. The engineering professor last year said he was doing this rocket project and asked if I could do some linear approximation in my calculus class. So the next class, I went in and fortunately I was near that material, and I did some linear approximation so he could use it in his engineering project. So, there's that kind of interdependency between the faculty. We probably should do even more of that because the more we talk and work together the better things get. But of course sometimes it's not that easy. Besides my students, I think I need to be in contact with the engineer and the physicist.

Integrate assessment into all aspects of the program
In addition, the
IMPULSE faculty had to integrate assessment more fully into the structure of their courses.

    Nick explained: I believe we are better prepared for accreditation because IMPULSE has continuous improvement loops built-in. This gets us to build an assessment way of thinking among the faculty. You do one of these courses; you're assessing it.

Come to terms with the "coverage" issue
Another difficulty encountered by all faculty who adopt collaborative learning methods is the "coverage" issue. Nick explained that there is a serious on-going debate about whether to focus more on the content presented or on the content learned:

    You know, the first question that comes up is, "Wait, with cooperative learning, I don't think I can cover as much." Well, the answer is, "Yeah, you can't cover as much when you expect them to really learn it." On the one hand, people feel like they have to cover all that stuff-write it on the board, make sure they at least see it, even if they don't understand or learn it? We've been through these arguments, and this is a debate we're going to have forever. The only way to deal with that debate is to decide what it is that's important in your classroom and then go measure it.

In addition to the view that some lecturing should be retained, some instructors clearly feel that students should be proficient in solving rote problems, particularly given that these students will take more traditional courses after their IMPULSE experience.

    Indeed, as Nick explained, physics had made some adjustments to that effect: Physics decided this year to actually run a recitation with this classroom, but it's 48 students in a one-hour recitation. They wanted to do this because the lab is merged into IMPULSE and they were uncertain that the students were actually developing their own understanding, and they felt they needed to work the more rote problems to manipulate the symbols and work the problems. So they decided this year to have an extra hour of recitation. And I think that's worked in their favor. But the total number of class hours went from six to four during the pilot, then to five.

In this regard, John Dowd (physics, now retired) suggested that future adapters turn to education research to address the matter of why it is that, while students shine at learning in these inquiry-based environments, they do not shine in problem solving:

    There's no question that, in terms of concepts, the students know more. Now, you would think that if you know the concepts better you should solve the problems better, but they don't practice solving problems as much. If I was in this game longer, I would investigate if conceptual learning helps you do better in problem solving. Apparently, it doesn't hurt, but I think it should help somehow.

And, as would be expected, there is a portion of faculty and staff who hope for a return of the old ways.

    For example, a computer support personstated: I don't particularly feel that IMPULSE trains the students. Once they leave this place, they're not going to be using virtual probes or virtual equipment; they're going to have to know how to use a piece of equipment that happens to be sitting in the company that they're working with. And I don't think it's going to be all computer-driven. If what we're doing is preparing students to go out in the working world, I don't think that's a proper way to do it. I'm still for the old school way. My general feeling is they're pushing towards more the engineering and designing end of the education and from what I've seen, and I guess this is true everywhere, it depends more on what the student wants to do. If a student has a mindset to go further, they go further.


Discussion 8. Tips for ensuring sustainability
"The first thing you need is to find a group of people that will communicate with each other."

Secure support at all levels of the institution
As noted above ("Rewards"), the
IMPULSE faculty all reported that implementing this kind of program requires an increase in workload. It is unlikely that a program requiring this increased level of effort can be sustained without support and recognition. Ray Laoulache put it bluntly: "If you do not have the technical support and the institutional commitment to support you unconditionally, don't even think about it." All the IMPULSE faculty we interviewed confirmed that it is imperative that at least department chairs, and perhaps deans, recognize this extra commitment.
    Rob Kowlczykexplained: I think within the department you need to get a lot of support from your chair, because teaching in this environment is a lot more work than in a traditional classroom. In a traditional classroom I can just go in and lecture for fifty minutes and I'm done. Where in this classroom it's much more difficult to do that, because you've got all these students in teams and if you try to lecture too much, students would just start socializing, and whatever. I've been hearing from the other instructors that have been teaching in IMPULSE this year, that they are finding teaching in this program is more work than in the traditional classroom.

In this regard, some IMPULSE faculty were beginning to ask for release time to participate in the program.

On the managerial side, it is important to have good technical support as well as somebody who will be responsible for all technological program details.

    A computer support person told us: You need a strong project manager if you're going to do this. I have no project management training, and I don't claim to be a project manager. The biggest thing that this project lacked was someone to sit in on all the meetings and make sure that all the details were taken care of. If you're going to do it, make sure the technical expertise is in place because the faculty does not necessarily understand the implications of some of their decisions. Make sure that there is a budget for technical support as well as money to update hardware and software.

Plan a way to survive turnover in faculty
It's an old story: a really innovative instructor gets going, changes a course, leaves it, and the course drops back to the status quo, as if nothing had happened. Nick explained that they made a big investment in this program and didn't want that to happen. He explained that to keep it going, the following "team" factors are critical:

    The team effect among the faculty makes it much more stable. If there is a team effect among the students, that makes it much more stable too. If you've got a faculty member who decides just to lecture through the math class-and it did happen some this semester, the students push back at her a little bit. The students will start to correct the things the faculty didn't do. So those are the exciting things, and at the same time they also deal with the sustainability problem.

This said, IMPULSE faculty also made clear that thoughtful selection of the central group of participating faculty-whether those who start out with the program or sustain it over time-is of paramount importance. The instructors should not only be "reform-ready" and willing to collaborate across disciplines but also be good communicators.

    Bob Kowalczyk (mathematics) explained: The first thing you need is to find a group of people that will communicate with each other. If the engineers, the physicists, and the mathematicians don't talk to each other, you can get into a lot of trouble. I think it would become a traditional program in a different kind of classroom. So get whatever support you can to be sure that the different faculty and the different college and departments communicate with each other.

The original group of IMPULSE faculty realized that it is important for IMPULSE faculty to be very effective with freshmen. They also decided it would be important to continually include new members in their group, both to renew the group with fresh ideas, and to replace people who retired or left for sabbaticals or other reasons. They learned that keeping the group resilient also depends on an effective process for integrating new members. John Dowd provided an effective prototype for integrating new members. Preparing to retire, he hand-picked and trained a successor. The training consisted of co-teaching his course with his successor (Renate Crawford) for a semester before he retiredll.

Nick observed that the resilience of the core faculty is also significantly enhanced by timely changes in leadership. For example, Nick decided that, in order to ensure that the program took on a life of its own, and avoid the phenomenon of becoming too strongly associated with an initial founder, he would step out of the leadership. Of course, once on their own, the new course leaders had to experience some of their own "organizational learning." But it is precisely this kind of experience that enables the new leaders to "own" a program. Fortunately, the continuing faculty already had begun using formative feedback regularly and knew the importance of getting together every week to discuss new information about the program, and make needed adjustments. As a result, the program is getting better, as demonstrated by student outcomes data as well as the robust group of faculty participants.

Get training from colleagues elsewhere in the nation
Many IMPULSE instructors stressed the need to attend workshops and be trained in the teaching methods. John Dowd said, "I'd advise them to go to some workshops. Go see some places that have done it and see if you think you can do it." Ray Laoulache emphasized the value of training in collaborative learning methods: "On pedagogical issues, you need to learn how to ask questions for teams, not for individuals; otherwise, you will be in deep trouble." According to Nick Pendergrass, the topic of how to manage the dramatic change process should be addressed in these workshops.

    Nick explained: I wish I had had time to get together with Karan Watson at Texas A&M and learn about the change process before we started. Karan came in six months after we had decided to implement IMPULSE and did a change workshopÉ She told us what was going to happen. For example,"You are going to have some senior faculty come out of the woodwork and try to stamp it out. It's going to happen." If we had known that, we might not have agonized as much. We would have known that it wasn't us, it was a process that inevitably would have resistance. We still would have moved fast, and pushed people, because we also learned that there is an optimum implementation time. If you go too long, things will polarize and you'll never get there-you'll be in "permanent pilot." So you need to move fast if you want to succeed. But we wouldn't have responded so personally to the resistance.

Make the learning problems evident to departmental colleagues, but be diplomatic!
Another instructor said it is important to get support from department colleagues, and that a good way to do so is to make evident that current methods are not resulting in adequate student learning. The process of making this evident can be dicey, however, as he clearly explains:

    You're got to have support from your department and your college before you do it. One way you can get support is if you can show that current methods aren't working. That's a very tricky thing to do with some people because they take it personally.

      "Can I come in your lecture and give this pre-test and this post-test?"
      "Sure, sure John, come in."

    And then at the end of the semester,

      "Oh! your students only knew 40 percent of the material going in and when they came out they only knew 50 percent. So you only taught them 10 percent. You didn't teach them much at all."

    People don't like that. So you have to be a little diplomatic about it and hope that people realize that it isn't an indictment of them, but of the method that we've all grown up with and used. It's not the best method for teaching a large number of people.

Make adaptations as necessary to suit your students
Several IMPULSE instructors expressed concerns about adopting a purely inquiry-based format (i.e., an environment where students explore science and math with virtually no lectures). Ray (mechanical engineering) felt that this approach can be catastrophic if the instructor does not have the skills necessary for this approach:

    I'm afraid that if you take an instructor who doesn't understand this approach, you could have a disaster. And the disaster can be on an exponential rather than linear scale. Why? Because if you don't know how to pull this together, to make that inquiry learning successful, you can have an exponential failure. You do not deliver; the student did not learn the material; it's a catastrophe on both sides.

Some IMPULSE instructors responded to this concern by adapting the purely inquiry-based approach to fit their freshman students, who they believe lack the maturity to take full responsibility for their learning.

    Renate (physics) explained: I would say don't completely go to purely inquiry-based classroom. I know the research shows wonderful results if you use only inquiry-based methods and you're only there as a mentor. It's not my experience. Maybe that'll work at a junior level, but my experience is that students just coming out of high school need more guidance, need more hand holding.

Some IMPULSE instructors adapted by including mini-lectures to provide structure in the inquiry-based environment and to insure that even if students do not learn some of the basic concepts, they are at least exposed to them. According to Ray (mechanical engineering), leaving students to develop the concepts on their very own is too risky and, "There is nothing wrong with lecturing, as long as we don't turn a lecture into a movie style presentation where we are completely passive."


Discussion 9. ASEE Conference paper by Pendergrass, Laoulache, & Fowler

"Can an Integrated First-Year Program Continue to Work as Well after the Novelty Has Worn Off?"

by

N. A. Pendergrass, Raymond N. Laoulache, Emily Fowler
University of Massachusetts Dartmouth

From: Proceedings of the 2001 American Society for Engineering Education Annual Conference & Exposition, Session 2453
Copyright © 2001,
American Society for Engineering Education

A downloadable pdf version of this paper is available.
Adobe Acrobat Reader is required to view this file.

Abstract

The University of Massachusetts Dartmouth (UMD) began a successful, integrated, first year engineering curriculum in September 1998. This new program dramatically changed the freshman year and was initially very successful. Data from the first year pilot program was very positive. Assessment showed that it

  • more than halved the attrition rate of first-year engineering students

  • nearly doubled the percentage of students passing two semesters of physics on schedule

  • increased the percentage of students passing calculus on schedule by 40%

  • increased performance of students on common final exams in calculus by more than a grade point and a half, despite having a significantly higher percentage of students actually take the final.

By September 1999, the new curriculum had become the required program for approximately 80% of first-year engineering majors at UMD. Expansion produced some unexpected challenges and the paper will show assessment data indicating both positive and negative changes in performance in various aspects of the program. We will give insight into the problems and opportunities that developed as the program grew. We will also describe how assessment provided feedback to help decision making.


I. Introduction

After several years of development, the University of Massachusetts Dartmouth (UMD) began a successful, integrated, first year engineering curriculum in September 1998. This new program was called IMPULSE (Integrated Math, Physics and Undergraduate Laboratory Science, and Engineering). The new curriculum dramatically changed the freshman year because it included

  • integrating multiple subjects

  • teaching and using teamwork among students and faculty

  • using technology-assisted classrooms to accelerate learning

  • using active and cooperative learningmm

  • encouraging formation of a learning community of students and faculty

  • using rigorous assessment to evaluate and improve performance.

Forty-eight calculus-ready engineering students began the pilot curriculum in September 1998 and by midterm it was obvious that the program was having a remarkable effect. Only one student had dropped any course and most of the time all remaining students were in every class every day. Details of assessment data will be discussed later in this paper but, after the first semester, results were so positive that a modified IMPULSE program was made the required curriculum for all electrical, computer and mechanical engineering and physics majors.

As a result, IMPULSE was expanded the following year. Eighty-seven calculus-ready students started IMPULSE in September 1999. Then forty-one students started it in January because they had taken precalculus in the fall and could not enter earlier. The same pattern was repeated the third year but with approximately 10% more students.


II. The IMPULSE Pilot Versus the Traditional Program

Table I shows the basic structure of the traditional program for most engineering majors. Each major had its introductory course in the first semester and specified additional unique courses during the first year. Classes typically involved large amounts of straight lectures and, depending on the particular instructors, various amounts of hands-on activity and the use of technology in the classroom.

Table I.
The Traditional Curriculum

 
Credits
Freshman Courses
Fall
Spring
    Classical Physics I  
4
    Principles of Modern Chem. I, II
3
3
    Critical Writing and Reading I, II
3
3
    Anal. Geom. and Calculus I, II
4
4
    Program Specific
4-6
2-4

Total Credits
14-16
16-18
 
Fundamental Sophomore Courses    
    Anal. Geom. and Calculus III
4
 
    Classical Physics II
4
 

The pilot IMPULSE first-year curriculum is shown in Table II. We will summarize some relevant points here. Additional details about the new courses and their innovations can be found in previous nn papers on oo IMPULSEpp.

A fundamental difference between IMPULSE and the traditional program was that it integrated and sequenced nearly all courses carefully together. The integrated courses are shown in Italics in the table. Physics was used to motivate and enhance students' intuition for calculus and to allow a calculus-based physics to be taken at the same time as calculus. An engineering course in each semester was also integrated to motivate learning of science and math fundamentals while providing engineering foundations. Engineering problems were developed that required knowledge and methods from the other courses. For example, calculus was sequenced to provide "just-in-time" development of the mathematics to deal with physics and engineering problems. In addition, papers were required in the technical subjects and these were worked on and graded jointly in the English course.

In order to keep students' loads reasonable, the first chemistry course was revised to reduce the number of hours students spent in class. IMPULSE chemistry met three hours per week, had two wet lab experiences and used computer tools extensively for exercises, activities and visualization. Traditional chemistry had the usual lecture classes, recitations and laboratories totaling seven hours per week.

Students in the pilot could not drop any IMPULSE course except chemistry because of the integration of subjects. Chemistry was more loosely integrated so that most of its content was not necessary for the other courses.

Table II.
The IMPULSE Pilot Curriculum

 
Credits
IMPULSE Freshman Courses
Fall
Spring
    Physics for Sci. & Engr. I, II
4
4
    Principles of Modern Chem. I, II
3
3
    Intro. to Applied Chem. II
0
1
    Critical Writing and Reading I
3
0
    Intro. to Applied Sci. & Engr. I, II
3
2
    Calc. for Applied Sci. & Engr. I, II
4
4

IMPULSE Total Credits
17
14
 
    Program Specific (not integrated)
0
3

Total Credits
17
17
 
IMPULSE Sophomore Courses    
    Calc. for Applied Sci. & Engr. III
4
 

Another important element of IMPULSE was that students were put in teams of three to four and these teams were used across all courses in a single semester. They were taught about how to work in teams and to be responsible for their own education and that of their teammates. In order to encourage teamwork and group problem solving outside of class, those IMPULSE students who lived on campus were also co-located in the dorm.

Each subject was taught by a faculty member from that discipline and separate grades were issued in each course. Nonetheless, some assignments were made and worked on in more than one course. Those results were sometimes included in grading for all courses involved.

Integration of subjects required faculty effort during development of the syllabus and throughout the semester. During the pilot, IMPULSE instructors met once per week to coordinate the integration of their subjects. In this way, they could point to material from the other courses and expect students to use it. They could also set up problems so that another course could provide "just-in-time" learning. Weekly meetings were also used to coordinate strategies to resolve student and team performance problems.

The engineering course was the motivating force in the curriculum, and in that sense it was critical to student's perception and activity. In addition, the engineering instructor provided leadership to the program and called meetings and coordinated schedules between subjects as needed. This instructor also served as advisor and counselor to all of the students in the course, taking the lead to ensure that student or team performance issues got attention. Sometimes these issues required coordinated activity with the other instructors. Sometimes they could be done directly.

The Engineering Courses in the Pilot:
The first engineering course emphasized problem-solving skills in engineering mechanics in the fall semester. Students met every other day for a total of five hours a week. There were two-contiguous class hours on the first and second day, and one hour on the third day. Occasional special events were also arranged outside class times.

The two-hour classes were used for development of skills in analysis, solid modeling, or design and for integrated problems with Newtonian physics and calculus. The one-hour class was designated for special events such as team building activities or presentations by professionals from industry. Professionals presented on topics such as what engineers do on a day-to-day basis, what research engineers do, how engineers deal with patents and invention, women in engineering, and ethics in engineering. The students showed a profound interest in these topics. During the first pilot semester, eight engineers were invited.

During the semester there were impromptu engineering design projects. That is, students were not notified in advance about the nature of a design problem but were challenged to come up with a design spontaneously in a limited time period. For example, in one early project in a special extended class period, students were told to bring swimsuits but were not told why. In class they were given a limited amount of materials and were asked to design a cardboard canoe. They had two hours to design and build it. That was followed by a competition in the university's swimming pool. All in all, this brainstorm-design-build-compete activity took five hours. The students developed a feel for design, what can fail, and the consequences of failure.

Afterward, the students were asked to work as a team on an improved version of their first design and get ready for a second competition in two weeks. Aside from camaraderie and team building, the students learned first-hand about the importance of understanding and using relevant concepts in their design.

In the second course offered in the spring semester, students met for four hours a week. The course had a mechatronics theme and emphasized development of problem solving skills in AC and DC circuits, electromagnetism, software tools, measurement and controls. The course was structured with two-hour classes on two alternating days. The course involved considerable design with hands-on activity in class but it did not have the extra activities like those that occurred in the first semester.


III. The Expanded IMPULSE Program

Table III shows the modified version of the IMPULSE program that was adopted as the required first year for all students majoring in mechanical, electrical and computer engineering and physics. Three courses remained tightly integrated, physics, calculus and engineering, as shown in Italics. Integration of the English course was dropped because some entering calculus-ready students do not pass the placement tests and are assigned to remedial English. That would have prevented some technically competent students from going into IMPULSE in the first semester. Nonetheless, as the program continues to grow that decision will likely be revisited. Integration of significant writing and oral presentation experiences across the technical courses, with support from an English instructor, provided powerful motivation for students to learn to communicate well.

Table III.
The Expanded IMPULSE Program as Adopted

 
Credits
IMPULSE Freshman Courses
Fall
Spring
    Physics for Sci. & Engr. I, II
4
4
    Intro. to Applied Sci. & Engr. I, II
3
2
    Calc. for Applied Sci. & Engr. I, II
4
4
    Principles of Modern Chem. I, II
3
3
    Intro. to Applied Chem. II
0
1
    Critical Writing and Reading I
3
0

IMPULSE Total Credits
17
14
 
    Program Specific (not integrated)
0
3

Total Credits
17
17
 
IMPULSE Sophomore Courses    
    Calc. for Applied Sci. & Engr. III
4
 

Chemistry was never integrated fully and its independence was formalized in the expanded program. The first course, however, used cooperative learning and contained other innovations as described above. It was kept the same as the pilot except that its students were not necessarily also in IMPULSE and it developed its own teams. The second IMPULSE chemistry course was changed back to the traditional one because it had a similar basic structure with its associated formal wet lab. It did not appear to be necessary to maintain a separate course in that case.

The traditional courses in physics and calculus continue to be available in parallel with IMPULSE each semester. They are needed by science and math majors. Therefore transfer students and others who already have taken one or more of the three integrated IMPULSE courses have been placed in the traditional courses as necessary. This is also true for students who failed only one or two of the integrated courses in IMPULSE.


IV. The Impact of Rapid Expansion

The IMPULSE program was adopted in late spring of 1999 and it was decided to implement it at full scale in the fall semester of the same year. Such rapid expansion generated considerable stress on faculty and administration to get everything in place for fall classes. Another technology-oriented classroom was constructed so that the engineering, physics, and calculus courses could run for two cohorts for a maximum of ninety-six students. A trailing section in the spring semester would also add another cohort of up to forty-eight. As a result, the number of faculty members involved in the program more than doubled and their other pre-existing teaching schedules forced some creative scheduling of IMPULSE courses.

Since students were scheduled as a cohort for three courses and these were widely separated in time, it became very difficult to do many of the special events. For example, carrying out the cardboard canoe design project became impossible because of scheduling problems.

During the first semester, a common hour was not available for the two different sections of IMPULSE. Fewer professionals could be arranged to speak to each cohort because of the difficulty of scheduling them for both sections.

Most of these problems extended into the second semester of the second year of IMPULSE and into the third year as the program continued to expand for additional cohorts. To resolve some of these issues, two more technology classrooms have been built to handle IMPULSE-style courses, making a total of four in the College of Engineering.

When the pilot of IMPULSE began, all of the instructors involved had worked hard to construct an effective program and they knew a great deal about the methods they were going to use. For example, in preparation they had attended several workshops on active-cooperative learning and using teaming in the classroom. These faculty members met weekly to discuss issues related to management of the program, integration, teaming efficacy, frequency of lecturing, impact of technology on pedagogy, and how to cope with dysfunctional teams.

As the program expanded and more instructors became involved, some with little warning, training and coordination became issues of importance. The added faculty often had little training or experience with active-cooperative learning and teaming. The approach used to compensate was to have the experienced professors mentor the new ones in their subject as the semester got underway. This had varied results depending on the personalities of the people involved. The most effective approach, however, was used by a physics professor who was retiring after the first year. He arranged for his replacement to attend class and assist two or three times during the semester before his retirement. The new instructor experienced the reality of cooperative learning in a team-based class. This was the most effective training model.

In the second year, the integrated subjects were not coordinated together as tightly as during the pilot. Faculty members met less often with those teaching the other integrated subjects for their cohort. In addition, the newest faculty to IMPULSE, and the most inexperienced in cooperative learning, met the least often and were less able to handle student problems and dysfunctional teams.


V. Assessment Results

From the beginning, considerable effort has been put into assessing the IMPULSE program so that good decisions could be made about its expansion or modification. Measurements are being made on an ongoing basis using a variety of devices.

The data summarized here includes results from the first two years of IMPULSE, including the pilot. Previous papersqq should be consulted for more details on the pilot program assessment. The information will continue to be updated and included in the conference presentation.

After a study of the factors that correlated with academic performance of first-time-full-time freshman engineering majors from 1997-98, we developed comparison groups. These groups were matched for their calculus placement entrance test score (CP) and high school GPA as follows:

  • IMPULSE I - 48 calculus-ready engineering majors in the pilot, CP=70.4%, H.S.GPA=3.03

  • F'98 control - 42 science, math and engineering majors in traditional courses, CP=69.2%, H.S.GPA=3.01

  • F'97 control - 38 engineering majors in traditional courses, CP=69.2%, H.S.GPA=2.99.

They also matched closely in SAT math and verbal scores. The F'97 control would have been IMPULSE students if the program had started a year earlier.

The F'98 control group contained very few engineering majors. We have made several studies of calculus data from 1997 and 1998 and the analysis supports the use of science majors for comparison groups to assess the performance of engineering students. None of these studies indicates that engineering freshmen perform differently than science or math majors in calculus.

IMPULSE students were randomly selected from the calculus-ready population of first-time-full-time engineering majors. All of those selected started the program.

For comparison the expanded IMPULSE II in 1999-2000 had the following scores:

  • IMPULSE II - 87 calculus-ready engineering majors, CP=64.4%, H.S.GPA=3.16

Calculus:
As shown in figure 1, IMPULSE I and II students scored almost a grade and a half higher than the F'98 control group on 18 common exam questions on the final exam for all sections of the first calculus course. Only 4% of IMPULSE students in each year did not take this final compared to 28% of the F'98 control.

Figure 1

Physics:
Fair comparison with traditional physics courses is difficult. IMPULSE students are the only students who were taking physics during the first semester of their freshman year. Comparison is further complicated because the IMPULSE development caused changes in the way traditional physics classes were being taught. Active learning techniques were first introduced in the spring of 1998 and exercises similar to those in

Figure 2

IMPULSE physics were introduced in standard physics courses in the fall of 1998. For these reasons we chose the comparison group:

  • S'97 physics class - 74 students who took PHY 113 that semester (72% were engineering majors, 82% were freshmen)

The Force Concept Inventory (FCI) was used for comparison of learning in first semester physics courses through 1998 and IMPULSE I. This test uses conceptual questions to determine the depth of understanding of Newtonian mechanics. It was given as a pre-test at the beginning of the first semester of physics and again as a post-test at the end. A normalized gain was computed by taking the gain from pre- to post-test and dividing by the maximum possible gain on the post-test (perfect score minus pre-test score).

As shown in figure 2, IMPULSE I students had a normalized gain on the FCI of 30% for the pre-test/post-test pair. In comparison, the S'97 physics class, using traditional methods, had an 18% normalized gain.

Beginning with IMPULSE II, assessment of first semester physics was done with the Force and Motion Conceptual Evaluationrr (FMCE) which in many ways is similar to the FCI. Physics faculty members at UMD believed it was more comprehensive than the FCI because it had a larger number of questions and they were more directly related to the laboratory modules used in first semester physics. IMPULSE II students scored a 32.5% normalized gain on the FMCE.

In evaluating the results above, it should be noted that 98% of the students in both years of IMPULSE took the first semester physics final. While the number of students actually taking the final was not recorded for the comparison S'97 class, traditional physics classes typically have from 65% to 85% take the final.

Engineering:
As discussed above, the IMPULSE courses were so different from the department specific courses that no direct comparison of course results was attempted. Assessment of these courses is currently being developed and is directed toward continuous improvement of the program.

Student Success Rates in the First Semester:
As shown in figure 3, students in IMPULSE have consistently earned substantially more credits during the first semester than the control groups. Students in IMPULSE I and II were also more successful in passing the entire first-year calculus and physics sequences on time, as shown in figure 4. Nonetheless, there was a drop in the number of IMPULSE students passing the physics sequence on schedule during the second year of the program.

Figure 3

Figure 4

The larger number of credits earned by IMPULSE students, and their higher success in passing both physics and calculus courses on time, are even more significant given the improved performance on common exams noted above. In addition, IMPULSE students were taking three very difficult courses at the same time - physics, chemistry and calculus - while the control groups took chemistry and calculus but not physics. In the traditional programs, engineering majors typically take physics in their second semester and most science majors take it in their third semester.

Retention:
As shown in figure 5, the retention of IMPULSE students has been substantially improved over those going through the traditional program; however, the one-year retention for the IMPULSE pilot was much greater than for the expanded version. Another notable feature of the graph is that retention of the pilot group dropped fastest between the first and second years. This was after they went through a largely traditional sophomore year.

Figure 5


VI. Discussion

Data from the first year pilot program was very positive. Assessment showed that it

  • more than halved the attrition rate of first-year engineering students

  • nearly doubled the percentage of students passing two semesters of physics on schedule

  • increased the percentage of students passing calculus on schedule by 40%

  • increased performance of students on common final exams in calculus by more than a grade point and a half, despite having a significantly higher percentage of students actually take the final

  • produced performance improvements in physics and English relative to the traditional program.

The second year of IMPULSE saw similar results to the pilot in terms of student success and performance in calculus and physics on common exams. The areas that dropped were the first year retention rate and the percent of students passing the physics sequence on schedule.

The drops in retention rate and in physics success in the second year of IMPULSE may be attributable to changes that occurred in the expansion. We will look for assessment tools, such as exit interviews, to try to get further insight. Without further information at this time, however, we can infer that we should try to return to former practices from the pilot program where it is reasonable to do so. For example we should:

  • give new IMPULSE faculty members good training in the methods, ideally also having them assist in the classroom in a prior semester

  • meticulously have routine weekly meetings among professors for the same cohort to coordinate on student and team problems and discuss teaching methods and topics

  • schedule a common class time among all IMPULSE cohorts for presentations from professionals, and have lots of such events

  • have team building projects and special events that also have a social aspect to bond students to each other and to the program.

Our experience with IMPULSE so far indicates that an integrated first-year program can continue to work well as the novelty wears off. Nonetheless, in order for that to happen, assessment must be done and effort must be applied to maintain and improve it.

The IMPULSE program, like others that use technology in the classroom and involve integration of subjects, is a highly ordered system. It requires coordination and cooperation and has special issues of scheduling and technology support. We all know that systems will return to a lower ordered state if allowed to do so (the forces toward chaos are relentless!). We believe that the lowest ordered state at a university is when each faculty independently just lectures from old, worn notes to sleeping students. Our experience with IMPULSE so far indicates that its demonstrated performance justifies the will and effort needed to maintain it as a highly ordered program.


Acknowledgements

The authors especially wish to thank:

  • The Davis Educational Foundation, the National Science Foundation and the Foundation Coalition for their support.

  • Dr. Judy Sims-Knight for assistance in the assessment activity.

  • The IMPULSE team for impressive effort to build an excellent program.

  • Many faculty members, administrators and staff at UMD who have given generously of their time and resources to assist the development, implementation and assessment of the IMPULSE program.

N. A. PENDERGRASS
Dr. Pendergrass is presently Department Chair and Professor in the Electrical and Computer Engineering Department at the University of Massachusetts Dartmouth. He has been a leader in the development and implementation of innovative educational programs and courses and he has been instrumental in getting the external and internal funding needed to effect significant change. Dr. Pendergrass received a B.S. degree in Electrical Engineering at the University of Missouri at Rolla in1967, and M.S. in Electrical Engineering from Purdue University in 1969, and a Ph.D. in Electrical Engineering at the University of California, Berkeley, in 1975.

RAYMOND N. LAOULACHE
Dr. Raymond N. Laoulache is an Associate Professor of Mechanical Engineering at the University of Massachusetts Dartmouth. He received his Ph.D. from Brown University, his MS and BS from Northeastern University. Dr. Laoulache's research interests are in the areas of multiphase fluid flow. He developed the engineering component of the integrated Engineering curriculum (IMPULSE). Currently, he serves as the Director of the IMPULSE program.

EMILY FOWLER
Emily Fowler is Assessment Specialist at the University of Massachusetts Dartmouth's Office of Institutional Research. She received an undergraduate degree in psychology from Harvard University and a master's degree in health care administration from University of North Carolina at Chapel Hill. She now applies both qualitative and quantitative assessment methods to new initiatives and curricular changes at UMass Dartmouth's College of Engineering. She recently served as the Strategy Director for Assessment and Evaluation for the National Science Foundation's Foundation Coalition.


Resources



Resource A. Institutional Context


The University of Massachusetts Dartmouth (
UMD), a regional comprehensive university, is one of five campuses of the University of Massachusetts System, the largest university system in New England. UMD began in 1962, when the state legislature created the Southeastern Massachusetts Technological Institute. In 1969, it became a comprehensive university and was renamed the Southeastern Massachusetts University. In 1991, the institution joined the University of Massachusetts System and was renamed the University of Massachusetts Dartmouth. At the turn of the 21st Century, UMD was a comprehensive university serving some 6,000 students (largely commuters), and had 300 faculty organized into five colleges: the College of Arts and Sciences, the College of Business and Industry, the College of Engineering, the College of Nursing and the College of Visual and Performing Arts. It offers a Ph.D. degree in electrical engineering, and several other Ph.D. degrees in conjunction with other U Mass campuses. With a strong teaching mission, faculty carry a teaching load of 3 courses (9 hours) per semester and are evaluated primarily in terms of their performance in teaching, measured primarily through student evaluations.

The UMD College of Engineering, which houses the IMPULSE program, has between 70 to 80 faculty members and enrolls roughly 1,000 students. Typical graduation time for undergraduate students enrolled in the College is four and a half years. While UMD is primarily an undergraduate institution, the College of Engineering offers several off-campus master's degrees, some available as far away as the Far East.


Resource B. Methods used to produce this case study


Jean-Pierre Bayard and Marco Molinaro, researchers for the Institute on Learning Technology, conducted interviews and observed labs and classrooms during spring 2000 at the
University of Massachusetts Dartmouth. At the time of their visit, these IMPULSE faculty and undergraduate teaching assistants were several weeks into the semester. They interviewed the following faculty, who are named throughout the case study: Nick Pendergrass, John Dowd, Raymond Laoulache, Renate Crawford, and Bob Kowalczyk. In addition, they also interviewed a group of 12 IMPULSE students, one university administrator, and two computer technicians. They also observed IMPULSE classrooms.

The interviews were guided by the protocols used in all the Learning Through Technology case studies and were taped and transcribed. Jean-Pierre Bayard analyzed the interview material and wrote the case study, with editorial help from Susan Millar, Charlotte Frascona, and Sharon Schlegel.


Acknowledgements:
The authors thank the University of Massachusetts Dartmouth faculty, staff, and students who participated in this study. These individuals very graciously responded to our request for their time and attention. In particular, the authors thank Professor Nick Pendergrass for the many hours and the thoughtful attention he dedicated to the improvement of this document.

This case study is based on a deeply collaborative analysis and planning process undertaken by the NISE's Learning Through Technology "Fellows" group:

The Fellows, in turn, benefited substantially from members of the College Level One Team: Andrew Beversdorf, Mark Connolly, Susan Daffinrud, Art Ellis, Kate Loftus-Fahl, Anthony Jacob, Robert Mathieu, and Sharon Schlegel.


Glossary


Assessment -- What do faculty who are experimenting with interactive learning strategies (see constructivism) mean by "assessment"? In the simplest terms, assessment is a process for gathering and using data about student learning and performance. The LT2 web site distinguishes the following two types of assessment:


Bricoleur -- a French term for a person who is adept at finding, or simply recognizing in their environment, resources that can be used to build something she or he believes is important and then putting resources together in a combination to achieve her or his goals.


Constructivism -- According to Schwandt, constructivism is a "philosophical perspective interested in the ways in which human beings individually and collectively interpret or construct the social and psychological world in specific linguistic, social, and historical contexts" (1997, p.19). During the last 20 or so years, cognitive psychologists (James Wertsch, Barbara Rogoff, and Jean Lave, among many others) have found that constructivist theories of how people construct meaning are closely aligned with their observations of how people learn: knowledge is mediated by social interactions and many other features of cultural environments.


Learning activity -- As used in the LT2 case studies, learning activity refers to specific pursuits that faculty expect students to undertake in order to learn. Thus, "Computer-enabled hands-on experimentation is a useful way to get students to take responsibility for their own learning" is a statement of belief that a particular learning activity (experimentation) helps realize a particular teaching principle.


Learning environment -- According to Wilson, a learning environment is a place where learners may work together and support each other as they use a variety of tools and information resources in their pursuit of learning goals and problem-solving activities (1995). This definition of learning environments is informed by constructivist theories of learning.


Microcomputer-Based Laboratories (MBL) -- A set of laboratories that involve the use of (1) electronic probes or other electronic input devices, such as video cameras, to gather data that students then feed into computers, which convert the data to digital format and which students analyze using graphical visualization software; and (2) a learning cycle process, which includes written prediction of the results of an experiment, small group discussions, observation of the physical event in real time with the MBL tools, and comparison of observations with predictions.


Seven Principles for Good Practice in Undergraduate Education -- These principles, published in "Seven Principles for Good Practice in Undergraduate Education" by Zelda Gamson and Arthur Chickering, were synthesized from their research on undergraduate education (1991). According to their findings, good practice entails:

  1. Encouraging student-faculty contact.
  2. Encouraging cooperation among students.
  3. Encouraging active learning.
  4. Giving prompt feedback.
  5. Emphasizing time on task.
  6. Communicating high expectations.
  7. Respecting diverse talents and ways of learning.


Teaching principles -- Teaching principles refer to a faculty member's more general beliefs about, or philosophy of, learning. For example, the idea that "students should take responsibility for their own learning" is a teaching principle. It is general and informed by a theory of learning. It does not refer to something specific that one might actually do in a course.


References



Endnotes


  1. See www.dcr.rpi.edu/colclass.html and www.rpi.edu/locker/13/000513/rpi-itesm/Contents.htm for information on the studio classrooms at Rensselaer Polytechnic Institute.

  2. See http://www.foundationcoalition.org/ for more information.

  3. The Force Concept Inventory exam is described at length in the case study featuring physics at Joliet Junior College. See www.wcer.wisc.edu/nise/cl1/ilt/case/joliet/summative_outcomes.htm and www.wcer.wisc.edu/nise/cl1/ilt/case/joliet/RD_pre_post_tests.htm.

  4. See http://www.wiley.com/Corporate/Website/
    Objects/Products/0,9049,39944,00.html.




a. See Resource B. Methods Used to Produce this Case Study.

b. Bob Kowalcyck (mathematics) explained, "The main issue for the College of Engineering was retention. I know that in the traditional program, at the end of the first semester, anywhere between 30% and 40% of the students wouldn't complete my calculus course. They'd either withdraw ahead of time or they'd fail the final. And then during the second semester, of those students who passed, you'd lose maybe another 30 percent or so. So there was a pretty high dropout rate."

c. Pointing to Figure 1, Nick Pendergrass explained, "This graph shows the attrition rate of engineering majors. And by the way, it doesn't matter whether you present these data as the percent who stay in engineering or as the percent who stay in science, engineering, and math. They weren't leaving to go to math. They weren't leaving to go to physics. They were leaving science. And that's a tragedy, when you get right down to it."

d. As Pendergrass explained, "Look at the blue dots [in Figure 2]. We were worried about that. We actually went back to find out if they just get lured away to another university before they started in the program here. No, they didn't. They actually enrolled here as students. They just flew out of the system before the end of the first semester---dropped everything, or got to the next semester and dropped everything. We were dismayed that there were people who were clearly capable--people in the lower right quadrant--who were simply dropping out. So this tells you, at least, in light of one measure, that we were not weeding out the least capable. And these data correlate with the typical complaints we got in exit interviews: `The course was boring and it was poorly taught. I couldn't figure out how to get through it. I couldn't connect with anybody.'"

e. Referring to the attrition rate, Pendergrass stated: "But our program, like many programs around the country, had become almost proud of this attrition. People could say, `See, it's a good program. Look how many people can't make it. We are doing our jobs. We are weeding out the chaff.' But I can show you another curve, a scatter-plot of SAT versus GPA. Or I can show you SAT Math versus cumulative quality points [see Figure 2]. And you get this snowstorm plot. You can see these dots where people dropped out of the program. And you look at their SAT scores, and find that many students with high SATs dropped out of the program. SATs are probably a fairly noisy measure of capability, but if somebody has done really well on the SAT, they should be capable of doing this program. So if they leave, what happened?"

f. Illustrating this point, Nick said that in pre-IMPULSE times, "We were doing the standard lecture and lab thing, with 80 students in a lecture hall. Half of them weren't there. That's very common [around the country]. When we started looking at ways to fix this attendance problem, we went to RPI, Texas A&M and Arizona State. At RPI, a physics professor looked at me and said that what they had seen was that half the students weren't coming to the lectures, and more than half weren't coming to the recitations and the labs. It was clear those who show up weren't particularly motivated. So what do you do?"

g. As Bob Kowalczyk (mathematics) explained, "I had students in my traditional classes with 700s on the math SATs. They were obviously very bright, but they had trouble, maybe, adjusting to college life. They start attending classes and then they say, `Oh, I had calculus in high school; I know all this stuff, why bother coming to class.' Before you know it, they were digging a big hole. They'd come to the first test and flunk the test. You try to wake them up, you bring them into your office, but it is too late."

h. Raymond Laoulache (mechanical engineering) commented: "One thing I've found also is that when you ask the students to read, even if you do a writing assessment test the next day, they don't necessarily read. They don't care about writing an assessment test, whether they get a grade of 0 or 4. So these are the real issues we deal with on a daily basis."

i. Pendergrass: "Students would come to my door to sign out of engineering. I'd say, `Well, gee, how come' They'd answer, `Well, I thought I was going to be an electrical engineer, but I'm in this physics course and if that's what electrical engineering is, I just don't want to do it.' Well, that's not electrical engineering, for heaven's sake. They go from DC and Ohm's Law to Schrodinger's wave equation a very short time. Students just weren't getting turned on to engineering." See also footnote e.

j. According to Nick, it was not that the content of these foundation courses was obsolete or inadequate for engineers per se; rather, it was that students often did not realize until much later that these courses were providing a fundamental part of their engineering knowledge base. "Students," he explained, "tended to just learn what they needed to get through the next exam; they didn't see the reasons why they needed to know the content," or bother to attend class on a regular basis. A junior in mechanical engineering, a member of the curriculum committee, commented that if only he had known he was really going to need calculus, he would have studied it better. Thus, a group of faculty decided that if they could motivate students to want to know the content, they would make better progress.

k. Renate Crawford explained: "In physics courses, you want students to be able to see the big picture. You don't want them to see physics as a bunch of formulas where you have to just plug numbers in... [Physics] is a whole picture and way of looking at the world that you have to get the students into. I mean, take the idea of building models...You build these very simplified pictures of the world. There's no friction, there's no air resistance and the cannon ball flies a perfect parabola and all that sort of thing. We have to get students to realize that it's not all baloney, that it's not the physics world and their world. Students say, "What answer do you want? I've read this. Do you want how things really are, or how they're supposed to work in physics?"...Students live in a world of friction. They don't live in this idealized world. If you're not careful, you can get the students to think that physics is a bunch of formulas that don't really apply to the real world and that they have to learn to get through this damn course in order to become engineers."

l. For a number of articles and books that provide evidence of the effectiveness of inquiry-based small-group learning, see the References.

m. A learning environment is a place where learners may work together and support each other as they use a variety of tools and information resources in their pursuits of learning goals and problem-solving activities (Wilson 1995).

n. A French term for a person who is adept at finding, or simply recognizing in their environment, resources that can be used to build something she or he believes is important and then putting resources together in a combination to achieve her or his goals.

o. An activity that simultaneously Bob Kowalczyk (mathematics) commented on lack of compatibility between software platforms, and gave a hint about students' comfort with Maple: "I'm using Maple in the IMPULSE Program. Once in a while, I'll bring in a MacIntosh computer and use a program called TEMATH. A colleague and I wrote this software, a very visual and dynamic software package that we use a lot for modeling; you can easily input data and mathematically model the data. Unfortunately the software was written just for the Macintosh. The engineers are all using PCs, so we can't use TEMATH extensively. So for everyday use, it's Maple. However, the students are sometimes more comfortable using their graphing calculators. When you ask them to graph something, rather than use Maple, they use their calculators. So I let them use whatever technology they are comfortable with. I find they use their graphing calculators in addition to using Maple."

p. Nick Pendergrass explained: "They have to do verbal presentations. What we've discovered is that if you don't do that, then you've got two or three people who just say,"Well, let them do it." And that responsible person does it. We have seen division of labor. But it's interesting that it actually leads to higher success rates in the performance of students on common exams. That was a frequent occurrence, even when we had all five courses integrated together. The person who was a better writer tended to lead the group on the writing stuff. The person who was really a good chemist tended to do some of the pre-work on all the homework and get the idea of how it ought to go. The instructors worried a lot about that during the first semester we were in this. "Look what's happening. You know, that person isn't facing that homework blindly. They're getting a lot of help from this other student before they get into it." But the net result was that performance for all of them went up--both for the better students and the under-performing students."

q. John Dowd said, "So what happened is that both of our sections performed similarly, which was amazing to me. But they didn't do as well as the full Workshop Physics in other places using six hours. On the other hand, I talked to Priscilla, who said it takes two or three years just learning the process. So while we've made a big jump, we're still at the beginning of this. You want to move yourself over to here (center of red curve in Figure 4). I'm hoping that as time goes on, that kind of improvement will be made."

r. Bob Kowalczyk explained, "We wanted to assess the effectiveness of calculus and so at the beginning of the program we tried to come up with a control group that closely matched the students in the IMPULSE section. Now the problem was that there were 53 engineers who passed our calculus placement exam, which placed them as calculus ready. We wanted to put 48 of those engineers in the IMPULSE program. So that only left 5 engineers for a control group, which meant we had to use scientists for the control group, you know, physics majors, chemistry majors, math majors, computer science majors, and the remaining engineers. We did some cluster analysis and we tried to match the characteristics of the 48 engineers to 48 science students. We matched them in terms of high school GPA, SAT scores, and our calculus placement exam and came up with a control group. If you look at the average GPA scores, SAT scores, and the calculus placement exam scores, the two groups' averages were within decimal places to each other. So we felt that the groups were pretty evenly matched. Of course the engineers said we had no control group at all because the control group didn't have engineers. Well, there were no engineers to pick from. So we did the best we could do."

s. Bob Kowalczyk said, "For example, we have some questions on limits. Now, in the IMPULSE program, we use the Harvard Reform calculus book, which tends to do things in a less rigorous manner, especially topics like limits. It treats limits very intuitively; you'll build tables with numbers and you get a graph; where in the standard course, we have all these theorems and limits; the limit of the sum is the sum of limits, and you go through all of this, spending days doing all of these limits. In IMPULSE we sort of do that intuitively and then we build on that idea of a limit to do derivatives in calculus. The students in the standard section have seen lots of limits problems, and they did well. In IMPULSE, we didn't do that much with limits, so the IMPULSE students didn't do as well in that section. But then, take a problem like this one [shows interviewer something]: "The price of a certain baseball card in 1990 can be approximated by a quadratic function from the date of purchase. Calculate the derivative of this function at some value; write a sentence interpreting the meaning of the derivative in practical terms." The IMPULSE students wiped out the standard students. The standard students could calculate any derivative you'd give them, but if you asked them, "Well, what does it mean if I calculate the derivative of this function at the point x=2? What does that number mean?" No clue. But the IMPULSE students could interpret it well, noting that the value of the card is increasing at this rate, at that point in time. So maybe the more interpretive application type problems are more aligned with what we're doing in IMPULSE."

t. Bob Kowalczyk said, "This is one group of students, one test, we have some data, you have to keep building on that. Whether that proves that IMPULSE is better than the standard sections would be difficult to claim, especially as a statistician. One sampling does not substantiate that finding; but all the data we've taken is positive. There hasn't been really anything that says the IMPULSE students are doing worse that the standard students."

u. Nick explained how he took advantage of both his faculty and administrative positions to get things done for IMPULSE: "I found that using the title of Associate Dean for Undergraduate Programs outside of the College of Engineering was important to get things done because the organization is sensitive to who's asking. There is, whether we like it or not, a kind of caste system. And so, outside the organization, I needed a title. But inside the organization, I don't use the title much. I've found that being a faculty member is very important in trying to make changes within the university. I would rather be one of the troops, to try to get things done and only on occasion did I choose to use the Dean title to write a letter or provoke an action that would encourage people to take the program seriously."

v. An example of an instructor who was already using active learning methods is John Dowd (physics), who explained:"I taught freshman physics on and off for quite a while, so I was aware of a lot of the problems. And I had been interested in interactive learning, just from reading literature in physics. In fact, before the IMPULSE program, I started to give the freshman in my class and some other freshman lecture classes that other professors agreed to, entrance and exit exams, like the Force Concept Inventory, to establish a baseline to see how well we were doing with the lecture model, as a department, versus other places. And of course, we were doing just as well, or as poorly as everybody else."

w. An example of this type of problem that the IMPULSE program encountered is small parts that are defective and not covered under the manufacturer's warranty:"Support for on-going problems is almost nonexistent...On the new group of computers that we got, these beautiful P3500s, they have new graphics cards in them, 3DFX cards, because that's what they're shipping with Gateways now. We don't need that, but that's what they're shipping. Well, they run very hot. I've had at least eight burned out during the last semester, which means that the computer is not usable. So I take the computer out of the classroom, I send it down to the technician, who services all the computers. I know the fix for this. The fix is to buy these video coolants. They cost $18 a piece. $18 times 25 isn't an unreasonable sum, but when I asked to have that purchased, the answer is, 'sorry, we don't have any funds.'"

x. Nick explained: "We did not buy laptops. There was a reason. You go to Texas A&M, you pick up a laptop that's been in one of those classrooms for a year, and things rattle around inside it. Probably the display won't stay up by itself anymore. When one of those breaks and they are going to break--you sort of throw it away and go get another one. And we have discovered that if we have a really ugly cabinet and it doesn't look new, nobody steals it. They don't know that it's a 600 Megahertz Pentium inside there. So in the oldest room, which is now about four years old, we have older machines. We can bring them up within one step of the latest generation and put a 20Meg hard drive in there for $500. So it costs us $500 for the equipment plus the technician time, which isn't technician time since it's a work-study team, which is trained by our technician, and we're not even paying for it. So it's not expensive to keep these up, but it depends upon on how you set up the process."

y. A technician commented: "As for the computer hardware, I don't know what the long-term plan is. People don't tend to adequately estimate the costs of computers. Take a look at a company like Boeing at Boeing they estimate it $4,000/ year to put a P.C. on somebody's desk. Microsoft estimates $6000-7,000/ year. The reason is they keep up with the latest hardware, and they factor in the cost of the hardware, the software, and the professionals that keep those computers running on a daily basis. We sort of use a group approach on a lot of the problems that we have associated with computers. I don't try to fix individual computers. We create a backup on the server of what the ideal computer looks like. If something is wrong with a computer, we just Ghost it and bring the image back on. So I wipe out everything on the machine, restore it, give it back its TCP/IP identity, and it's set and ready to run again. But if we want to update from AutoCAD 3.0 to AutoCAD 4.0, or from MathCAD 3.0 to MathCAD 4.0, then I need to do the work on individual computers. I need to move it up onto the server, and then I need to move the image down from the server onto all of the individual computers. That's very difficult and very time consuming. They've booked the lab very, very tightly, to get the maximum utilization out of the lab, but it's gotten better as we've put additional labs up. But as far as cost is concerned, how long are we going to be able to run the latest version of AutoCAD? Maybe another year, maybe two? I don't know where we'll get the money for future updates."

z. The stress on the computer support staff was evident in talking with the individual whose job is to support the faculty in the physics department. He told us, "Now, when a faculty comes to me with a request, I'm much slower in servicing my own department because of all the computers I'm taking care of the IMPULSE labs. That's a great disservice to the department and I feel very badly about it. I'm constantly being hounded by the IMPULSE professors. I keep going back and forth, that I have to ignore IMPULSE for a while because a faculty member needs his computer rebuilt, or I ignore a faculty who might have a minor computer problem because I'm servicing IMPULSE. So, from my standpoint, it's a net loss to the department because they didn't hire anyone in addition, they just sort of said, 'Oh, well, you can do all of this, and then they kept adding on to it and on to it."

aa. Nick explained, "We had to move to adopt the program or potentially stay in the experiment stage forever. In other innovative programs what often happened was that you could watch the program start to polarize the faculty. As data started to come in, you could hear people say, `I don't think that program is any darn good. You know, I talked to this student who said it's no darn good.' Other faculty would find something good about the program and you could see it polarizing. At some point in time, the middle ground where compromise could have been found disappears. I think that would probably be sometime in the middle of the second year of the experiment. I think that happened at the University of Alabama and at Rose-Hulman. Karen Watson kept it from polarizing so much at Texas A&M. So if one of these programs goes for two or three years as a pilot, the evidence seems to indicate that it is going to continue to polarize and lock. Then it is very likely to stay an experiment."

bb. Bob Kowalcyzk explained, "For a new colleague, participation would be positive, because they would be trying to do something innovative-new pedagogical ideas, integrating technology- and that seems to be pretty heavily weighted now in instruction. The math department considers the use of technology in teaching a good thing. It's pretty progressive. There are a few people in the background that might not use technology as much or still think that every theorem in the book needs to be proved, but in general our department is pretty supportive of innovative pedagogy."

cc. When asked if he would worry about participating in a program like IMPULSE if he were not tenured, or were new to the institution, an engineer said, "Oh, you bet... I would not participate unless I had the blessing of the department."

dd. A technician commented: "As far as the actual professors are concerned, I hear a lot of grumbling from individual physics professors because they have to redo their entire curriculum, okay. This is a totally new way of teaching and they don't necessarily get a lot of release time to develop this whole new curriculum. Once you've run through once, you don't need as much release time, but it still isn't like a course that they've taught many times, that they can just have an hour of preparation and then they're fine for it. They have to revise the things that didn't go right the first time. So it's difficult for the professors."

ee. Bob Kowalczyk elaborated as follows: "The teaming part, I'm still working on. As an instructor I'm finding that it is the most difficult part of this program. You'd expect if you had four students sitting across from each other like this and you give them a project or problem, they'd automatically start talking to each other, solving it. In calculus you give them a problem, and everybody goes off and tries to solve the problem by him or herself. So you are thinking that there's not that much collaboration but then if one student is not getting it and another student is getting it, they'll start talking and I think that's where teaming comes in; there's more of a tutoring going on in the teams, where they are sort of helping each other out to try to understand the problem. But I'd like to see even more, more teaming work with the students, more working together."

ff. Bob Kowalczyk (mathematics) observed: "Students are much more active in this class. In the traditional class, you go there and everyone is sitting in their own chair and maybe they'll talk to their neighbor or whatever. If you ask a question, you may get some responses. But here people are used to talking to each other, so if you ask questions, it's nothing new for them to answer your question, or for them to ask questions themselves. I see that there's a lot more dialogue between the professor and the student. In that sense for me it's a lot more exciting."

gg. This technician elaborated on this point as follows: "Well, it's a totally different atmosphere when you have 48 students and you see them through multiple courses... In the traditional lecture style, a lot of the professors do not take attendance; it's almost pointless. So the only time they realize that a student may be having problems is after one or two exams, and that student might just be a social security number, a name, and a bunch of grades in a grade book. So, in that sense, faculty in the traditional class are totally removed from students. And yeah, the professor can say, `Listen, you're having trouble, you can take advantage of the science and engineering tutoring center,' or when the exams are passed back, talk individually with the student, saying,`I'd like you to come to my office to meet with me about that;' and then tell them the advantages of going over to participate in the peer-to-peer tutoring through the science and engineering tutoring center. But this is much more readily done when they're in the classes, doing this collaborative learning, and you can simply pull them aside or say you want to meet with the entire group. It's much, much easier."

hh. Johnson, D., Johnson, R., Active Learning: Cooperation in the College Classroom, Interaction Book Co, 1998.

ii. The Davis Foundation provides grants to institutions of higher education in the New England area in support of educational improvement efforts.

jj. For this pilot group, they included English and chemistry courses, for a total of 17 IMPULSE credits. But when they scaled up to the full first-year course, they had to address questions raised by students who do not place into calculus and/or introductory English. They decided to drop the English requirement, and to offer a spring-starting "trailer" section of IMPULSE to students who took pre-calculus in the fall. They also decided to drop the chemistry course because its curriculum was not integrated with the other courses. (For a full description of the pilot IMPULSE course and other topics, see Discussion 6, which consists of a paper presented by Pendergrass, Laoulache, and Fowler at the 2001 Annual Meeting of the American Society for Engineering Education.)

kk. The effectiveness of the process that Dowd used to hand-off his course to Crawford is affirmed by research on faculty adaptation processes. See Penberthy & Millar, 2002.

ll. The effectiveness of the process that Dowd used to hand-off his course to Crawford is affirmed by research on faculty adaptation processes. See Penberthy & Millar, 2002.

mm. Johnson, D., Johnson, R., Active Learning: Cooperation in the College Classroom, Interaction Book Co., 1998.

nn. Pendergrass, N. A., Laoulache, Raymond N., Dowd, John P., and Kowalczyk, Robert E., "Efficient Development and Implementation Of An Integrated First Year Engineering Curriculum," Proceedings of the Frontiers in Education Conference, Tempe AZ, November 1998.

oo. Pendergrass, N. A., Kowalczyk, Robert E., Dowd, John P., Laoulache, Raymond N., Nelles, William, Golen, James A., and Fowler, Emily, "Improving First-year Engineering Education," Proceedings of the Frontiers in Education Conference, San Juan, Puerto Rico, November 1999.

pp. Kowalczyk, R. and Hausknecht, A., "Using Technology in an Integrated Curriculum Project IMPULSE," Proceedings of the 11th Annual International Conference on Technology in Collegiate Mathematics, Addison-Wesley Publishing Co., 1999.

qq. See references in previous footnote, plus: Hestenes, D. and Wells, M. and Schwackhammer, G., "Force Concepts Inventory," The Physics Teacher, 30 (3), 141-158, 1992.

rr. Thornton, Ronald K., and Sokoloff, David R., "Assessing Student Learning of Newton's Laws: The Force and Motion Evaluation and the Evaluation of Active Learning Laboratory Lecture Curricula," Am. J. Phys. 66 (4), 338-352, April 1998.

Go to previous page Go to next page


Introduction || Quick Looks || Conversations || Case Studies || Resources

Search || Who We Are || Site Map || Meet the CL-1 Team || WebMaster || Copyright || Download
College Level One (CL-1) Home || Collaborative Learning || FLAG || Learning Through Technology || NISE