Header1_angel Advanced Concepts Case Studies Discipline Matrix

Aerospace Digital Library Link to http://www.nasa.gov Link to xxxxxx

 

extrovert

Learning Across Disciplines

Assessment of Learning

Why? Discover how students are learning and using the resources we develop, and how to adapt the resources to improve learning.  Measure effectiveness of given resources and strategies for transferable products. Obtain peer-review for final products.

 

Lessons from Current Practice: Georgia Tech’s Center for Teaching and Learning (CETL) conducts web-based surveys in all courses, similar to surveys used in many universities. This survey focuses on the instructor, and the key metric from this is the “Teacher Effectiveness Rating”, so that the subject of the survey is the instructor’s popularity, much more than the learning that really occurred in the course. In recent times, the student participation level in this survey, as with similar institutional surveys all over the nation, has become quite random, and is often so small as to be unhelpful.  However, these surveys will continue to be conducted in every course. Table 3 shows lessons learned from using these surveys, and our approach to using the lessons.

 

Table 3: EXTROVERT assessment techniques compared to standard course-instructor evaluation

Present course evaluation process

EXTROVERT approach

Focused on instructor  popularity

Focus on learning effectiveness

Anonymous survey devalues thinking

Learners asked to participate as team members; open-ended questions seek new thinking

Web-based surveys eliminate need for repetitive signatures and disclaimers; reduce overhead

Adopt web-based survey creation resources accepted in assessment community

Low participation in end-of-course surveys, especially in web-based ones, given lack of reward/recognition for time and effort by learner

Engage the participant by integrating the survey into modules throughout the course. End-of course results derived from learner evolution through the semester

Customer Service focus weighted towards complaints. Drives to minimum change/surprise

Encourage initiative, recognize value of dealing with innovative thinking.

Focus on popularity discourages excellence

Reward excellence in learner value addition

 

The Requirements Definition for assessment plans is more extensive. One requirement is that they must be approved by the Institutional Review Board, which checks against the Belmont Criteria. The second is that they must also help in the institution’s overall assessment practices, and in the ABET (accreditation board) review process. While these are important from the institution’s point of view, sustainability of the assessment plan requires that instructors and users (learners) buy in. In practice, the assessment effort must be integrated into the learning process to be seen as valuable. We try to include these requirements in the design of the plan.

 

Recent work on assessment related to multidisciplinary learning provides several excellent ideas. Richerson and Suri   discuss metrics for the ABET criterion 3, which is about graduates’ ability to function on a  multi-disciplinary team. They relate experience from a collaboration between Bio Engineering and Software Engineering students. They track the evolution of the team’s opinions of each other and the team process through several weeks. Ocon shows how to promote creative thinking through teamwork. He gives brief survey items to assess creativity, for use in student evaluations.  Rose and Flowers give technology assessment techniques that also apply to assessing the progress of the EXTROVERT system. It also validates the importance of integrating policy and economics into the multidisciplinary project environment. Bilen et al quantified the impact of hands-on space-related projects on career choices, from anecdotal evidence and the responses to questions on a graduating senior project.

 

Assessment strategies specific to this program:

Metrics for this program were developed after an expert panel consultation involving the team members, who between them account for nearly 50 teacher-years in large and small engineering classes at some the most demanding institutions in the world. Table 4 summarizes them.  Assessment will be performed against each metric for each of the Tasks and products discussed in the following pages.

Table 4: Metrics

Metrics for learning

Metrics for Resources

Metrics for Learning Methods

1. Depth of understanding of material: from tests and projects

1. Depth

1.  Ease of Access

2. Ability to grasp key concepts in new fields.  From assignments

2. Accuracy

2.  Intensity of Challenge

3. Ability to use contributions of others in team work. Team project segments

3. Lucidity

3.  Relevance to Work

4. Ability to make and refine estimates for concept development. Assignments

4. Breadth

 

5. Ability to formulate problem in quantitative terms from customer requirements. Projects.

5. Learner Adaptation

 

 

 

 

 

 

 

 

 

 

Recently, we surveyed the relationship between course assessment and grade inflation problems as seen by faculty around the nation , as well as current techniques for addressing the issues. Our own earlier work has shown that when students are asked to write in their own names, about how they learn and what they found useful, they provide excellent and thoughtful information.  However, we had found no way of incorporating this into quantitative metrics through a process that would satisfy the criteria set by the assessment offices.

 

In this project we propose a practical solution: by assessing learning at the end of each module. This thus merges the data collection function with that of identifying needs for improved comprehension, in near-real time. It also provides feedback to both the instructor and the learner. Professor Saleh has tried this out in his courses, both on large undergraduate core courses and on smaller graduate elective classes.  Samples are shown in Figures 1 and 2.

 

 

 

TOPICS COVERED DURING LECTURES

(Ratings of Quality of Coverage, Pace and Self-assessment: How well did you know this topic? How well do you now know the topic?)

  • Mechanical systems modeling and dynamics
  • Time domain analysis.
  • Frequency domain analysis
  • Introduction to state-space modeling

Comments / suggestions:

 

HOW USEFUL WERE THE FOLLOWING?

  • (Standard commercial math analysis/simulation package)
  • Review of analytical tools (ODE, Laplace transforms, complex variables)
  • Evening Review sessions
  • Problem sets
  • Overall subject content and organization

Other stuff that helped you learn the material. Please specify:

Figure 1: Student evaluation of topic coverage used in AE3515, Systems and Controls

Figure 2: Student evaluation of resources used in AE3515, Systems and Controls

 

In addition to the mandatory course evaluation mentioned above, we will conduct the following, by approval from the Institutional Review Board. We propose to use one of the existing web-based standard survey development interfaces such as www.surveymonkey.com for this purpose. 

 

 

 

Development of Transferable Products

Why? This is the process of developing lasting products from this project, besides web-based learning resources.

 

Richerson, S., Suri, D., “Strategies Of Assessing Multi-Disciplinary Collaborative Experiences”. Paper AC 2008-236, ASEE National Conference, June 2008.

Ocon, R., “Teamwork And The Creative Process: Promoting Creative Thinking Through Teams”. AC2008-879, ASEE National Conference, June 2008.

Ball, M., Flowers, J., “Technology Assessment: A Graduate Course To Build Decision-Making Skills”. Paper AC2008-285, ASEE National Conference, June 2008.

Bilen, S., Schuurman, M., Brown, L., Wheeler, T., Urbina, J., “Addressing Aerospace Workforce Needs: The Impact of Hands-On Space Systems Project Experiences on Career Choices”. Proceedings of the ASEE National Conference, Pittsburgh, PA, June 2008.

Komerath, N.M., “Excellence or Disaster? A Thought Experiment on Grading, Teaching and Learning in Engineering School”.  Proc. ASEE Annual Conference, Pittsburgh, PA, June 2008

Craig, J., “Teaching Undergraduate Aerospace Engineering Students to Communicate Effectively”. ASEE Paper 2006-1943, ASEEE National Conference, June 2006

 

 

What type of Learner are you? Try any of the following, see below for explanations.

Space Shuttle LandingEagleWright FlyerLj2Inch
 
 
 
 
 

 



 
 

Space Shuttle Landing1. Astronaut: A thorough, sequential presentation of notes. Systematic, guided approach, with logical but often simplified presentations of the material.
 
 

Eagle2. Eagle: A birds-eye view of the entire subject matter through lists and site-maps, freedom to pick precise items quickly.
 
 
 
 
 
 
 
 

Wright fFyer3. Barnstormer: Problem-solving based approach to subject matter. Web-based "calculators", on-line computer programs and simulations, and data bases. Trial-and-error expected.
 
 
 
 

Lj2Inch4. Rocket Scientist: Hyperlinked derivations of material will be presented with a no-holds-barred usage of mathematics, and a rigorous graduate-school-level presentation of theory.
 
 
 
 
 
 
 

 


Acknowledgements: "STS_Land" and "Wright-Flyer" from image files supplied by Deneba Canvas. "Eagle" photo, origin unknown. Rocket launch: "Lj2lnch.JPG", www.boeing.com, historical archives.

"ExTro_VERT" and the interface are Copyright 2000, Komerath, N.M. & Smith, M.J., Georgia Institute of Technology, School of Aerospace Engineering.
 
 

 

 

EditRegion3

 

 

EditRegion3