Monthly Archives: May 2014

Speed Business Center to Open July 1

On July 1, 2014 Speed School will open its new business center. SpeedBC will consolidate the school’s unit business managers with the research support group (RASS) into one office to handle finance, accounting, payroll, hiring, procurement, grants management, and a wide array of related business services. This strategic initiative aims to:

  1. Support the vision and strategic goals of the school;
  2. Significantly improve the quality and timeliness of business processes;
  3. Adhere to best practices of the profession;
  4. Provide accountability to those served;
  5. Transition to the return of a significant share of research infrastructure funding to the departments.
  6. Meet the university mandate to improve financial controls

The center will be led by Ms. Laura Newton, and will be housed on the second floor of the Vogt Building. It will also serve as a pilot study for UofL to examine how it might implement similar shared service centers across the university, as part of its 21st Century Initiative.

We will plan to host an Open House event sometime soon after the July 1 date, so everyone can stop by and see the new center.


Students travel to Botswana on Service Learning Program


Marcella Kennedy

The UofL International Service Learning Program (ISLP) began in 1997 with a program in Barbados. Since then, the UofL ISLP has extended programs to the Philippines, Botswana, Belize, Croatia, and Trinidad and Tobago. The ISL program involves sending students to different countries around the world to teach younger students about their respective disciplines.

This year the Botswana program was available to engineering students. Nine engineering students attended this year, including Marcella Kennedy, Stephanie Coffey, Ian Van Lierop, Lily Yang, Nick Hudson, Alec Thompson, Gabrielle Hamilton, Kayla Meisner, and Rachel O’Connor. The students were led by the civil engineering department head, Dr. J.P. Mohsen. These students spent the spring semester preparing lesson plans for junior secondary students at various schools in Gaborone, Botswana.

After much preparation, the students departed for Gaborone from April 29 through May 8. The group had three lesson plans prepared: designing and building straw bridges with masking tape, building simple solar cookers with cardboard and aluminum foil, and assembling small solar powered cars. The students departed, excited for the opportunity to share their passion with young Botswana.

Throughout the next week, these lesson plans were taught in classrooms at four schools: Nanogang, Maoka, and Kwena-Sereto junior secondary schools, and Hope Mission. These activities ignited a passion for science and problem solving in many of the students in Gaborone. Additionally, the Speed School students had time to discuss what engineers do, how to become an engineer, and why being an engineer is awesome.

In addition to teaching junior secondary students about engineering, the UofL reps also had the opportunity to engage in cultural exchange events with students from Botho University, representatives from the Botswana Ministry of Education, and faculty from many schools in Gaborone. In their spare time, students also had the pleasure of going on a trip to a game reserve in South Africa! It was an enriching first ISLP trip for engineering students, and many of them plan to return next year!

For more information on the UofL International Service Learning Program, please visit their website.

Student Evaluations vs. Quality Matters in Online Learning

High quality course design and delivery is important to the ongoing growth and success of the Speed School’s online programs. End-of-term student evaluations provide one way of assessing both online and face-to-face courses across various factors. In addition to asking about departmental learning objectives aligned with ABET accreditation standards, these surveys seek student feedback on the following items.

  1. Were the goals of this course established?
  2. Were the goals of this course met?
  3. To what extent has your knowledge and/or skill been increased as a result of taking this course?
  4. Rate the effectiveness of this instructor’s classroom presentation (e.g., preparedness, delivery, format, use of illustrative examples, etc.).
  5. Rate instructor’s attitude toward students (e.g., availability outside class, adequacy of office hours, adherence to scheduled class times, adherence to syllabus, general attitude toward students, etc.).
  6. Rate the overall effectiveness of this course.
  7. Rate the overall effectiveness of this instructor.

The average ratings for each of these items can be useful when comparing courses across sections, instructors, departments and delivery modes, but they must be used with caution, especially when the number of respondents is low. One of the most helpful aspects of these student evaluations is the written comments, which provide additional insights about outcomes, strengths, areas for improvement and overall satisfaction.

There are several shortcomings of student evaluations, especially when it comes to online learning. These include their backward-looking nature (feedback collected after much or all of a course has been delivered), the expertise of the evaluators (student participants in a course instead of experts in course design and delivery) and their focus on traditional face-to-face delivery (which does not account for the unique demands of effective online courses). In response to these and other concerns ten years ago, a new evaluation tool was developed based on national standards of best practice, research findings and instructional design principles.

As stated in the 2011-2013 Quality Matters Rubric Workbook for Higher Education, “Quality Matters (QM) is a faculty-centered, peer review process that is designed to certify the quality of online and blended courses. QM is a leader in quality assurance for online education and has received national recognition for its peer-based approach and continuous improvement in online education and student learning.” QM subscribers include the University of Louisville’s Delphi Center for Teaching and Learning, as well as hundreds of post-secondary, K-12 and other academic institutions.

At the heart of the Quality Matters toolset and process is the QM Rubric’s eight general standards (see below) and 41 specific review standards, of which 21 are identified as essential (three points each), 12 as very important (two points each) and eight as important (one point each). In order for a UofL online course to be “Delphi Certified” using the QM Rubric, it must meet all 21 essential standards and achieve a total score of at least 81 out of 95 possible points.

  1. Course Overview and Introduction
  2. Learning Objectives (Competencies)
  3. Assessment and Measurement
  4. Instructional Materials
  5. Learner Interaction and Engagement
  6. Course Technology
  7. Learner Support
  8. Accessibility

Quality Matters is an important element of the Delphi Center’s training of current and future online faculty. This spring, eight Speed School instructors participated in the Survivor’s Guide to Online Teaching and Learning, a 12-hour program over four weeks in the Vogt Building. Congratulations and thank you to Dr. Antonio Badia (CECS), Dr. Ahmed Desoky (CECS), Mr. Gary Eisenmenger (ENGR), Dr. Jeff Hieb (ENGR), Dr. Ibrahim Imam (CECS), Dr. Anup Kumar (CECS), Mr. Michael Losavio (CECS) and Dr. Olfa Nasraoui (CECS) for their attendance and participation. There are seven more Speed School faculty who will be completing a one-week face-to-face or eight-week online session of Delphi U this summer. All of these instructors have received or will receive QM training that will help them develop a new online course or improve an existing one. After teaching the new or improved course once or twice, all will be encouraged to seek a formal Quality Matters review, which we hope will eventually lead to a “Delphi Certified” designation for all of Speed’s online courses.

Moving forward, we will rely on both Quality Matters and student course evaluations as we strive for continuous improvement in online teaching and learning. QM emphasizes the use of research-based best practices for course design using a faculty-driven, peer review process, whereas student course evaluations focus on effective curriculum delivery and achievement of desired learning outcomes. Responding positively to feedback from both assessments will help Speed School faculty, staff and students to design, deliver and benefit from exceptional online learning experiences.