Instructor
Marcie Bober, Ph.D.
Office: NE-282
bober@mail.sdsu.edu

Using Observations to Measure Instructional Value

It's been a long, arduous process but the new (course, module, workshop, tool, website) has finally rolled out. So--what to do now? How will you know if the intervention on which you've worked so long and hard is "usable" or instructionally effective? In this course, we’ll focus on the role observation plays in determining instructional impact and quality —strategies for conducting it; tools for capturing the data in which you’re interested (without putting your learners at risk); ways to reduce bias and improve objectivity and rigor; and techniques for analyzing and reporting results.  
Meets 2/3 & 2/17, 8:30am to 4:30pm. Schedule # 10846.

 

Overview

This is a practical course--designed to help you use a key evaluative technique (observations) to assess instructional impact, effectiveness, and quality. Observation is common, of course, as these three examples suggest.

  • Lurkers---we feel their presence, even when we post to sites (blogs, forums, listservs) we most trust. We'll distinguish between causal and structured lurking--and what the technique reveals.
  • Secret shoppers---we know they're around ... at the grocery store, Target and Nordstroms, Home Depot, even your local post office. We'll cover how this observational strategy helps organizations monitor the ways in which training is integrated into everyday performance. [And check out available jobs in this area ... and the technique's "impact."]
  • Driver's license---no way to drive without one. But explicit observation can affect performance, even performance that's automatic and "natural." We'll explore ways to make those we observe more at ease.

Take a look at a set of tools (room layout, data form) that EDTEC grad Caroline Willi developed for an engagement study that we conducted in a special clasroom that ITS manages.

Audience

This course targets anyone interested in learning more about using observations to measure the interventions we design, manage, or facilitate--whether the intent is formative or summative. The instructor assumes basic familiarity with Microsoft Word and Excel.

Learning Outcomes

At the end of this course, you'll know:

  • important observation "lingo."
  • how perspective and intent inform instrument design and the formality of the process itself.
  • how setting (including virtual contexts that may or may not be synchronous) makes a difference.
  • the contextual factors that observers face (logistical, technical, conceptual, ethical)--and ways to account for or address them.
  • techniques to reduce observer bias and subjectivity--without destroying intuition.
  • how to create an array of different instruments (both qualitative and quantitative in nature)--and ways to implement them.
  • key techniques for analyzing, interpreting, and reporting observational data.

Resources

Breeze presentations (available January 29)

ECR readings, both book chapters and journal articles (available by January 29)---from such experts as Good and Brophy, Robinson and Robinson, Hale, Patton, and Rosenberg. [The password is observations.]

And these websites will prove useful as well!

And we'll have a course blog to which you'll contribute during class--and that, with some forethought, can become a resource available to the larger SDSU/EDTEC community.

 

Course Outline

Session 1: February 3, 2007 -- Observation 101

We'll cover basic terminology as we attend to key drivers that help us determine what to observe (and how formally); on whose perspective to focus; constraints and obstacles to control (or at least recognize); how "labor" factors into tool or instrument design; how technology can both assist and detract from the process; and ways to produce reader-friendly analyses on which real people can act.

We'll also join the debate about the credibility of eyewitness testimony.

But first ... let's explore some optical illustions!

This session is hands-on, with a focus on techniques you might use in a traditionally-delivered workshop or course ... as well as in the field. You'll complete a number of practices--independently and with a partner--that "personalize" the tips and strategies covered during the day.

Between Sessions

You'll plan and conduct a "traditional" observation based on one of three general scenarios that I'll introduce in class. Prior to gathering data, you'll forward a data collection plan organized as follows:

  • Overview and purpose
  • Methodology (sample/subject selection, tool description of tool and how it "works," strategy for use, how tested)
  • Tool itself

Both your classmates and I will critique your work --using a checklist I'll create in Zoomerang. We'll use Quick Topic to share work and garner feedback--and a course blog to share ideas, resources, tips ... and more!

Due date: to be announced

Session 2: February 17 -- The virtual world ... and where observation fits in

Here we transition to the virtual world--covering common situations that you might face as a performance technologist. What happens when the nuances of interaction that we take for granted (body language, voice intonation, eye contact, etc.) simply aren't presented or don't unfold naturally?

This session is hands-on, with a focus on techniques you might use when the setting is virtual--although not necessarily synchronous. You'll complete a number of practices--independently and with a partner--that "personalize" the tips and strategies covered during the day.

You'll once again:

  • plan and conduct an observation based on one of three general scenarios that I'll introduce in class, and ....
  • forward a data collection plan that I'll approve prior to formal data gathering.

Both your classmates and I will critique your work (via a Zoomerang checklist) and we'll use Quick Topic and our course blog to garner feedback, and share ideas, resources, and tips.