Goal Analysis

Introduction Connect Apply Reflect Extend


If there is a "secret" to the success of Instructional Systems Design, it probably has something to do with the careful way in which we define instructional goals, or conduct "goal analysis."

You may recall that behavioral psychology has had an important influence on the field of educational technology. The behaviorists insisted on measuring observable behaviors rather than on making inferences about thought processes.

This insistence on observable behaviors found its way into ISD by specifying outcomes in terms of behavioral goals and objectives. That means that we judge learners' success by comparing statements about what learners should be able to do with what they actually can do after they've participated in the instruction.  That process might sound familiar as you reflect on the analysis process (optimals - actuals = gaps).

We can also use goals and objectives to help gauge the success of the instruction itself. We judge it successful to the extent that learners' performances following instruction correspond with the instructional goals and objectives.

Educational technologists have, over time, somewhat softened the strict behavioral interpretation of instructional outcomes. For example, a common instructional goal in education is that students will develop higher level thinking skills. "Higher level thinking skills," indeed any thinking skills, would be, if you will excuse the expression, unthinkable to the behaviorists.

To avoid confusion, we'll talk about instructional or performance goals and objectives rather than behavioral goals and objectives. But we mean the same thing as most people who use the older terms. That is, "What will learners be able to do after instruction?" Examining this question is the focus of Module 5.


Module 05: Connect

In this module:

Instructional goal analysis involves transforming broad, "fuzzy" goals into observable, measurable performances. This, in turn, means thinking about the range of possible performances.

The answer to the question, "What will learners be able to do after instruction?" will always fall into one of two broad categories. Learners will be able to remember specific things from the instruction, and/or they will be able to apply them.

What will be done with the knowledge?

Remember or apply? This is an important distinction. Who doesn't remember Einstein's famous formula E=MC2? But how many of us can apply it to solve a problem? Sure, I recall that it has something to do with the conservation of matter and energy (the fact that they can change into one another, but can't be created or destroyed), and I even remember that, in English, the formula reads "Energy equals Mass times a Constant squared." That's about the extent of what I remember about E=MC2. Ask me to apply the formula to solve a problem, however, and I couldn't do it without additional instruction.

The same distinction holds true in corporate and military training situations. A manager or officer might be able to list or even describe, say, seven types of decision-making processes, but might not be able to actually implement any of them. Using or applying what we've learned requires more mental processing and more practice than just remembering it. That means more, longer, and more expensive instruction.

Sometimes just remembering is enough. If you can remember the principles on which the impressionist painters based their styles, that may be sufficient if you are, say, a museum guide. In this instance, a little bit of training, just enough to help you remember and explain the principles, is probably adequate. There is no need to devote extensive resources to train you to be an impressionist painter. Should you wish apply the principles, however, and paint impressionistically, you will need to spend a good deal more time understanding and practicing.

In addition to determining whether your learners only need to remember or actually need to be able to use or apply their knowledge, you can classify knowledge in terms of four types, or knowledge of facts, concepts, procedures, and principles.

Facts, Concepts, Procedures, and Principles

Educational technologists like to classify knowledge into these categories for a simple reason: we learn each of these in different ways. If we know what kind of knowledge we're trying to teach, we can more easily prescribe learning strategies. When analyzing instructional goals, we want to start thinking about what facts, concepts, procedures, and principles we want learners to know so that, later, when we specify instructional strategies, we can do so systematically.

[Note that some theorists have also identified process as a separate, unqiue classification.  In EDTEC 540, we limit our discussion to the following four classifications.]


Facts are unpredictable associations among objects, symbols, or events. For example, see if you can predict what the first word of the text of the next chapter is. Now check to see if you were correct. Without astonishing luck, you were wrong. But it's not because you're stupid, it's because there is no method or rule for accurately predicting the first word of book chapters. You just have to know what it is. You could try any number of ways of predicting it, but eventually you'll have to throw up your hands and admit that, "I don't know why it is what it is, it's just a fact."

Names, dates, and quantities are all facts. In 1770 you couldn't have predicted that the name of the first President of the United States would be George Washington. If we blindfolded you and dropped you onto a street in a town you have never visited, you would be unable to predict the name of that street, without seeing a street sign. You could accurately predict the year of the next international Olympics, because there is a rule to follow--the Olympics are held every four years. But you couldn't accurately predict the finishing times of the first three contestants in the 100 meter dash. Those are facts.

A good test to see whether something is a fact is to ask yourself, "If I didn't already know it, is there a rule or procedure that would let me accurately predict it?" If the answer is no, it's probably a fact.


A concept is a group of objects, symbols, or events that share common characteristics. "Computers" is a concept. There are many sizes, shapes, and types of computers but they all share some characteristics, such as having input and output devices and the ability to process information. You can think of a concept as a "definition."

"Elections" is a concept. Elections are held in many different countries, under different circumstances, and for different reasons, but they have some characteristics in common, such as campaigns, ballots, polling places, and so forth. One definition of an election is "choosing for office by vote."

If you're not sure whether or not something is a concept, you can ask yourself, "Is this a group?" My car, for example, would not be a concept, because it isn't a group. "Cars" would be a concept, though, because all cars share characteristics like wheels, motors, and frames. Could you predict what kind of car I drive? No? Then it's probably a fact that I drive a Mazda Musketeer, California license plate number "1 4 ALL."

You could also ask yourself, "Could I find a definition of this in the dictionary?" You would be hard pressed to find a definition of "Bob's car" in the dictionary, but you could easily find a definition of "car."


Procedures answer the "How" questions. If someone asks, "How do you conduct an effective sales meeting?" or "What's the best way to troubleshoot this toaster?" they're probably looking for a procedure.

A procedure is a step-by-step sequence for accomplishing a specific task. The procedure for brushing your teeth includes the steps: (1) locate toothbrush and toothpaste tube; (2) uncap toothpaste tube; (3) squeeze toothpaste onto toothbrush; and so on. This procedure only works for brushing teeth, it doesn't apply to, say, brushing a coat. Procedures apply only to specific performances.


Principles are guidelines or rules that can be applied in a range of situations. For example, the principle for spelling English words with i and e in them advises students to use "i before e, except after c.." You can apply this rule to any number of English words, including ones you've never heard or seen before. Unlike a procedure, it applies across a range of situations.

Principles can also be used to predict things. "Red sky in the morning, sailors take warning; Red sky at night, sailors delight" is a guideline for predicting the weather. It doesn't explain how to do anything, it just helps you understand patterns or cause-and-effect.

The content/performance matrix

Merrill (1971, 1986) and others have combined these four types of instructional content--facts, concepts, procedures, and principles--with the two types of performances--remember and apply--into a content/performance matrix (Figure 5.1).  We have modified this a bit for EDTEC 540, your foundation course in Educational Technology.  As your graduate journey continues, you will build upon this foundation frequently.








Classify examples

Perform the steps

Apply a rule


Recall an association

Recall a definition

List the steps

State a rule

Figure 5.1. Content/performance matrix, adapted from Merrill (1971, 1986) and Clark (1994).

The cells of the matrix contain learner performances that represent apply or remember performances for each type of knowledge. For example, if you are teaching social studies students about the differences between republics and popular democracies, you might ask them to remember the definitions of each of these concepts, but if you really want them to be able to apply the concepts, give them a series of descriptions of real or imagined governments and ask them to classify each one as either a republic or a democracy. The students would be applying their understanding of the concepts.

Similarly, if your real estate sales agents are trying to learn to apply principles for matching a family with a neighborhood, you could give them some profiles of families and ask them to predict the best neighborhood to show them, based on the rules or principles. That would take a little more practice than just remembering the rules themselves.

Notice that there is no apply facts category in the content/performance matrix, since facts, by definition, are arbitrary associations, solitary instances. You can't classify instances of them, they are instances. You can't make inferences based on them, because facts only refer to themselves. They're certainly not rules, because they don't apply to a range of situations.

Clarifying "fuzzy" goals

Goal analysis is essentially the process of taking what Mager (1984) calls vague, "fuzzy" goals, and operationalizing them. That just means stating your goals as observable performances. This is important for several reasons.

First, if your goals are "fuzzy," so will be your instruction. For example, if you decided you wanted your paratroopers to "know how to parachute safely," a conversation between you and your sponsor might go something like this:

Fuzzy instruction resulting from a "fuzzy" goal--he instruction may match the goals perfectly, but if the goal isn't appropriate to the performance and stated explicitly, the instruction will miss.

A second reason for stating goals as observable performances is that it helps the learners budget their resources. If they know they only need to learn to "list the five steps to parachute safety," they can muster a modest effort. On the other hand, if they know they will need to demonstrate those steps, they will budget additional time and effort to master the performance.

In the beginning of a project, you and your sponsor will almost certainly discuss the instructional goals in broad, fuzzy terms. But during the goal analysis phase, you'll want to assign specific performances to those goals, and state them explicitly. Later, in the objectives and test item writing stage, you'll use these same performances as a basis for assessing mastery.

In this section, we've adapted, with modifications, Mager's five-step method of goal analysis. The mnemonic "LACCE" may help you remember the five steps: (1) List the goals; (2) Assign performances; (3) Consolidate; (4) Complete statements; and (5) Evaluate. Here's how to do each step.

List the goals

Together with your sponsor, client, staff--anyone who may have a stake in the instructional goals of the project--make a comprehensive list of everything you think you want the learners to accomplish. Don't worry about how they're stated at this point, just get them all down on paper. These are what Mager refers to as "fuzzy" goals.

Awhile back one of our Educational Technology students worked on a training video for line workers in a tuna processing company. The company had several processing plants, and some of them were more efficient and successful than others. Their goal of the program was to help the less organized plants adopt the practices of their more successful counterparts, and to reduce training costs associated with high turnover of employees (would you want to clean tuna eight hours a day for the rest of your life?). They had multiple goals for the project. Their initial list might have looked something like this:

Sometimes people get a little long-winded and come out with phrases like, "We want that the quality of our products are consistent and of extremely high quality." That's fine. Just jot it down the way it comes out at first. The rest of the LACCE method will smooth out the bumps.

Assign performances

Here's where the content/performance matrix (Figure 5.1) helps you. For each "fuzzy" goal, list one or more performances that you would consider evidence of mastery. Do this with your sponsor or client, if possible. At the very least, submit the list to them for approval.

Refer to the content/performance matrix for ideas about appropriate performances. For our tuna canning example, above, the list might look something like this (with content/performance matrix categories at right):


Demonstrate safe knife handling

Apply Principle

List and justify rules for walking instead of running and for cleaning up spills

Remember Principle

Demonstrate appropriate dress including safe shoes, no jewelry, and no loose clothing

Apply Principle


Demonstrate appropriate dress, including hair net, long sleeves, surgical gloves

Apply Principle

List important rules for washing hands after using toilet, and for handling minor injuries

Remember Principle


Identify and describe the characteristics of a quality product

Remember Concept

Separate good and poor quality product based on the characteristics

Apply Concept


Explain the importance of consistent product

Remember Principle

Discriminate between acceptable and unacceptable products

Apply Principle


Clean at least 3 tuna per minute

Apply Procedure

Notice that several of the goals involve subjective feelings, or "affective" states, rather than performances. You can't observe them, let alone measure them. That's OK. You can use what we call "indicator performances." Think of things people might do that would indicate to you that they were feeling or thinking this thing or that. For example:


Express empathy and support for coworkers

Apply Principle


Recite company slogan

Remember Fact

You might protest that reciting the company slogan, for example, does not really prove that a person takes pride in their work or in their role in the company. You could argue that the only way to really find out whether workers feel pride is through extensive interviews or even subtle psychological tests, or perhaps through indirect means such as tracking whether they take part in non-compulsory company activities.

The problem is that these alternatives are expensive to conduct. You have to balance the importance of a given goal against the cost of a specific indicator performance. Remembering the company slogan might not very accurately indicate feelings of pride, but not remembering might indicate that the individual is unaware of organizational values. That might be enough to satisfy you and your sponsor.


The next step in the instructional goal analysis process is to consolidate goals and performances that seem redundant. For example, in the tuna canning list, you might decide that "quality" and "consistency" are really two sides of the same coin. When you look at the performances, you realize that if people can discriminate quality product then they are also ensuring consistency in their work. You might consolidate the two goals like this:



Identify and describe the characteristics of a quality product

Remember Concept

Separate good and poor quality product based on the characteristics

Apply Concept

Explain the importance of consistent product quality

Remember Principle

You can use the process of consolidating duplicate performances to further ensure that you're stating them in an "observable" way.

Complete statements

Now you're ready to write complete sentences that clearly and succinctly state what you and your sponsor want learners to be able to do following instruction. Each sentence should include "who" is doing "what" and how well they need to do it. For example, the "apply concept" performance for the "Quality" goal above might be written as:

"Cannery workers will be able to separate all the good and poor quality product based on the characteristics presented in the instruction."

Notice we've added the word "all." That qualifier makes the performance a little more specific than it was. Until we put that word in, we didn't know whether discriminating between good and poor quality product fifty per cent of the time is acceptable or whether more or less consistency is needed. Now we know it's important. We want them to separate all of them. If they miss one, they need more help. If they can, they pass the test. "How well" depends on the situation. Airline pilots "always" need to be able to land an aircraft. History students may only need to state three of the four causes of the American Civil War.


This step is a review--a final check to see whether your observable goals stand up to close scrutiny. Look over the set of performances for each goal, and ask yourself, "If the learner actually does this, am I willing to say he or she has met the goal?"

If you are, then the performance is a valid indicator. If you're not, then you should be back and revise that performance or rethink the goal itself.


Instructional goal analysis involves restating the broad, "fuzzy" goals (with which you and your sponsor or client inevitably started) as specific, observable performances. Most projects involve either remembering or applying four types of knowledge, summarized in the content/performance matrix (Figure 5.1). Following the five steps of instructional goal analysis results in clear goal statements specifying the performances you and your sponsor are willing to accept as indicating mastery of instructional goals.

Module 05: Apply

 Content/Performance Matrix

Test your knowledge by completing the quiz. The link is in the 'Content Modules' page in Blackboard, just below this Module 05 link.

Module 05: Reflect

Presenting Results of Your Analysis

Wherever you are in your Performance Analysis process, I want to make sure you have the opportunity to ask questions.  So, I've established a performacne analysis coaching discussion board topic.  One question that often comes up is this:  "Is there one "best" way to gather data and present results?"  The answer is no.  It all depends on the initiating challenge, findings and envisioned solution.  Presenting your results in a report is, in fact, an instructional design effort.  Successful reports "educate" the reader.  So, this too is an opportunity for you to think about your audience (me - or, if helpful, think supervisor) and your content (data analysis and reccomdnations) and bring it together into a successful report.

Feel free to visit the board to post your challenges and ideas, and I'll add my advice too.

Module 05: Extend

Overview of this section:

People in action
Main points
Next step
For more information

People in action

When we last left Roberto, the vice president was upset because performance rating forms were not being completed correctly at the retail stores, and that "some sort of training program" needs to be set up to "correct these errors." Rather than blindly accepted the word of the vice president, he convinced her that a goal analysis would be a better starting point.

They began by calling a meeting of six people; Roberto and the vice president, a store supervisor, and three managers from the camping, water sports, and biking departments from three different stores. Their first step was to list the goals. They decided to concentrate on two; the forms should be complete, and they should be turned in on time.

Assigning and consolidating performances turned out to be easy for the goal of turning reports in on time (a remembering procedure and applying procedure skill), but a bit harder for the goal of completing the forms. As they listed the behaviors, it turned out that this goal not only meant completing all empty spaces (remembering concept and applying procedure), but that the information included in the form needed to accurately reflect the actions, attitudes, and performances of the employee being evaluated (a remembering and applying principles skill).

Statements generated to identify if the goal had been achieved included the following. The department manager will: 1) provide the performance rating rubric to all employees within one week after hire; 2) discuss the components of the form with employees, and obtain the employee's signature that they understand the form and its significance; 3) rate the performance of the employee on each of the six categories in the 5-point evaluation rubric; 4) submit performance evaluations on February 1 and July 1 for all employees within the supervisor's department; and 5) schedule and conduct an individual meeting with each employee to discuss their evaluation within 15 days after the due dates.

The goal analysis process took the group most of one morning to complete. At the conclusion, the vice president, store manager, and department managers all felt that these performances would indicate that the managers would be accomplishing the original goal of "filling out these forms correctly." Questions remained, however, as to what sort of rubric would have to be generated to use during the evaluation, but all felt that a new form was needed rather than the previous one the company had been using.

Main points of Module 05

  1. Instructional goals and objectives are used to measure the success of the learner, as well as the instruction, by checking what learners can do versus what they should be able to do.
  2. The purpose of the instructional goal analysis is to reduce broad, fuzzy goals into observable, measurable performances.
  3. Performance goals and objectives are stated as to what the learner will be able to remember or apply after instruction.
  4. Remembering is easier than applying, and therefore requires less instruction.
  5. The knowledge that needs to be remembered or applied will be one of four types: facts, concepts, procedures, or principles.
  6. Knowledge types are classified by instructional designers because each type is learned differently, and therefore require different strategies during instruction.
  7. Facts are unpredictable associations among objects, symbols, or events, such as names, dates, and quantities.
  8. Concepts are a group of objects, symbols, or events that share common characteristics. Their definition is often found in a dictionary.
  9. Procedures are a step-by-step sequence for accomplishing a specific task. They answer "How" questions.
  10. Principles are guidelines or rules that can be applied to a range of situations, or predict cause-and-effect events.
  11. The performance/content matrix, consisting of the two performance types (remember and apply) by the four knowledge types (facts, concepts, procedures, and principles) can be used to define most learning events within the nine cells (we can't apply facts).
  12. Without identifying specific performance you want your learners to accomplish, fuzzy goals often lead to fuzzy instruction.
  13. Specific performances to be accomplished also let the learner know how much mental effort to assign to the learning task.
  14. The goal analysis can be viewed as a five-step process; List goal, Assign performances, Consolidate, Complete statements, and Evaluate (LACCE).
  15. Listing the goals should be done, without critique, with as many stake holders as can be justified.
    After the goals have been identified, performances that would be considered mastery of the goal are assigned.
  16. Indicator behaviors are often used for goals that involve subjective feelings.
  17. Consolidation is the combining of goals or performances that are redundant in their meaning.
  18. Statements are then written in complete sentences to clearly and succinctly state what the learners are to do following the instruction.
  19. The evaluation phase concludes the goal analysis by verifying that if a learner were to do these performances, they would be considered to have met the goal.

    Next step

    Before instruction is identified, not only do the performances need to be identified, but it is also critical to get a handle on the audience. How old are they? How much experience do they have? What language do they speak? How do they like to learn? These and other variables need to be identified before we begin selecting strategies for an intervention.

For more information

Clark, R. (1994). Developing technical training. Phoenix, AZ: Buzzards Bay Press.

Mager, R. (1984). Goal analysis Belmont, CA: Lake Publishing Company.

Merrill, M. D. (1971). Necessary psychological conditions for defining instructional outcomes. In M. D. Merrill (Ed.), Instructional design: Readings. Englewood Cliffs, N.J.: Prentice Hall.

Merrill, M.D. (1986). Component display theory. In C. M. Reigeluth (Ed.), Instructional-design theories and models: An overview of their current status. Hills dale, N.J.: Lawrence Erlbaum Associates

Introduction Connect Apply Reflect Extend

Page authors: Bob Hoffman & Donn Ritchie Last updated, Marshall: Spring, 2006.

All rights reserved