Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation (2001)

Chapter: 5 Post-Occupancy Evaluations and Organizational Learning

Previous Chapter: 4 Post-Occupancy Evaluation Processes in Six Federal Agencies
Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.

5
Post-Occupancy Evaluations and Organizational Learning1

Craig Zimring, Ph.D., Georgia Institute of Technology Thierry Rosenheck, Office of Overseas Buildings Operations, U.S. Department of State

Federal building delivery organizations face intense pressures. Not only must they provide buildings on time and within budget, but they have increased demands. They are called on to deliver buildings that are better: more sustainable, accessible, maintainable, responsive to customer needs, capable of improving customer productivity, and safer. In many cases, they must achieve these goals with fewer staff.

Some organizations have faced these pressures proactively, by creating formal processes and cultural changes that make their own organizational learning more effective. In this chapter we adopt the approach to organizational learning of Argyris (1992a), Huber (1991) and others. We mean that organizations are able to constantly improve the ways in which they operate under routine conditions, and they are able to respond to change quickly and effectively when needed (Argyris, 1992a). Learning is “organizational” if it is about the core mission of the organization and is infused through the organization rather than residing in a few individuals. More simply, in the words of Dennis Dunne, chief deputy director for California’s Department of General Services, they “get it right the second or third time rather than the seventh or eighth.” By being more systematic about assessing the impact of decisions and being able to use this assessment in future decision-making, building delivery organizations are able to reduce the time and cost to deliver buildings and increase their quality.

Some of the best models come from private sector organizations. For example, Disney evaluates everything it does and has been doing so since the 1970s. Disney has at least three evaluation programs and three corresponding databases: (1) Disney tracks the performance of materials and equipment and records the findings in a technical database. (2) Guest services staff members interview guests about facilities and services, focusing on predictors of Disney’s key business driver: the intention of the customer to return. (3) A 40-person industrial engineering team conducts continuous research aimed at refining programming guidelines and rules of thumb. The industrial engineering team explores optimal conditions: What is the visitor flow for a given street width when Main Street feels pleasantly crowded but not oppressive? When are gift shops most productive? This research allows Disney to make direct links between “inputs” such as the proposed number of people entering the gates and “outputs” such as the width of Main Street.

The Disney databases are not formally linked together but are used extensively during design and renovation projects. They have been so effective that the senior industrial engineer works as a coequal with the “Imagineering” project manager during the programming of major new projects.

Disney is a rare example. It uses an evaluation program to do the key processes that organizational learning theorists argue are key to organizational learning (Huber, 1991):

  1. monitoring changes in the internal and external business environment,

1  

For their generous and thoughtful input we would like to thank Stephan Castellanos, Dennis Dunne, Gerald Thacker, Lynda Stanley, Polly Welch and Richard Wener.

Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.
  1. establishing performance goals based on internal and external influences,

  2. assessing performance,

  3. interpreting and discussing the implications of results,

  4. consolidating results into an organizational memory,

  5. widely distributing findings and conclusions,

  6. creating a culture that allows the organization to take action on the results,

  7. taking action based on organizational learning.

Post-occupancy evaluation (POE) practice focuses mostly on individual project support and analysis rather than on lessons-learned. Although POE potentially provides a methodology for all of these processes, POE practice has historically had a more narrow focus on assessing performance and interpreting results. POE has often been used as a methodology aimed at assessing specific cases, while the other processes are seen as part of strategic business planning. Even when evaluators have been able to create databases of findings, they have often been used to benchmark single cases rather than to develop more general conclusions.

Structured organizational learning is difficult. It requires the will to collect data about performance and the time to interpret and draw conclusions from the data. More fundamentally, learning involves risk and change. Learning exposes mistakes that allow improvement but most organizations do not reward exposing shortcomings. Learning brings change and organizations are usually better at trying to ensure stability than at supporting change.

In this chapter, we explore how a variety of public agencies and some private ones have used POE successfully for organizational learning. We discuss the “lessons-learned” role of evaluation rather than the project support and analysis role. We have examined written materials from 18 POE programs and interviewed participants wherever possible. We explored whether POE-based organizational learning appeared to be going on, whether the organizations had established support for learning, and the nature of the learning.

Did POE-Enabled Organizational Learning Occur?

For example, in looking for evidence of organizational learning we asked the following questions:

  • Are participants in building projects, including internal project managers, consultants, and clients, aware of POEs or POE results, either from personal participation or from written results?

  • If so, were POE results consciously used in decision-making about buildings? For example, are they used for programming, planning, design, construction, and facilities management?

  • Can we see evidence that POE results are part of reflection and discussions about how to do a good job, among peers and with supervisors?

  • Are POE results consciously used to refine processes for delivering buildings in terms of either formal process reflected in manuals or informal rules of thumb and customs?

  • Are people who make policy about buildings, such as policy directives, design guidelines, and specifications, aware of POEs?

  • If so, is POE explicitly used in formulating policy?

Were the Conditions for Organizational Learning Present?

As we attempted to document organizational learning we were trying to understand the conditions that foster or thwart it:

  • Does the organization have an infrastructure for learning? For example, are results from POEs consolidated in some way, such as in reports or databases? Is this consolidated information distributed, either internally or to consultants or the public?

  • Is there a mechanism for ensuring that this information is kept current?

  • If lessons are made available, do they support the kinds of decisions that are made by the organization? Are they likely to seem authentic and important to decision-makers? Are the implications of results made clear, or do busy decision-makers need to make translations between results and their needs?

  • Are there incentives for accessing the data, using the results, and contributing to the lessons-learned knowledge base? For example, are internal staff or consultants evaluated on use of lessons-learned? Are they rewarded in some way for participation? Are consultants rewarded for partici-

Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.

pation? Are they rewarded for good performance as judged by the POE?

  • Are there disincentives for participating in lessons-learned programs? If an innovative initiative receives a negative evaluation, is it treated as an opportunity for organizational learning or as a personal failure?

  • Is there a perception of high-level support for the lessons-learned program? Many organizations create frequent new initiatives, and seasoned staff often perceive these as the management “infatuation du jour”: wait a day and it will change.

How Can Organizations Develop Useful Learning Content?

We also assessed the content of the organizational knowledge. We asked the following:

  • Has the organization produced a shared view of what makes a good building, in terms of either process or product? For example, has the organization been clear about key design and programming decisions and about how these decisions link to the client’s needs? Are these contributed to by POE?

  • Has the organization created an organizational memory of significant precedents? Are these precedents described, analyzed, or evaluated in meaningful ways?

  • Is this view tested and refined through POE or similar processes?

In this chapter, we briefly report our findings and analysis. We discuss four topics:

  1. What is post-occupancy evaluation? What is its history, and how has this contributed both to its potential for and difficulties in achieving organizational learning?

  2. Do organizations do POE-enabled organizational learning?

  3. How have organizations created the appropriate conditions for learning through POE?

  4. How have they created a knowledge base for building delivery and management?

BRIEF INTRODUCTION TO POST-OCCUPANCY EVALUATION

Post-occupancy evaluation grew out of the extraordinary confluence of interests among social scientists, designers, and planners in the 1960s and 1970s (see, for example, Friedmann et al., 1978; Shibley, 1982; Preiser, et al., 1988). Early POE researchers were strongly interested in understanding the experience of building users and in representing the “nonpaying” client (Zeisel, 1975). Many early POEs were conducted by academics focusing on settings that were accessible to them, such as housing, college dormitories, and residential institutions (Preiser, 1994).

During the 1980s, many large public agencies developed more formal processes to manage information and decisions in their building delivery processes. As planning, facilities programming, design review, and value engineering became more structured, agencies such as Public Works Canada and the U.S. Postal Service added building evaluation as a further step in gathering and managing information about buildings (Kantrowitz and Farbstein, 1996).

This growth of POE occurred while politicians and policy analysts were advocating the evaluation of public programs more generally. Campbell and many others had been arguing at least since the 1960s that public programs could be treated as social experiments and that rational, technical means could contribute to, or even replace, messier political decision-making (Campbell, 1999). A similar argument was applied to POE: statements of expected performance could be viewed as hypotheses that POE could test (Preiser et al., 1988).

The term post-occupancy evaluation was intended to reflect that assessment takes place after the client had taken occupancy of a building; this was in direct contrast to some design competitions where completed buildings were disqualified from consideration or to other kinds of assessment such as “value engineering” that reviewed plans before construction. Some early descriptions focused on POE as a stand-alone practice aimed at understanding building performance from the users’ perspectives. Some methodologists have advocated the development of different kinds of POEs, with different levels of activity and resource requirements (Friedmann et al., 1978; Preiser et al., 1988). For example, Preiser advocated three levels of POE: brief indicative studies; more detailed investigative POEs; and diagnostic studies aimed at correlating environ-

Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.

mental measures with subjective user responses (Preiser, 1994). Whereas there was little agreement about specific methods and goals, most early POEs focused on systematically assessing human response to buildings and other designed spaces, using methods such as questionnaires, interviews, and observation, and sometimes linking these to physical assessment (Zimring, 1988).

Over the years, many theorists and practitioners have grown uncomfortable with the term POE; it seems to emphasize evaluation done at a single point in the process. Friedmann et al. (1978) proposed the term “environmental design evaluation.” Other researchers and practitioners have suggested terms such as “environmental audits” or “building-in-use assessment” (Vischer, 1996). More recently, “building evaluation” and “building performance evaluation” have been proposed (Baird et al., 1996). Nonetheless, for historical reasons the term post-occupancy evaluation remains common, and we use it in this chapter for clarity.

Other discussions of evaluation emphasized the importance of embedding POE in a broader program of user-based programming, discussion, and design guide development, proposing terms such as “pre-occupancy evaluation” (Bechtel, 2000), “process architecture” (Horgen et al., 1999), and “placemaking” (Schneekloth and Shibley, 1995). As early as the 1970s, the Army Corps of Engineers conducted an ambitious program of user-based programming and evaluation that resulted in some 19 design guides for facilities ranging from drama and music centers to barracks and military police stations (Schneekloth and Shibley, 1995; Shibley, 1982, 1985). More recently, POE has been seen as part of a spectrum of practices aimed at understanding design criteria, predicting the effectiveness of emerging designs, reviewing completed designs, and supporting building activation and facilities management (Preiser and Schramm, 1997). With growing concerns about health and sustainability, several programs have also linked user response to the physical performance of buildings, such as energy performance (Bordass and Leaman, 1997; Cohen et al., 1996; Leaman et al., 1995) or indoor air quality (Raw, 1995, 2001).

POE methodologists and practitioners have identified several potential benefits of POE (Friedmann et al., 1978; McLaughlin, 1997; Preiser et al., 1988; Zimring, 1981):

  • A POE aids communications among stakeholders such as designers, clients, end users, and others.

  • It creates mechanisms for quality monitoring, similar to using student testing to identify under-performing schools, where decision-makers are notified when a building does not reach a given standard.

  • It supports fine-tuning, settling-in, and renovation of existing settings.

  • It provides data that inform specific future decisions.

  • It supports the improvement of building delivery and facility management processes.

  • It supports development of policy as reflected in design and planning guides.

  • It accelerates organizational learning by allowing decision-makers to build on successes and not repeat failures.

This chapter focuses primarily on the use of POE for improving organizational learning.

DO ORGANIZATIONS DO POE-ENABLED ORGANIZATIONAL LEARNING?

As discussed above, we reviewed materials from some 18 organizations that are currently doing POEs or have done so in the past. In looking at organizations that have active POE programs, we found that members of project teams, including project managers, consultants, and clients, tend not to be aware of POEs, unless a special evaluation has been conducted to address a problem that the team is facing. Where they are aware of the POEs, team members often do not have the reports from past POEs at hand and do not apparently use POE results in daily decision-making.

Mid-level staff tend to be more aware of POEs. In particular, staff responsible for developing guidelines and standards are often aware of POE results. For example, in the U.S. Postal Service, the staff who maintain guidelines also administer POEs; the POEs conducted by the Administrative Office of the U.S. Courts are used directly by the Judicial Conference to test and update the U.S. Courts Design Guide.

We were not able to find situations where senior management used POEs for strategic planning. POEs have the potential for supporting “double-loop learning” (Argyris and Schon, 1978)—that is, not only to evaluate how to achieve existing goals better but also to reflect on whether goals themselves need to be reconsidered. However, we were not able to find cases where this actually occurred.

Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.

We were not able to find many compilations of POE findings, although several organizations such as the U.S. Army Corps of Engineers, U.S. Postal Service, Administrative Office of the U.S. Courts, General Services Administration, and others have incorporated POEs into design guides. Disney and the U.S. Department of State have incorporated POE into databases of information. These are discussed in more detail below.

It does appear that POEs are not used for their full potential for organizational learning. In particular, we were not able to find many circumstances where POE was part of an active culture of testing decisions, learning from experience, and acting on that learning. There are two major reasons for this:

  1. Learning is fragile and difficult, and many organizations have not created the appropriate conditions for learning. If learning is to be genuinely “organizational,” a large number of staff must have the opportunity to participate and to reflect on the results in a way that enables them to incorporate the results into their own practice. Potential participants must see the value for themselves: there must be incentives for participating. Also, evaluation will sometimes reveal that building performance does not reach a desired standard. This is, of course, the value of POE, but many organizations punish people when innovations do not work. In addition, many organizations simply do not make information available in a format that is clear and useful to decision-makers.

  2. Many organizations have not created a body of knowledge that is valuable in the sense that it provides a coherent, integrated body of knowledge that is helpful in everyday decision-making. Knowledge tends to be informal and individual.

WAYS TO CREATE THE APPROPRIATE CONDITIONS FOR LEARNING THROUGH POE

Create Broad Opportunities for Participation and Reflection

Our research suggests that POE-based knowledge is not widely shared within most organizations. One way to achieve this sharing is through direct participation in evaluations. Seeing how a facility works while hearing directly from users is a memorable experience. Also, the process of analyzing and writing up the results from an evaluation can help decision-makers reflect on the implications of the results and make links to their own practice.

A group of evaluators in New Zealand developed a “touring interview” methodology to allow decision-makers to actively participate in evaluations with little training and only modest commitment of time (Kernohan et al., 1992; Watson, 1996; 1997). For example, in an active evaluation program with more than 80 completed evaluations, Bill Watson, a consultant to public and private clients, takes building users on a tour of the building and asks open-ended questions—for example, “What works here?”—as well as more specific probes about the functions of spaces and systems. POE reports are mostly verbatim comments by users and are sorted into categories such as “action for this building” or “change in guidelines for future buildings.” This approach is quite inexpensive and can be completed with several person-days of effort. The experience is vivid for the participants and produces results that are imageable and articulate. It also allows participants to discuss relative priorities and values. However, because each touring interview group varies, it is more difficult to compare evaluations of different settings.

This kind of participatory evaluation can be an extension of existing processes for receiving feedback from customers. Project managers in Santa Clara County, California, were tired of receiving a storm of requests from users after they moved into a building. These were difficult to direct to contractors, suppliers, and others. They contracted with consultants Cheryl Fuller and Craig Zimring to create a Quick Response Survey (QRS) aimed at organizing and prioritizing user needs about three months after buildings were occupied. All building users fill out a one-page questionnaire, and project managers follow up with a half-day walk-through interview of the building with the facility manager and staff representatives. The project managers then prioritize requests and meet with the client organizations. The State of California Department of General Services is further developing the QRS and will have evaluators enter results into a lessons-learned database.

A lessons-learned program initiated in 1997 for New York City to examine the success of school projects in the state was aimed at participation by consultants. The School Construction Authority (SCA), whose membership is appointed by the governor, the mayor, and the New York City Board of Education, was charged with

Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.

the program. To get the program approved, SCA, under the leadership of consultant Ralph Steinglass, adopted a simple methodology—require the architect or engineer of record to conduct the POE. The rationale was that this would guarantee that designers would confront how users responded to their designs and force a lessons-learned loop in the design process. About 20 POEs have been completed. To ensure reliability, SCA reviewed the results before approving the POEs. In some cases, the architects or engineers had to reschedule their interviews when they were suspected of introducing a bias or continue their investigation if they failed to include critical areas required in the study.

The three programs described above involve evaluation by the people who designed and managed the project. As such, participatory evaluation is well suited to supporting learning by in-house project managers and consultants. Whereas the New Zealand projects are led by consultants, the quick-response projects and the SCA projects are conducted entirely by the consultants or project managers.

Create Incentives for Participation

Most building professionals are interested in doing a good job and see value in POE. However, as personal time management consultant Stephen Covey has argued, things that are merely important often lose out to things that are urgent: general benefits such as long-term learning often lose out when professionals are faced with the pressing matters of everyday life. When more specific incentives are offered, it often increases participation in a POE program.

The drug company Ciba-Geigy has used direct monetary incentives. The architectural and engineering firm HLW and the contractor Sordoni Skansa Construction put their design and construction profits ($300,000 and $1.2 million, respectively) at risk based on performance on schedule, cost, and user satisfaction for the new $39 million Ciba-Geigy Corporation’s Martin Dexter Laboratory in Tarrytown, New York. One-third of the profits was based on user satisfaction responses to 14 survey questions: heating, ventilation, air conditioning, acoustics, odor control, vibration, lighting, fume-hood performance, quality of construction (finishes), building appearance, and user-friendliness. The questions were binary-choice (acceptable-not acceptable), and the building had to reach 70 percent satisfaction to pass the test. Some aspects such as sound transmission were also assessed using physical measures; if the user satisfaction measures did not reach criterion, physical measures could be substituted (Gregerson, 1997). The designers and contractors consulted the scientists throughout the process, showing them alternatives for the façade design and full-scale mockups of the range hoods. The building passed on all criteria except satisfaction with the range hoods, which were modified after the evaluation as a response to user input. Whereas some aspects of this testing process might be questionable—Should you evaluate an entire building on 14 yes-no scales? Should maintenance and operating experience be included?—this process gained from participation throughout. The design firm, contractors, management, and scientists all participated in establishing the criteria at the outset, and the financial incentive encouraged the contractor and designers to consult the users at every step in the process. It is difficult to document the learning benefit of this process, but the contractor, Sordoni Skansa, has since used POE-based incentives in several other projects and has refined the way in which buildings are delivered.

The California Department of General Services is planning to include the results of POEs as part of the review of qualifications when selecting consultants and contractors. This has strongly increased the interest in POEs by participating firms. We are unaware of any POE programs that provide incentives for internal staff members to participate in evaluations, though several programs have discussed such incentives, such as providing a free vacation day as a reward for adding data to the knowledge base or providing a mini-sabbatical for participating in evaluations or a lessons-learned program. Disney provides a powerful, if indirect, incentive: knowledge. Only the industrial engineers have access to key data and they then become valuable members of the design team.

Reduce Disincentives: Create Protected Opportunities for Innovation and Evaluation

Organizational learning consultants have long pointed to an inherent contradiction in many organizations. Whereas most organizations espouse innovation and learning, they behave in ways that limit it. We recently participated in a meeting where an organization had used an innovative building delivery strategy with which it was not familiar. They had left out a key review step. When this became clear, a senior manager turned to the project manager and said: “We would

Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.

have expected someone at your level to do better.” The message to everyone in the room was clear: avoid innovation and avoid evaluation! This syndrome— focusing on the individual rather than the performance, blaming the innovator rather than learning from the innovation—is pervasive among organizations more generally (Argyris, 1992b; Argyris and Schon, 1978). However, some building-delivery organizations have used POE to at least partially overcome it.

Some organizations have done this by explicitly sanctioning “research” with the attendant acknowledgment that innovations might not succeed. For example, the General Services Administration’s (GSA) Public Buildings Service has recently appointed a director of research. The first director, Kevin Kampschroer, has a budget to conduct, synthesize, and distribute research, including POE. The use of the term “research” carries with it the understanding that not all efforts are successful, and the budget provides some time for reflection about findings. To date, much of the research is conducted by academic consultants who bring outside learning into GSA. However, GSA is also looking at ways to broaden internal ownership of the research program.

GSA has also created an active “officing” laboratory in its own headquarters’ building. The lab, supervised by Kampschroer, is one floor of actual workspace that includes an innovative raised floor heating, ventilating, and air-conditioning system and several brands of modular office furniture systems. It also explores design to support teamwork, with many small conference rooms and meeting areas. The workers are frequently surveyed and observed, and the lab also becomes a place where clients can see alternative office layouts.

The U.S. Courts and the General Services Administration Courthouse Management Group are considering developing a different kind of laboratory: a full-scale courtroom mockup facility where new courtroom layouts and technologies can be tested and refined at relatively low cost and risk. This facility, to be constructed at the Georgia Institute of Technology, would allow mock trials to be conducted and would provide training for judges, staff, and lawyers.

Another way to reduce the personal and organizational cost of experimentation is by starting small with projects that have an experimental component. The innovation can be evaluated and considered for broader adoption. For example the U.S. Department of State Office of Overseas Buildings Operations (OBO) tries out innovations on a limited number of projects before rolling out the innovation to the larger organization. This office has recently used building serviceability tools and methods (Davis and Szigeti, 1996) for programming and design review for the new embassies in Dar es Salaam and Nairobi.

The State of Minnesota Department of Natural Resources has used POE to evaluate two innovative regional centers. In each of these cases the organizational learning effort provided some additional resources for data collection and reflection as well as the clear designation that this was an innovative effort that might not be fully successful.

In many organizations it is risky to be the first one to try an innovation. Massachusetts Institute of Technology organizational consultant Edgar Schein has proposed that while organizations may benefit greatly from consultants, they often find the experience of peers more helpful when they actually move to implementing an innovation. Schein has called for “learning consortia” where people can get advice from peers in other organizations and learn from their experience (Schein, 1995). He argues that although such learning consortia may be effective at all levels of an organization, they are particularly effective among chief executive officers (CEOs) or upper to mid- level managers. Although the prototype, laboratory, and learning consortium efforts are quite different, all reduce the disincentives for innovation and evaluation by allowing innovation and evaluation at relatively low personal and organizational cost.

Provide Access to Knowledge for Different Audiences

The simplest barrier to using POE for organizational learning is when POE results are not available to decision-makers. Many organizations produce POEs as case study reports that are not widely distributed. Part of this may be due to the history of POE, which has focused on single case studies, and part may be because of the perceived disincentives to distributing information that might be seen as critical of internal efforts or individuals. Part of the problem is the simple technical difficulty of distributing printed information, and this has become a lot easier with the Internet and intranet and virtual private networks. The National Aeronautics and Space Administration makes its lessons learned database available to all authorized staff and contractors. In the United Kingdom, Adrian Leaman and Bill Bordass have created an interactive Web site for the 18

Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.

buildings they have evaluated as part of the PROBE project. Funded by the Building Research Establishment and Building Services Journal, PROBE stands for post-occupancy review of buildings and their engineering.

Some organizations have overcome some of these issues by creating design guides and databases of POE information. Agencies such as the Administrative Office of the U.S. Courts, the U.S. Postal Service, and the General Services Administration have created design guides that are widely distributed.

As we have suggested, the problem with organizational learning is only partially technical. The tools for creating Web sites and databases are now widely available and inexpensive. A useful Web site requires the initiative to collect the information, the time to make sense of it, and the will to share it.

Part of the problem with building delivery organizations and design projects is that they represent many different professional cultures. Engineers tend to take a technical problem-solving approach. Architects are often interested in form. Clients might be interested in the usability and experience of the building. Senior managers might be searching for help in setting strategic directions, whereas project managers might be interested in lessons learned about specific materials or equipment. Part of the challenge in creating any database or report is translating between these different professional cultures, and evaluators have not always been successful at doing this.

Reduce Uncertainty by Upper Management’s Commitment

Participants in POE programs report that uncertainty about senior management’s commitment to the program is a key disincentive to participation. Sometimes the lack of commitment is seen in lack of resources, but it is often manifest in lack of visible endorsement for the program and lack of commitment to the two- to five-year time span necessary to see results in terms of organizational learning. Savvy staff have learned not to genuinely commit to the management’s infatuation du jour, knowing that it will change quickly.

CREATING A KNOWLEDGE BASE FOR BUILDING DELIVERY AND MANAGEMENT

Most fundamentally, organizational learning for a building delivery organization is producing better buildings more effectively. Given the large number of POEs that have been completed—longtime researcher and Environment and Behavior Journal editor Robert Bechtel estimates that more than 50,000 have been completed—one would expect that there would have been many books and guides that synthesize the results of POEs and tie them into a coherent guide for key programming and design decisions. However, such guides are relatively rare. In part, this is because of the focus of POE researchers and consultants on case studies. Much knowledge about buildings has been built up incrementally through negotiation on individual projects and programs, but organizations seldom take the time to identify the key strategic decisions that most affect their clients. Efforts such as American Society for Testing and Materials (ASTM) Building Serviceability Tools and Methods have begun to do this for programming and portfolio management, but we have seldom done this for POE.

In this section we examine several strategies that have proven successful for beginning to create this kind of knowledge base. In several cases, organizations have built on POEs that have been initiated for different purposes (or seem to us to be able to do so reasonably easily).

POE can be particularly successful in organizational learning if it links strategic facilities decisions to the “key business drivers” of the client organization. In the 1970s, the U.S. Army was shifting to an all-volunteer army. Potential recruits said that the aging facilities were a significant impediment to recruiting and retention, and the Army sought to renovate or rebuild many of its buildings. To help guide the multibillion dollar investment, the Army Corps of Engineers created a large program of participatory programming and evaluation, resulting in some 19 design guides (Shibley, 1985).

In the 1980s, the newly reorganized U.S. Postal Service (USPS) was losing customers to private competitors such as FedEx and UPS (Kantrowitz and Farbstein, 1996). Focusing initially on the customer experience with lobbies, the USPS contracted with Min Kantrowitz and Jay Farbstein and Associates to conduct focus group evaluations. This has led to a large and continuing program of evaluations and design guide development. New concepts of post office design are developed such as the retail-focused “postal store,” innovative projects are designed, the projects are evaluated, and the ideas are refined and then incorporated into design guides. This program has sustained an

Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.

ongoing process of testing and refining the design guides through evaluation and experience. More recently, the USPS has de-emphasized on-site evaluations. Most POEs now involve having facility managers fill out relatively brief mail-out surveys. The POE manager has found that the open-ended responses to the questionnaire are often most valuable in refining the USPS design guidelines because they are more specific than the scaled satisfaction responses.

BUILDING ON EXISTING EVALUATIONS

An organization can begin to rationalize its knowledge base by building on evaluations that occur for other reasons. As the experienced evaluator Bob Shibley has said, evaluations are easiest to justify if they bring project support, analysis benefits, and lessons-learned benefits (Shibley, 1985).

Building on Diagnoses of Troubled Settings

Sometimes a building is the subject of complaints or controversy; a POE can help to diagnose the source of problems and prioritize solutions. For example, the new San Francisco central library was a landmark when it opened in 1996, but it faced immediate controversy. Some of the initial programmatic assumptions continued to be debated—such as the wisdom of moving books to closed stacks to create room for computers. As a result, the mayor appointed an audit commission that recommended a POE, led by architect Cynthia Ripley and including the director of the Los Angeles library system. After interviewing staff and users, observing use, and analyzing records, the POE team highlighted problems with way-finding, flexibility, and public access to books. The POE recommended detailed renovations to reorganize the stacks and collection (Flagg, 1999; Ripley Architects, 2000). The POE is quite thorough in suggesting detailed specific changes, and the basis of these recommendations could potentially be turned into planning principles. Most significantly, this raises issues of programming process where the (former) library director went against the recommendations of his planning committee to reduce access to books in favor of closed stacks. It can support broader reflection about the role of libraries and physical structures in providing information in the age of computers.

Whereas the focus on understanding problems and failure provides a clear direction to POEs and has a history in case studies of blast, earthquakes, and other building failures, these kinds of POEs carry special risks of becoming ways to focus (or deflect) blame.

Capitalizing on Evaluations of Innovations

Evaluation can help decide whether innovative buildings or building components should be considered for additional capital investment. For example, as mentioned above, the State of Minnesota Department of Natural Resources (DNR) has recently changed the way in which it manages the environment. Rather than organizing its staff by discipline, DNR now uses a matrix management system where decisions are made by a multidisciplinary group organized by ecosystem. The DNR is creating new regional centers that include wildlife biologists, air and water specialists, and others concerned with a given area. The centers are intended to encourage multidisciplinary collaboration and to be very “green.” The DNR contracted with a university team led by Julia Robinson to evaluate two of the initial projects. The team made numerous recommendations. The POE was included as an appendix for the funding request for the third center. This was the first time in DNR’s history that a capital request was fully funded on the first attempt, and the DNR was told that the POE was a major reason: it showed a high level of understanding of the project. This result provided an important incentive for DNR as an organization. However, the project also raised some issues about sustainability, and the internal staff did not feel that they had been fully consulted in the POE process. An additional team was hired to create design guidelines in close consultation with staff (M. Wallace, personal communication, 2000). Issues such as sustainability, which are undergoing rapid change, are particular candidates for “double-loop” learning where both goals and methods for achieving them are being developed, if appropriate conditions are established for discussion, reflection, and action.

Focus on “Learning Moments”

The Administrative Office of the U.S. Courts (AO) conducts a POE program that informs guidelines (in the U.S. Courts design guide). However, the AO has achieved organizational learning by linking the design guide to a strategic learning moment in the development of courthouses: the negotiation between judges and the building agent (the General Services Adminis-

Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.

tration) about the scope and quality level for new courthouses. In the early 1990s, the U.S. government initiated the largest civilian construction program since the Second World War, projecting to spend more than $10 billion on 160 new courthouses. (The creation of new judgeships in the 1980s, concerns for increased security, and new technologies all necessitated new courthouses or major renovations.) However, both the judiciary and the GSA were being criticized by Congress for creating marble-clad “Taj Mahals.” The AO initiated the POE program to identify necessary changes to the standards in the first edition of the design guide, to defend the judiciary against attack by documenting the efficacy of the design standards, and to inform the negotiation about issues such as the dimensions and materials of courtrooms and chambers. Information from POEs was also used in training workshops for judges and staff who were becoming involved in new courthouse design and construction. This program is run by the AO, but the design guide is actually created and vetted by a committee of the Judicial Conference, the group that sets broad policy within the federal judiciary. This program is quite unusual: it is the only case that we are aware of where a POE and design guide are developed by a client organization that does not build its own buildings.

The Minnesota Department of Natural Resources project also focuses on strategic moments, especially the approval of the funding package by the key legislative committees.

The focus on strategic learning moments is similar to Shibley’s reminder that information is most likely to be used when it is asked for (Shibley, 1985). A strategic learning moment is a critical time when information or a POE can help resolve a problem or issue that is of considerable importance to the participants. The focus on learning moments can also be used in developing policy documents or targeting POEs toward decisions.

Creating Organizational Memory for Precedents

A key part of organizational memory is simply knowing what the organization has done, but few POE programs have been linked to recording and analysis of precedent. There is a real opportunity to link evaluation to a record of past projects. This record can include simple plans and photos and some analyses of cost, size, and materials. These descriptions can be linked to evaluations.

LESSONS FROM POE PROGRAMS: ENHANCING ORGANIZATIONAL LEARNING

We have suggested that POE has a large potential for lessons learned as well as for project support and analysis. Because of the historic focus of much POE research, the difficulty of finding resources for organizational learning, and sensitivities in exposing problems, relatively few organizations have created effective POE-enabled organizational learning systems that include:

  1. monitoring changes in the internal and external business environment,

  2. establishing performance goals based on internal and external influences,

  3. assessing performance,

  4. interpreting and discussing the implications of results,

  5. consolidating results into an organizational memory,

  6. widely distributing findings and conclusions,

  7. creating a culture that allows the organization to take action on the results,

  8. taking action based on organizational learning.

However, based on a number of successful examples, we suggest the following strategies for creating the conditions for learning:

  • Create opportunities for participation

  • Add incentives

  • Remove disincentives

  • Provide access to information

  • Provide upper-level management support

Successful organizations have also used several strategies for creating knowledge:

  • Clarify key strategic choices

  • Build on existing evaluations

  • Focus on strategic moments

  • Record precedents

The lessons from these 18 POE programs, which influence billions of dollars of construction, suggest that the solution to creating a lessons-learned program is partly technical, such as using information technology to reduce costs in gathering and distributing information. At its core, however, the problem is orga-

Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.

nizational—creating a setting in which decisions can be evaluated, discussed, and learned from.

ABOUT THE AUTHORS

Craig Zimring is a professor of architecture and of psychology at the Georgia Institute of Technology. In his teaching, writing, consulting, and research he has developed methods, procedures, and concepts for the evaluation of buildings, including comprehensive studies of building types such as healthcare facilities, jails and prisons, courthouses, and specialized studies of wayfinding, security, stress, and other issues. Dr. Zimring has focused on how social, organizational, and behavioral information can be incorporated into design and decision making at a variety of scales, from a freshman design studio to the $4.5 billion California prison development program, the $6 billion French Universities 2000 program, and the $1 billion annual construction budget of the California Department of General Services. He has worked in the design studio, lectured to facility managers, written in the popular and professional press, served as a consultant and directed research projects for AT&T, U.S. Department of State, U.S. General Services Administration, the Administrative Office of the U.S. Courts, U.S. Department of Transportation, Ministry of Education of France, and many others and served on the boards of several professional organizations including the Environmental Design Research Association and the Justice Facilities Research Program. Dr. Zimring was a distinguished senior visiting fellow at the Centre Scientific et Technique du Batiment, Paris; he has received awards from the American Society of Interior Designers and the National Endowment for the Arts Design Research Recognition Program. He holds a bachelor of science from the University of Michigan, and a master of science and a Ph.D. from the University of Massachusetts at Amherst.

Thierry Rosenheck is a project manager at the U.S. Department of State, Office of Overseas Buildings Operations (OBO), and has been working on embassy rehabilitation projects in New Delhi, Beirut, Tel Aviv, and Jerusalem since 1999. He has developed a serviceability profile of user generic requirements for new chancery office buildings using the ASTM Standard on Whole Building Functionality and Serviceability and with the Centre for International Facilities. He has coordinated POE and serviceability input with other ongoing projects at the Office of Overseas Buildings Operations. Prior to working with the Department of State, Mr. Rosenheck was in private practice. He has worked for architectural firms, a construction firm, and taught at the School of Architecture and Urban Planning at Howard University. He holds a bachelor of architecture degree from Howard University, a master’s degree in architecture and environment-behavior studies from the University of Wisconsin-Milwaukee, and is a licensed architect in the District of Columbia.

REFERENCES

Argyris, C. (1992a). On Organizational Learning. Cambridge, Mass: Blackwell.

Argyris, C. (1992b). Teaching smart people how to learn. In: C. Argyris (Ed.) On Organizational Learning (pp. 84-100). Cambridge, Mass: Blackwell Business.

Argyris, C., and Schon, D. (1978). Organizational Learning. Reading, Mass: Addison-Wesley.


Baird, G., Gray, J., Isaacs, N., Kernohan, D., & McIndoe, G. (Eds.). (1996). Building Evaluation Techniques. New York: McGraw-Hill.

Bechtel, R. (2000). Personal Communication.

Bordass, W., and Leaman, A. (1997). Future buildings and their services: strategic considerations for designers and clients. Building Research and Information 25(4): 190-195.


Campbell, D.T. (1999). Social Experimentation. Thousand Oaks, California: Sage Publications Inc.

Cohen, R., Bordass, W., and Leaman, A. (1996). Probe: A Method of Investigation. Harrogate, United Kingdom: CIBSE/ASHRAE Joint National Conference.


Davis, G., and Szigeti, F. (1996). Serviceability tools and methods (STM): Matching occupant requirements and facilities. In: G. Baird, J. Gray, N. Isaacs, D. Kernohan, G. McIndoe (Eds.) Building Evaluation Techniques. New York: McGraw-Hill.


Flagg, G. (1999). Study finds major flaws in San Francisco main library. American Libraries 30(9):16.

Friedmann, A., Zimring, C., and Zube, E. (1978). Environmental Design Evaluation. New York: Plenum Press.


Gregerson, J. (1997). Fee not-so-simple. Building Design and Construction (August): 30-32.


Horgen, T.H., Joroff, M.L., Porter, W.L., and Schon, D.A. (1999). Excellence by Design: Transforming Workplace and Work Practice. New York: Wiley.

Huber, G.P. (1991). Organizational learning: The contributing processes and the literature. Organization Science 2: 88-115.


Kantrowitz, M., and Farbstein, J. (1996). POE delivers for the Post Office. In G. Baird, J. Gray, N. Isaacs, D. Kernohan, G. McIndoe (Eds.) Building Evaluation Techniques. New York: McGraw-Hill.

Kernohan, D., Gray, J., and Daish, J. (1992). User Participation in Building Design and Management: A Generic Approach to Building Evaluation. Oxford: Butterworth Architecture.


Leaman, A., Cohen, R. and Jackman, P. (1995). Ventilation of office buildings: Deciding the most appropriate system. Heating and Air Conditioning (7/8): 16-18, 20, 22-24, 26-28.


McLaughlin, H. (1997). Post-occupancy evaluations: “They show us what works and what doesn’t.” Architectural Record 14.


Preiser, W. F. E. (1994). Built environment evaluation: Conceptual basis, benefits and uses. Journal of Architectural and Planning Research 11(2), 92-107.

Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.

Preiser, W.F.E., Rabinowitz, H.Z., and White, E.T. (1988). Post-Occupancy Evaluation. New York: Van Nostrand Reinhold.

Preiser, W.F.E., and Schramm, U. (1997). Building performance evaluation. In D.Watson et al. (Eds.) Time-Saver Standards (7 ed., pp. 233-238). New York: McGraw-Hill.

Raw, G. (1995). A Questionnaire for Studies of Sick Building Syndrome. BRE Report. London: Construction Research Communications.

Raw, G. (2001). Assessing occupant reaction to indoor air quality. In: J. Spengler, J. Samet, and J. McCarthey (Eds.) Indoor Air Quality Handbook. New York: McGraw-Hill.

Ripley Architects. (2000). San Francisco Public Library Post Occupancy Evaluation Final Report. San Francisco: Ripley Architects.


Schein, E.H. (1995). Learning Consortia: How to Create Parallel Learning Systems for Organization Sets (working paper). Cambridge, Mass: Society for Organizational Learning.

Schneekloth, L.H., and Shibley, R.G. (1995). Placemaking: The Art and Practice of Building Communities. New York: Wiley.

Shibley, R. (1982). Building evaluations services. Progressive Architecture 63(12): 64-67.

Shibley, R. (1985). Building evaluation in the main stream. Environment and Behaviour 1985(1):7-24.


Vischer, J. (1996). Workspace Strategies: Environment as a Tool for Work. New York: Chapman and Hall.


Watson, C. (1996). Evolving design for changing values and ways of life. Paper presented at the IAPS14, Stockholm.

Watson, C. (1997). Post occupancy evaluation of buildings and equipment for use in education. Journal of the Programme On Educational Building (October).


Zeisel, J. (1975). Sociology and Architectural Design. New York: Russell Sage Foundation.

Zimring, C.M., and Reizenstein, J.E. (1981). A primer on post-occupancy evaluation. Architecture (AIA Journal) 70(13): 52-59.

Zimring, C.M., and Welch, P. (1988). Learning from 20-20 Hindsight. Progressive Architecture (July), 55-62.

Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.
Page 42
Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.
Page 43
Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.
Page 44
Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.
Page 45
Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.
Page 46
Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.
Page 47
Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.
Page 48
Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.
Page 49
Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.
Page 50
Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.
Page 51
Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.
Page 52
Suggested Citation: "5 Post-Occupancy Evaluations and Organizational Learning." National Research Council. 2001. Learning from Our Buildings: A State-of-the-Practice Summary of Post-Occupancy Evaluation. Washington, DC: The National Academies Press. doi: 10.17226/10288.
Page 53
Next Chapter: 6 The Role of Technology for Building Performance Assessments
Subscribe to Email from the National Academies
Keep up with all of the activities, publications, and events by subscribing to free updates by email.