In lieu of an abstract, here is a brief excerpt of the content:

14 Thinking about Cause and Effect How can public executives identify the causal impact of their leadership efforts when they cannot observe how these activities affect the behavior of people who directly create results? Nothing in the quantum-mechanical theory of radioactivity explains how a subatomic particle performs its magic and makes its way through impenetrable walls. george greenstein, Amherst College1 To explain is to provide a mechanism, to open up the black box and show the nuts and bolts, the cogs and wheels of the internal machinery. jon elster, Columbia University2 In 2006, the Scottish Executive (now called the Scottish Government) released a report on six experiments with a CitiStat strategy: “What Do We Measure and Why? An Evaluation of the CitiStat Model of Performance Management and Its Applicability to the Scottish Public Sector.”3 As I read the report, I was surprised. Public executives in Scotland seemed to understand the complex, causal behaviors and appreciate the subtle nuances of Baltimore’s CitiStat leadership strategy significantly better than many in the United States. U.S. public executives who worked so much closer to Baltimore did copy the visible features of CitiStat: data, projectors, meetings. Yet, they often failed to figure out the underlying cause-and-effect relationships that can contribute to the potential effectiveness of this leadership strategy. Consequently, many never adopted—let alone adapted—many of the core leadership principles and key operational components that contribute to the PerformanceStat potential. Why did these neighbors miss many of the principles and components? Why did the visitors from across The Pond get it? When I asked the people from Scotland, their answer was obvious—and explained a lot. Scotland had not merely flown a single observer to Baltimore, 245 14-2527-5 ch14.indd 245 4/10/14 4:27 PM 246 Thinking about Cause and Effect who, during a morning visit, watched a meeting or two and got a brief briefing. Instead, Scotland had sent an entire team to spend an entire week in Baltimore. The members of the team observed a variety of CitiStat sessions. They talked with people in the mayor’s office and in city agencies. They spent five days poking around, trying to soak up what people were doing and grasp what was really going on. One member of the team even took a side trip to Somerville, Massachusetts to learn about SomerStat.4 Most public officials who seek to create their own CitiStat do make a visit to Baltimore—a quick one. They watch a session, maybe two, get the standard briefing from a CitiStat staffer, and ask a few questions (mostly about cost—the cost to their budget, not the cost to their time). Their questions answered, they leave thinking that they fully understand this leadership strategy. They may have seen quite a lot. Yet, they may comprehend very little. Explicit Knowledge about “What?” Like Baltimore, most public agencies and governmental jurisdictions that have created their own PerformanceStat welcome visitors.5 But what, exactly, do these visitors learn? What these visitors see is the room, the people, the questions and discussion, the data, the technology used to project the data. Maybe even some analysis. What, however, do they learn? Mostly, I think, these visitors learn that PerformanceStat is pretty simple. What they observe does not appear to be particularly complex. They have seen it all before: meetings, data, questions. They may have used data in their own meetings. Nothing much new here. What may well be different, however, is the technology. This is new. The projectors used to display data, maps, and photographs are designed to get everyone’s attention—to ensure that everyone in the room is concentrating on the same information. The technology also gets the attention of visitors. This is cool. How much more technologically avant-garde can public management get? The visibility of the technology helps explain why visitors focus their initial questions on it: “What kind of software did you buy?” “What did it cost you for all this technology?” After these easy questions about technology and cost are answered, the visitors shift their attention to operational details: “What kind of data do you collect? “What software do you use? “What is the frequency of your meetings?” These “What?” questions are explicit. They demand an explicit response. Thus, most 14-2527-5 ch14.indd 246 4/10/14 4:27 PM [3.146.35.203] Project MUSE (2024-04-20 15:44 GMT) Thinking about Cause and Effect...

Share