I am almost finished with The Gladstone Barrier, my second “transportation planning adventure story.” I continue to explore the field of transportation planning for story ideas. This post draws on my experience with transportation models.
Most of my career in transportation planning was spent on models and forecast. Much of the rest of my efforts went to surveys and other data collection. Over the years I developed a view of their proper use, which a lot of people don’t agree with.
There is a conflict between what transportation planners want from models and what they can actually deliver. In the movie Coal Miner’s Daughter, or maybe it was the movie Sweet Dreams, a character said “People in Hell want ice water, that don’t mean they get it.” This is the way it is with transportation models.
It became clear to me over the years that models could not predict the future like we hoped they could. Henry Minzberg’s book The Rise and Fall of Strategic Planning helped me understand the complexity of this problem. In the book he focuses on business planning, but many of his suggestions can apply to transportation planning. The article From Strategic Planning to Strategic Thinking by James L. Morrison (no relation) has a good short description of Minzberg’s ideas.
We don’t have the luxury of waiting and seeing. We must make decisions now. But we must recognize that often we will be wrong. The goal then is to be aware that we can be wrong. We need to recognize when we are wrong and act in a timely manner to correct.
The approach I strongly favour is to link forecasts to a robust monitoring program. Periodic “reality checks” of forecasts will catch forecast errors sooner and allow plans to be modified. This is not a new idea. When I looked at technical manuals from the 1950s and 1960s they all talk about the importance of monitoring. They explicitly assume that models and their forecasts are fallible.
I think the main barrier to successful monitoring is that people don’t like to be exposed as wrong. They fear the personal consequences of the perception of failure. This fear is not unfounded. Many people use examples of failure as excuses to discount other people’s opinions.
Admitting that a forecast is wrong can become a political issue. Politicians work out compromises between different interest groups to reach their decisions. When transportation forecasts turn out to be wrong some interest groups won’t get what they bargained for. This undermines the compromise and may impact the resolution of other issues. Often it is easier to deny the problem rather than correct it.
I don’t like the idea that interpersonal and political issues can conflict with technical issues, but this is the reality we face.
This post is a part of a series. The other posts are:
I have several other posts that are closely related: