Skip to content

I recently listened to a talk by Jamer Hunt on "The Anxious Space between Design and Ethnography" In his talk, he creates a two-dimensional space (four quadrants) to reflect the famous quote from Donald Rumsfeld on "Unknown-Unknowns." He uses this space to illustrate the overlapping territory explored by anthropology and design.

The talk stimulated me to think more generally about relations between basic laboratory research, field research, and design. I was trained in human experimental psychology in a way that emphasized experimental design and controlled laboratory research. This was motivated by a clear bias that doing controlled experiments was the way to do "real" science. However, I was also involved in the development of the aviation psychology laboratory at Ohio State University and was exposed to applied Engineering Psychology in the tradition of Paul Fitts. And much of my career has been focused toward applying cognitive psychology in applied domains such as aviation, driving, and healthcare.

In light of these experiences, I have come to see most laboratory research as situated in the domain of Known-Knowns. Much of the published experimental literature are demonstrations of what we already know. For example, consider all the replications of Fitts' Law or all the variations on visual and memory search or more recently the replications of 'blind sight' experiments. Laboratory research does sometime open up insights into and fill gaps in the Known-Unknown territory, but surprise is extremely rare!

In my experience field research pushes us further into the region of Unknown-Unknowns - increasing the potential for surprise. Increasing the potential to learn something knew.

And design pushes us still further into the region of Unknown-Unknowns, increasing the potential for surprise - increasing the potential for learning and discovery. Also note that as we move deeper into the region of Unknown-Unknown, we also move deeper into the region of Unknown-Knowns. This is the region for reflection, meta-analysis, and metaphysics.  As we move into the Unknown it becomes more important to reconsider and reflect on foundational assumptions and to build theory to connect the dots and integrate across empirical experiments.

I have come to the conclusion that a mature science depends on a healthy coupling between laboratory research, field research, and design.  I believe that the ultimate test of any hypothesis or theory is its ability to motivate solutions to practical problems. I believe that paradigm shifts emerge from the coupling of research and design.

Certainly, experimental research serves a valuable function. However, if you are serious about learning and discovery, then it is important to explore beyond the laboratory, to get out of the territory of the known - to test your hypotheses and theories in practice, and to increase the potential for surprise.

Satisfying, Specifying, Affording

There is a prevailing assumption within Western cultures of a dichotomy, where mind and matter refer to fundamentally different (or ontologically distinct) kinds of phenomenon that work according to different principles or laws. On the one hand there is the world of Matter - that constitutes the realm of physical phenomena and that behaves according to the Laws of Physics (e.g., Laws of Motion, Laws of Thermodynamics, etc.). On the other hand is the world of Mind that constitutes the realm of mental phenomena and that behaves according to a completely different set of laws or principles (e.g., psychological or information processing principles).

This is sometimes referred to as the Mind/Body problem - suggesting that our bodies are subject to the Laws of Physics, but that our Minds are subject to different laws related to psychology or information. In essence, we have a world of hardware and a world of software and that these two worlds are fundamentally different, operating according to different laws/principles, and requiring separate sciences.

However, this belief is not universal, and it has been rejected by some notable philosophers/scientists (e.g., William James). James conceived of experience as a unified whole that included both physical (objective) and mental (subjective) constraints.

Just so, I maintain, does a given undivided portion of experience, taken in one context of associates, play the part of a knower, of a state of mind, of 'consciousness'; while in a different context the same undivided bit of experience plays the part of a thing known, of an objective 'content.' In a word, in one group it figures as a thought, in another group as a thing. And, since it can figure in both groups simultaneously we have every right to speak of it as subjective and objective, both at once. (James 1912, Essay I)

Further James argued for a single, unified science that focused on the relations that constituted the totality of experience resulting from the combined impact of the physical and the mental constraints. In our book, "A Meaning Processing Approach to Cognition" Fred Voorhorst and I make the case that there are three duals that are fundamental to a unified science of experience. Each dual reflects specific relations between mental (subjective) and physical (objective) constraints.

The first dual is a concept introduced by James Gibson that has gained increased acceptance from the design community. This is the construct of affordance or affording. An affordance reflects the potential for action that reflects properties of an agent in relation to properties of objects. The prototypical example is the affordance of graspability that reflects the relation between an agents effectors (e.g., hands) and the size, orientation, and shape of objects. Another example of an affordance that reflects the agency of a human-technology system is land-ability. This reflects the properties of a surface relative to the capabilities of an aircraft. Different aircraft (fixed wing versus rotary wing) are capable of landing successfully on different types of surfaces. In control theoretic terms, affordance is closely related to the concept of controllability - reflecting the joint constraints of the agent and the situation on acting.

The second dual is a concept that is related to James Gibson's concept of information (e.g., the optical array) and that was the focus of Eleanor Gibson's research on perceptual learning and attunement.  We refer to this dual as specificity or specifying. This refers to the relation between the sensory/perceptual capabilities or perspicacity of an agent and the structure available in the physical medium (e.g., the light, acoustics, tactile properties, or the properties of a graphical interface). For example, due to the angular projection of light, the motions of an observer relative to the surfaces in the environment are well specified in the optical array (e.g., the imminence of collision, or the angle of approach to a runway). In control theoretic terms, specificity is closely related to the construct of observability - reflecting the quality of the feedback with respect to specifying the states of a process being controlled. E.J. Gibson's work on perceptual learning reflects the insight that it typically takes experience for organisms to discover and attend to those structures in the medium that are discriminating or diagnostic with respect to different functions or intentions. For example, pilots must learn to pick up the optical features that specify a safe approach to landing.

The third dual is a concept that James Gibson included in his definition of affordance, that we feel is better isolated as a third dual. We refer to this dual as satisfaction or satisfying. This refers to the relation between the consequences of an action and the intentions/desires or health of an actor. If the consequences are consistent with the desires of an actor or if they are healthy, then the relation is satisfying. If the consequences are counter to the desires or unhealthy then they are unsatisfying. For example, a food that is healthy or a safe aircraft landing would be satisfying. But a food that was poisonous or a crash landing would be unsatisfying. In control theoretic terms, satisfaction is closely related to the cost function that might be used to determine the quality or optimality of a control solution. In essence, the satisfying dimension reflects the quality of outcomes.

We see these three duals as essential properties of any control system or sensemaking organization. That is, the quality of control, skill, situation awareness, or of sensemaking will depend on whether the affordances (action capabilities) and potential consequences are well specified by the available feedback, which in turn should enable the development of quality (if not optimal) control/decision making that reduces surprises and risks, and increases the potential for satisfying outcomes.

Lindblom (1979) wrote:

Perhaps at this stage in the study and practice of policy making the most common view... is that indeed no more than small or incremental steps - no more than muddling -is ordinarily possible. But most people, including many policy makers, want to separate the 'ought' from the 'is.' They think we should try to do better. So do I. What remains at issue, then? It can be clearly put. Many critics of incrementalism believe that doing better usually means turning away from incrementalism. Incrementalists believe that for complex problem solving it usually means practicing incrementalism more skillfully and turning away from it only rarely.

I am not sure that today is all that different. I think many people still see muddling (incrementalism) as a flawed approach to problem solving and decision making. It still seems to have a negative connotation and there is an implication that there is a better way to manage organizations or a better way to make decisions.

I think people equate muddling with a confused, aimless flailing. They have no concept of skilled muddling. However, for me skilled muddling is an apt description of how experts deal with complexity and uncertainty. It is not a confused, aimless process, but a carefully controlled combination of probing the world and utilizing the resulting feedback to make tentative steps in potentially satisfying directions.

The image I have is of skilled rock climbers - who are well tuned to the opportunities that different holds offer and who have a good understanding of the risks. They have developed strength and agility, so that they have a wide range of affordances in terms of what is reachable and graspable. They are careful about insuring that they have appropriate protection to guard against catastrophic falls. They are constantly establishing a firm base of support before reaching for the next hand or foot hold. And they are well-tuned to the information that allows them to identify the best supports and the viable paths toward their goal.

I see many parallels between muddling and  Peirce's concept of abduction - in which experts use their experience to make tentative hypotheses about the consequences of choices and actions, and then test their hypotheses through acting on them. And then utilize the feedback to adjust their assumptions and framing to reduce surprise (error). Further I see this in terms of a closed-loop triadic semiotic process which three sources of constraint. Each of these sources reflect relations between agents and ecologies.

Satisfying, Specifying, Affording

The first of these relations is the now familiar construct of affordance, which represents the potential for action. These reflect the relation between the effectivities of an agent (or organization) and the properties of objects in the ecology.  In control systems terms this reflects the constraints of the plant dynamics in the forward loops.

The other two constructs are less familiar. The second construct, specifying, reflects the potential for perception or information pick-up. It reflects the relation between the sensor systems (e.g., eyes) of agents and structure in the related ecological mediums (e.g., optical array, or graphical interface). In control systems terms this reflects the constraints on the sensors in the feedback loops.

The third construct, satisfying, reflects the functional constraints on performance. This reflects the relation between the intentions, values, or preferences of agents and the actual consequences of particular choices or actions. In control terms this reflects the comparator where intentions are compared to feedback - resulting in a surprise or error signal.

These three components are ontologically independent. That is an affordance exists independently from the information to specify it and from the intentions of agents. Thus, a cup is graspable even though it is out of eyesight and you don't have a current use for it.

However, these three constructs are tightly coupled with respect to epistemology. Thus, for example, our experiences of affordances depend on how well they are specified, and on whether they help us to satisfy our intentions. You cannot actually grasp the cup unless you can sense it (e.g., it is in eyesight), and you are unlikely to grasp it if you don't have a use for it (e.g., to hold a liquid).

Thus, skilled muddling involves tuning the coupling of agents with their ecologies or situations - increasing the range of affordances, tuning the perceptual system and aligning your intentions with the activities that lead to satisfying consequences. The opening figure illustrates how these dimensions map into Rasmussen's intuitions about the design of cognitive systems that are capable of skilled muddling.

These ideas are developed more extensively in our book "A Meaning Processing Approach to Cognition" (Flach & Voorhorst, 2020).

1

Most organisms and essentially all organizations have multiple layers of interconnected perception-action loops. And it will be generally true that the loops all operated at different time constants, have differential access to information, and have differential levels of authority. One way to think about the coupling across authority levels is that higher-levels set the bounds (or the degrees of freedom) for activity at lower levels. In this context, a key attribute of an organization is the tightness of the couplings between different levels.

For example, the classic Scientific Management approach involves a very tight coupling across levels. With this approach, higher levels (e.g., management) is responsible for determining the 'right way' to work and the management levels have the responsibility to implement training and reward systems to ensure that the lower levels (operators - workers) strictly adhere to the prescribed methods. This is an extremely tight coupling leaving workers very little degrees of freedom, as they are typically punished for deviating from the "one best way" prescribed by management. In essence, the organization functions as a clockwork mechanism. This approach can be successful in static, predictable environments, where the assumptions and models of managers are valid. However, in a changing, complex world (e.g., VUCA environments) this approach will be extremely brittle - because the cycle times for the upper levels of the system will always be too slow to keep pace with the changing environment. This approach will surely fail in any highly competitive environment, with intelligent, adaptive competitors.

One way to loosen the coupling between levels is to replace the "one best way" with a playbook, or a collection of smart mechanisms that are designed to address different situations that the organization might face. Typically, higher levels develop the playbook and are responsible for training the plays and for rewarding/punishing workers as a function of how well they are at selecting the right play for the situation and implementing that play.

The coupling can be relatively tight if the playbook is treated as a prescription such that any deviations or variations from the plays are punished, especially when negative results ensue. Or the coupling can be looser if the plays are treated as suggestions, deviations (e.g., situated adaptations) are expected or even encouraged, and the lower levels are not punished for unsuccessful adaptations.

A still looser coupling is an approach of Mission Command. With Mission Command higher levels in the organization set objectives and values, but they leave it to lower levels of the organization to determine how best to achieve those objectives given changing situations. This approach requires high levels of trust across levels in an organization. Higher levels have to trust in the competence and motivations of the lower levels, and lower levels have to trust that higher levels will have their backs so they will not be blamed or punished when well-motivated adaptations are not successful.

Also, Elinor Ostrum's construct of polycentric governance describes how loosely coupled social systems self-organize to manage share resources in a way that avoids the tragedy of the commons.  Note that Ostrum's work contradicts the common notion that top-down control is essential to prevent competition between local interests from resulting in complete exhaustion of a resource. Her research discovered many situations where coordination and cooperation emerged bottom-up without being imposed by a centralized, external authority.

These are simply some examples of different levels of tightness in the couplings between levels intended as landmarks within a continuum of possibilities. On one end of the continuum are tightly coupled organizations where the couplings across levels are like the meshing of gears in a clock. As couplings become looser the organization becomes less machine-like and more organismic. Metaphors that are often used for looser couplings include coaching and gardening. In organizations with looser couplings higher levels introduce constraints (e.g., guidance, suggestions, resources), but don't determine activities. The higher levels tend the garden, letting the plants (workers) free to grow on their own.

In an increasingly complex world - organizations with tight couplings across levels will tend to be brittle and vulnerable to catastrophic failures due to an inability to keep pace with changing situations. When couplings across levels are looser, the potential for bottom-up innovation and self-organization increases. However, if the coupling is too loose - there is an increasing danger for inefficiencies, conflict, and chaos. Thus, organizations must continuously balance the tightness of the couplings trading off top-down authority and control to increase the capacity for exploration and innovation (adaptive self-organization) at the operational level. In a complex world, loosening the couplings can allow the organization to muddle more skillfully.

At the heart of any consideration of questions of controllability and resilience is the question of degrees of freedom.  This was a central question of concern for Nikolai Bernstein in his explorations of perceptual-motor skill. In the context of motor skills, degrees of freedom refer to the constraints/possibilities for moving our bodies. But more generally - one might think of the degrees of freedom in a system as setting bounds on the potential for action.  When degrees of freedom are high - then there will be many different ways to accomplish the same task or goal.

Generally, the more degrees of freedom a system has - the more resilient it can be. This is because when one route to a goal is blocked, there will be other paths for reaching that goal.  However, degrees of freedom pose a challenge for control - because the more degrees of freedom a system has - the more variables have to be taken into account in the control logic.  In essence, increasing the degrees of freedom increases the potential for disturbances that have to be managed by the controller.

So, the degrees of freedom problem refers to a trade-off between flexibility (resilience potential) and control complexity. More degrees of freedom means there are more ways to be successful in accomplishing a goal, but also more ways to fail. In the context of perceptual motor control, the constructs of coordinative structure and smart mechanism were introduced as hypotheses of how to effectively manage this trade-off.

A coordinative structure or smart mechanism is created by constraining or locking out a subset of the degrees of freedom to create a simpler 'mechanism' (less complex control problem) that is specialized for a particular task or situation. This is somewhat analogous to the smart heuristics described by Gerd Gigerenzer in relation to ecological rationality.

The skill of golf provides a good example of how skilled athletes manage the degrees of freedom problem. Trying to drive a golf ball is a difficult control problem because of the many degrees of freedom in the body. For example, there are many potential disturbances to the path of the club head during a swing - a bend at the elbow of the leading arm, a twist of the head or shoulder, the positions of the legs, etc. It is simply impossible for a person to control all of the different degrees of freedom in real time. So, learning to drive a golf ball consistently involves learning to lock out many of the degrees of freedom (e.g., locking the elbow of the leading arm, holding the head in a fixed position with eyes on the ball, etc.). Thus, pro golfers learn to become a simple 'driving machine' that involves realtime control of only a few degrees of freedom that are specifically chosen to generate power to hit the ball great distances in a specific direction.

So, what good is having a lot of degrees of freedom if we are going to lock them out. The high degrees of freedom allows golfers to become many different kinds of simple mechanisms.  They can lock out a different set of degrees of freedom to become a chipping mechanism, designed to hit a ball with greater control over distance and direction. Or they can lock out another set of degrees of freedom to become a putting mechanism. Thus, a pro golfer will learn to become many different simple control mechanisms that are specialized for different situations they will face on a golf course. Successful golfers not only have different types of golf clubs - but also have specialized control mechanisms that have been 'designed' through extensive practice for specific situations.

Thus, to be resilient it is desirable for organizations to maximize degrees of freedom to increase the space of potential possibilities or to increase the possible routes to satisfying ends. However, the organizations also need to develop (train) smart mechanisms tuned to the demands of different situations - so that they will not be overwhelmed by the need to control too many variables in realtime. And they have to learn to choose the right mechanism for the right situation.  Just as smart heuristics reduce the computational burden, while maintaining high levels of effectiveness with regards to decision making, smart mechanisms reduce the computational demands on control, while maintaining high levels of effectiveness with respect to coordinated action.

Have you ever been in a situation where you are looking for one person and another person, who you didn't expect approaches and remarks "Aren't you going to say hi?"

To which you reply, "I didn't even see you."

And they respond, "But you were looking right at me!"

Many years ago Ulrich Neisser illustrated this phenomenon experimentally and it has since been replicated in numerous experiments in which salient stimuli (e.g., people carrying umbrellas, gorillas, etc.) in the field of view are not 'seen.'

For me, this illustrates the intimate coupling between sensation, perception, cognition, and action or using the OODA Loop framework, between Observing, Orientating, Deciding, and Acting.

I wonder if the use of representations that illustrate the different functions as if they are separate isolated stages of processing might lead to misconceptions about the nature of human experience.  Frankly, I believe that these are not independent stages of processing. Each stage blends (harmonizes) with the other stages such that the dynamics within each function are in part shaped by what is going on in the other stages.  So - what we observe is shaped by our orientation (or framing of the problem), and by the options we are considering, and by our capacity for action. While simultaneously what we are observing is shaping the other functions.

A simple example is research by Dennis Proffitt that shows that our perception of the steepness of a slope is different depending on the weight we are carrying in our backpack.

The bottom line is that when we break human information into a sequence of discrete stages it is likely that we can lose sight of some of the relations from which human experience emerges. The functions are not like a sequence of dominoes, but more like a barber shop quartet, in which each voice (i.e., function) is blending with the others to create a sound that has unique emergent properties.

I wonder if we redrew the OODA loop as overlapping circles that blend together to produce our experiences - it would lead to different intuitions about the nature of cognition and expertise.  What do you think? Does it strike a chord with you?

It seems to me that much of the hype about Artificial Intelligence (AI) reflects the potential power of AI to control things that had traditionally been controlled by humans.  If not replacing humans, the expectation is that humans will be displaced from lower levels of control to higher supervisory levels. Thus, the AI systems are framed to be autonomous control systems (e.g., autopilots).

I think this vision might be reasonable for simple and maybe even complicated domains of operation. However, I think this is the wrong vision when thinking about applying AI to complex or chaotic domains of operation. An alternative that I suggest for applications in domains that are more than complicated is to think about AI as being analogous to a telescope or microscope. That is, the function of AI is to enhance observability.  The function is to use the power of advanced computations and statistics to pull patterns out of data that humans could not otherwise perceive.

In this context, the control problem is framed as a joint cognitive system (rather than as an autonomous system). The role of AI in this joint cognitive system is to enhance observability to shape how humans frame decisions. In terms of Boyd's OODA loop framework, the value of AI is to enrich Observations in ways that constructively shape how humans Orient to a problem or frame the Decision processes.  Thus, humans are engaged and empowered (not displaced), and the ultimate quality of control is enhanced.

2

I was listening to the No Way Out podcast hosted by Brian Rivera and Mark McGrath talking to Johan Ivari and Annette Nolan about teamwork and they briefly mentioned the distinction between Synchronization and Harmonization and a light bulb went off.

I have long been interested in coordination both in relation to perceptual motor skill and in relation to teamwork and polycentric control. However, in thinking about coordination I tended to think primarily about the need to synchronize activity. In terms of motor skills, this was about synchronizing muscle assemblies, and in organizations, it was about synchronizing actions across and within teams. But I had not thought to frame coordination in terms of harmonization. Though it was obvious that high functioning teams needed to 'blend' their skills together in ways not fully captured by the term synchronization.

Synchronization emphasizes the timing of activities. For example, in making a pass to a teammate (e.g., in basketball, soccer, or American football) the release and pace of the ball has to be in synch with the motions of the receiver.

Harmonization, however, is not just about timing, but it is about blending diverse individuals to create a satisfying result. For example, this might reflect the ability of teammates to move together to create the opportunity for a successful pass. In group problem solving - success can often depend on the ability of people to share diverse experiences in order to discover/create an innovative solution. Harmonization seems to be consistent with other descriptions of team functioning such as 'building common ground' or 'organizational sensemaking.'

Now in thinking about polycentric control I suggest that synchronization is a necessary, but not sufficient requirement for successful coordination. In addition, effective polycentric control requires harmonization. That is, it requires that individuals blend their skills and their diverse perspectives in ways that complement the skills and perspectives of their teammates AND that satisfy the demands of their domains of operation.

To be clear, harmonization, is not an answer. It's practical meaning can only be defined relative to the demands of situations or domains of operation. Rather, it is just a subtle, but perhaps important shift in how to frame questions about coordination. Maybe this is just word play - but I find comfort in having a word that fills in some of the gaps with respect to understanding how teams work.

 

What must be admitted is that the definite images of traditional psychology form but the very smallest part of our minds as they actually live. The traditional psychology talks like one who should say a river consists of nothing but pailsful, spoonsful, quartpotsful, barrelsful and other moulded forms of water. Even were the pails and the pots all actually standing in the stream, still between them the free water would continue to flow. It is just this free water of consciousness that psychologists resolutely overlook. Every definite image in the mind is steeped and dyed in the free water that flows around it. With it goes the sense of its relations, near and remote, the dying echo of whence it came to us, the dawning sense of whither it is to lead. The significance, the value, of the image is all in this hallo or penumbra that surrounds and escorts it, or rather that is fused into one with it and has become bone of its bone and flesh of its flesh; leaving it, it is true, an image of the same thing it was before, but making it an image of that thing newly taken and freshly understood.   William James (1890, p. 255)

When talking about cognitive experiences people often refer to a sense of 'flow.' This concept does not fit easily into conventional cause-effect narratives based on billiard ball collisions (or string of dominos) metaphors.  It suggests the need for a new narrative. I wonder whether it would be possible to follow the lead of physics and consider the possibility of a new narrative where instead of talking about cause - we talk about constraints; and instead of talking about effects - we talk about possibilities.

The physicist, John Wheeler describes the motivation that led physicist to adopt a field narrative:

It is to Aristotle, working in the fourth century B.C., that we owe the popular maxim that ‘nature abhors a vacuum’. It is more accurate to say that people abhor a vacuum. Newton called it an absurdity. Scientists ever since have developed our picture of nature in terms of what I may call ‘local action’, to distinguish it from ‘action at a distance’. The idea of local action rests on the existence of ‘fields’ that transmit action from one place to another. The Sun, for instance, can be said to create a gravitational field, which spreads outward through space, its intensity diminishing as the inverse square of the distance from the Sun. Earth ‘feels’ this gravitational field locally – right where Earth is – and reacts to it by accelerating toward the Sun. The Sun, according to this description, sends its attractive message to Earth via a field rather than reaching out to influence Earth at a distance through empty space. Earth doesn’t have to ‘know’ that there is a sun out there, 93 million miles distant. It only ‘knows’ that there is a gravitational field at its own location. The field, although nearly as ethereal as the ether itself, can be said to have physical reality. It occupies space. It contains energy. Its presence eliminates a true vacuum. We must then be content to define the vacuum of everyday discourse as a region free of matter, but not free of field.

Also Richard Feynman explains the power of the field construct for tracking dynanic relations that extend over space and time:

It [the field construct] would be trivial, just another way of writing the same thing, if the laws of force were simple, but the laws of force are so complicated that it turns out that fields have a reality that is almost independent of the objects which create them. One can do something like shake a charge and produce an effect, a field, at a distance; if one then stops moving the charge, the field keeps track of all the past, because the interaction between two particles is not instantaneous. It is desirable to have some way to remember what happened previously. If the force upon some charge depends upon where another charge was yesterday, which it does, then we need machinery to keep track of what went on yesterday, and that is the character of a field. So when the forces get more complicated, the field becomes more and more real, and this technique becomes less and less of an artificial separation.

Would it be useful to frame the coupling of perception and action in terms of fields of constraint - and to consider how the constraints on information (e.g., Gibson's optical flow fields) specify the possibilities for action (e.g., the safe field of travel) [see Gibson and Crooks,1938]?

Would it be useful to think about event trajectories in terms of fields of possibilities analogous to Minkowski's light cones? However, in thinking of the possibilities for a cognitive system one must consider both the constraints going forward from the present AND the constrains extending backward from a goal or ends - in order to show the possible paths (or means) to an ends.  Thus, an event might look something like this, where the constraints on perception and action limit the paths from where you are (now) to where you are striving to reach (intention).

What do you think?  What would a field theory of cognition look like? Is it possible for cognitive science to escape from classical Newtonian narratives and to follow physics into a dynamic world that flows? Would it be a step forward?

For a deeper dive into this see: Flach, Dekker, & Stappers (2007). Playing twenty questions with nature (the surprise version): reflection on the dynamics of experience. Theoretical Issues in Ergonomics Science, 9:2, 125-154.

2

As the classical story about the blind men and the elephant illustrates - the complexity of natural systems generally exceeds our grasp.  That is, from any one perspective we can only grasp a part of the elephant. Thus, to get a sense of the whole elephant it is necessary to walk around to explore each of the parts and then to mentally stitch the parts together to get a sense of the whole elephant. The different possible positions of the blind men are analogous to the different disciplines in science. Each discipline owns a part of the elephant and the challenge is to combine the various perspectives into a coherent understanding of the elephant.

However, for those who can see, there is another approach for getting a sense of the whole elephant.  We can move away, increasing our distance from the elephant. As we move farther away, we can see less detail, but we can now see relations among the parts that were not visible from up close.  Still, even when the whole elephant is visible, we can only see one side at a time. So, there is no single perspective that allows us to see the whole elephant. There remains the need for some cognitive work to stitch the different perspectives together into a complete understanding of the whole.

Thus, there are two ways that we can change perspectives in exploring the elephant. One way, illustrated by the blind men is through aggregation of the parts. The second way, available to those with eyes, is to change perspective through increasing distance from the elephant. This is a metaphor for abstraction.

One of the key insights of Jens Rasmussen is that to make sense of complex organizations or complex work domains it is necessary to explore through both decomposition/aggregation and through abstraction.  Further, he suggests, based on observations of experts trouble shooting faults in complex technologies, that in terms of thinking productively about complex systems some regions in the abstraction/aggregation space are privileged. In particular, he suggests that at high levels of abstraction the details become less important. For example, there is limited return from reducing a functional purpose into goals, and then further reducing them into sub-goals, which can be further divided into sub-sub-goals. On the other hand, at low levels of abstraction details become more and more important. For example, the shapes of the different components have to fit together in the space available. The threads of one part have to mesh with the threads of another part.  The diagrams below are intended to illustrate an exploration space for exploring complex domains jointly by abstraction and decomposition/aggregation.

The key point is that to better understand the whole elephant we have to use multiple senses and take multiple perspectives.  John Boyd uses the analogy of building a snowmobile to illustrate the importance of combining an analysis mindset (e.g., exploring the components of different vehicles - motors, steering mechanisms, traction), and a synthesis mindset (e.g., recombining those components to satisfy a different purpose - driving over snow). Decomposition/aggregation reflects an analysis mindset, and abstraction reflects a synthesis mindset. The hypothesis is that to think productively about complex problems one needs to be able to move along the diagonal in this abstraction/decomposition space.