Martin Wyler | Wyler Safety Consulting GmbH https://www.martin-wyler.ch failure is no option en-gb Martin Wyler | Wyler Safety Consulting GmbH Sat, 23 Nov 2024 02:18:39 +0100 Sat, 23 Nov 2024 02:18:39 +0100 Martin Wyler | Wyler Safety Consulting GmbH Martin Wyler | Wyler Safety Consulting GmbH news-91 Sun, 04 Feb 2024 16:30:00 +0100 The runway extensions - a beacon of light for safety https://www.martin-wyler.ch/en/blog/detail/die-pistenverlaengerungen-lichtblick-fuer-die-sicherheit/ On March 3rd, the people of the Canton of Zurich will vote on the runway extensions. They are being asked to take responsibility for the safety of the airport. As Swiss citizens, we are used to setting the course with our ballots in our direct democracy. We also know that every vote has consequences that we are prepared to bear. Before we write a yes or no on the ballot paper this time, we would be well advised to ask ourselves whether we can live with it if an air accident occurs at Zurich Airport in the future and the authorities' accident report cites the complexity of Zurich Airport as a contributory cause.

The main reason and trigger for the runway extensions are the eight major risks outlined in a safety review in 2012. These are the risks that air traffic controllers, pilots and all operating personnel at Zurich Airport have to contend with on a daily basis. They have to operate in a system that is far away from a state of systemically anchored safety. The 'traps' are set in eight places every day and have to be actively avoided by people with their fallibility. The unpleasant thing about working in aviation is the fact that the people involved can find themselves in a situation that overwhelms them in a matter of seconds. If you want to get an idea of this, I recommend reading the SUST reports on incidents at Zurich Airport in recent years. Some were so serious that even the state got involved and air traffic controllers were convicted. As if this would make the airport system safer. It takes courage to rely on the infallibility of the players in such a complex system. If you follow the discussions about self-driving cars, you realize that completely different demands are being placed on systemic safety here. If we had the opportunity to use our vote to ensure that only a system that safely limits the fallibility of drivers is approved, the choice between yes and no would be quickly made.

The danger of complexity

But Zurich Airport is far way from this. The first of the eight top hazards in the aforementioned safety review is: "Reduced margin of error due to high operational complexity". This means nothing other than that the margin of error for those involved is low and does not correspond to the norm. Does anyone today really still believe that such a narrow buffer to an accident will never have to be used by thousands of people in the cockpit and at the radar console 365 days a year in all weather conditions for decades? The first top hazard alone is reason enough to take action. Anyone who still has the courage to tackle the other seven risks needs strong nerves.

That said, it never ceases to amaze me how politicians fervently proclaim that Zurich Airport is safe. How do they manage to suppress the two serious accidents with many fatalities in the 2000s without shame? I am almost certain that they will feel uncomfortable when boarding an airplane if they have read the eight top hazards at Zurich Airport beforehand.

What does 'safety' mean?

The absence of accidents in the immediate past does not mean safety. Safety is not a state. If it was, it would be comparable to the state of two balls lying on top of each other. Without us continuously stabilizing them with supporting influences, the upper one would fall down. Safety can be defined as an ongoing process of risk mitigation. We can never say that something is safe. We can only say whether what we do has a positive or negative impact on safety.

Taking responsibility for safety

With this vote, we will probably only have one opportunity in our lifetime to take responsibility for safety at Zurich Airport. After all, infrastructure expansions are costly business and the approval process takes many years. We are dealing with an enormously sluggish system that lacks the agility needed to maintain the safety of such a complex structure. Working on the system has never been easy. But it is becoming increasingly important, because we humans have managed to build socio-technical systems that we can only understand with great difficulty, let alone really control. In this respect, the runway extensions are both a ray of hope and an opportunity.

Of course, you can reject the runway extensions with faith in God on March 3rd. But that will not prevent you from bearing part of the responsibility for future accidents at Zurich Airport.

News
news-89 Sat, 24 Dec 2022 14:00:00 +0100 Patrouille Suisse: A guilty verdict out of time https://www.martin-wyler.ch/en/blog/detail/patrouille-suisse-ein-gerichtsurteil-aus-einer-laengst-vergangenen-zeit/ On Dec. 22nd 2022, a Patrouille Suisse pilot was fined for clipping one of his colleagues during an approach maneuver in training in 2016. His aircraft was damaged to such an extent that he had to eject. His colleague was able to land his aircraft, which was also damaged, safely. The prosecutor charged the pilot with negligent misuse and squandering of equipment, negligent interference with public traffic, and multiple counts of negligent failure to comply with official regulations. The verdict is out of time and detrimental to the matter. Why?

The pilot was convicted of negligent misuse and squandering of material. The court refrained from the other charges. However, for aviation and ultimately for all other high-risk organizations in Switzerland, the fact that the court found culpable conduct in this incident is significant. This is true even if the present guilty verdict comes from a military court. This is because the verdict joins the unpleasant decisions of various civilian courts (up to the Federal Supreme Court), which punished air traffic controllers for misconducts that were neither intentional nor grossly negligent. In the present case, too, there is nothing of gross negligence or intent involved. Thus, this judgment also reveals to us that the Swiss courts have difficulty in even remotely understanding the meaning of human action in highly complex systems and in extremely specific situations and (have to) rely on laws that were written when the world still looked very different. 

Whoever makes mistakes will be punished. A paradigm that causes collateral damage.

The consequences of these guilty pleas are significant for the organizations involved because, on the one hand, they cause employees and leaders to avoid any exposure, to stop taking responsibility, and to just do exactly what they are told. Thus, such guilty verdicts promote the unfortunate compliance orientation and cover-my-ass strategy that drives self-responsibility out of the company and leaves a team in underdog mode. Germany sends its regards. Such a culture is the beginning of doom for an Air Force. It must be able to count on people who, under extreme stress, have the willingness to make decisions that are purposeful. People who do not ask their superiors for advice at the slightest uncertainty. A superior who, nota bene, sits in the operations center and has no situationally relevant information.

The ambiguous approach to risks

And on the other hand, such guilty verdicts as in the present case leave a stale aftertaste. After all, display flying is a dangerous business per se. But one that the state not only tolerates, but supports for understandable reasons. If an incident occurs in the process, an individual is punished and the whole context, and thus the desire for formation displays, suddenly no longer matters. This is an indecent foreshortening. Or do you also punish your children if you have encouraged them to bake a cake themselves and they then burn their fingers on the stove? The guilty verdict just handed down reveals the ambiguity of society in dealing with risks in an exemplary manner. The system has found a victim, but the games are to continue. Is this still honest?

Responsibility in high-risk organizations

As for context: In the present case, the accused pilot was the anciennity youngest in the team. He was trained from the F/A 18 to the F-5 to be able to fly in the Patrouille Suisse. His experience on the 'Tiger F-5' was accordingly small. In 2016, the demonstration program included for him (number three) and the number two in the formation new joining maneuvers, which in the past were not expected of the youngsters. The joining maneuvers into a formation are definitely the most demanding from a flying point of view. In such a context, it is not surprising that even a hardened Hornet pilot can be overtaxed in the short term. And anyone who doesn't let this idea get to them has no idea what it feels like when, in a descending turn under acceleration, you turn your head all the way out to be able to maintain visual contact and try to fit into the formation. Anyone who then says that you have to keep to the regulations, which always demand a minimum distance of three meters from you, reveals himself to be an administrator and compliance pope who is out of touch with reality. This inability of the court to put itself in the situation was palpable in the courtroom during the questioning of the witnesses. It's not that the judges didn't make a good faith effort to understand the situation. But how do you explain to a layman in ten minutes a discipline that took experts several years to learn? It was downright grotesque. And one should not blame the judges for not being able to distinguish the difference between the flight path of an airplane that deviates from the horizontal plane and the one that deviates upward or downward from the path of the plane in front of it. But the system allows, indeed requires them to judge anyway. And they have to play the man. The criminal law wants it that way. It always and still understands the actions of a person as unaffected by the system that surrounds the person. This is an inexcusable omission in a time in which we deeply involve the human being in highly complex systems. Or does anyone still believe that a human being alone would be able to bring a 500-ton commercial airliner to its destination independently in flights of over 4000 miles in all conceivable weather and environmental conditions without massive support from technical systems? And does anyone seriously still believe that these systems interact with the pilot in a way that does not compromise his 100% control? Is it fair to always and consistently hold such a person fully responsible for the outcome regardless of the systems supporting him? No, it isn't. It is untenable from a moral perspective.

Backlog in criminal law

Criminal law is old, ancient. This is reflected not only in the categorical exclusion of systemic components and components influencing the result, but also in its terminology. In the present case, the pilot was charged with 'negligent misuse and squandering of material'. I don't know how you feel about these terms. For me, they actually have no connection with what happened. I don't see in what way the pilot should have used the aircraft for other than demonstration purposes and he should have been guilty of misuse. Also, in my understanding, in order to squander material, it needs a malicious intent to get rid of material recklessly. Not even the prosecutor had imputed the intent to him. I understand that my understanding of language does not have to coincide with legal terms. But I was by far not the only one in the courtroom who had trouble with these completely outdated terms.

There is much to be done in criminal law if it wants to remain an accepted part of our society.  And there is much to be done in training prosecutors and judges; not only for the sake of justice, but also for the sake of the safety and combat readiness of our Air Force.  

News
news-87 Tue, 18 Oct 2022 11:00:00 +0200 Mastering crises and emergencies https://www.martin-wyler.ch/en/blog/detail/krisen-und-notfaelle-meistern/ A company must be able to rely on having a crew that provides everything it needs to cope professionally with emergencies or crises. Daniel Schlup does that at Swiss Federal Railways SBB with a motivated team. When he asked himself how he could efficiently train the many emergency and crisis team leaders as well as all the employees in these teams, he came across GemaSim.

GemaSim is a computer simulation in which four trainees pilot a spaceship and complete challenging missions in space. However, the focus of the learning is not on technical skills. The focus is entirely on the goal-oriented behavior of staff leaders and team players in the staffs. It was a great pleasure for me to show the SBB managers in a three-day course the training possibilities that GemaSim has to offer. They were reinforced with an additional enabler in Daniel Schlup's team. Because the whole crew was PCM (Process Communication Model) trained and thus well equipped to keep communication successful even under heavy stress. The combination of these two training approaches, PCM and GemaSim, turned the training of his crew into a real power training.

Training concepts for emergency and crisis organizations are a special challenge. After the financial crisis and at the latest during the pandemic, many companies had to deal with the question of what is the best way to lead and cooperate in such extraordinary situations. So it is not surprising that SBB started to look around at the aviation industry. The leadership concepts, as well as the aspects of cooperation under difficult conditions, have been consistently developed in aviation since the 1980s. In this context, the cockpit serves as both a focal point and a metaphor. Indeed, much of what has proven effective in this intimate leadership environment can be transferred to the staff work of emergency and crisis organizations. When it comes to the challenging organizational and personnel elements of leadership support for large companies, aviation can only help symbolically. Bridges, on the other hand, can be easily built without sacrifice in the domains of interpersonal skills, the inner attitude and mindset of superiors, and the stringent application of the principles of a structured approach.

As the former head of the crisis organization of Swissair and as a captain, I came across GemaSim during the development of training concepts for crisis and emergency organizations. For many years the tool has proven itself in blue light organizations. It saves the trainees from inefficient frontal teaching. The simulation, which engages them fully from the first second, gives them the opportunity to learn lessons from personal experiences. A proven successful learning approach that not only leaves a mark, but is also fun.

News
news-85 Mon, 29 Aug 2022 10:00:00 +0200 Just Culture in Medicine https://www.martin-wyler.ch/en/blog/detail/just-culture-in-der-medizin/ I am pleased to speak on the topic of Just Culture at the 5th National Radiation Safety Day. The event, organized by the Federal Office of Public Health, is dedicated to medical radiation events and patient safety. With Just Culture, medicine is taking up a cultural approach that has become widely and regulatory established in aviation in recent years. It's no wonder, since Just Culture, colloquially known as 'error culture,' has become a mainstay and powerful foundation of safety efforts in aviation. With the experience I was privileged to gain in aviation, I now support executives in high reliability organizations as an organizational developer and change agent in introducing and embedding this particular safety culture. I am looking forward to the interdisciplinary exchange on September 12 in Liebefeld.

The conference addresses specialists in radiology, nuclear medicine or radiation oncology, medical physicists and radiology professionals with a management function or as medical / technical experts for radiation protection. The target audience also includes quality and/or risk managers.

The day will be organized in the sense of a "Call for Action". At this event, the FOPH wants to network stakeholders from all language regions of Switzerland and encourage them to initialize and implement projects to promote patient safety.

For more information on the 5th National Radiation Safety Day in Medicine, visit https://lnkd.in/e4YNJ3KU

More information on Just Culture: https://en.justculture.ch/was-ist-just-culture

News
news-83 Mon, 15 Aug 2022 11:25:00 +0200 Occupational safety and health management takes a close look at its communications https://www.martin-wyler.ch/en/blog/detail/das-betriebliche-gesundheitswesen-bgm-nimmt-seine-kommunikation-unter-die-lupe/ I am pleased to be able to contribute to the 18th National Conference for Occupational Safety and Health Management as a speaker on the topic of communication. The goal of the conference is to provide participants with approaches, ideas and tools that they can use to optimize occupational safety and health management (OSH) communication in their companies. So that OSH not only receives more appreciation, but also that its offers are used more often. It helps if those responsible for OS&H in the company can help to prevent miscommunication. After all, "Failure is no option" also applies to communication.

Successfully establishing OSH in a company means cultural change. As an organizational developer and change agent, I know the importance of communication in such demanding change processes. After all, what the stormy developments of the last decades as a person with responsibility in aviation have taught me is that in this high-risk environment, the only constant is change.

In my remarks, I will focus on the importance of shaping the relationship of those responsible for OSH with the stakeholders in their environment. After all, for communication to succeed, it pays to start there. This is good news, because it shows that everyone can make a valuable contribution to better understanding in the company. How can we do that? Find out more on August 31 at the Kursaal in Bern.

I am looking forward to a lively exchange with the OSH specialists.

Further information on the OSH conference: www.bgm-tagung.ch/de/

News
news-81 Mon, 25 Jul 2022 17:00:00 +0200 The one-eyed state https://www.martin-wyler.ch/en/blog/detail/der-einaeugige-staat/ At the end of June 22, the Federal Supreme Court acquits a Skyguide air traffic controller. The public prosecutor's office could not claim that he had endangered public safety. This is good news, not only for the controller, but also for all air travelers. However, it is doubtful whether the ruling will have a lasting effect on ensuring air traffic safety.

In its ruling of June 29, 2020, the Federal Supreme Court acquits an air traffic controller charged by the public prosecutor's office. On August 20, 2012, he had caused an allegedly dangerous proximity of two aircraft at Zurich's Kloten Airport. The Federal Supreme Court supports the ruling of the cantonal lower court and explains in its reasoning that a sufficiently concrete danger never existed. This would be a prerequisite to be able to order a penalty according to the applicable law. The fact that the public prosecutor's office did not accept the verdict of the lower court and took the case to the Federal Supreme Court makes one sit up and take notice. Was there really a need for this state fervor?

The criminal offense was about endangering public safety. There is nothing wrong with the fact that the state actively seeks to ensure this. On the contrary. However, when it comes to the question of how it does this, doubts arise that we in society should think about. It is no coincidence that the aviation industry is raising its voice critically in this matter. After all, when it comes to ensuring public safety, it is particularly called upon. So it is obvious that knowledge has accumulated in this high-risk area and concepts have been developed that show how safety can be ensured and further improved in highly complex socio-technical systems.

When reading the federal judgement, it becomes clear that according to current law, a threat to public safety is only seen in relation to the actions of a fallible human being. The state acts as if public safety could be endangered by human misconduct alone. In doing so, it negates all the hotspots dormant in the organization and the overall system that have a demonstrable negative impact on safety. So, once again, we miss the assessment of the systemic aspects. It is as if they are non-existent. What an incredible oversimplification. It is easy to live in such a trivialized world view. But just judgments, ones that have something to do with the world as people experience it, can only be made by such a practice by chance or by detour. As in this case, where a danger brought into play by the public prosecutor's office was not seen by the federal court.

Understanding the systemic perspective

Actions of people working in complex systems are subject to manifold influences of context.  To understand this, we don't even need to consult the research of great behavioral economists such as Nobel Prize winner Daniel Kahnemann. It is quite sufficient if we look at the most obvious linkages.

The highest and most important aspect is the opening of criminal proceedings with their negative influence on the reporting behavior of frontline employees. Under current criminal law, which punishes not only intentional but also slightly negligent acts, reports of safety-relevant events are therefore always associated with fear on the part of the reporters. The resulting reluctance to report undermines the safety culture in the company and in the entire community. This is serious because it diminishes the safety management database and prevents effective safety barriers from being put in place.  To the chagrin of the public, who like to feel safe on planes.

From the publicly available documents of the case mentioned, it is not clear whether the controller disregarded or violated applicable rules with his actions. We must or may assume that he acted in accordance with the rules. Otherwise, this circumstance would certainly have been dealt with prominently. Does he now bear sole responsibility for what happened? The case is a prime example of incidents in highly complex systems in which undesirable events can occur even though everyone follows the rules. 

It is also not apparent whether his actions did not comply with 'best practice' in air traffic control. It is not addressed whether another controller would have acted in the same way in the same situation. The second aspect, in particular, is of major importance if one wants to do any justice to the accused person. This shows that the usual practice in the administration of justice to involve non-certified experts as in other countries is not purposeful. What is needed is expertise in the specific working environment of the person concerned, which also includes the culture of the company.  This is where one would find the answer to the question of whether he or she was following a culturally anchored value system that places an inadequate value on the efficiency of order completion. Front-line decision-makers are confronted dozens of times a day with a conflict of objectives that asks them to choose between earnestness (safety) and efficiency. Since the company leaves them alone to make this decision, they understandably follow standard practice. They decide the way things are done in this company. They act according to rules that are written in big letters on the wall but cannot be found in any book. Is it fair when, in the event of an incident, they are accused of not having paid enough attention to the earnestness of the execution of the order and thus to safety? This, although everyone knows that if they always opted for earnestness, the whole operation would come to a standstill? 

If, in the aftermath of an undesirable event, a judgment is passed on a decision-maker on the front line without compellingly clarifying these issues, that is not honest. It is undifferentiated and cheap because it fails to hold all the other stakeholders who participated in the design of the system accountable. All of this shows the one-eyed attitude of the rule of law and how ignorant it deals with the modern, complex world in which aviation stakeholders operate on a daily basis.

The legislature is challenged

This is not a criticism of the actors in the legal field. They are required to enforce the law. It is a criticism of the legislature, which continues to tolerate such conditions, tacitly tolerating them and leaving us citizens wondering, shaking our heads, even perplexed. We have a legitimate demand that attempts be made to ensure public safety with the knowledge available today. The approaches developed in the distant past, when man was not yet active in complex socio-technical systems, where his actions were always directly causally linked to the result of his actions, clearly fall short today.

It is not a matter of freeing man from his responsibility. It is a matter of appreciating his actions with a holistic view and taking into account the manifold systemic influences. Anything else is dishonest.

We can no longer afford not to look in the mirror. At some point, we must face the fact that we have built systems that we can no longer fully control and in which we have people working who are fully exposed to systemic influences. Treating them like cannon fodder in the event of an incident regardless of these circumstances is not okay. Sometimes they 'survive' and are acquitted as in this case, sometimes they are punished. The only thing that is certain about these seemingly archaic procedures is that all other stakeholders who have actively participated in the construction of the complex systems and therefore share responsibility always get off the hook. This simple reduction of consideration to the acting individual is not worthy of a civilized society.

What we need

We have learned in aviation that we need to approach safety with a systemic approach, in which humans take on the gatekeeper role thanks to their tremendous adaptability. We have learned that errors are important symptoms of a system that is not functioning optimally. Therefore, we understand the error as a learning opportunity and do everything possible to draw the right conclusions from it. The Just Culture helps us to do this. In each individual case, we use a balance check to determine whether it is suitable for learning or whether there has been a grossly negligent or even deliberate violation of the applicable rules. Of course, both are not accepted and will be punished. How we as a society can deal with this issue has been outlined by the umbrella organization of the Swiss aerospace industry / AEROSUISSE in its white paper "Anchoring the Just Culture Principles in Swiss Law" using the example and context of aviation.

AEROSUISSE: Verankerung der Just Culture Prinzipien im Schweizer Recht

Translation of the document is pending

News
news-79 Mon, 21 Mar 2022 20:00:00 +0100 Cooperation under pressure and stress https://www.martin-wyler.ch/en/blog/detail/zusammenarbeit-unter-druck-und-stress/ "Failure is no option" - This also applies to teams that have to work together under high pressure and stress. At Swiss Federal Railways / SBB, this is commonplace in the Traffic Control Centers (TCC). For this reason, SBB has been training its TCC-managers and employees since 2006 with an approach that I was allowed to develop for them using the computer simulation "GemaSim". Now new SBB trainers are being trained again. After two courses in Lausanne, Hélène Magnenant and Yannick Abel-Coindoz are ready to lead the courses internally themselves. Congratulations to both of them for their fine way of supporting the participants in their development as stress-resistant team players.

Whenever we act under pressure or feel stress, our behavior changes. This is an old wisdom that has significant implications for our ability to cooperate and lead. Among other things, stress causes us to have trouble thinking clearly. More seriously, it causes us to exhibit dysfunctional behavior. Both together have the potential to make us pretty poor leaders or team players. So it makes a lot of sense for supervisors and employees who face recurring job-related stress to address their stress-related behavior patterns and crisis-resistant leadership concepts.

In the Traffic Control Centers (TCC) of the Swiss Federal Railways, time pressure and stress are an almost everyday phenomenon. There, teams composed of people from a wide variety of disciplines search for and develop solutions for traveling customers in the event of disruptions on the route network. Every day, they are all challenged to keep the effects of stress in check. This is the reason why SBB has been training its managers and its employees in the TCC's with a specific training approach since the very beginning.

Among the long-standing customers of this training approach are the Swiss Air Force, which trains its young pilots of the professional pilot corps, and, obviously, countless crisis management teams, for whom controlled and predictable leadership in crises is a serious concern.

At the center of the training is the computer simulation "GemaSim". It provides attractive, dynamic situation developments and challenges the participants to lead under stress and to create a collaboration that can withstand the pressure. The actual training takes place in the trainer-led reflection. Following jointly mastered exploratory flights in space, the licensed trainers moderate the individual learning process in a participant-centered debriefing. As the participants' insights are linked to experienced emotions from the flown missions and they receive rich feedback on their behavior, the learning progress is correspondingly high. At the end of the last training at SBB, one manager said, "This is the best training I have ever had the opportunity to do at SBB. The experiences I was able to make in this training are of great importance to me and go deep".

Training with GemaSim

 

News
news-77 Fri, 11 Mar 2022 22:00:00 +0100 Aviation calls for Just Culture principles in Swiss law https://www.martin-wyler.ch/en/blog/detail/aviatik-fordert-just-culture-prinzipien-im-schweizer-recht/ AEROSUISSE, the umbrella federation of Swiss aerospace industry, has published a white paper outlining its concerns about the legal developments underway. It deals with legal framework conditions that are significant for the safety of passengers, patients and the environment. The paper was developed in close collaboration by and with affected stakeholders in aviation. As co-author, I led the process for the creation of the white paper.

Before the end of this year, the Federal Council will submit a proposal to the Swiss parliament on how the principles of just culture can be incorporated into Swiss law in the future. It is the Federal Office of Justice that has been working on the postulate "Redlichkeitskultur im Schweizer Recht" (20.3463) and has set out to find possible solutions. It has commissioned a study from the Foundation for Aviation Competence, which includes a legal comparison with European countries and a stakeholder analysis in Swiss high-risk industries.

The response to the postulate is the beginning of a legal development that is of particular importance for Switzerland. The aim is to create legal framework conditions that serve to improve safety. Learning from reported near incidents, from undesired events and from safety-relevant hints should be better possible through these in the future. Today, criminal law hangs threateningly over those who report, make statements or provide information. Silence pays and thus 'not learning' is inherent in the law. In today's world of complex socio-technical systems, which is also rushing from crisis to crisis, this is an anachronism. Who still pretends to know how everything is connected and how everything works? This question is increasingly being asked in companies that need to bring major risks under organizational control. Companies in the medical, aviation, energy and rail industries. They all depend on their employees to provide them with information about where things are going wrong in the system and where risks are embedded in the organization. As long as the state reserves the right to intervene in the safety culture of these companies in an accusatory and punitive manner, regardless of the collateral damage, these risks will remain like buried mines and will not be addressed. Is that what passengers on planes and trains want? Is that what patients waiting in the operating room for surgery want? Is that what people want when they buy electricity from a nuclear power plant? Is that what our parliamentarians want?

In aviation, there has been a reporting obligation for special professional groups for years. It is regulated by law and goes hand in hand with the protection of the person making the report, provided that the error was not intentional or grossly negligent. This regulation has led to the establishment of a learning error culture, called 'Just Culture'. However, this obligation is valid only within the company and has to cope with the sword of Damocles of criminal prosecution. No one, not even aviation stakeholders, is calling for general impunity. Willful violations, intent and gross negligence must be punished. But the authority of the state, and thus its power, should be tempered for the good of safety with transparent procedures and guidelines to the effect that the state is required by law to seek for a balance between the criminal law requirement and the requirement to learn and improve safety. 

Swiss Aviation has now made its voice heard with a white paper. It explains the context, outlines the significance for safety and makes concrete proposals for the legal developments underway. A look beyond national borders shows that progressive countries have found suitable solutions for the socially so relevant balancing of interests between law enforcement and improving safety thanks to learning. There are approaches that do not pit one against the other. This is also the view of the International Civil Aviation Organization (ICAO) and the European Aviation Safety Agency (EASA) with their numerous recommendations to the individual nations. All this is encouraging and gives hope that the Swiss legislator's view is widening and that he will no longer close his eyes to the enormous developments in our socio-technical systems. After all, these require a rethink and call for new, adequate solutions.

I wish the readers a stimulating read.

AEROSUISSE: Verankerung der Just Culture Prinzipien im Schweizer Recht

Translation of the document is pending.

News
news-75 Sun, 06 Feb 2022 23:03:00 +0100 Miscommunication is no option https://www.martin-wyler.ch/en/blog/detail/misskommunikation-ist-keine-option/ It is not only in times of a pandemic that the expectations of executives in terms of communication are high. In the safety-relevant environment, miscommunication is always highly problematic. It was great that four managers from the high-risk environment came together again last week and spent three days strengthening their communication skills. I wouldn't have minded if it had lasted longer, because the atmosphere and the learning environment were simply top-notch. Many thanks to the participants. This is what makes advanced training really fun!

In the High Reliability Organization, failure is no option. The same applies to communication. It is crucial that the players understand each other. Even small misunderstandings can have serious consequences. Not to mention interpersonal conflicts, which, as we all know, always start with miscommunication.

With the Process Communication Model®, the participants have strengthened their communication skills on the basis of a personality profile created for them. They are now able to respond to the individual preferences of their work colleagues in their environment and thus ensure communication without noise and background sounds. They now know which people will be challenging for them to communicate with. They can adjust to them and adapt their communication precisely. They know when and why they themselves become a communication challenge for others. They have acquired the tools to prevent communication from slipping into miscommunication. Thus strengthened, they are able to recognize conflicts as they arise and use their newly acquired skills to exert a de-escalating influence.

Communication is the means of shaping relationships with other people. After this training, participants will be able to make a valuable contribution to successful cooperation in their organizations. For a cooperation that leads to more efficiency and is more fun at the same time.

News
news-73 Thu, 27 Jan 2022 22:22:00 +0100 Failure is no option. In communication, too. https://www.martin-wyler.ch/en/blog/detail/failure-is-no-option-auch-in-der-kommunikation/ This week, more pilots and managers from Rega / Swiss Air Rescue Service Jet Operations Center strengthened their communication skills. Well done, because in their time-critical and high-risk working environment, miscommunication and interpersonal conflicts have serious consequences. Many thanks to the participants! Truly, a cool bunch! Read on…

When asked how to ensure safety and reliability, many people first think of rigid processes, compliance, discipline, and highly qualified managers and employees. What is often overlooked are the serious negative effects of failed communication. For the containment of our complex systems and for them to be able to deliver their full performance safely and reliably, it is essential that the people involved work together as smoothly as possible. This stands and falls with the quality of communication. If, as in the case of Rega / Swiss Air Rescue Service, an enormously high degree of flexibility is also required, then successful communication becomes the ultimate success factor.

With the Process Communication Model®, the participants have strengthened their communication skills on the basis of a personality profile. They now know with which people communication will be challenging for them. They can adjust to these people and adapt their communication precisely. They know when and why they themselves become a communication challenge for others. They have acquired the tools to prevent communication from sliding into miscommunication. Strengthened in this way, they are able to recognize conflicts as they arise and use their newly acquired skills to exert a de-escalating influence.

The three days of training were both a challenge and a pleasure for me. I know that together we have made a further contribution to Rega’s safe and reliable flight operations. And I know that the participants, with their newly acquired skills, will also ensure successful cooperation in day-to-day operations. Win-win. What more could you want?

News
news-71 Thu, 23 Dec 2021 21:00:00 +0100 Humility https://www.martin-wyler.ch/en/blog/detail/demut/ Sometimes stories teach us more than long theories. When it comes to leadership, the theories are especially long... Here is a story that can provide food for thought for leaders embedded in today's challenging settings.

Time and again in history, we find events that usher in a new era and serve as landmarks for meaningful developments. The tragic crash landing of a United DC-10 in Sioux City in 1989 is one such event. This incident marked the transformation of an outdated understanding of leadership in aviation into a contemporary form of leadership.

United Flight 232

What happened? The aircraft was cruising between Denver and Philadelphia at an altitude of 30,000 feet when the middle of the three engines explosively disintegrated and flying debris damaged all three hydraulic systems. The aircraft was virtually uncontrollable from that point on. In the cockpit were Cptn Haynes, the first officer, and the flight engineer. The two pilots managed to keep the aircraft in a descending turn by pulling the control column back with maximum body power and bracing against the turn with full aileron deflection. That was all they could do. The crash was only a matter of time before the aircraft would hit the ground curving like this. In distress, Captn Haynes remembered the DC-10 flight instructor aboard as a passenger and offered him to help. He managed to steer the aircraft from the middle seat in the cockpit, using the two thrust controls of the still-functioning wing engines. He was able to steer the aircraft  with the two thrust controls of the still functioning wing engines, although this was more a case of rough directional and very imprecise descent rate control.

In a feat that could not be reproduced later, the four cockpit members managed to bring the DC-10 to the ground at Sioux Gateway Airport in Iowa. However, there could be no talk of a landing. On impact, the plane flipped onto its back, caught fire and broke into four pieces. Of the 296 people on board the plane, 111 died. On the other hand, 185 people survived the accident. There were a few favorable circumstances that minimized the death toll. The most significant aspect was explained by Cptn Haynes after the incident as follows: "Until the 1980s, the captain was the authority on board. What he said was valid. That's how we lost some aircraft. Sometimes the captain wasn't as good as we all thought he was. In Sioux City, none of us knew what to do. Why should I, of all crew members, have known how to proceed? But the four of us had 103 years of combined pilot experience. By each of us contributing fully, we managed to get the DC-10 on the ground in Sioux City."

A new understanding of leadership

This is a leader who does not pretend to himself and all his employees that he is the master of all situations. With this statement, Cptn Hayens has openly broken with the hero myth of the "Gods in Blue. He not only refuses to highlight his personal achievements in retrospect. Rather, he meets his challenge from a humble inner attitude. It is this attitude that made the necessary and successful teamwork possible in the first place. An attitude that does not allow him, just because he is the boss, to do anything that would not be a viable response to the problem. He uses the resources at his disposal in the best possible way and does not allow his ego to prefer to see him in a particularly prestigious role. His uniform jacket with the four stripes and his captain's hat are in good hands in the back of the wardrobe. They can' t cause any collateral damage there.

Why we no longer need heroes

What is presented here in an extreme way in the emergency and comes to the fore in an edgy way plays itself out thousands of times in highly complex work and leadership environments in normal operations. In Sioux City, the emergency confronts a leader and demands that he solve a problem he does not know. Elsewhere, relentless competition demands a ability of adaptation and innovation from the company and its executives, calling for solutions they don't yet know. Ultimately, this is the same context in which 'leadership' is called for in a special way. Not only demanded by the crisis, but also by the pressure to adapt and innovate, diversity, inclusion, collaborative understanding of leadership, agile management and the associated social competencies of the leader are pushing themselves so prominently to the fore. They are all important elements of resource-based leadership. As a boss or expert, if I don't know what the answer to the problem is, the hero pedestal I stand on is of precious little help. If I don't manage to climb down in such situations or environments, it becomes dangerous. Then the cumulative power of the heroes becomes a risk for the company. For there is a danger that things will be done that are not meant for problem solving, but for maintaining power or face-keeping. In my various engagements as a coach and trainer of crisis teams, I have witnessed and continue to witness such misguided actions by people in positions of responsibility. Since the ongoings are always visible to all involved, they are also the triggers of meticulous embarrassment or even frustration in the team.

Humility as a basis

Humility protects managers and specialists from such dangerous escapades. It helps them see their performance expectations free of formal or organizational demands and opens their eyes to what needs to be done. No politics and no cover-your-ass strategies. Humility frees leaders from such corrosive self-protection mechanisms and allows them to make an authentic appearance that makes them approachable and vulnerable. Yes, it is like that: Those who are approachable and vulnerable are trusted. Humility turns heroes into team players.

What is still interpreted by many managers and experts today as 'not being up to the job' or even as a sign of failure, proved to be a success in Sioux City. The incident teaches all those who like to bathe in their expertise and use it to legitimize their power something different.

Humility lays the groundwork for an attitude that does not presume to have a solution to every problem or to always be able to maintain control over everything. It is based on the honest admission of one's own imperfection and personal overstrain as an individual in complex systems or challenging situations, in environments with threatening pressure to innovate or in states of crisis.

A humble attitude also helps those leaders who are in charge in high-risk organizations. Here, a humble and reverent view of a complex system, which is usually highly regulated but can no longer be understood holistically in all its facets and overlapping, mutually influencing functions, helps. This inner attitude allows leaders to create a fear-free atmosphere, a culture of trust. By establishing an environment of psychological safety, they ensure that information flows smoothly throughout the organization. Horizontally as well as vertically. In highly complex environments, this leads to reporting the weak points of the system. This is an indispensable prerequisite for the continuous learning process and thus for the safety and reliability of the organization.

What should be done when managers are overwhelmed?

Well, they are only overwhelmed if they uphold an understanding of leadership that makes them exclusively responsible for the result. Such an output-oriented understanding can make sense in simple and linear environments and systems. But in complex and networked situations, which are characterized by a high degree of interdependence, it becomes a complete overload. Where we can steer and control in simple systems, we are limited to influence in complex systems and in crises. The diversity and interconnectedness of influences take away the power of steering and control. A humble attitude can help the leader move away from a historically entrenched focus on results. The role he or she has to play in complex, networked environments characterized by pressure to adapt is dedicated to achieving results. Leaders are responsible for all the processes, structures and cultural frameworks that optimize the facilitation of good solutions. In fact, they are there to organize collaboration. In such a way that all the resources of the company are brought to bear in the best possible way. We are therefore well advised when assessing the performance of managers not to focus solely on the result, but rather to evaluate their abilities, which were decisive in achieving the result.

They can' t do this without trust. Nor can they do so without a sense of purpose. And without humility, it will be difficult.

Christmas at last!

Fortunately, the holidays are just around the corner. They offer us the opportunity to reflect on things that are understandably neglected in everyday life. Christmas, however, as the feast of love, provides an ideal setting to let our thoughts revolve around the virtue of humility. In the German language, the word humility comes from the old High German word 'diomuoti'. This stood for 'attitude of a servant'. This meaning has a lot in it when it comes to clarifying your own understanding of leadership. I wish all readers of this blog a Merry Christmas and a successful, because relaxed, start into the New Year.

News
news-69 Fri, 26 Nov 2021 21:00:00 +0100 Performance assessment of people in complex systems https://www.martin-wyler.ch/en/blog/detail/leistungsbeurteilung-von-menschen-in-komplexen-systemen/ Is it useful if honest mistakes end up in the personnel file of the unhappy actors? This still widespread practice is one of the concrete obstacles to a living culture of trust. Mental barriers in the minds of managers are the blockers. Can we talk about it?

In recent blog articles, there has been repeated talk of the complex work environment. We can widely observe how the costs of progress are presented to us in the form of increasing complexity. Be it new products, services, contracts, or laws. They all emerge from an environment that is becoming more complex at a breathtaking pace. Sometimes they themselves are turning the complexity screw. "Mastering Complexity" is becoming an aspiration, not only for executives. Can we handle it? Well, if this is supposed to mean a desire to regain control, the expectation is bound to lead to disappointment. The answer is more likely to lie in clever and intelligent influence. This may take some getting used to. I am afraid that there is an adjustment to reality in the air here.

Complex systems have the pleasant side of making new things possible. And they have the unpleasant side that they can lead to system failures without the individual functions needing to exceed their defined performance limits. The phenomenon of functional resonance can cause an undesirable event to occur with a random overlap of tolerable performance deviations of individual functions. Randomly.

How we try to protect humans from their fallibility

Most systems are built around the human being. As a rule, we still have an important function. Our fallibility is reduced to an acceptable level by a variety of technical and organizational means. For example, pilots in modern cockpits are prevented from leaving the aerodynamic envelope of the aircraft by a computer-guaranteed envelope protection. An aerodynamic stall would not only be a frightening maneuver for the passengers but could also be the cause of a crash. Which means nothing less that we still let the pilots steer, but only to a predefined degree. By the way, this feels like preventing you from crossing the street without having looked left and right first. It's only a marginally good experience. Organizational measures that set limits to the free actions of fallible human beings are, for example, rules, process specifications and competence restrictions. With increasing complexity, it is not surprising that the longer, the more we threaten to drown in a flood of laws and regulations. They are all an expression of the attempt to limit human fallibility to a tolerable level so as not to lose control. All this works reasonably well. We should be satisfied with what we have achieved so far, even if it is accompanied by the acceptance of certain risks. After all, we know today that more regulation does not mean more safety or reliability. We have come to an end with that.

Dealing honestly with error

Let us remind ourselves that the occurrence of certain risks in complex systems is due to randomness? They can only occur if there happens to be a superposition of several (individually tolerable) functional deviations. As soon as a human being was involved in such an undesired event by its action or omission, we tend to turn all eyes on him. Is it honest, if it turns out that he did not leave the limits of his 'given envelope'? That he has not acted intentionally or with gross negligence? That he was involved, with an intact inner attitude, in a situation that led to an unintended incident? He contributed, we can assume in most cases, that it happened. Perhaps with carelessness, which alone, however, would not have led to the undesired result. Would it be fair if this incident, which was due to chance, would find its way into his personnel file? How fair is it when such spontaneous fallibilities are used to qualify an employee or manager? How fair are the side-eye looks from colleagues who usually work with him? We are often merciless. And we are often indecently undifferentiated. Who makes the effort to learn about all the other influences that contributed to the event? Who starts trying to do justice to the complexity? Who manages not to give a single fallibility the significance of a qualifying dimension?

About good leadership

For some readers, this may sound like a plea for a general amnesty. As an attempt to take employees and managers in complex work environments off the hook in general. As an argument for not taking responsibility for one's own actions. As a sign of weakness. Now, this image can only be created by someone who implies that the fallible person involved in the event intends not to take responsibility. That is fierce. I don't know any doctors, pilots, air traffic controllers or operators in nuclear power plants who deal with responsibility in this way. How could trust ever arise under such a mindset? By indulging my expectations of infallibility as a manager or colleague, and meeting anyone who fails to live up to them with skepticism and accusations, regardless of how the circumstances might have been contributing? That these are corrosive thoughts need not be elaborated here. It is obvious that they speak the word of a culture of mistrust. A culture that would have a hard time ensuring reliability and safety in a high reliability organization. In the hospital, in the nuclear power plant, in the airline or in rail operations, to name just a few.

Whoever, as a manager, allows him- or herself to be tempted or even finds it right that incidents of the kind described should be used for the qualification of those involved, should not be surprised if he or she lacks the qualification as a manager in a high reliability organization.

The ‘honest mistake’ has no place in a personnel dossier.

If you as a manager manage to anchor this basic principle of Just Culture in your area of responsibility, you will be surprised what happens. Do you have doubts? Then contact me. I can show you the effects with concrete examples.

News
news-67 Fri, 12 Nov 2021 20:00:00 +0100 Wrong eye operated https://www.martin-wyler.ch/en/blog/detail/falsches-auge-operiert/ The health care system struggles with malpractice stories that come out in the open. They are unflattering stories, not only because they involve patient suffering, but because they are too often not used for learning. What's at the heart of it?

Yesterday, journalist Alexandra Bröhm from the Tages Anzeiger addressed the readership with an article that gives us insight into a dark and unattractive side of our society. A society that prefers to punish mistakes rather than learn from them.

She tells of operations on the wrong eye, of the amputation of two breasts in a patient for whom only a minor operation was actually planned, and of the administration of the wrong medication because it was mixed up. On the surface, the story is about the failure of doctors and nurses. In view of the incredible suffering of the patients affected, these stories have the potential to trigger anger and indignation among the readership. To whom they are directed is clear. Sloppy work by the medical profession and or nursing. But that falls far short of the mark. It seems a bit adventurous to accuse our healthcare professionals of having a lax attitude to work in general. Error prevention is much more about learning from undesirable events. Alexandra Bröhm is therefore right to criticize the inability of the healthcare system in Switzerland to set up a central register for reporting work errors. Such a registry would be the basis for learning systems that could bring about improvements and reduce or completely prevent the occurrence of never events. But even that is not enough. More on this later.

Never Events

In the article, 'never events' are events that should never happen. It need not be mentioned that this term implies an unrealistic hope. We humans are fallible. Much more unfortunate is the fact that the term addresses the result of an event and not the reasons that led to it. But only those are of interest when it comes to making healthcare safer. Dear physicians, dear nurses, dear patient safety experts or whoever may have brought the term 'never event' into play: please detach yourselves from it. It leads in the wrong direction. It is not about never events! It is about the 'Never Causes'! There must not be certain reasons that lead to an event. That is decisive and only that.

Humans are always integrated into a system and exposed to influences.

Only by focusing on the causes can we see all the contextual influences that promote or trigger an event. Similar packaging for drugs with different effects, poorly set up work processes, time pressure, lack of resources and much more. In their work, physicians and nurses operate in environments that influence them relentlessly. When they perform actions that lead to unintended results, such obstructive influences have always been involved in shaping the situation. But because they are part of history as actors, when they report, they must expect to be sanctioned. Because they will be judged and condemned by people who are not fixated on causes ('Never Causes') but on 'Never Events'.
It is this unfortunate, hard-wired mental process with its focus on the outcome that clouds our view of the causes. It is one of the reasons that reporting systems are not built and maintained. Reporting systems that protect the reporters and that are dedicated to finding the causes. The unreflective and usually unjust personification of cause has led to our criminal law and our sense of justice. We should rethink this and ask ourselves what is more important to us. The punishment of the actors or the good feeling of being able to entrust ourselves as patients to the health care system. A medicine that is capable of learning and that continuously reduces the risk of incorrect treatment. I don't know about you. But I definitely prefer the second option.

Punishment prevents learning

I am not surprised to read in a comment of the article by Alexandra Bröhm that the punishment for the guilty doctor, who wrongly and unnecessarily amputated two breasts, was shamefully low. Not a word does the commentator ask about the reasons for what happened. Like a mouse hypnotized by the snake's gaze, she sees only the result. It is terrible, no question. But the doctor's punishment has done much greater damage. It has prevented learning and left in place the danger that the same mistake can happen again and again, multiplying the suffering instead of reducing it. The only thing the unfortunate physician can take away from the story as a rational lesson is: silence is indeed golden.

If we live up to the mistaken belief that things will change for the better in the healthcare system with an emphatic appeal to all doctors and nurses to please make more of an effort from now on, then we are on the wrong track. We have to understand that we are hitting the sack, but we mean the donkey. Today, a substantial part of Swiss healthcare organizations still do not have a management system that records and analyzes events and reduces risk with adequate measures. In aviation, these management systems are not only required by regulation, but have become a culturally anchored matter of course. Dear decision-makers in the healthcare system, isn't it time to get rid of this incomprehensible deficiency? After all, it is not only about improving patient safety, but also about reputation.

But in all this, we must not overlook one thing. Criticism of the players in the health care system falls short of what is required of us as citizens. We are in fact allowing a legal framework that does not allow healthcare institutions in Switzerland to protect their reporters from criminal consequences in connection with work errors. Therefore, we should not be surprised if the actors remain silent and the healthcare system does not make any progress in this area.

Politics has a responsibility

According to an estimate by the Federal Office of Public Health, our direct-democratic inactivity in urging politicians to create the legal framework mentioned above results in 2,000 deaths every year. People who died because of avoidable errors and complications. Outrage and appeals are of little use here. We are called upon to hold our politicians to account. It must not be the case that next year, when Parliament deals with the postulate 'Just Culture in Swiss Law', it will once again remain inactive. If it once again deals with the issue from the traditional perspective, which is characterized by the outdated stimulus-response model: where there is error, there must be punishment. I, at least, assume that we can expect our elected representatives to think in somewhat more sophisticated contexts. I also believe that not only the decision-makers in the healthcare sector, but also our parliament are capable of learning.

News
news-65 Mon, 01 Nov 2021 10:24:00 +0100 Our idea of how undesirable events occur, in which humans are involved, is in urgent need of an update! https://www.martin-wyler.ch/en/blog/detail/unsere-vorstellung-davon-wie-es-zu-unerwuenschten-ereignissen-kommt-bei-denen-der-mensch-die-hand-im-spiel-hat-braucht-dringend-ein-update/ In complex systems, we tend to attach too much importance to people and their actions. In doing so, we diminish the influence that all other functions in a system have on the outcome. At the latest when it comes to the question of his responsibility in the event of an incident, this view becomes dishonest because it does not correspond to the real situation.

Our brain was formatted in a time in which there was always a direct causal connection between cause and effect. Man's actions led to a result that could only come about because of his actions. With emphasis on 'only' in the sense of exclusively. It was the time of simple systems. Causality everywhere you looked. Everything was related to the other in a comprehensible way. Good old times. Not that we have completely abandoned these cause-effect chains. No, they still exist (fortunately), although we really have to look for them in the modern working environment. If, on the other hand, we look around a bit and take a closer look at the world of people at work, a completely different picture emerges. They give inputs into systems that present them results without being able to understand how they came about. They work with outputs of black boxes without knowing their inner logic. The fact that something has been installed between the acting human being and his output is one of the characteristics of our contemporary working world. It relativizes his causality and would have to relieve him of his full responsibility to the same extent. However, we find this very difficult.

But that is only one aspect of the new world of work. The other is the increase in complexity. Many of us, I dare say the majority, are embedded in large systems in which we are a small node in a huge network. The image of the tiny cog in the big wheelwork belonged as a metaphor to industrialization, which we have long left behind us. It vividly showed the mechanical cause-and-effect chain and its causality. That was once upon a time.

Integrated into a network

In the network, however, I am only influencing, and I cannot say what output my inputs will result in exactly. We have largely lost the controllability of our systems and are concerned with influencing. Is there anyone who can explain the cell phone to us in detail? I doubt it. Do we need it? No, because we learn to understand and operate it through different inputs (influence). Or do you still read instruction manuals? The fact that we can only influence, but no longer genuinely steer and control, also means that we are only indirectly causal and that we can at best bear some responsibility if something undesirable should happen.

In today's working world, we are caught in the web of countless functions that hold the system together and make it perform. We influence with our actions, and we are influenced by the other functions. Each function in the system serves a purpose and is usually well described and regulated. As examples, to name a few, the following may be helpful to us: Selection of employees and managers, training, rules and regulations, contracts, management systems, technical systems, communication and last but not least the performance (actions) of the people in the system. For each function, the others form the environment because they all interact with each other in the system. All of these functions have normal and accepted variability. That is, they are allowed to vary to some degree, and they do. For example, technical systems have a 'Mean Time Between Failure rate' (MTBF). This describes the expected time between failures for a repairable system. The procedures for selecting managers have come a long way, but they do not give us a hundred percent guarantee of finding the right ones. Communication in the company usually serves its purpose, and yet miscommunication or misunderstandings occur time and again. And last but not least, despite successful training and adequate experience, a mishap can happen to us humans in the system that contributes to an undesirable event. Even with an intact inner attitude, our performance varies. We are fallible, like all other functions in the system.

Functional resonance

If we look at a system in this way, we can understand that random overlaps of fluctuations of different functions can occur, producing an incident. This, although in no function the threshold of tolerable would have been exceeded. Everything worked as it was supposed to work. Only the system as a whole failed. We are dealing with the phenomenon of 'functional resonance'.

In such a world, to maintain the notion that the actor on the site is always directly fully responsible for the result produced in the system is questionable. It reduces the focus of consideration to one function, the human being, and ignores all other influences. It is a reality distortion of the first order and dishonest on top of it. If this view is held up by managers, it is an expression of an attitude of having nothing to do with the whole thing, in order to possibly shirk one's own shared responsibility. We have arrived in the world of cooperation and complex systems.

Conclusion

Functional resonance confronts us with the difficult-to-accept fact that incidents or accidents in complex systems can occur with an unpredictable randomness. What this means for the organizations and companies that must prove themselves in the high-risk environment is something I will discuss in the next blog posts. This much up front: all those who still live in the notion that human action on the front lines follows the cause-and-effect chain in the modern world of work and can be considered the only viable rationale for the outcome must accept the accusation of social romantic, industrialized reverie. They urgently need an update if they are to take a hand in complex systems.

News
news-63 Sun, 17 Oct 2021 12:55:00 +0200 Why do we want to blame individuals for the failures of our systems? https://www.martin-wyler.ch/en/blog/detail/warum-wollen-wir-individuen-fuer-das-versagen-unserer-systeme-beschuldigen/ Facebook was hit at the beginning of the month. The total failure showed us how fragile and ultimately unreliable this monstrous network is. But what do we hear? It was human error. In this case, too, are we being told that the system is perfect and only humans are at fault?

Why do we want to blame individuals for the failures of our systems?

Once again, we read about the failure of a large system. For over six hours at the beginning of October, Facebook was hit by a total outage. WhatsApp and Instagram were also cut off from the Internet.

According to Internet experts, the crash was caused by human error. It is said to have been a faulty update of a Border Gateway Protocol (BGP). This apparently works like a navigation system, scanning the Internet for addresses and forwarding this information to the desired locations. Please don't ask me anymore. I'm just relaying network know-how from experts here, rehashed for dummies. If BGP doesn't work properly, a single company, an entire country, or an entire continent can be cut off from the Internet. According to the experts, only a few employees have access to the update mechanism of a BGP. Only a very few are authorized to make changes to the navigation system.

How would you react as the person in charge?

Let's assume it was a single network engineer or programmer who had disconnected the backbone system of the now 1,000 billion company Facebook from the network. And imagine you were disciplinary superior to these employees. What would go through your mind? In view of the enormous damage, would you be offended if you suspected that your team members were not working with the required diligence? That you therefore accuse them of not having done a good job? You would probably feel the same as the numerous experts who have come to the conclusion that this was once again a case of human error.

Why is it, I ask you (and all the experts), that your perception is reduced to the human factor and completely ignores the highly complex, but misguided system? A system that only a few still understand. A system that experts are not able to convey to us in page-long explanations. Not even when they try to speak in simple language and use metaphors and analogies that clearly signal to us that we have lost the plot and should better concern ourselves with other things.

But that is exactly what we should not do! We should stand up and fight back against such vulnerable and fragile systems. Just because a system is not a human being does not mean that we do not take it into account in our evaluation and only see the human being with its fallibility. A system also deserves recognition and thus an indictment. It must not get off scot-free.

It is not, on the face of it, that a human being has failed in this case. It is the poor performance of an ill-conceived system in which gross design flaws and technical omissions allow unacceptable failures. It is a system in which humans are assigned a totally inadequate role. Or have we forgotten that humans are fallible?

Why only do we want to blame individuals for the failures of our systems?

In answering this question, I would like to address two hypotheses and know that they alone cannot answer it conclusively. The mechanisms are too complex for there to be a single, 'correct' story to explain them.

Power

The obvious way to explain this inappropriate blaming of individuals is that it involves the raw exertion of power. In any failure of a highly complex system, there are people, organizations, or institutions that seek to defend their goods, their services, or their reputations. They do so by sacrificing individual employees or leaders in the event of an incident. This suggests that it is the protection of such interests that imposes guilt on, or in certain cases criminalizes, the actions of individuals. The idea is thus to divert attention from poorly set-up systems for which executives would have to bear responsibility. It is obvious that this can be a gross insinuation in a specific case. We all know that it is practiced.

Organizations in which such practices are found suffer internally from a culture of mistrust. If such practices become public, there is inevitably a loss of trust among customers and all other stakeholders. If such companies want to bring their organizational and systemic risks under control, they have a long way to go. It begins with the abandonment of the tube view of human fallibility. It requires the opening of perception to a systemic perspective. This is an unpleasant process for which it is worthwhile to have a change agent, a coach, at your side. Whenever it comes to overcoming intuitively triggered reactions (blaming individuals), it becomes tough and it needs perseverance and patience. This is the path that leads to reliability, safety and resilience of the organization and its goods and services.

Anxiety

Today, when we read the report of an air accident investigation written by a professional authority, we realize that there is not one reason for disasters. This is also true for disasters that occur in other highly complex sociotechnical systems. Rather, dozens of factors have an impact and are partly responsible for failure. Only through their simultaneous occurrence or action can the accident occur. This simultaneous interaction is, we must admit, as painful as it may be, pure coincidence. The complexity of today's aviation system, for example, is so immense that the fluctuations of its individual functions can lead to catastrophes, even though everyone has done everything right. This is a frightening insight, and it is scary.

This fear is rooted in the fact that we no longer have total control over the complex systems we design and operate. We fear the possibility that they may fail because of the intertwined, everyday interactions that take place within them. It would be much easier for us, and we would very much welcome it, if such system failures had a single, traceable and controllable cause. So, when it comes to finding the cause, we're not picky; everything doesn't necessarily have to be on the table. Because that would be scary. We would be made aware that we are dealing with a loss of control. How reassuring it is to tell ourselves that it was once again human error. Assigning blame to individuals, punishing them and thus criminalizing them is therefore also a protective mechanism that saves us from having to face reality with its terrible grimace.

Why do we assume that our systems are okay?

As we have seen, our IT experts come to the (self-)reassuring conclusion in their analysis of the Facebook total failure that it must have been human error. How unbearable it would be for them to have to admit to themself that they are specialists in an industry that produces such ailing and unreliable systems. What would be called for would be modesty and reverence for what they had helped to build.

Pay attention the next time you hear about an accident or a serious incident in your environment. What does the press, the company involved, or your intuition present to you as the cause? If it is once again a scapegoat, pay attention to the calming aura of this assumption. At the same time observe how the system in which the incident occurred does not receive any attention and that it can continue its flawed existence unmolested in the shadow of the guilty individual's accusation.

News
news-61 Fri, 01 Oct 2021 20:30:00 +0200 Why do we blame https://www.martin-wyler.ch/en/blog/detail/warum-wir-anschuldigen/ When we set out to deal with error differently, we need to address the concept of psychological safety. As a leader, this involves creating an anxiety-free environment. The biggest hurdle in this endeavor is how to deal with the accusation.

Blame is arguably the most problematic offender in a safety culture. It touches the core of the error culture because it undermines trust. Thus, it is worth giving some thought to. In this and the next few blog articles, I will address individual aspects without claiming to tackle the topic holistically. Rather, the discussion here is intended to provide food for thought and impetus for all those who have set out to put the noble goal of a fear-free culture into practice.

Psychological safety

It is the instrument to achieve this goal. It is one of the currently hotly debated leadership topics. The advantages associated with it are being made palatable to us almost obtrusively with many well-intentioned appeals. The call is unmistakable: "Move forward!" But why is there a lack of implementation? Appeals do not help us to do justice to the matter. It is much more helpful to know what happens to us when we are confronted with a mistake. When we have an idea of what our head presents to us as an idea, imagination, or instruction for action, when others do not fulfill our expectations that we have of them.

Leaders who are strongly committed to their values are particularly vulnerable

Let me begin this post with a reference to the last blog article. There, I addressed the fact that we don't all react the same way when we're under stress. As a leader, being confronted with a mistake that has a negative impact in one's area of responsibility is associated with stress. Depending on the tendency inherent in our personality architecture, the reaction varies.

The error is a special challenge for all those who feel particularly committed to their personal values. These are managers who have high expectations of the quality of their work, who are committed and to whom responsible action is very important. It is not surprising that these are virtues that are common among managers. A mistake has no place in this mental predisposition because it does not satisfy the implicit expectation these superiors have of the others. It requires the others to be perfect for them.

A claim that is mentally anchored in this way leads superiors, unnoticed and unintentionally, to leave the appreciative position to the fallible person. They elevate themselves above them by making them only conditionally okay or not okay at all. The accusation is then the expression of this devaluing attitude. The damage to the relationship is done before a factual classification of the circumstances has had even the slightest chance.

Those who are committed to their own values are particularly susceptible to unreflected accusations. Such leaders do not personally find it easy to assume the leadership role in an organization that values psychological safety and sees error as a learning opportunity, indeed as the driver of profitable further development. For leaders with such a tendency, every mistake becomes a tough training session in self-control.

A broader view helps

What has been completely left out of this process so far is the question of whether the accusation is factually justified at all. The matter was not even an issue. Only the person was an issue. A relationship was damaged without knowing whether there was a reason for it at all. Too bad, then with it much trust was destroyed. Possibly for nothing.

The whole thing happened so fast that there was no time to investigate what the reasons for the unintended result (i.e. mistake) could be. (We always assume that neither willful intent nor gross negligence is involved. Errors of this category have to be handled with other gloves). Open and thus unanswered are the questions about the systemic influences, the contributory factors.

Since we know that in complex work situations unintended outcomes can occur even though everyone has done everything right, the reaction of blaming the other is completely incomprehensible. It does not do justice to the matter and is dishonest, because it ignores the entire context in which the fallible action occurred. And yet it happens again and again. Only because becoming aware of a mistake triggers a mental process in executives with the aforementioned tendency. A process that takes place in the executive him or herself and which has nothing to do with the matter at all.

Do you know your exposure?

Certainly, some readers have already become acquainted with this problem as a leader. If you would like to know more about it and especially your personal exposure in this matter, then this can be dealt with a personality profile of PCM (Process Communication Model). Contact me if you have the time and interest to do so.

News
news-59 Fri, 17 Sep 2021 17:00:00 +0200 Managing errors - Part 4: Energy-Management https://www.martin-wyler.ch/en/blog/detail/fehler-managen-teil-4-energie-management/ A safety culture in which the learning potential of error is widely recognized cannot be lived if managers fail to lead themselves. The personal reaction to errors that become known is the key to success. But how can this be achieved if you yourself are under stress?

If we as leaders have decided to live a culture in the company that uses the mistake as a learning opportunity, this commitment first of all places a demand on ourselves that should not be underestimated. We have to address the question of how we respond to mistakes when our batteries are dead.

Perhaps you've noticed the following about yourself. Whenever you are irritable or a bit thin-skinned on the road, you show a specific behavior pattern. The trigger for this could be a mistake that someone else has made and that makes you uncomfortable? Of course, far be it from me to impute a general irritability to you as a reader of this blog article. On the other hand, anyone who is reflective with him or herself knows that life, and especially work as a leader, holds enough situations and constellations that can challenge our mental defense system. Such situations can also be mistakes that happen in one's own area of responsibility. They are classic stressors that need to be managed.

One person who has dealt with such challenging moments is Taibi Kahler. An American psychologist who has studied the behaviors of people who are in an out-of-balance state, when they are stressed. He has translated his scientific findings into a model (Process Communication Model) that allows us to acquire the skills on how to cope with such situations. Kahler demonstrates that unbalanced states, especially when they last for a prolonged period of time, are rooted in a lack of satisfaction of psychological needs. This includes things such as the recognition of an achievement, the acknowledgement of our opinion and thus the approval of our personal values, or the unconditional recognition as a human being, just as we are. This also includes the longing for action and the desire for quiet with the need to be alone. And others more. It goes without saying that we need many of these to some degree in order to live a balanced life. Kahler's work, however, points to a clear hierarchy of needs in each and every one of us. According to this, not all needs are equally important to us. Even more exciting about Kahler's research is his scientific evidence that our reactions in an unbalanced mental state always occur in the same way, and thus predictably. And that they correlate with the failure to satisfy a psychological need that is important to us. He thus proved why we always show the same pattern of behavior in stress.

What happens to us when we don't get our psychological needs met?

Well, we become dysfunctional, experience negative emotions, and find it difficult to see things rationally. We lose the ability to cooperate. Our reactions in such situations are always connected with a devaluation of a person - either the other or ourselves. The devaluation of the other person is of particular importance in a lived error culture. It is therefore worth taking a closer look at it. One thing is certain: it undermines mutual trust, because we express the expectations, we have of others emotionally and thus inadequately. In these demands on the other person, blame is always interwoven in various forms and shapes. Pay attention to what happens to you in such situations! You have a tendency to react in one of the ways described below. These are the options in their overly clear expression:

  • Your reaction is from above in the form of an attack. This is wrapped up in your idea that the other person is too stupid to do things right. "How stupid do you have to be to make a mistake like that!"
  • Your pejorative response is from above, also in the form of an attack. This time, however, you argue differently. You experience yourself as being opinionated and insist that a serious and earnest completion of the job is important in the company. "Would you please make the necessary effort at work! I can still expect that!
  • Your response is a cool and tendentious distancing of blame that is intended to hurt the other person. "Great mess you've made. Well, making mistakes always has unpleasant consequences. You'll have to deal with that yourself!"
  • Your response is a detached blame game about being able to hold yourself harmless. "If you make mistakes, that is your problem alone. It has nothing to do with me."

However, you may also have expectations of yourself in situations where you, as a manager, are confronted with mistakes made by your employees. Namely, when mistakes made by others cause you to feel insecure and push you out of your comfort zone and into the role of victim. "I feel so sorry for him! What did I do wrong to put my coworker in this difficult situation?" This response does not undermine the relationship with the other in the same way as the four described above. We can therefore leave it aside in this consideration. Nevertheless, it is a as well a challenge for the manager who tends to react in this way, and it makes sense to deal with it.

What this means for dealing with the error.

Any form of devaluing the other person torpedoes a living safety culture based on trust. This is especially true when managers do it. Why would I go to my boss to hear that I'm not okay, that he or she is cutting me off, or that they won't promise me support? Would I go there to tell them about the difficulties I've had at work or about my own mishaps that have happened to me? What good are the noble resolutions of wanting to use error positively in the company's mission statement if, in everyday life, managers with empty batteries are unable to maintain the necessary relationship of trust. Safety culture is not made with appeals and nice intentions. It is carried by managers who not only know why they have to de-taboo and use the error. But by supervisors who are also capable of leading themselves. In such a way that they have the necessary energy not to fall out of an appreciative position, towards the other, at the slightest gust of wind.

Self-control and energy management

Make sure your batteries are always charged. Actively pursue the question of what psychological needs are important to you personally and, as a leader, take care to get them satisfied. The Process Communication Model has an individual profile ready for you that shows you which charging stations you should go to in order to satisfy your psychological needs. In this way, you provide your personal bulwark against unwelcome falls from the I-OK-You-OK position. Because if you can no longer maintain eye level with the other person and you see yourself holding others in low esteem or covering them up with accusations, these are alarming signals. They do not promise anything good for the safety culture in the company. Ask yourself what kind of environment you need to learn from your mistakes. Fear? Disrespect? Guilt? An emotional or a factual debate?

The Process Communication Model (PCM) can tell you precisely what is important for you to remain on the appreciative eye level with others for as long as possible. And it presents you with your personal early warning signs that stress is on the way.

If you are interested, get in touch with me.

News
news-57 Fri, 03 Sep 2021 20:00:00 +0200 Managing errors - Part 3: The Relationship with the Fallible Person https://www.martin-wyler.ch/en/blog/detail/fehler-managen-teil-3-die-beziehung-zur-fehlbaren-person/ Have you noticed that you react differently to mistakes made by others? It is about mistakes whose consequences you feel. There are countless exogenous factors that influence our reaction. It's good to know that there are also influences that we can proactively manage.

Imagine you are the mother or father in a middle-class family borrowed from the stereotype, with a home of your own and a newly acquired middle-class car. You are the first member of the family to come from the bedroom to the kitchen on Saturday morning and find this note from your eighteen-year-old son on the kitchen table. What goes through your mind when you see it? It goes without saying that there are very different things that pop up. It doesn't need to be the philosophical question that starts to occupy your mind and for which you first have to have a cup of coffee to answer, what is more important to you, the son or the newly acquired car.

Let's play out the same situation again in a slightly different context. This time you are the first person in the family to want to use the car for shopping on Saturday morning. You find the Post-It message on the windshield. Written by your unpleasant neighbor. Unpleasant because you've been at odds with her for years over the unkempt trees that encroach on your side of the garden, providing foliage and shadow. And because she has the annoying habit of giving big parties in the garden on weekends with an open, smoky fire and loud music. She has scratched your new car while parking her wrecked vehicle, the same way your son did in the first example. What is going through your mind now?

Take a little time before you read on. Try to put yourself in the situation and play briefly with the thoughts that come now.

Why is our reaction so different?

The answer is certainly obvious for the first intra-family case. Because we usually have a close, benevolent relationship with our children. It helps us to be more empathetic. It is not difficult for us to put ourselves in the son's situation. This gives us the ability to forgive. (A skill that good error management always requires of leaders). The relationship is characterized by trust, because we do not want to hurt our children and we always pursue good intentions. We have proven this in countless situations and our juniors know this and therefore trust us.

In addition, we better understand the context in which the incident occurred. The son certainly did not act intentionally, and he has only had his driver's license for a few months. It was nighttime when it happened. And last but not least, we were also young once. This important information is crucial to our reaction to what happened. Once we are in a good relationship with the other, it is much easier for us to take context into account and consider it in our assessment of the incident. This is because we are usually more aware of the contextual circumstances in which the event occurred, thanks to the closer relationship. In this case, we have especially plenty of information from the context because we can draw on our own experience. We know what it's like when you have to park a car at night with little driving experience. It helps us not to make unreflective accusations, but to see things factually and less emotionally.

What does this mean for our everyday life as managers?

If we transfer this insight to our everyday working life, an unambiguous mission emerges. Of course, it is not about creating and maintaining a father-daughter or mother-son relationship with all employees, interfaces and partners on the job. Rather, it makes sense to take an interest in the work environment of the collaborators and in the problems, they face in providing services. And it makes sense to have an idea of how they are doing as individuals. Ultimately, it's about being inwardly attuned to what's in common rather than what's divisive. Taking an interest in others thus becomes a noble leadership task. This has nothing to do with an empathetic over-adaptation that aims to prevent conflicts from arising. Rather, it creates the prerequisite for a holistic view of events.

So, a lot is achieved when we notice that we are once again falling prey to old stereotypes and catch ourselves branding, for example, the well-known incompetence of the IT department. How we denigrate the usual suspects in the organization without having a clue about the way the problems actually manifest themselves for these subject matter experts. This is where self-discipline is especially needed from managers. If we already know that human beings are fallible, why should this only apply to us and not to others as well? This mindset is the best prerequisite for trust to develop.

Seeing things as they actually happened

Such an approach clears our view for what really happened. It even allows the disciplined self-manager to see for what he is mutually responsible. Thus, the father of the family mentioned at the beginning manages to remember, while reading the message on the kitchen table, that he had to have the car repaired a long time ago because of the defective rear-view camera ...

News
news-55 Fri, 20 Aug 2021 22:44:00 +0200 Managing errors - Part 2: The corporate lone wolf https://www.martin-wyler.ch/en/blog/detail/fehler-managen-teil-2-der-einzelkaempfer-in-der-organisation/ It is in line with the management zeitgeist to see mistakes as learning opportunities. Why is it, one wonders, that so many people find it so difficult to put this noble approach into practice? There are two aspects to this that deserve special attention. In this blog, I address two main suspects.

By now, many have understood that it pays to see mistakes as an opportunity to learn. By "mistakes" I mean those actions that lead to unintended results. The Anglo-Saxons have created an apt term for this: "Honest Mistake". Many have also understood that as soon as blaming is involved in connection with such an error, the learning opportunity is lost. And last but not least, more and more managers are becoming aware that whatever 'mistake' they are confronted with, their own or those of others, there are two learning processes. The personal reflection of the person who made the mishap and the organizational learning process that deals with the systemic contributing factors. For all of these insights, understood does not mean implemented. There is a big gap between understanding and application. This is at least my experience from various cultural projects in companies and from many years of self-observation. On the way to applying these noble principles, there are considerable obstacles to overcome. One of these obstacles is rooted in self-interest. Dealing with it illustrates to us the importance of the personal inner moral compass and the framework conditions in the company that are needed to establish a culture that makes the aforementioned learning processes possible. A culture that serves the reliability, safety, resilience and agility of the organization.

The lone wolf in the organization

As soon as we hear in the company about a mistake that has happened to a certain person, a subtle temptation emanates from this news. It offers us the chance to make ourselves look better in relation to the other person. All it takes is pointing out the person's mishap in any number of conversations with people in the organization. The subtlest of insinuations will do. Thick plastering is not necessary, it might even reveal the intention. This would be highly undesirable, as it goes hand in hand with a moral deficit and reveals unseemly self-interest.

That said, I do not impute to any reader that they would be so calculating. Further reading is therefore not necessarily worthwhile; it is - like everything in life - a free decision.

But how can it be that the mistake of another person gives us reason to behave in a morally dubious way? I would like to address two aspects that are partly responsible for this:

1. "It's the name of the game, stupid"!

Following the famous phrase of Bill Clinton's political advisor James Carville: "It's the economy, stupid". With this slogan, Carville emphasized the importance of the economy for his own campaign team during Clinton's election campaign. Here, the analogous variation of the saying points us to the relevance of the prevailing business environment. People always behave rationally in the context of a given framework. If this framework rewards individual performance in the company, no one should be surprised if people actually live by it.

2. The dishonest handling of human fallibility.

The temptation arises when the company has culturally failed to attribute an action of an employee that leads to an undesirable result not exclusively and above all not primarily to the failure of this person. It arises only where there is still a belief that people can work without error. There, where not only the trust in the employees is on shaky ground, but also the idea of how unwanted events occur. To see the causes exclusively in the hands of people is evidence of a blatant lack of knowledge and of a superficiality that is taking on almost shameful dimension these days.

Appeals are of no use

If we consider these two main causes of the "problem of the lone wolf in the organization," it makes little sense to try to eliminate it with appeals to morally intact self-management. An appeal not to behave selfishly in a system designed for self-interest is at best ambiguous cynicism. I advocate a decent way of addressing it. If a person is politely pointed out that he or she is about to suggest his or her special qualities as a reliable person by pointing out the faults of others, such a reference does not fail to have its effect. The only thing that has to be endured is a brief embarrassment, which, in my experience, subsides very soon.

Commitment

If it is not possible with appeals, it is possible with personal commitment to a self-image that is not whitewashed with the shortcomings of others. I can, if I want to, consistently refrain from such mask make-up. Personal successes that come about without this dubious support feel even better on top of it. By the way, it is much easier to make the commitment if you are aware of your own fallibility without blinkers. It's enough to look in the mirror every evening and list all the things you failed at that day. Anyone who, with consistent application of this self-disciplining exercise, suddenly finds him- or herself confronted with the question of his or her own trustworthiness has won an important personal victory. Such purification strengthens and enables us to actively shape everyday life in a culture of trust. It is an old fact that one's own vulnerability makes us approachable and creates trust.

Error management, a function of self-leadership

If we manage not to use the mistakes of others for our own purposes of self-optimization, we make a decisive contribution to professional error management - in general. This is active error management. We do not need any advantageous framework conditions for this, such as a culture of trust can provide. Solid self-management is fully sufficient, even if it is only enough for an isolated solution in one's own environment.

Framework conditions that eliminate the lone wolf

The problem of the lone wolf can also be tackled at the cultural level in the company. There are two promising approaches here, both of which should be considered in parallel in organizational development:

1. Leadership approaches that have emerged for other reasons can provide relief: "Unboss the company", "Team orientation", "Agile organization" and many others. They all purify the organization and clean it from corrosion damage caused by strong power structures, hubris and silo thinking.

2. Education and knowledge transfer. This is about showing everyone in an appropriate way how unintentional events (popularly scolded as errors) come about. It is high time, especially in work environments strongly characterized by interdependencies, that we break away from the completely outdated assignment of full and sole responsibility to the person at the sharp end. In highly complex organizations, this view obscures in a way that can no longer be explained all the contributory, systemic aspects that lead to an occurrence. It trivializes inappropriately and, above all, it does not help us to get better, to make progress and to learn to understand the systems that we have built but that we can no longer really control.

We can get on this path in the development of the company by refraining from simply pinning undesirable events, the Honest Mistakes, on a person like Jewish stars. We can get rid of the bashing of human fallibility without the company degenerating into a pony farm where everything is allowed and nothing is punished. As a sparring partner, coach and organizational developer, I would be happy to talk with you about this exciting topic.

News
news-53 Fri, 06 Aug 2021 20:00:00 +0200 Managing error https://www.martin-wyler.ch/en/blog/detail/attribution-bias/ Everyone is talking about dealing constructively with mistakes. Learning from mistakes is the order of the day. In order to succeed, we have to find a different way of dealing with it. This starts with ourselves.

The vacation season invites and provides an occasion for contemplative thinking about things that often get short shrift in everyday life. In this blog post, I would like to give all those who don't like to pack books in their suitcases some food for thought. Thoughts that have what it takes to strengthen self-management in a deck chair or at a dreamy beach bar overlooking the ocean. If there are managers among the readership, they will also make them fit for a corporate culture that leads to more relaxed and efficient cooperation. If you yourself are a manager in an agile company or in a high reliability organization, however, these thoughts already have the character of a duty. Whether you want to do this to yourself during the vacations is, of course, up to you.

Attribution Bias

In the context of managing errors, the attribution bias is of particular importance. It states that we have a tendency to attribute others' mistakes to their incompetence and that we justify our own mistakes by the circumstances in which they occurred.

Looking at the picture can help us get to the root of the matter. You may have doubts about the abilities of the road markers who did the work. Do not let yourself be distracted by any empathic feelings that may arise. These are probably due to the fact that the visible damage is very small and that you are neither responsible for the mistake nor for the qualification of the performers. Now, nevertheless, imagine that you are the manager of these very workers. This helps. Imagine further that you had to take them to task. How would you argue?

Think for a moment before you read on.

What do you think you would hear from the workers in this conversation? Let me give you a possible answer here on their behalf: "We are sorry that this mistake happened, it was never our intention. It is the first time that something so stupid has happened to us. But today we were exposed to a special stress. The dispatcher failed to assign us the third man for the job, who was otherwise designated for the job. Then we found out only on the spot that the police had not been notified for the necessary traffic signalization and we had to make up for it. Last but not least, we had to refuel our vehicle on the way to the site. This was necessary because the team that had used this truck yesterday, contrary to the customs in our company, had not done so.”

Self-Management

Take this answer as a tool for your personal reflection on the case. Do you succeed in understanding it factually? By that I mean, do you hear exactly what the employees said? Can that be authentic to you, or do you see behind it more of an attempt to make excuses? If so, why would the employees tell you a story that does not correspond to reality? After all, they only do that if they have to fear being blamed for their unintentional actions. In turn, they will only do so if they have previously experienced that this is the normal reaction to errors in your company. However, since you consistently react differently when it comes to work errors and honest mistakes, because you can trust your employees, the marking crew's response matches reality quite well.

So, you've managed to clear the first hurdle.

And now comes the second. It is no less challenging. Do you see that the causes for the unintended result can also be found in the context in which the marking work took place? External time pressure. Lack of resources. Can you imagine that this makes you, as a manager, part of the story? After all, you are responsible for the systemic, contributing factors. If you see this responsibility, then you have taken the second hurdle. Congratulations! You have just successfully defended yourself against the attribution bias. You have succeeded in not simply understanding mistakes with the inability of others.

The third hurdle

If you now succeed in making it clear to the workers that you are responsible for the systemic causes and that you will do your part to eliminate them, you will have cleared the third hurdle. You have succeeded in not justifying your own mistakes with the circumstances in which they arose. You have refrained from making excuses to try to cover up your failures as a manager with outside influences.

Of course, your exemplary reflection does not absolve employees from taking responsibility for their actions. It makes a huge difference, however, how they take it. Whether they feel guilty and accept negative effects on their qualifications, or whether they develop suggestions that will help them avoid making the same mistake again in similar situations in the future. If you as a manager succeed in valuing such contributions from your team, this would be a reason to be hired by you as an employee.

A whiff of change is in the air

The relentless dealing with attribution bias is an excellent way to stay fit, especially as a leader on a day-to-day basis. It's a recipe for building trust, and it has the potential to turn your company into a learning organization with a Just Culture. Thus, well prepared, you can resume your work with confidence after the vacations, because with these intentions in mind, things will change for the better in your environment. You could almost say that the vacation was worth it.

News
news-51 Fri, 23 Jul 2021 20:00:00 +0200 In the maze of complexity https://www.martin-wyler.ch/en/blog/detail/im-labyrinth-der-komplexitaet/ In the aftermath of the unfortunate emergency number outage in Switzerland, a journalist calls on Swisscom's CEO to resign. The reporter is guided by the time-honored error-blame paradigm. This should give us pause for thought. Not because we want to absolve executives of their responsibility. But because antiquated tools are being used here to try to straighten out something that can no longer be done in this way.

Two weeks ago, the landline telecom-network in Switzerland collapsed for about eight hours. Emergency numbers were also affected. The company Swisscom thus made it back into the headlines. Similar system failures, in which the national telecom service provider did not cut the best figure, were not so long ago. Since the emergency numbers were down on a night with severe storms, the media coverage was correspondingly high. Thus, in the aftermath of the events, I make an interesting observation in an interview with the CEO of Swisscom. A journalist asks him whether, in view of the seriousness of the incident and the accumulation of similar incidents, it might not be time to resign.

Resigning would be equivalent to a severe punishment for the CEO. He would take responsibility for what happened. But what would be achieved? The public would have a scapegoat, and it would go back to the 'courant normale' with the illusion that the problem had been solved. Case closed. Interestingly, it does so even though, apart from confirming a questionable sense of justice, it knows full well that the problem has not been solved.

The journalist's question with a prompting character is understandable. It comes from the widespread and well-entrenched error-blame paradigm. Those who make mistakes must take responsibility for them, bear the consequences, accept the blame. This is done in the misguided belief that justice will be done and that the obviously incompetent person in charge will be replaced by someone who can do it better.

Let's get to the bottom of the matter and examine the error-blame paradigm for its suitability.

What happened?

Swisscom was performing maintenance work on a network element of a telephone platform for business customers. A software update caused a malfunction that triggered a domino effect. As a result, large parts of the network were affected by the malfunction. The reason why it took so long to fix the glitch was that the supplier of the network component had to be called in for this purpose. For a better understanding, it is worth mentioning that there is no system leadership for emergency calls in Switzerland. The CEO explains in the interview that numerous partners are involved in the emergency call system. It is like a machine with 1,000 cogs: Swisscom controls perhaps 700 of them, and the company has only limited or no influence on the remaining 300. But all these cogs have to mesh. Otherwise, the emergency call would no longer work.

Without knowing the details of what actually happened, we can imagine the chain of circumstances. It is extremely rare that maintenance staff go to work with the intention of producing a breakdown. It is equally unlikely that update software was deliberately programmed incorrectly. We can safely assume that the programmers were not aware of the undesired domino effect that their software would trigger. The fact that in this case the quality check of the software was not up to the requirements may be of systemic nature or may be due to the circumstances in the wider context in which this check was performed. The domino effect triggered by the software update could most likely not have been anticipated by the maintenance crew on site. It was probably more of an unknown phenomenon. To what extent the aforementioned 300 Non-Swisscom cogwheels in the emergency call system had an influence on what happened is beyond our knowledge. The only certainty is that they played a systemic contributing role.

That's how complexity comes along.

What may we conclude from this?

In this case, we are dealing with a system that is not under the control of a single organization. The system boundaries run far outside the Swisscom company. They extend into politics and the federal structure of Switzerland. Replacing the CEO would bring a person to the top of the company who only potentially knows or can do more, but who would be confronted with the same system. A system that cannot be changed with the push of a button within the company. Whether such a person could be found at all, of course, remains to be seen.

The characteristic of complex systems, apart from the thousands of existing mutual relationships, is also the fact that they can no longer be controlled, but only influenced. And that they are able to produce undesired events, although all have worked correctly according to instructions. To start here with the medieval punishment club is like trying to start an ice hockey match with jogging shoes. In a complex environment, the sanction has degenerated into a blunt tool. As we know, it even hinders the required learning process.

It makes much more sense for us to realize that, not least thanks to digitalization, we have built systems that are way out of step with our understanding of the law and our idea of control and accountability. They relativize the importance of individual people - including bosses. We can change them all at any level of the hierarchy and will find that this will not solve the problems. The complex systems challenge us with our desire for control and ask us to find a different way of dealing with them.

On closer inspection, they ask only one thing of us: that we learn to understand them. We have developed them in many iterations, often over a long period of time, and there has never been anyone around to take care of keeping up with the user manual. So we have no choice but to take the painstaking reverse route and learn by observation how they really work. To do this, we need information and data, tips and reports from employees and supervisors that reveal systemic weaknesses. We need a culture in which reporting one's own mistakes is valued. Where people who report mistakes do not have to fear disadvantages or even sanctions. Where punishment is only in the management's toolbox for the very gross and deliberate.

Letting go

We must not resent the journalist's question about resignation. It comes from a long-gone time of simple systems. A time that has left a lasting mark on us and our understanding of the law and our legal system. So it is not surprising that in the whole interview the reporter never asked the CEO of Swisscom what should be done to prevent the company from such undesirable events. Let's hope the next generation of journalists can do it.

The widespread application of artificial intelligence is just around the corner. The accompanying massive increase in complexity demands a social and cultural development push from us in dealing with complex systems. This starts with the first step: Let's throw the error-blame paradigm overboard. It has done its job - may it rest in peace.

News
news-49 Fri, 09 Jul 2021 20:00:00 +0200 Safety-Management is Relationship-Management https://www.martin-wyler.ch/en/blog/detail/sicherheitsmanagement-ist-beziehungsmanagement/ When it comes to keeping the company's risks under organizational control, the first thing that sticks out in one's mind is safety management. What is often not considered is the standing of the safety experts in the company and thus their relationship with the risk owners. This blog article illustrates how significant this can be.

Safety only results from concretely implemented measures. A lot of good intentions are not enough. But every safety measure has its price. Either in the form of a direct cash-out or in the form of a slowdown of performance processes in the company. They always affect the performance of the company in some way. The responsible executives, the risk owners, must therefore in each case perform a difficult balancing act when approving security measures between an abstract 'advantage', lower damage potential and or lower probability of occurrence, and the concrete visible 'disadvantage' that such measures cause. This is an assessment of two very differently designed and thus difficult to compare goods. They know that the measures will do their job, but they have to form a personal picture of their concrete effect. And they will not let themselves be deprived of that. Despite all the assurances about the effectiveness of these measures from the safety experts.

Who guarantees the best advice?

The responsible managers are confronted with the risk of doing something with the approval for the implementation of safety measures that could prove to be unnecessary in retrospect. Consider the following example. A father of a family wonders whether it is still worthwhile to take out fully comprehensive insurance for a car that is becoming somewhat outdated. In asking this question, he is probably guided by the assessment of whether a total loss of the vehicle is financially viable for the family or not. But now the crucial question: With whom does he discuss this consideration? With the insurance agent? With work colleagues or with his partner of many years and mother of their two children?

The insurance agent is a bad advisor. He has the shortcoming that he personally benefits from taking out the insurance. The work colleagues have the shortcoming that they cannot really judge whether he can financially cope with the total loss of the car. In any case, such has no impact on their circumstances. That leaves his wife. She has the same view on the problem as she 'manages' the family together with him. She knows what a total failure of the car would mean for the family. If she is willing to take the risk of partial coverage, she will share the burden in the event of a crash. And so she also understands that by again taking out comprehensive insurance, the family's finances will not suddenly grow like trees. Whatever the decision, in a good relationship, a sorrow shared is a sorrow halved.

Respect and recognition for the other party

And this is also how it is in the company when the safety experts communicate with the risk owners. As always in communication, a purposeful exchange only succeeds when the two meet at eye level. When they do not devalue each other either as humans or by claiming superior argumentation in the matter. Only when the relationship between the two is characterized by respect for the other as a human being and as a manager does a dialog emerge that is capable of carrying out the difficult balancing of the two such different assets for the good of the company. Good advisors understand the risk owner's problem. They engage in the difficult assessment of goods that are so difficult to compare. They acknowledge the adverse effects of safety measures on the business to the same extent as they present their request for such measures. They leave room for the line-side responsible party to weigh in and do not push, as if in a bazaar of views, that their perspective is the right one. And they are neither disappointed nor offended when their submissions are not acted upon. They understand that in such a case, the risk owner has done nothing but take responsibility for the risk. In other words, they take care to build trust in every situation. Effective communication only takes place when there is mutual appreciation and recognition. The only people you listen to are those you trust.

A bold hypothesis

I put forward the hypothesis that a trusting relationship between the risk owner and the safety expert is better able to reduce the risk exposure of the company and is therefore more significant for the safety of the company than any safety department, no matter how highly endowed and professionally set up, whose superiors are not heard because there is a lack of trust on the part of the risk owners.

What needs to be done?

If you, as a safety-, security- risk- or compliance manager, are concerned with what you can do to build trust in the responsible managers, the following questions may give you food for thought.

  • Do you and the line manager(s) have a 'shared mental model' of how the company's success is achieved?
  • Do they understand the risk owner's problem?
  • Do you tend to present the need for safety measures as an urgent problem, making the risk owner feel pressured?
  • Do you link your personal success to the number of safety measures approved and implemented, to the size of your budget, to the number of employees assigned to you? Or do you link your success to the success of the company?
  • Do you sometimes find it difficult to present the arguments for your proposed safety measures in a purely factual manner?
  • As a safety expert in the company, can you live well with a decision by the risk owner to accept a risk that you wanted to mitigate with measures? Or do you get frustrated when safety measures you have proposed are not approved? Do you find it difficult to deal with it in a way that does not affect your relationship with the risk owner? What is the reason for a possible frustration?

    • Because you were not proven right?
    • Because you see the rejection as a defeat?
    • Because it makes you feel uncomfortable to deliver the message to your team?
    • Because you don't understand it and the work is no fun that way?
    • Or because in such situations you tend to have the thought that the risk owners will see how the risks will occur and do their damage. So let them deal with it themselves!

If you want to learn more about your incidentally predictable reaction, there is a proven explanatory model for this:

www.martin-wyler.ch/fileadmin/user_upload/dokumente/MW-Factsheets-eng.pdf

This model is very helpful in avoiding miscommunication and restabilizing relationships that are in danger of going out of balance.

Addressing each of these issues can give you clues as to where and how you can work on your standing in the company so that you can become a trusted advisor to line decision makers. That you can hold a position where you are respected and listened to. If my hypothesis is correct, this would be an important key to effectively reducing the company's exposure to risk.

News
news-47 Thu, 24 Jun 2021 17:33:00 +0200 Thoroughness https://www.martin-wyler.ch/en/blog/detail/ernsthaftigkeit/ Whenever nothing should go wrong, the work must be approached with great thoroughness. This is a culturally deeply anchored expectation, especially in the high-risk environment. If things do not go well, it is once again a human error. Is that all a safety culture has to offer?

Recently, a good friend who still flies the Boeing 777 as a captain told me a story in a casual way. I couldn't get it out of my head. As a joke, he counted the pages of all the documents in which his airline and the authorities give him regulations for doing his job. He came up with the astonishing figure of 12,000. Now we know that not all guidelines are the same. But still. The number leaves its mark.

Let us now imagine that this captain, with his inner attitude characterized by seriousness and reliability, would bend over the 12,000 pages in all thoroughness. And he would read them in such a way that he could claim to know where what is written and to know the content and relevance of all the aspects presented. What would suffer or could be damaged by such an approach?

Quite obviously: efficiency. Because there are pages on which things are described that are of little importance for the execution of his job. Conversely, there are pages where there are things that require his full attention. Things that he needs to understand completely in order, as captain, to get his passengers to their destination safely, comfortably and economically.

And what would also be damaged?

Control mania at work

The story is a beautiful metaphor for the approach we take conceptually when we want to be sure nothing goes wrong. In the case of a flight captain's job, this is understandable because, after all, flying involves serious risks. We want to be sure that they are kept neatly under control. So it's tempting to explain in detail to the executing professional how he has to work. It's all about control. He has to work according to the instructions of those who, for example, designed the aircraft. The manufacturers guarantee the airline that flights in their aircraft are safe if the captain follows their guidelines. Also obvious is the serious face of the captain's boss, who looks him deeply in the eye before his flight and makes him understand that he has the full responsibility to follow the guidelines and rules with thoroughness. Because an action not carried out with the greatest possible conscientiousness would have to be judged as a mistake. Regardless of what the reasons might be. A work performance that disregards the rules would also be unacceptable. Because as far as the safe operation of the Boeing 777 is concerned, there is no quarter. The aircraft manufacturer, the aviation authority and his airline have written everything down to the smallest detail. It's all about correct, serious application of the regulations. A captain who does not take his job with the utmost thoroughness is a risk factor.

Really?

The traces of 12,000 pages printed with requirements

We see that in this story we are in the middle of an environment where earnestness just drips from the ceiling. In addition to the overflowing rules and the bosses stressing responsibility, there is an expectation in this culture painted on the wall in big letters: "Make every effort to do the given work with the utmost care and fidelity to the rules". In itself, there would be nothing wrong with that. At first glance, everything seems to be going right in this airline.

On his way to the flight planning office, the words of his boss run through the captain's mind again; they have made an impression on him. But they probably had an even deeper effect. As a human being, he knows that he is fallible. He is aware that he will get into situations in which he will not succeed in sticking to everything that is written in the 12,0000 pages. That worries him. It scares some of his colleagues. It goes without saying that the credo he adopts in this culture is: "I make a genuine effort to complete the tasks assigned to me in all thoroughness and I will adhere strictly to the rules.

When thoroughness becomes an obstacle

He has half an hour to plan the flight. Together with his first officer, he studies the flight documents. There are many, many of them. His credo gets in the way for the first time that day. If he were to work through all the documentation with the thoroughness he has set his mind to, the flight would be delayed by more than an hour. There would be no objection to that for safety reasons. On the other hand, he and his first officer would be under a lot of stress during the later preparation of the aircraft for the flight. Trying to make up for lost time - punctuality is an important requirement - mistakes would creep in. These would involve much greater risks than not studying the flight documents in all thoroughness.

By successfully standing up to his own credo and putting a stop to thoroughness, he has done something crucial for safety. He has proved that man is not a risk factor in rule-heavy, complex and dynamic systems that are characterized by many interdependencies. On the contrary, he is a first-class safety factor.

He is the only entity capable of walking the difficult tightrope between thoroughness and efficiency in such systems. Safety is not created by making adherence to rules and thoroughness the maxim. Safety in complex systems can only be created if there is someone on the ground in the current situation who, with his experience and knowledge of the overriding interrelationships, decides for more or less thoroughness depending on the situation. This applies not only to airline captains.

People are gatekeepers and not risk-factors

The safety culture is therefore of decisive importance. It must be designed in such a way that it values the decision-makers in the organization. It must allow freedom and be careful not to turn employees and managers into manipulative monkeys with excessive expectations of compliance. Those who culturally associate safety too strongly with compliance and thoroughness it demands, fall victim to a devastating basic assumption. The assumption that the system in which people operate in the company is perfectly built.

If we didn't have people, our systems would create an unacceptable number of headlines. Humans, with their fallibility, are not primarily a risk factor. They are the gatekeepers par excellence. Don't tie them to the goalpost with a misguided safety culture. Because it makes a difference what image in their head they go to work: Risk-factor...? Safety-expert...?

News
news-45 Thu, 03 Jun 2021 22:37:00 +0200 Speaker at the HSG Alumni Open Innovation Days 2021 https://www.martin-wyler.ch/en/blog/detail/speaker-an-den-hsg-alumni-open-innovation-days-2021/ I am pleased to be able to contribute to the HSG Alumni Open Innovation Days 2021. The hybrid event will take place from 1 - 9.6.21 and is an alumni 'homage' to the new HSG Learning Center being built at the University of St. Gallen. The learning safety culture will be the theme of my workshop, with a focus on the associated challenges for leaders. During the nine days, students, student associations, professors and HSG alumnae & alumni will offer various forms of learning, teaching or working together for everyone based in the HSG ecosystem.

At the Open Innovation Days, new ideas and formats will be implemented and tested in a kind of Pre-Lab. Students, student associations, professors as well as HSG alumnae and alumni will have various forms of joint learning, teaching and working ready to be presented during these nine days. This allows to experience and feel the HSG Learning Center even before it opens its doors. The Open Innovation Days will encourage exchange between students, faculty, staff and people from the field. A fault-tolerant environment is created, which will also shape collaboration and co-creation at the HSG Learning Center.

The "Learn-Shop" moderated by me on the June 7th is aimed at leaders in complex and risky environments and anyone who wants to learn more about what organizational safety and reliability has to do with them personally as decision makers. Highly complex systems require an interdisciplinary approach to bring risks under organizational control. In addition to safety management systems and human factor aspects, safety culture plays a critical role. It is shaped top-down and places special demands on managers.

News
news-43 Fri, 21 May 2021 21:00:00 +0200 Toxic company cultures https://www.martin-wyler.ch/en/blog/detail/von-toxischen-betriebskulturen/ And once again we hear about corporate failures in the press. This time in the financial sector, where questionable dealings are costing a Swiss bank a lot of money. There is talk of a lack of risk management and a toxic culture that puts profit ahead of safety. This case also shows that risk management that is not underpinned by a safety culture is completely useless.

As an interested reader of the daily press, I am currently being informed in detail about the unpleasant excesses of a toxic business culture. The latest case revolves around the systemic failure of a renowned Swiss banking institution, which recently had to accept indecent write-offs in deals with Greenshill and Archegos. Needless to say, this matter has the potential to negatively impact my preconception of the financial industry and its players. At the same time, I realize that such a judgment would not do justice to many dedicated individuals, and even entire companies, in this industry. In this context, I would describe as toxic a culture that devalues risk in order to earn a lot from it in the event of good fortune. A culture oriented in this way is well protected. Because in the event of bad luck, there are managers in lower ranks who can be blamed for the damage that has occurred. And as we know, it is much more difficult to hold those accountable who fail to put an end to such a culture. This is despite the fact that leadership also bears responsibility for inaction and for systemic inadequacies. However. There are powerful protective mechanisms at play, which are widespread in hierarchically managed organizations with strong power imbalances. The motto: "Always have somebody between you and the problem". This is how one's own untouchability can be organized. The damage is generalized within the company. The second protective wall, especially in stock corporations, is the diversified ownership structure, which reliably paralyzes the rebellion of the owners. In principle, the business conduct now brought to light by the incidents relies on luck and thus on chance. It is a casino mentality in the true sense of the word.

Without the right culture, risk management is ineffective

A culture designed in this way has nothing to do with risk management. The discipline is carried out, but in the end it has only a fig leaf function. Its origin, namely to bring risks under organizational control, must be understood in a casino culture as a means of minimizing opportunities. This admittedly rather pointed description, on the other hand, makes it clear that risks cannot be brought under control with tools such as risk management or safety management systems alone. A safety culture is needed. Risk managers and their systems can be as good as they want to be; if the culture sets other incentives, they become a toothless bunch in the back office.

What if human lives were at stake instead of money?

A look at high-risk industries shows that an airline, for example, which offers high-risk transportation to millions and millions of passengers, has to operate differently. Like other high-risk industries, aviation is highly regulated. This is a circumstance that is due not least to its questionable performance in the early years of its existence. Regulation, as even the financial industry knows today, follows misconduct on its heels. My daredevil predecessors in the aviation business all too often relied on chance and thus on luck in the construction of flying machines and in their operation. Today, the industry has proven that it has emancipated itself in terms of safety. In 2019, only 283 people died in plane crashes while transporting just over 4 billion passengers.

So the question is whether bankers could learn from aviation when it comes to managing risk. There may be something overbearing about the approach. After all, the way the financial industry presents itself today, we have to assume that it is still in a developmental phase in which it is common for people to indulge in heroism. This could make it difficult for individuals to enter the lowlands of operational, risky business. After all, the recognition that the hero enjoys is based on his willingness to take high risks. Which unpleasantly illustrates how hollow the pedestal is on which he stands. Perhaps it might succeed by pointing out that the heroes of aviation have also climbed off their high horses. Today, for example, the former kings of the skies are the cool-headed risk and systems managers of captain's rank, embedded in polished industrial processes. Every day, they all ensure the protection of their assets - the passengers - and, on top of that, strive for entrepreneurial (also monetary) success with all the means at their disposal.

Culture change as a compelling element for progress

Captains would not be such reliable leaders today if the transition from former hero to systems and risk manager had not been accompanied by a cultural change in the organization. The descent from the pedestal took a good twenty years. The starting point for this development were accidents that people were no longer prepared to accept. Analysis of the events often showed that the heroes of the skies were not as good as they claimed to be. When confronted with the well-founded accident analyses, which consistently revealed the human inadequacy of the cockpit chiefs as a contributing factor, this led to a change in thinking. Thus, in the 1980s, the ground was laid for the urgently needed cultural change.

A new understanding of leadership took hold and simply changed everything that had gone before. The understanding of the role of the leader, the idea of cooperation, the handling of risks and the meaning of hierarchy. The latter in particular was completely redesigned. The power differential was taken away and with it the platform for cultivating airs and graces. After all, it had become apparent that in risky environments it is not very helpful to have a culture of hierarchical subordination. One that is associated with fear among employees and that interprets any reliance by the boss on resources in the team as an admission of personal inadequacy. Moreover, the temptation to abuse power for one's own purposes was too great, if only to maintain one's own status. Hierarchy with power imbalances is a dangerous framework. The medical profession is equally aware of this. Unfortunately, the only inhibiting factor is the fact that those affected ask themselves why they should understand something that is to their disadvantage.

Countless completely unnecessary accidents, which were associated with an infinite amount of suffering, taught us in aviation that we had to regulate ourselves. Employees and managers became embedded in a different culture. There are studies that show that if we had not done this from the 1980s onwards, we would today have to put up with two to three crashes involving commercial aircraft every day.

Culture as an effective protection against trouble

In the aforementioned transformation, we in Aviation not only dealt with the further development of safety management systems, but also with the framework conditions for a safety-oriented culture. It is characterized by fear-free speak-up, both in the cockpit and in the company. Characterized by its team orientation, which is linked to appreciation of the contributions of all employees. Characterized by a willingness to report organizational shortcomings and self-inflicted undesirable events, because mistakes are seen as learning opportunities. By the humble recognition of human fallibility. It is a culture in which managers do not simply shift responsibility to the person involved in unintended incidents. They act in the spirit of 'shared responsibility' and attend to the contributory systemic factors, which may include deficiencies in the culture, and advocate for system improvement.

Such a culture helps to better identify emerging trouble and prevent the risk from occurring. It does not leave the success of the organization to chance but ensures that dangerous solo runs by executives are considered a culture violation. It is a culture that saves the organization from damage and scandal because it promotes internal transparency. It is the necessary add-on to risk management. It is only through safety culture that it can bring its strengths into play. If a company cultivates it, it can be trusted with its own money.

News
news-41 Thu, 06 May 2021 20:30:00 +0200 Safety cultures need leaders, not managers https://www.martin-wyler.ch/en/blog/detail/sicherheitskulturen-brauchen-leader-nicht-manager/ Unwanted events such as incidents or accidents give managers the opportunity to further develop the safety culture in the company. How they deal with them is crucial to building and maintaining trust. Depending on the image of 'leadership' they carry within themselves, they will succeed or fail in this.

We can twist and turn it however we want. If we as leaders are confronted with an undesirable event and then react spontaneously without thinking, we have most likely already put our foot in our mouth. From there, we find ourselves in an uphill battle when it comes to building trust. In the last two blogs, we've gotten to the bottom of this. We are subject to various fallacies, all of which ensure that we struggle to get a factual picture of what actually happened in an incident, for example. We tend to rush to judgment, believing from hindsight and knowledge of the damage at hand that we know exactly what took place, and that minimal information is enough for us to have a coherent story of what happened. All these illusions lead us to exaggerate the importance of the involved individual as the causal person. They also prove that the contributory systemic aspects have a wallflower existence in our minds. If we do not cognitively defend ourselves against these misconceptions, the question of guilt will be forced upon us. It is not an expression of goodwill and, depending on the situation, it can even be hurtful for the accused. How is trust supposed to grow in this way?

Not stimulus-response, but systematic preparation

A safety culture can only develop in an environment of mutual trust. This is when I can report mistakes I have made at any time and can be sure that I will not suffer any disadvantages as a result. Rather, I can make a constructive contribution to safety because others can learn from my experience. And because I know that by doing so, I am initiating a learning process that also addresses the question of what the system could learn. Whether I report a mistake I have made depends directly on how I assess the reaction of my immediate superior. I am very cautious about this and shy away from any risk. Who likes to incriminate themselves? Knowing this, it makes sense for all managers to consider how they will react in such situations and what attitude they will adopt. What they will say and which faux pas they will proactively push to the side. In other words, preparation is the name of the game. Two questions are central: How can I build trust and what role do I take on as a leader?

Trust building

I don't know how you deal with it. I myself build trust in other people when they are benevolent toward me. When I don't run the risk of being hurt by them and when I know that they ultimately have good intentions. So it is precisely these three points that we can use in personal dialog and in preparing for a conversation. Am I succeeding in being benevolent to the person? What are my good intentions? What would hurt the other person?

Now, trust is not a one-way street, but develops reciprocally. Therefore, in a second part of the preparation, it is also worth asking the following question in advance: What would the person have to do to hurt me? And last but not least, I will pay attention in face to face whether the other person is benevolent towards me and whether I can recognize his good intentions. The second part of the preparation is important for controlling the conversation. If I am hurt by the other person and or I am met with distrust, I will make the relationship an issue. Discussing what happened then becomes secondary.

The role of the leader: judge or coach?

The role of a leader is not only defined by the mission, but just as much by the countless expectations of HOW to lead. Since we also have our own ideas about how we lead, this opens up a space that we need to take advantage of. By clarifying the HOW of our leadership, we interpret our role as a leader to a large extent ourselves within bandwidths given by corporate culture. This interpretation is what it's all about. What image of 'leadership' do I carry within me? Am I the thinking and ordering authority who gives instructions and checks whether everything has been done correctly? If so, I have decided in favor of steering and control, live in the paradigm of the organization and function in a top-down manner. If you interpret leadership in this way, you will be forced into the role of judge in the event of an undesirable occurrence. The sense of justice will ensure that responsibility for what occurred is assigned to a person. With this comes the accusation and thus the fear of the person or persons involved and sacrificed is trust. Such a view of things is always associated with the repeated unwillingness of managers to accept joint responsibility for deficiencies in the system.

There is another way. If the image of leadership is oriented toward the overall whole. When employees are not understood as tools of an organizational machinery, but as valuable resources who do not go to work in the morning with the intention of making a mistake. Employees who have special abilities and at the same time are fallible like every human being. Then leaders slip into the role of coach. They provide support, make learning possible and, working together, not only seek to solve problems but also strive to improve the system. They assume shared responsibility. They create trust.

Shared responsibility for the system

Regardless of how a leader understands leadership, the call to take shared responsibility for identified system deficiencies is a hefty one. It is ultimately a matter of inner attitude and mindset. The path to this is not a Sunday stroll. When I set out, I have to deal with a strenuous question. Am I able to stand up and take shared responsibility for the shortcomings of the system, despite the fact that I normally have a very limited ability to influence it? This is not easy, because the systemic aspects that contributed to the event often have to be addressed at upper and highest management levels. The argument is put in my lap that, after all, I am only a small light and that my competencies, resources and possibilities are never, ever sufficient to fix the problem. What if every manager thought that way? Basically, this argument is about whether I see myself as a leader, as part of a management team, or as an                 externally determined subject with a job profile that I have to obey.

Incidentally, it is worth asking oneself in this context whether it would be honest to demand that employees take responsibility for their actions without sharing responsibility for system deficiencies. Shared responsibility is a matter of the inner attitude of managers. Those who do not succeed in this are lacking in greatness. For many, this would still be bearable. But not for the company. Because an attitude that rejects shared responsibility severely torpedoes the trust-based safety culture. It leaves the problem to the employee, reduces and trivializes what happened, and abuses the power owed to the hierarchy for one' own purposes.

A long road that demands a lot from executives

The above-mentioned preparation for a discussion with those affected thus includes a specific examination of the systemic aspects of the incident. My experience shows that this is no easy task. It is an unfamiliar and therefore poorly mastered perspective on how things are viewed. It needs to be practiced like any leadership activity. Not only is it challenging to recognize the contributory elements, but always complicating matters is the delusion, implanted in us by Mother Nature, that the causes must have been the actions of the people involved after all. It is therefore not surprising that the importance of systemic aspects for the occurrence of undesirable events is still unduly belittled by executives. This circumstance is an expression of a poorly developed safety culture. We must not forget that the road to a trust-based culture is long. It begins where managers rethink and realign their interpretation of leadership, where they succeed in building trust and are able to provide psychological safety, and where they are able to recognize the systemic aspects of events and their significance. This path is a cultural change that cannot be accomplished without hick-ups. Therefore, for the first steps, it is worth having a coach by your side who not only knows the way, but also ensures that the development of the organization stays on the chosen path.

News
news-38 Thu, 22 Apr 2021 21:00:00 +0200 Hindsight judgment comes from the high horse and undermines safety culture https://www.martin-wyler.ch/en/blog/detail/das-urteil-aus-der-rueckschau-kommt-vom-hohen-ross-und-unterminiert-die-sicherheitskultur/ Trust is the basic building block of any safety culture. Building and maintaining it in the organization is a matter of leadership. What sounds so casual turns out to be a special challenge for the management team. It requires consistent self-leadership especially when it comes to overcoming the irrationality given to us by nature. The hindsight- and outcome bias is one of these challenges.

In the last blog, we explored a perfidious irrationality that makes it difficult for us as leaders to ensure psychological safety in the organization. Unfortunately, it's not the only one we should pay closer attention to. Another handicap that nature has imposed on us is giving us a hard time in our efforts to embed a safety culture in the company that is worthy of the name. One that is not characterized by superiors perfecting their expectations of employee behavior and setting them out in ever more insistent appeals and detailed mission statements, directives and regulations. But one that is based on mutual trust and in which managers succeed in providing psychological safety as a cultural element. It is the phenomenon of the hindsight and results bias that we need to take a closer look at. This thinking error is also a distortion of perception. It is related to the problem of assigning blame and thus has a direct impact on the trust relationship. Trust, however, is an indispensable prerequisite for a promising safety culture.

The effects of the hindsight and outcome bias.

First, this thinking error leads us to believe that if the outcome of an action was bad, the actions of the actors involved were also bad. In such cases, we assume that it was wrong decisions, poor situation analysis, or missed opportunities that were the reasons for the bad outcome. The hindsight bias is therefore quickly accusatory and undermines trust. The assumption that the actors, the people, were fallible is quite often the product of the thinking error we explored in the last blog. "What you see is all there is" (WYSIATI Rule). That perceptual bias that presents us unasked with a coherent story even based on minimal available information.

We can further observe that actions are posthumously assessed as mistakes in the light of the damage situation, although they were judged as normal, reasonable and appropriate for the actors in the situation itself. Often, in retrospect, an action is therefore judged to be irresponsibly risky. Take, for example, a standard low-risk surgical procedure in which complications arise and the patient dies. In retrospect, the survivors, lawyers, or judges will tend to believe that the surgery was risky as from the beginning. They are convinced that the doctor should have known better.

Furthermore, this thinking error leads to a general overestimation of the probability of occurrence of the incident. And it makes us believe that the ability of the involved persons to correctly assess this probability is insufficient. This means nothing other than that at the moment we become aware of the damage situation, we believe that we can make a factually correct judgment about the probability of occurrence. This is an arrogance that belittles those who were involved in the events. If we could not attribute this error of reasoning to an irrationality inherent in our nature, we would correctly have to apologize for it.

These three effects of the hindsight bias undermine trust to a particular extent, because the cause of the damage in all of them implicitly always lies with the person and the blame is thus placed on him or her. Any unreflected reaction by a leader to an undesired event therefore undermines the building of a culture of trust. I will explore this issue in a forthcoming blog.

Of systemic importance, on the other hand, is the fact that with this perception bias, the reasons that led to the incident are viewed uncritically and the root cause identification is handled far too superficially. This is because it makes us believe that we have always known. Thus, we already know the causes and any further consideration of the case is unnecessary. This observation shows that under such an interpretation of an incident, it becomes very difficult for the safety experts to obtain the necessary resources in the company that a professional investigation requires. Only this will enable the company to actually learn.

What is to be done?

Some of these consequences of the hindsight and result bias are difficult to bear for the persons involved in incidents. On the one hand, in combination with the WYSIATI rule, they cause judgmental superiors to fail to do justice to the people involved and the situation they were experiencing. This corrosively undermines the relationship and mutual trust. Second, they prevent managers from taking responsibility for the organization and working to improve the system because they believe they know the root causes. Much is gained if leaders are aware of these effects. This helps them to overcome their negative impact and, for example, to see and understand the situation as it presented itself to those involved before the events took their course.

In my experience, it is therefore important in the context of safety culture development projects to give managers the opportunity to take a closer look at this robust and perfidious cognitive illusion. After all, we are reluctant to give up a notion that makes us believe we can grasp the unpredictability of the world. When we need to break away from the comforting "I've-always-known," we as leaders come face-to-face with loss of control. It's uncomfortable. It pays to have a coach by your side during such confrontations.

The highest and most important commandment in connection with the hindsight and result bias is never to judge the quality of a decision by its outcome. But always by the quality of the process that led to the decision or action. Seen in this light, it would be correct for managers who have made high profits by taking too much risk, but who can only owe these profits to luck, to be sanctioned by their superiors. But because we are subject to the hindsight and outcome fallacy, we tend to attribute a sense of success to such managers. All those who doubt happiness-addled momentary luminaries are labeled mediocre, timid, and weak in light of the success they claim for themselves. This reflection illustrates to us the persistence of the hindsight and outcome fallacy. As leaders, we are challenged to counter it with consistent self-leadership. Those who set out to anchor a safety culture in the company are challenged to start with themselves.

Note

This call is not only addressed to executives, but in particular to prosecutors and judges. They are always in the position of the retrospective. The hindsight bias helps the prosecuting party to present a coherent story, which may or may not coincide with reality. The effect of "it was obvious (given the harm) that the defendant took too great a risk" works great with the judging public as well as with many judges. It is good to know that there are increasingly judges who can successfully resist this perfidious distortion of perception. What is difficult for judges, on the other hand, is the fact that criminal law does not primarily focus on the courses of action and decision-making processes that led to the damage, but on the magnitude of the damage. It would be time for our legislators to balance the two orientations here.

News
news-36 Thu, 08 Apr 2021 20:32:00 +0200 Premature judgment kills safety culture https://www.martin-wyler.ch/en/blog/detail/das-vorschnelle-urteil-killer-der-sicherheitskultur/ For a safety culture worthy of its name, managers must overcome the error paradigm. They are challenged to ensure psychological safety within their sphere of influence, and the only way to do that is to build trust. To achieve this, they must confront irrationality, which is buried in the very nature of human beings.

In the last blog, I looked at the external conditions that are important to an internal safety culture. But they are only half the battle. If you don't clean your own house at the same time, even ideal external conditions will prove ineffective. Therefore, in the next few blogs, I would like to focus on specific leadership practices and leadership skills that are particularly challenging for the leadership team to build and maintain a safety culture.

Culture evolves from top down in an organization. Therefore, the visible behavior of leaders is critical. Provided they have a good standing, they can offer themselves as role models and are thus in a position to change even harmful paradigms.

The nasty error paradigm

When it comes to organizational safety, reliability and resilience, one paradigm stands out as particularly obstructive: the error paradigm. Where there is fault there is blame. As long as it is at work in the organization, people, be they managers or employees, will not commit to improving safety, efficiency and or quality to the extent necessary. They are afraid and will not offer their support for improvement. This attitude is rational and understandable.

The relevant question, therefore, is when to do what to reduce fear and what is the role of leaders in doing so. There are moments that lend themselves particularly well to creating impact as a leader. Whenever it comes to dealing with an event that has led to an undesirable outcome. That's where what is all-important to the culture plays out. Case handling is THE window of opportunity for supervisors to influence the culture in the company in a direction-setting way. Incidents roll out the red carpet for them, on which their behavior can be communicated to the workforce in a particularly sustainable way. Fortunately, such opportunities do not come along very often. It is therefore important to make good use of the few that do arise. It is always about the same thing, about ensuring psychological safety. In the following, I would like to focus on one aspect that supports managers who set out to establish psychological safety. The most important element here is building and maintaining trust. This is easier said than done. On the way to it, we need to understand powerful perceptual distortions (irrationalities) with which nature has endowed us. If we succeed in recognizing and resisting them, then we will have a good basis for building and maintaining mutual trust. Thus we can lay the foundation for psychological safety in our own sphere of activity.

WYSIATI Rule

Daniel Kahneman coined the term "What you see is all there is" (WYSIATI). "Only what you know at the moment counts". Psychological research shows that we form our opinions based on the information currently available. The success criterion of our brain is the coherence of the story it puts together from it. The quality and quantity of the data on which it is based is largely irrelevant. This is startling, but it is so. Our brain always makes a coherent story out of even vanishingly little available information. It then corresponds to our opinion of the relevant matter. The information that is not known, because it is not perceived, can per se have no influence on the construction of our reality. But they are there and thus part of factual and rational reality.

When we as managers have to deal with events that have led to an undesired outcome such as accidents with damages or losses of any kind, we are first dealing with the people. This is because they are involved in these events and we see their actions in the context of the events. WYSIATI works. Our brain, unasked, builds a story from the initial information that makes sense. This cognitive perceptual bias causally links the result of the event (outcome) to the persons acting because they are 'visible' and we implicitly hold them responsible. Based on this information, the cause of the undesired outcome can only be the person who did not meet the demands made on him or her and thus made himself or herself guilty. To our brain, three inputs are enough for a coherent story: Who (human), What (harm), and Responsibility. This information makes it seem enormously plausible that the reason for what happened lies with the person involved, given responsibility for the task. Deeply anchored in us, the error paradigm 'where there is error there is blame' unfolds its effect. WYSIATI has contributed significantly to the emergence of the error paradigm. This irrationality is heard over and over again in the shouting of those who learn of the undesirable event. They cry out: Who did it? If they were interested in the non-existent information, they would shout: What were the reasons that led to it?

In complex systems, not only humans are at work

The error paradigm reveals to us that we still find it very difficult to recognize the causes of undesirable events in their diversity. We always see only humans and desperately cling to their responsibility. This is despite the fact that we have created systems that massively limit man's ability to influence things. They have reached a complexity that obscure beyond recognition or completely interrupt the connection between causal action and the effect derived from it. We bravely stick to this overwhelming responsibility of man, because we are afraid of the incalculable consequences, if we would give it up.

It is a strange feeling when you are responsible for 300 passengers in an airplane, and you only can tell the plane you are piloting where to go with a little side-stick. What the control computers do with your intent input, you don't know, and you can't verify it. The only thing you do know is that most of the time it turns out well and the computers deliver a useful result. But unfortunately, not always. That's what happened with the two tragic 737 MAX accidents, where the control computers had interfered with the controls to such an extent that the pilots lost control of the aircraft. It is not everyone's cup of tea to take responsibility for jobs that are externally determined by such an extend.

Fighting back

We cannot prevent our brain from building a plausible story from very little information. But we can prevent it from forming an opinion about what happened. Because an opinion is connected with a judgment. Thus, every attribution of blame is also a judgment. Judgmental superiors must ask themselves what their purpose is in doing so. And to what extent this purpose is part of their job. Especially in companies that need to bring large risks under organizational control, aren't leaders there to make things safer and more reliable? Aren't they tasked with ensuring continuous improvement, efficiency and output quality?

Those who make this their mission are fighting tooth and nail against the irrationality of premature judgment. If this succeeds, it will not fail to have its effect. By refraining from judgment, the leader refrains from placing himself or herself above others. Something that may be difficult for certain leaders. But it sets the stage for a successful relationship and for trust. It clears the way for a look at events of a different kind. It opens up a view to all the influencing factors that contributed to the situation, and it allows us to ask why it made sense to act or decide in the way they did in the situation they found themselves in. If managers and employees can see eye-to-eye in incident handling, they have a good chance of tracing a history of events that is close to what really happened. This allows adequate conclusions to be drawn and valuable learning processes to be set in motion.

Again, understood is not yet learned. My experience in culture development projects shows with all clarity that this discipline must be part of the leadership training that should be tackled in the context of organizational development. It is not the easiest and for some leaders it is an enormous challenge. For them, leadership philosophies are shaken.

News
news-34 Thu, 25 Mar 2021 12:11:00 +0100 It's time to rethink https://www.martin-wyler.ch/en/blog/detail/wendezeit/ Our well-balanced and perfected approaches to preventing misconduct and damage are increasingly proving to be fragile and ineffective. We meticulously create sets of rules and compliance structures, only to find that they do not prevent trouble. A critical examination of our approach is called for.

At regular intervals, we witness misconduct in companies and organizations. The federal companies SBB, SWISSCOM, Swiss Post and Ruag also made headlines last year with a lot of unpleasant stories, so that politicians and the federal councils felt compelled to take up the matter. The tenor in Bern was that new laws were needed and controls had to be tightened. Irrespective of the fact that the activism that came to light was probably more politically motivated, the parliamentarians showed the reflexive pattern that can be observed time and again of demanding more guidelines and more monitoring in response to undesirable events or conditions. The same is regularly observed in the corporate world. If our response to lapses is to react with more rules and more controls, and at the same time we share the experience of being surprised by them again and again anyway, we should take a closer look at our reflexive pattern of behavior. How successful and efficient is our concept when it comes to making things more reliable and safe?

Our 'compliance concept' is based on the assumption that rules and laws are operating instructions of systems of all kinds that describe how the systems deliver predictable output. It is also based on the assumption that people working in the system will adhere to these operating instructions, that they will be compliant. The third component of this concept is control. It ensures that the operating rule is implemented correctly, and that people comply with it. In a nutshell, the concept consists of rules, compliance and audit. But now we find that this transactional, causal approach reveals its weaknesses to us over and over again.

Lack of competence

Investigations of undesired events in high-risk environments, where the analysis of what has happened is carried out with particular thoroughness, show that in complex systems situations repeatedly occur for which the operating instructions did not provide any guidance. Further, we find that failures occur in our systems even though everyone has done everything 'right' in terms of following the operating instructions. It gets even worse. We further find that things have often turned out just fine because one or more actors in the system have not followed the operating instructions. All these situations show us in an unpleasant way that we are increasingly overtaxed with writing operating instructions for complex systems. We have to admit to ourselves that we lack the necessary competence.

Leadership is the order of the day

For the actors in complex socio-technical systems, the challenge of doing the right thing in the right situation is becoming increasingly important. Simple compliance with the rules is increasingly falling short of the mark. Conveniently committing only to compliance will have to give way to subject commitment, as unpleasant as it may sound. Responsibility for the matter at hand is coming to the fore. Leadership is the order of the day - for everyone. For management as well as for the employees. If things are to be safe and reliable, everyone must take responsibility for the matter in hand. Compliance is a child of the organizational paradigm. As Frederick Winslow Taylor pointed out in his work "Principles of Scientific Management" as early as 1919, it is part of the factory system of the industrial revolution. It belongs where man is understood as part of the machine, where everything can be controlled and monitored as in a machine, where the company becomes an organization. Compliance belongs to where thought work is separated from execution work. The fact that the person who thinks today has to learn that he is no longer up to his task puts the other person, who was used to making his contribution by simply following the rules, in the spotlight of co-responsibility for the overriding whole.

Rethinking the importance of compliance

In the past, we have contributed a great deal to the fact that compliance has become rampant. As important as it may be in causal contexts, today it has a corrosive effect in complex environments, in times of change, crisis and transformation. It is appropriate to reflect now on the importance of compliance. We would do well to look closely at where it can still play its role. We should set about singling out those areas in the company where it is not up to the job. Where its misaligned incentives are causing collateral damage and where it has become blunt as a tool for reliability and safety. Inherent in compliance is the arrogance that the thinker is right. By now we know that no authority has ever succeeded in explaining the world to us, let alone writing us an instruction manual for it.

The task of leadership

In view of the increasingly recognizable excessive demands, humility would be called for and a willingness to embrace new things. If the task in our world is to navigate around cliffs, manage risks and avert damage, and thus ensure safety, leadership will have to position itself differently. It will increasingly have to deal with the question of how a climate of trust can be built up in the organization or in the company. How to shape cooperation that defies the rigors of complexity. It will have to turn to a credo that aims to understand the system. It will make learners out of everyone; itself as a role model in front. It will build in reporting systems and make every effort to ensure that those who report do not have to be afraid. All this will help to anchor organizational learning, to perceive events as opportunities and to understand human fallibility primarily in the light of system complexity. As a result, emerging problems become low-threshold addressable. The probability decreases that decision-makers will find themselves in an escalation or that they will learn from the press what is going wrong in their own company. If one follows up on undesirable incidents by tracing their history, it is often found that there were people in the organization who knew about them. In most cases, there were also indications of the developing event, but these were not systematically recorded and analyzed. The showcase on this topic is the unspeakable misconduct and the intrigue associated with it at Zurich University Hospital. The failure of the 'compliance approach' came to light in all sharpness and the decision-makers were prompted to turn to safety culture under the situational pressure. We are following the developments with interest. Hierarchies have to be broken down, silos have to be collapsed and key players have to become learners.

What we need

Against this backdrop, the reaction of politicians and councils to the lapses in federal companies turns out to be understandable, but neither efficient nor up to date. What the companies need is a legal framework that allows them to build sustainable safety cultures. That would be the job of parliament. There needs to be effective protection of the reporting persons and the safety data collected in the safety management systems and the 'Critical Incident Reporting Systems' of the health care system. We need strict separation of incident investigations dedicated to learning from law enforcement investigations. We need criminal legislation that focuses not on harm, but on the causes that led to harm, and we need strong legislation that allows accountability not just of individuals, but of bodies and decision-makers who bear systemic responsibility. In our complex world, legal procedures are needed in which it is possible to answer the important question of which is the greater good for society in the case under consideration: To learn as much as possible from what happened or to find and punish the guilty. And we need a more relaxed approach to whistle-blowing. In all of this, the councils would have a lot to do. That they have responded to misconduct in federal companies with scrutiny and suspicion has not helped all the much-needed changes to the legal framework one bit.

It's time to rethink.

News
news-32 Thu, 18 Mar 2021 21:06:00 +0100 Three new captains https://www.martin-wyler.ch/en/blog/detail/drei-neue-kapitaene/ Congratulations to Simon Billeter, Jürg Niemeyer and Raphael Jenni! The three experienced First Officers of the Swiss Air Force and Rega / Swiss Air-Rescue have successfully completed a three-day joint-leadership course with me. I wish you all the best, much success and satisfaction in your new, demanding role in the cockpit of the Bombardier Challenger!

In February the three experienced first officers from Swiss Air Force and Rega Swiss Air-Rescue Simon Billeter, Jürg Niemeyer und Raphael Jenni completed a three-day joint leadership course with me as aspiring captains on the Bombardier Challenger 650 / 604. The course covered important aspects of self-management, the tools and methods for handling extraordinary and crisis situations, and the managerial skills for leaders in high-risk environments. The three days made them fit to continue and complete the practical captain training on the aircraft with confidence. The online format of the course has been positively evaluated by the becoming captains. Thanks to Cptn Andy Siegenthaler for his moderative and content support. Thanks to the dedicated crew of participants. I wish you all the best as you move into this challenging and wonderful job!

News