This is the reconstruction of the last minutes of flight AF , obtained by Pierre watches the CAS on his PDF instantly dropping to knots, increasing the. The most comprehensive coverage to date of Air France , an Airbus A that crashed in the ocean north of Brazil on June 1, , killing all persons on board. Written by A Captain, Bill Palmer, this book opens to understanding the actions of the crew, how they failed to. Simulation of flight AF in the Eurocat system. 99 .. At this stage of the investigation, it was clear that it was necessary to understand.
|Language:||English, Spanish, German|
|Distribution:||Free* [*Register to download]|
For more than two years, the disappearance of Air France Flight over the We now understand that, indeed, AF passed into clouds associated with a. For within the pages of this latest version of Bill Palmer's book, Understanding Air France , readers will learn as much—or more—about the A than some. Aug Report on Air France Flight Accident Lalitya Dhavala .. back all this while and he did not understand how they could be falling(BEA ).
Feedback Environ- Decision: He perceived a sudden change in CAS indicators and heavy thunderstorms. He comprehended this as a loss of speed and hence altitude of the aircraft and projected that they would be facing worse situations if they continued at the same altitude.
Thus he made a decision to increase the altitude and fly above the clouds. The immediate stall warning may have been ascribed to the fluctuating speeds and not actual stall. Communication and team work failed between them and the lack of discussion about the situation proved to be poor CRM. The real situation was that only the indications had gone wrong and the plane was not losing speed.
The PNF identifies that they have unreliable air speeds and the alternate law protections and suggests vaguely that they should be controlling the thrust BEA However his directions are not clear to the PF who was already struggling to keep the aircraft stable. The flight director told the pilot to pull up based on wrong calculations and even though the pilots knew the systems were unreliable, the PF decided to trust his instruments. Humans are seen to detect lesser failures when their instruments have a constant reliability as seen in Fig 3.
The pilot selectively matched the automated cue to this rule and decided to pitch up the aircraft continuously. This is an example of rule based decision making. Graph showing detection of automation failures when it has constant reliability Parasuraman and Riley The pilots already had a stall warning once which they assumed to be false. They probably ascribed the warning to be related to the erroneous air speed.
The A Instructor Handbook states that in the event of unreliable airspeed, stall warnings may be activated due to the wrong Mach number and acknowledges that it is a very disturbing A Instructor Support He had high commitment towards his decision and even though the pilots were trained to pull down in the event of a stall, he disregarded it in view of his own decision to climb.
In fact, the stall warning also manifested itself in the form of reduced air speed, Mach number and change of vertical speed from a climbing ft. All these cues were missed by the PF as they were contradictory to his commitment to climbing. Also, the PF did not acknowledge that the aircraft was in alternate law. In normal law, the A cannot be stalled even when the pilots apply full control on the side-stick A Instructor Support Knowing this, the PF may have been relying on that fact and thus believed that the stall warning is false.
He was perhaps startled by what was happening and how the relatively stable situation deteriorated to this chaos that he was seeing. This gave them a false illusion of safety due to which the PF resumed the nose-up input. When FO 2 took over controls and pushed the nose down to avoid a stall, the instruments came back on and the stall warning sounded again BEA They were doing the right thing but were misled by faulty automation.
Panicked by this warning, the FO 1 pulled back on the side-stick again. The pilots who were so desperate for direction from the Captain did not receive any because the Captain was unaware of the situation and it would only have been unwise of him to take the controls.
Already strained, the pilots did not communicate with each other and control was passed from left to right without any announcements. The stall warning was going on and off putting the pilots in extreme confusion and pressure BEA When the altitude reached ft.
Only then the FO 1 announced that he had the stick back all this while and he did not understand how they could be falling BEA This critical piece of information told the other pilots that they were truly in a high altitude stall. But this design became a missing link in the mental model of the PNF and Captain. Both of them did not know that the FO 1 was continuously pulling back on the side stick, they could not see his action, and the asynchronous design did not let the PNF feel the motion.
The Captain instructed him not to climb again and to reduce the pitch attitude. A Cockpit with Side-Sticks Airliners. The Ground Proximity Warning System alerted them of a high sink rate and terrain and urged the pilots to pull up. In the face of such a complex situation, they did not think of alerting the cabin crew or the passengers. They were pre-occupied with an unfamiliar, fluctuating and incomprehensible situation. The aircraft crashed, killing all aboard BEA Although all the pilots were experienced and had training in Unreliable Airspeed procedures BEA The Stall recovery training was only at lower flight levels.
Reliance on automation and lack of practice of hand-flying was one of the major passive problems which did not let them revert to basic pitch-and-power flying. Also, the CRM training given to them may be said to be insufficient as it did not help the pilots communicate and coordinate the emergency.
It is reasonable to believe that they could have saved the flight with better communication and coordination. The multiple warnings and cautions did not provide reasonable cues to diagnosing the problem. Also, the misleading stall warnings tricked the pilots into resuming the wrong action when they were on the right track. Information that the flight is in alternate mode is limited which presumably led the pilots to think that the aircraft cannot be stalled.
Air France was alerted to the problem of misbehaving pitot tubes by other crews and procedures were in place to repair or replace the pitot tubes on all its aircrafts BEA However this flight was allowed to fly without any modification, assuming there would not be any problem in the face of delay and money involved in grounding the entire fleet. The recent Asiana flight crash landing in SFO July also was attributed to the same problem of the crew relying on automation too much and failing to practice basic flying skills Aviation Week Also, decision making and communication skills have to be emphasised.
For example, the flight directors should be turned off permanently following a failure, thereby reducing the possibility of misleading instructions. The stall warning should also provide guidance on how to recover from it, similar to the existing GPWS. A visual indication would also be beneficial as the cockpit is largely a visual and not an aural environment.
For example, when the aircraft goes into alternate law, AI can provide a reference of the implications and recovery techniques to the pilots making the environment less stressful. Nicklas Dahlstrom, my professor for Human Factors. The motivation and inspiration that he provided to me were just an added benefit to the invaluable guidance and support that he has given.
I thank all the authors of the books and journals which I referred to who have supported me with lot of information and guidance, with a special mention to the developers of Google. I am also grateful to my friends for proofreading the report and providing valuable input and parents who were always there for me with encouraging words and generous wisdom.
Finally, I thank the Almighty for providing me with the strength and courage to complete this report. Brockner J. The Hofstede centre n. International Civil Aviation Organization c. Aviation Safety Experts c.
FlightStats c. Traufetter G. Learmount D. Croft J. Related Papers. By Eller-Jed Mendoza. By john stoop and Jaap Vleugel. Trajectory Recovery System: By Tiziano Bernard.
By Benjamin Chong. Early loss of altitude and buffeting. What you should do? With this, the speeds are again valid, the angle of incidence decreases and the support loss alarm reactivates. Now it's good It's back to zero level […] no, no, he does not want.
Again, actions to nose-up the aircraft are registered and sustaining loss alarm for. Orders are given controversial in recent dialogues inside the cockpit. Place the wings horizontally.
I'm doing this as much as possible. No, no, come down! We are less than 4, feet. This is not true! In A, the ECAM proposes actions to be performed in most cases of failure or emergency during the flight. The plane begins to leave its flight envelope, the stall alarm sounds and sounds 74 times in 54 seconds. There were, however, other signs in the cockpit in addition to this alarm, which could indicate that the aircraft had stalled: the loss of altitude, an artificial horizon showing the position nose-up and buffeting.
According to the BEA , these signs do not seem to have been sufficiently clear to the pilots, since no standard ad on the attitude angle differences and vertical speed was made. Thus, according to the BEA , the pilots did not understand that the aircraft had stalled and, consequently, possible recovery maneuvers were not applied. Recognition of the stall alarm, associated with the same buffeting, assumes that the staff assigned to the alarm a minimum legitimacy.
This presupposes, however, a sufficient prior experience, a minimum of cognitive availability and understanding of the context, knowledge of the aircraft and their protection modes and on the physics of flight BEA, , p.
In addition to non-identification of the stall alarm, Co2 action to pull the joystick or instead of pushing it was also justified by the FH group: […] the excessive nature of pilot actions CO2 can be explained by surprise and the emotional charge the disconnection of the autopilot, amplified by the lack of practical training of staff in high-altitude flights […] BEA, , p.
The justification of the BEA and its FH group is therefore based on human sensory limits, the emotional charge of the situation and the lack of proper training for the identification and management of occurring anomalies, whether the failure to identify the support loss alarm, not to support loss identification itself, the fact that the Co2 pull the lever rather than push it and realized diversion.
Are these reasons, according to the BEA, leading the pilots to commit driving errors considered primary. But it is not strange that a team with great experience to have committed? These selective attention mechanisms establish even different weights between the senses.
A recent study electrophysiology on a piloting task confirms that the appearance of such visual-auditory conflicts strong workload situation is reflected by a selective attention mechanism that favors the visual information and lead pilots to neglect critical audible alarms Scannella, Similarly, training plays a key role in the recognition and control of risk situations or heavy workload. The construction of an answer from previous knowledge is assumed in real time the event occurs, an embodiment of the anomaly to the mental representation of the situation, which can undergo a construction or reconstruction of the representations previously assigned.
Thus, the correct perception of the situation by a team, which can improve the reliability and speed of diagnosis and the decision is linked not only to the way in which this situation is presented to this team, but also to their training and previous experience Jouanneaux, However, it is necessary to clarify in addition to these physiological mechanisms or lack of training, as in the course of action, operates selective perception. The conclusions of the BEA FH group on very little accident added to the explanations published in previous reports because they do not take into account the action set and the interaction between the set of parameters that make up the context.
After the accident, several publications eg BEA, , ; Otelli, considered the action of Co2 to pull the stick as unacceptable, and said that some basic safety rules have not been respected.
The actions of Copilot CO2 can be described as sudden and excessive [ They are inappropriate and do not correspond to an expected pilot phase of flight at high altitude […] BEA, , p.
Otelli points out that, according to the laws of aerodynamics in support loss situations is necessary to push the joystick not vice versa , exercising actions to poke the aircraft to regain their support. According to the author, are basic rules of aerodynamics apply to all types of aircraft. Explanations like these are clear, but only because they represent the view of an outsider, which occupies a privileged asymmetrical position in relation to the co-pilot inside the cockpit: the post-accident analyst knows for sure what position was the plane at that point, so what rule should have been applied to keep it under control.
There is no point, however, reaffirm this rule as an explanation and also strengthen it as a safety procedure because jumps up one step - the most crucial - in the course of actions and events that led to the crash: why the co-pilot not can build an adequate representation of the aircraft condition. Analyzes based on expected behaviors depending on the planned anomalies are insufficient because it does not take into account the representation of the current situation, only the known context of the data, of course, after many hours of analysis and with the help of several experts.
As discussed above, the error should not be the conclusion of an investigation, but its starting point Dekker, Item 6 of this article will develop these issues more deeply. However, the question to be answered should be why, when they were in, the pilots did not understand the situation and committed the primary errors identified in the investigations?
The inconsistency between the information coming from the automatic system and those perceived and interpreted by the drivers is the key to answering this question and understand their behavior, being necessary to consider the immediate situation between team-cockpit-plane in flight and their interrelationships dynamics. To understand the situation, not enough to confront the world's state at any given time the plane position information available in the cockpit [ Human operators perceive and act according to the mental representations and perceptions that mediate the relationship with the situation and not on the basis of an exhaustive copy of the external reality.
Based on the approaches of situated action and distributed cognition, in item 6. After this comprehensive analysis, in item 6. Why not recognize the signals emitted by the aircraft for the loss of support, such as alarm and the buffeting? Why Co2 pulls the stick and worsening support loss condition? Why the Commander does not have a more forceful reaction and directly assumes the command to return to the cockpit? Answering these questions only from our descriptions of context or even explanations of actions based on the theory of action located in can lead to errors of interpretation and an untrusted analysis from the point of view of the own pilots.
Thus, to try to fill the absence of the main actors - the pilots - we confronted possible explanations supported by the action theory situated with explanations of pilots and ex-pilots, given in official interviews about the accident, in particular as to the meaning that may have been assigned by the team, in the heat of action, to the signals provided by the aircraft during the crash. The immediate sequence of events that led to the accident begins with the freezing of Pitot probes.
From this fact, depending on the context in which this occurrence was inserted and the subsequent actions of the pilots, a sequence of events generates various signals in the cockpit, not all perceived or properly interpreted by the pilots: the autopilot disconnects; the speed indicator shows a sharp decline; various alarms sound in the cabin, while at times and separately in others; the altitude indicator shows a rapid and progressive decrease, while the artificial horizon shows a position nose-up of the aircraft wings; the buffeting begins and continues until the clash with the water.
The misunderstanding of the context was not reduced by the warning signals from the ECAM, which contributed to the unreliable signs, which lasted from the start of the crash until the end of the flight. Much has been said about the lack of understanding of the drivers on the stall condition of the aircraft. It is necessary, however, that there is a previous level of misunderstanding, which is determining the sequence of events: that the speed of the aircraft had not changed, despite the electronic indicator shows otherwise.
When the autopilot stops working, the cockpit of the information system provides them with a disparate information from the actual plane situation. Only the immediate consequence of this anomaly sudden reduction in speed is displayed by the trip computer, but not its cause.
Once the probes are frozen, the pilots see an abnormal speed on the panel and act according to this sign, believing it to be true.
However, this signal is not real, because the speed of the plane had not been changed, only your statement. The initial malfunction is in the speed sensor and not the speed itself.
All operations from this are carried out according to this interpreted situation and not to an actual situation. Pilots do not know the actual speed at which they are.
During a trip so far without incident, the first available abnormality signal is the speed indicator.
However, they are unaware of the anomaly in the Pitot probes. The shifting speed displayed by the panel not to say necessarily had an abnormality in the speed of probes, as this anomaly could be any other system component. The failure of the Pitot probes is only obvious after the accident occurred. In this sense, a hypothesis formulated by Co2 could have been a real change in the speed of the aircraft, not just the change in the electronic indicator.
No information of the actual speed and without knowing the technical incident, pull the stick may have been an attempt to resume the speed Co2, whose notes are presented below the normal speed.
The aircraft begins to fall and some signals are issued by it: begins the buffeting and the ECAM will indicate a zero speed, altitude gradually decreasing and an artificial horizon outside the stable position when the wings are not on horizontal axis.
For pilots, these signs are incomprehensible and this situation lasts until the clash with the water. Thus, by itself, the sense of turmoil was not an abnormality indicator and could be considered as the result of external conditions. From the moment the plane goes into sustaining loss, the altitude indicator shows a progressive loss and riders constantly make reference to it. At the same time, the artificial horizon indicates that the wings are not in horizontal position, which demonstrates instability of the aircraft.