FADEC = complex
Neil Gould writes:
One gets plenty of clues that something is going awry prior to this
happening.
No, one does not. The whole purpose of automation is to mask
information that contains such clues, in order to reduce the workload
for the pilot. As an autopilot moves the ailerons of an aircraft to
maintain heading and attitude, this is completely transparent to the
pilot for the most part, unless he actually looks out at the ailerons
or keeps his hands on the controls (in some aircraft). If he were
constantly being reminded of the autopilot's actions, there wouldn't
be any advantage to having an autopilot.
It appears that you are describing another form of pilot error. If one
believes that they can set an autopilot and then take a nap, *that* is the
problem, not the behavior of the autopilot.
A lot of commercial pilots do that. Long trips can get pretty boring.
Given that so few accidents can be charged to the failure of these
devices, it may be reaching to claim that some unreasonable level
of danger is presented by their use.
A lot of accidents have occurred when automated systems allowed crews
to lose their situational awareness. Autopilots are particularly
implicated in this respect, perhaps because they've been around so
long and work so well.
--
Transpose mxsmanic and gmail to reach me by e-mail.
|