It isn't the examiner's job to verify everything has been taught.
Examiners sample the areas, but are not required or even
suggested to cover everything.
My favorite examiner was a stickler for the instructor endorsements.
I asked him why he was so particular about making sure they were
all correct, and everything was there. He said:
"It's the instructor responsibility to cover the aeronautical
skills and the knowledge and prepare the applicant for EVERYTHING
in the PTS. When the instructor endorses and signs this,
they are saying the pilot is trained. I just give the test.
I can't possibly test everything, and I'm not going to. But if
I uncover something missing, that reflects on the instructor,
not the student."
This examiner is also good at doing exactly what the PTS
says. Buried in the many words in there, one example says:
"Examiners shall test to the greatest extent practicable the
applicant's correlative abilities rather than mere rote enumeration
of facts throughout the practical test."
This examiner never got nitpickety, but would test correlation
for only fundamental areas.
For example, the student might fly coordinated very well,
understand yaw and roll, and describe rudder and ailerons and
even parrot back adverse yaw. But in the air, the examiner
may ask for a slow roll rate into a steep bank, then try the
same thing with a fast roll rate. If the applicant can't
CORRELATE what he was asked on the oral exam, and apply
more rudder pressure during higher roll rates, then they
FAIL the standard.
So instructors are required to cover everything. And they are
required to teach to proficiency not just of rote or
understanding or application. They are required, by the PTS, to
teach pilots to the highest level of learning.
Correlation.
When the instructor signs off saying the applicant is prepared for
the practical test, they are saying the applicant has correlation
for all of the skills to be tested.
Not obscure weather terms, not the manufacturer names of
yaw-indifferent static ports, not the number of pounds of
force exerted on a tiedown at different windspeeds, and
not how density altitude affects variometers.
Not this obscure rote garbage. Correlation. When two
windsock tails a mile apart point at each other, what does this
MEAN? What is happening? What are you going to do about
it?
The minimum standard, straight from the PTS, is correlation,
and I think it is quite a high standard indeed.
Yes, there are instructors who give ZERO ground instruction.
And there are some students who can learn it all on their own
or in the air. But I hear what Terry said, and the instructors
who sign off they've covered wind-shear and wake turbulence,
or assembly procedures, when they have NOT, are simply unethical
and unprofessional.
My CFIG FAA ASI examiner said the same. He said the CFI endorsement
carries a LOT of weight.
Two years ago a CFI signed off a student for an instrument test.
The student got to the "holds" portion of the flight test, and
when asked to do a hold, the student said "I've never done one
of those in flight before." It turns out the CFI had signed off
this as proficient, but had never taught a single hold in flight or
in a simulator. And there was no record of any such training anywhere
in the logbook.
Well, the student got some of her money back from the CFI, the FAA
issued the CFI a letter, and the CFI got a VERY bad rep out of this.
Yes, CFIs and even examiners go bad sometimes. Some are too easy,
some are too hard. I, for one, go through every single line
of the reg and endorse longhand for each item, before I endorse for
a solo or practical test or privilege. I've always missed some
part of it every single time, and take that opportunity to cover
wind shear or assembly or how to evaluate runway lengths at airports
of intended landings or ...
Any of you who think the bare minimum PTS standard, or the
bare minimum regulatory standard of part 61, is too lax,
well, I disagree...
If you're arguing that some CFIs or examiners are signing off stuff
they haven't done, I agree with that, and that is a whole
different subject of ethics.
In article .com,
wrote:
Terry wrote:
That said, examiners who do their own thing can make it very hard
on
instructors.
Thanks for sharing your perspective.
UH
================================================= =====================
I hope I did not give the impression that I am making up my own
checkride for I am not. If an applicant meets the PTS during my time
with him, then he passes. As it should be. Any examiner that is
running his own checkride does not deserve nor should he continue to
hold his status.
By raising the bar, I meant as an iINSTRUCTOR/i, I should always
be
looking to higher standards from my students. After all getting the
student there is what instruction is all about.
Terry Claussen
]
Thanks Terry: Agree we should all be expecting more than barely good
enough. I have seen some examples of examiners making up their own
stuff and it can make you crazy. The standards are a bit mushy, which
makes it more complicated, especially for someone who is new. I'm sure
all of us that have been doing this for awhile has our own "hot spots",
that is things I commonly see a weak points in the pilot population.
I'll share a few of mine and maybe some other folks can add to the
list.
#1 Poor energy management in the landing pattern- an over application
of "speed is your friend". I'd estimate that 2 out of 3 pilots I check
for the first time would hit the fence at the far end of a small field.
#2 Failure to create a plan for developing events. The simple lack of
recognition of a need for this is far too common.
#3 Poor general airmanship- especially is slow flight. Most pilots do
not know how to fly in the stall range. I include in this flying the
glider in a stalled or partially stalled condition.
Anybody else want to jump in here?
UH
--
------------+
Mark J. Boyd
|