By Barry Santini

The Perfect Storm: “Normal Vision,” the Bias of the Exam Room, Higher-Order Aberrations and Doctor Discretion

When we approach refraction through just our knowledge of optics, we often miss seeing”the forest for the trees.” Serendipity dictated how the 20-foot distance for testing visual assessment became the benchmark: It was the longest dimension commonly found in rooms of the day. But this 20-foot testing distance does not possess a vergence equivalent to optical infinity, and therefore can introduce a real and important dioptric bias in all our testing. Let’s look at the numbers:

20 feet = 120 inches
120 inches x 25.4 (centimeters to the inch) = 6.096 meters
What is the dioptric value of a lens with a focal length of 6.096 meters?
1/6.096m = 0.16 Diopters

It should be clear from this example that all testing done at a distance of 20 feet is already removed from true optical infinity by + 0.16 diopters. Today, we often exacerbate this situation by the use of mirrors. By folding this already biased 20 foot reference distance in our offices into a more space-efficient 10 feet, we’re compounding this optical bias error by encouraging patients to view the eye chart from this closer distance. The prospects for fixating on the mirror-frame mounting can divorce these prescriptions even further from an accurate representation of distance focus. If you also combine examination protocols based on keeping the eye’s accommodation at rest, i.e., never over-minusing, with the serendipity of current fabrication tolerances as defined by ANSI, it’s really no surprise why so many progressive-lens wearers complain of unsatisfactory distance vision.

From the tales of woe in the early days of refractive surgery, we’ve learned about the impact that higher-order aberrations (HOAs) have on our subjective sense of sharpness. We determined a stable, pre-corneal tear film is essential to proper evaluation of our eye’s HOAs. With this awareness, we now understand better how factors such as dry eye, allergies, the overly dehumidified environments common to most air-conditioned homes and offices, the reduced frequency of the blink reflex for people who work at computers and even time of day can impact not only perceived sharpness, but also our refractive findings.

The final ingredient in this exam-room recipe is a tricky one to uncover, as it originates from factors rarely revealed on any written prescription; Namely, to what degree has the final Rx been modified, adjusted or “tweaked” from the true exam room findings by the individual judgment and discretion of the refractionist? The examiner often acquires a “between the lines” intuition gleaned from careful listening to the responses of people (mirror neurons at work again?) and will sometimes “adjust” final Rx values to reduce or soften the impact of abrupt power, astigmatism and axis changes, or even a change in the accommodative balance between a pair of eyes. So we, at the dispensing counter, often find ourselves trying to discover the reasons why a patient may be uncomfortable or unhappy with their new eyewear, are working with a distinct handicap—the absence of knowledge about how much examiner discretion may be hidden in the prescription numbers before us.

When approaching the subjective complexity and nuance hidden within every new prescription placed before us using just the cold light of objectivity, we’ll fail to see how these factors act to undermine the goal of patient satisfaction as the hidden, swirling detritus of a “perfect storm.”

As the majority of our training and education as eyecare professionals is founded in the “hard” science of optics, we tend to approach most vision problems with a biased perspective. When we are answering questions or delivering explanations to new trainees or the lay public, we regularly use analogies that compare our eye’s cornea, lens and retina to camera lenses and film. Viewed within this context, it’s no wonder that our expectations, recommendations, standards of practice and the codification of the same into law often carries with it the imprimatur of scientific objectivity.

But the sensory experience of human vision is quite distinctly not an objective undertaking. It is parochial, influenced by evolution, natural selection and the instinct for survival. It’s not camera optics. Affected by age, genetics, environment, nutrition, general health and even time of day, vision is rarely stable. It’s fluid. Most importantly, our assessment of our quality of vision is a subjective undertaking that makes an approach based on comprehending it through objective measurements significantly flawed.

Is it even possible to reconcile the conflict between remaining objective about the refractive process while we acknowledge the fundamentally subjective, interpretative nature of the same by your patients? Let’s start by taking time to visit refraction’s great grandfather, the process of visual assessment.

A Short History of Objective Visual Assessment
A few millennia ago, our ancestors discovered that the ability to discern, recognize and categorize star patterns helped them to predict the passage of time and improved their chances of finding food and detecting predators. Those individuals who possessed sharp vision and recognition skills became essential to the welfare of the tribal community.

Fast forward to the middle 1800s, where a physiology professor, Dr. F.C. Donders of the Netherlands became interested in vision. He created the following formula to help in evaluating, quantifying and comparing human visual performance:

Defining the “Standard Eye”

Dr. Donders defined the baseline for a “standard eye” as being just able to see a letter 5 arc minutes high at 20 feet away. Why did he choose 20 feet as his reference distance? It is speculated that 20 feet was chosen because it was the longest dimension regularly found in rooms of this era. (See sidebar: “The Perfect Storm.”)

At Dr. Donders’ request, his co-worker Herman Snellen, created a prototype of the standardized measurement tool still used today: The Eye Chart. Instead of simple letters, Snellen employed characters he termed optotypes, because he perceived that not all letters were equally resolvable to a common standard. He also wanted to ensure comparative visual assessments done elsewhere would share a common benchmark for analyzing results. One hundred years later, a man named Sloan created a set of optotypes because he felt not all of Snellen’s original optotypes were equally recognizable. Thus, the evolution of the eye chart bears witness to the beginnings of the struggle to reconcile objective optical resolution and subjective perceptual recognition.

A tremendous amount of significant discoveries, optical advancements and theories proposing how the eye works occurred from the middle 1800s to the early 1900s. This period is often referred to as the “golden age of ophthalmology.” Names such as Hemholtz, Abbe, Jaeger, Donders, Tillyer, Snellen, Green and Jackson were pioneers in understanding the human eye.

In the 1970s and 1980s, the latest revision of the Snellen eye chart resulted from work done in conjunction with the National Eye Institute’s study of diabetic retinopathy. The NEI showed how simple visual assessment can be used to help discover underlying vision disorders. But is visual assessment, or its more developed cousin, refraction, robust enough to fully explain the sense of how well we see?

Subjectivity in the Eye of the Beholder
Today we possess sophisticated instrumentation with the ability to objectively determine the exact refractive state for a person’s eyes. Yet, as complex and useful as these advanced scientific instruments are, we find that delivering vision people find subjectively “satisfactory” requires that we often override and modify the findings of these instruments, using feedback derived from subjective choices presented to these same individuals. Because ECPs primarily schooled in science and “hard” technical skills are sometimes puzzled when patients are “fickle or finicky” about what constitutes a subjective impression of sharpness. If you’re like me, you’ve wondered about the mechanisms that might underlie some of the seemingly capricious vision preferences demonstrated by patients, as well as ruminating about what their underlying origin would be.

To begin to understand these mechanisms, we first need to become acquainted with how evolution, and the goals of survival and procreation, influenced the development of sensory processing. Despite today’s enormous knowledge of the eye’s optics, eyecare practitioners can be frustrated by their inability to regularly and objectively attain a patient’s complete visual satisfaction.

Survival and Sensory Development

Simply stated, past generations successfully passed traits on that increased their probabilities of survival. Amongst these was a way of handling the potentially overwhelming input of one’s senses, including vision, in a manner efficient enough to be processed by the primitive brain. The results of natural selection favored organisms that processed sensory inputs by assigning a greater importance to changes in stimulus, over those that attempted to handle the full, sensory input stream. In effect, individuals exhibiting traits that paid more attention to changes in stimuli increased their chances of survival and opportunities to procreate. For example, a potential predator exhibiting movement was a greater threat than one remaining stationary. Or take the development of instinctual migratory patterns. These journeys resulted from behavioral adaptations to changes in local weather patterns. These “traits” not only increased their chances of survival, they also became “hard-wired” into sensory systems, i.e., unconsciously reflexive responses.

The most efficient mechanism for processing changes in sensory stimuli is based on making comparisons. For example, we are immediately aware of the unfamiliar odors we might find in a friend’s home because our sense of smell quickly compares odors of this new environment with familiar smells. Yet, if we spend enough time amongst these new odors, we become acclimatized to their presence and awareness dramatically decreases. The initial comparisons processed by our nasal sense noticed these changes and our brains respond by increasing awareness. As we experience this odor stimulus over time, the olfactory sense fatigues from the continued input of the stimuli. This is referred to as neurological adaptation. We become “used to” the new stimulus.

Survival is improved using this type of efficient, sensory processing. Ancestors helped ensure survival through the development and evolution of two, very efficient traits: paying attention to changes in sensory stimuli and adapting to the continued presence of a stimulus change by employing the neurological adaptation to reduce sensory fatigue.

Survival and Its Influence on Our Visual Development

The attainment of the twin goals of survival and procreation has been directly influenced by evolution and has also produced some of the most basic, characteristic traits seen in our vision system. Below are a few examples, along with some speculations about their origin:

Why do we become increasingly alert when in the presence of the color red? Red is the color of blood; its presence signified that either food or danger was nearby.

Why do our eyes tend to adopt a close focus, rather than fixating on a greater distance when we’re within a darkened environment?

This trait, called night myopia, is said to have ensured survival by readying us to more quickly recognize a predator or food when in dark conditions, or when suddenly being startled from a sleeping state. In these circumstances our eyes exhibit a focus bias of two to three feet.

Why do we naturally favor a lowered and depressed eye position for reading?
Our ancestors discovered the availability of food or the threat of a predator was recognized more quickly when their eye’s gaze was focused on the ground, down and immediately in front of them, versus looking at a similar distance at eye level or above.

What is the origin of stereoscopic vision?
This trait improved our estimation of distance to food or predators. It also accompanied the marriage of three, common reading traits: eye depression, convergence and accommodation, which have been forever married by the process of evolution.

Why do progressive lenses with distance corrections that lie near “plano” often present severe adaptation issues for patients?
The initial impressions of “swim” heard from first-time progressive lens wearers or the negative comments from patients who have experienced abrupt base curve changes, are best understood if we look at how nature evolved the light receptors known as rods in the periphery of our retinas. Quick recognition of the movement of even a single blade of grass in our peripheral vision could often spell the difference between hunger and evading the stalk of a predator. Unlike the cones, nature chose to maximize the speed of peripheral recognition by ensuring the fastest neural processing available: each of our rods is individually wired directly to a corresponding single neuron in our brain.

Mirror, Mirror, on the Wall...

In the two decades of the last century, researchers in neuroscience have rediscovered a perspective and understanding about sight and vision that had been essentially lost since it was first observed by the ancient Greeks. Subsequently rediscovered during the Renaissance, scholars posited that the process of human vision is not a passive one. Sight is more than just our eyes receiving photons off the world. Though refracted and focused on our retinas, the totality of vision is far more complex than modeling it as some type of camera.

Dr. Daniel Glaser, a neuro-vision researcher with the University College London, says: “Seeing is a process of projecting what you expect out into the world and constantly matching your experience, your prejudice, your expectation with what's out there.” This subjective perspective supports the idea our vision is a complex descendant of the sensory stimulus/processing-comparative paradigm discussed above.

The receipt of a visual stimulus and its conversion into a usable neurological signal first occurs in the retina. Further processing in the brain is then accomplished through the action of structures known as mirror neuons. The primary, active analysis process that mirror neurons employ is founded on the comparison. Their evolutionary importance is intimately tied to how facilitating the survival of a tribal group through cooperation and deference to an organizational hierarchy also improved the survival of an individual member. Mirror neurons evolved to respond to help us observe and process the actions of other members in a group. Socially sensitive traits, including empathy, sympathy, trust and doubt all ensured the survival of early human groups. These emotions improved the chances of finding good mates, evaluating the threat potential of rivals and enjoying the security and benefits of living constructively together in groups.

The study of mirror neurons has been said to reveal the underlying social nature of our brain’s organizational structure. It should then be obvious why the sense of vision is modeled better using evaluation and comparison, rather than using just the refraction of light. Herein then may lie the best understanding of the reasons behind so many common perceptual complaints we hear from patients when their prescriptions, base curves and lens designs are changed. Grasping how evolution has hard-wired human’s subjective sense of vision to respond to these changes is key to comprehending how comparative comments, such as “the floor looks tilted,” “the doorway is distorted” or “I feel weird with my new eyewear,” are best addressed with reassurance that “you’ll get used to it.”

The Psychology of Refraction
Further complicating our ability to approach refraction objectively is the inherent, psychologically nuanced nature of the subjective part of the eye exam. Who has not heard a patient say “the exam choices are so confusing. I’m never sure I gave the right answer!”? When we prompt patients to participate in the examination by asking them to decide that whether “one or two,” “This or that” or “A or B” is clearer/sharper/better, why is it that people do not say “I don’t see any difference” with real confidence? Moreover, patients often advised at this very point when they see “no difference,” that this part of the choice-tree is now complete. By continuing to leave unaddressed the feeling that patients may not have made the “right” choice, we invite the specters of doubt and intimidation to hide or color their expectations for good view through their new eyewear.

Can We Really Be Objective?

Pursuing a strictly objective approach to fulfilling the recipe contained in a new prescription thus leaves out important ingredients. Understanding how age, environment, medications, neurology, eye disease, evolution and even time of day are influences on your patient’s subjective sense of sharpness and perspective is essential to offering acceptable explanations for what they see and feel with their new eyewear. The cost of ignoring the effects that millions of years of evolution have had on the subjective sense of vision would be the loss of a most prized quality—nuance.


Barry Santini is a New York State licensed optician based in Seaford, N.Y.