By Barry Santini
As the majority of our training and education as eyecare professionals is founded in the “hard” science of optics, we tend to approach most vision problems with a biased perspective. When we are answering questions or delivering explanations to new trainees or the lay public, we regularly use analogies that compare our eye’s cornea, lens and retina to camera lenses and film. Viewed within this context, it’s no wonder that our expectations, recommendations, standards of practice and the codification of the same into law often carries with it the imprimatur of scientific objectivity.
But the sensory experience of human vision is quite distinctly not an objective undertaking. It is parochial, influenced by evolution, natural selection and the instinct for survival. It’s not camera optics. Affected by age, genetics, environment, nutrition, general health and even time of day, vision is rarely stable. It’s fluid. Most importantly, our assessment of our quality of vision is a subjective undertaking that makes an approach based on comprehending it through objective measurements significantly flawed.
Is it even possible to reconcile the conflict between remaining objective about the refractive process while we acknowledge the fundamentally subjective, interpretative nature of the same by your patients? Let’s start by taking time to visit refraction’s great grandfather, the process of visual assessment.
A Short History of Objective Visual Assessment
A few millennia ago, our ancestors discovered that the ability to discern, recognize and categorize star patterns helped them to predict the passage of time and improved their chances of finding food and detecting predators. Those individuals who possessed sharp vision and recognition skills became essential to the welfare of the tribal community.
Fast forward to the middle 1800s, where a physiology professor, Dr. F.C. Donders of the Netherlands became interested in vision. He created the following formula to help in evaluating, quantifying and comparing human visual performance:
Defining the “Standard Eye”
Dr. Donders defined the baseline for a “standard eye” as being just able to see a letter 5 arc minutes high at 20 feet away. Why did he choose 20 feet as his reference distance? It is speculated that 20 feet was chosen because it was the longest dimension regularly found in rooms of this era. (See sidebar: “The Perfect Storm.”)
At Dr. Donders’ request, his co-worker Herman Snellen, created a prototype of the standardized measurement tool still used today: The Eye Chart. Instead of simple letters, Snellen employed characters he termed optotypes, because he perceived that not all letters were equally resolvable to a common standard. He also wanted to ensure comparative visual assessments done elsewhere would share a common benchmark for analyzing results. One hundred years later, a man named Sloan created a set of optotypes because he felt not all of Snellen’s original optotypes were equally recognizable. Thus, the evolution of the eye chart bears witness to the beginnings of the struggle to reconcile objective optical resolution and subjective perceptual recognition.
A tremendous amount of significant discoveries, optical advancements and theories proposing how the eye works occurred from the middle 1800s to the early 1900s. This period is often referred to as the “golden age of ophthalmology.” Names such as Hemholtz, Abbe, Jaeger, Donders, Tillyer, Snellen, Green and Jackson were pioneers in understanding the human eye.
In the 1970s and 1980s, the latest revision of the Snellen eye chart resulted from work done in conjunction with the National Eye Institute’s study of diabetic retinopathy. The NEI showed how simple visual assessment can be used to help discover underlying vision disorders. But is visual assessment, or its more developed cousin, refraction, robust enough to fully explain the sense of how well we see?
Subjectivity in the Eye of the Beholder
Today we possess sophisticated instrumentation with the ability to objectively determine the exact refractive state for a person’s eyes. Yet, as complex and useful as these advanced scientific instruments are, we find that delivering vision people find subjectively “satisfactory” requires that we often override and modify the findings of these instruments, using feedback derived from subjective choices presented to these same individuals. Because ECPs primarily schooled in science and “hard” technical skills are sometimes puzzled when patients are “fickle or finicky” about what constitutes a subjective impression of sharpness. If you’re like me, you’ve wondered about the mechanisms that might underlie some of the seemingly capricious vision preferences demonstrated by patients, as well as ruminating about what their underlying origin would be.
To begin to understand these mechanisms, we first need to become acquainted with how evolution, and the goals of survival and procreation, influenced the development of sensory processing. Despite today’s enormous knowledge of the eye’s optics, eyecare practitioners can be frustrated by their inability to regularly and objectively attain a patient’s complete visual satisfaction.
Survival and Sensory Development
Simply stated, past generations successfully passed traits on that increased their probabilities of survival. Amongst these was a way of handling the potentially overwhelming input of one’s senses, including vision, in a manner efficient enough to be processed by the primitive brain. The results of natural selection favored organisms that processed sensory inputs by assigning a greater importance to changes in stimulus, over those that attempted to handle the full, sensory input stream. In effect, individuals exhibiting traits that paid more attention to changes in stimuli increased their chances of survival and opportunities to procreate. For example, a potential predator exhibiting movement was a greater threat than one remaining stationary. Or take the development of instinctual migratory patterns. These journeys resulted from behavioral adaptations to changes in local weather patterns. These “traits” not only increased their chances of survival, they also became “hard-wired” into sensory systems, i.e., unconsciously reflexive responses.
The most efficient mechanism for processing changes in sensory stimuli is based on making comparisons. For example, we are immediately aware of the unfamiliar odors we might find in a friend’s home because our sense of smell quickly compares odors of this new environment with familiar smells. Yet, if we spend enough time amongst these new odors, we become acclimatized to their presence and awareness dramatically decreases. The initial comparisons processed by our nasal sense noticed these changes and our brains respond by increasing awareness. As we experience this odor stimulus over time, the olfactory sense fatigues from the continued input of the stimuli. This is referred to as neurological adaptation. We become “used to” the new stimulus.
Survival is improved using this type of efficient, sensory processing. Ancestors helped ensure survival through the development and evolution of two, very efficient traits: paying attention to changes in sensory stimuli and adapting to the continued presence of a stimulus change by employing the neurological adaptation to reduce sensory fatigue.
Mirror, Mirror, on the Wall...
In the two decades of the last century, researchers in neuroscience have rediscovered a perspective and understanding about sight and vision that had been essentially lost since it was first observed by the ancient Greeks. Subsequently rediscovered during the Renaissance, scholars posited that the process of human vision is not a passive one. Sight is more than just our eyes receiving photons off the world. Though refracted and focused on our retinas, the totality of vision is far more complex than modeling it as some type of camera.
Dr. Daniel Glaser, a neuro-vision researcher with the University College London, says: “Seeing is a process of projecting what you expect out into the world and constantly matching your experience, your prejudice, your expectation with what's out there.” This subjective perspective supports the idea our vision is a complex descendant of the sensory stimulus/processing-comparative paradigm discussed above.
The receipt of a visual stimulus and its conversion into a usable neurological signal first occurs in the retina. Further processing in the brain is then accomplished through the action of structures known as mirror neuons. The primary, active analysis process that mirror neurons employ is founded on the comparison. Their evolutionary importance is intimately tied to how facilitating the survival of a tribal group through cooperation and deference to an organizational hierarchy also improved the survival of an individual member. Mirror neurons evolved to respond to help us observe and process the actions of other members in a group. Socially sensitive traits, including empathy, sympathy, trust and doubt all ensured the survival of early human groups. These emotions improved the chances of finding good mates, evaluating the threat potential of rivals and enjoying the security and benefits of living constructively together in groups.
The study of mirror neurons has been said to reveal the underlying social nature of our brain’s organizational structure. It should then be obvious why the sense of vision is modeled better using evaluation and comparison, rather than using just the refraction of light. Herein then may lie the best understanding of the reasons behind so many common perceptual complaints we hear from patients when their prescriptions, base curves and lens designs are changed. Grasping how evolution has hard-wired human’s subjective sense of vision to respond to these changes is key to comprehending how comparative comments, such as “the floor looks tilted,” “the doorway is distorted” or “I feel weird with my new eyewear,” are best addressed with reassurance that “you’ll get used to it.”
The Psychology of Refraction
Further complicating our ability to approach refraction objectively is the inherent, psychologically nuanced nature of the subjective part of the eye exam. Who has not heard a patient say “the exam choices are so confusing. I’m never sure I gave the right answer!”? When we prompt patients to participate in the examination by asking them to decide that whether “one or two,” “This or that” or “A or B” is clearer/sharper/better, why is it that people do not say “I don’t see any difference” with real confidence? Moreover, patients often advised at this very point when they see “no difference,” that this part of the choice-tree is now complete. By continuing to leave unaddressed the feeling that patients may not have made the “right” choice, we invite the specters of doubt and intimidation to hide or color their expectations for good view through their new eyewear.
Can We Really Be Objective?
Pursuing a strictly objective approach to fulfilling the recipe contained in a new prescription thus leaves out important ingredients. Understanding how age, environment, medications, neurology, eye disease, evolution and even time of day are influences on your patient’s subjective sense of sharpness and perspective is essential to offering acceptable explanations for what they see and feel with their new eyewear. The cost of ignoring the effects that millions of years of evolution have had on the subjective sense of vision would be the loss of a most prized quality—nuance.
Barry Santini is a New York State licensed optician based in Seaford, N.Y.