Essay Writing Service

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)

Treatments for Visual Field Loss in Glaucoma

Visual perception is considered to be the most significant and important sense for humans. The eyes gather approximately 80% of the total information that is transmitted to the brain from various sense organs. Hence, adequate visual function is highly associated with good quality of life. Glaucoma is one of the top three ocular disorders worldwide that can lead to severe visual impairment.1-3 The term of glaucoma describes a group of optic neuropathies whose clinical characteristics are the progressive loss of retinal ganglion cells (RGCs). It is associated with changes in the structure of the optic nerve head (ONH) and the retinal nerve fibre layer (RNFL) and specific functional defects to the visual field (VF).4 It is often, but not always, associated with high intraocular pressure (IOP).5 6 It is usually slowly progressive and in its early stages asymptomatic. Unfortunately, the structural and functional damage caused by glaucoma is irreversible. Once detected it requires life-long management with medication and/or surgical intervention to decelerate, or even stop, further progression of the disease. If left untreated it can lead to visual impairment and blindness.7-12

The prevalence of glaucoma increases with advancing age, affecting 1.5% over 40 years and 4% over 75 years of age.13 Important risk factors include race and family history of glaucoma; see Figure 1.1 for a detailed list of risk factors associated with open-angle glaucoma.14 After taking into consideration these risk factors, the National Health Services (NHS) in the UK established schemes that offer free annual eye tests for first degree relatives of glaucomatous patients over the age of 40 and for patients more than 60 years of age.15

1.1.1 Pathophysiology of Glaucoma

The progressive loss of RGCs and gradual degeneration of the optic nerve (ON) are the main characteristics of glaucomatous optic disc neuropathies. Multiple elements are believed to have an important role in glaucoma’s pathophysiology. According to various theories, factors like elevated IOP and vascular dysregulation contribute to the glaucomatous atrophy of one of the RGCs’ basic compartments, the axons, at the lamina cribrosa.16 These two elements can result in the alteration of the ON microcirculation and cause changes in the laminar glial and the connective tissue at the level of the lamina.17 The “cupping of the optic disc” is a characteristic change in the ONH and an indication of where RGC axons have been lost (see figure 1.2). Death of RGCs in glaucomatous human eyes occurs by apoptosis, a programmed cell death process that takes place without any inflammation.18

High IOP seems to have a role in RGC apoptosis; however, it is still unclear how important that factor is. There is good clinical evidence showing that an IOP reduction often helps to decelerate the progression of degenerative structural changes. However not all glaucoma patients present with high IOP. Various studies have showed that only one-third to half of the glaucomatous study population presented with elevated IOP at the early stages of the disease.20 21 On average, 30-40% of patients with glaucomatous VF defects have normal values of IOP when diagnosed.22 Thus, an elevated level of IOP is now believed to be a major risk factor for glaucoma, rather than the cause of the disease.

The finding that therapeutic control of IOP in many cases is not sufficient and also that glaucomatous changes have been observed in individuals with normal IOP suggest a critical role of other factors in the induction and progression of degenerative changes. Circumstantial evidences point towards an association between vascular insufficiency and glaucoma. A positive association of glaucoma has been observed with dysregulations of cerebral and peripheral vasculature, such as migraine and peripheral vascular abnormalities respectively.23-26 For the proper understanding of this association between glaucoma and vascular deficiency, we need to comprehend the differences in the autoregulatory mechanism between a healthy and a glaucomatous eye.

The high metabolic demands of the vital parts in a healthy eye are met by the constant blood flow in the retina and the ONH. For the maintenance of a constant rate of blood flow an efficient autoregulatory mechanism operates over a wide range of day-to-day fluctuations in ocular perfusion pressure that is dependent on both the systemic blood pressure and IOP.27 These autoregulatory mechanisms are not as vigorous in aging individuals as in youth. Therefore, deficient autoregulatory mechanisms leading to ischemia may contribute to the development of glaucomatous neuronal damage with increasing age. Glaucoma patients have been shown to have a chronically reduced ONH and retinal blood flow, especially in people being diagnosed with low systemic blood pressure leading to reduced ocular perfusion pressure.28-30 Thus, reduced diastolic perfusion pressure is now recognised as another significant risk factor for glaucoma.31 The progressive decline in cerebral and ocular perfusion that has been observed with increasing age supports the definition of glaucoma as an age-related disease.32 33

1.1.2 Classification and Types of Glaucoma

There is, actually, no simple mutually exclusive classification system for glaucoma; that in part reflects the lack of understanding of the pathophysiologic processes. Types of glaucoma can be classified in many ways. For instance, classification can be based on aetiology (primary and secondary), occurrence type (chronic and acute) and the outcome of gonioscopy (open- and close- angle) and IOP measurement (normal and hyper- tension).34 Primary glaucoma, either open-angle (POAG) or close‑angle (PACG), accounts for over 90% of the total glaucomatous cases observed worldwide.35

The time of onset may also be used to specify the type of a glaucomatous condition. Glaucoma cases of late onset are the most common and the average age of detection is approximately 65 years of age.34 Those of early onset include congenital or developmental glaucoma cases with the most representative condition being juvenile open-angle glaucoma (JOAG); a rare, often inherited condition that affects 1 in 10,000 infants and develops after the 3rd year of life.36 Lastly, glaucoma can be classified as genetic or acquired.34 Congenital or infantile glaucoma is evident either at birth or within the first few years of life.

There are three broad types of glaucoma: POAG, PACG and secondary glaucoma. American Academy of Ophthalmology defines POAG as “a progressive, chronic optic neuropathy in adults in which IOP and other currently unknown factors contribute to damage and in which, in the absence of other identifiable causes, there is a characteristic acquired atrophy of the ON and loss of RGCs and their axons. This condition is associated with an anterior chamber angle that is open by gonioscopic appearance” therefore allowing aqueous to access the trabecular meshwork. POAG is the most common type of glaucoma in Europe accounting for more than 80% of primary glaucoma cases.37

As mentioned above, the closure of the anterior chamber caused by multiple mechanisms is associated with PACG. Pupil block is an important factor for the pathogenesis of the majority of PACG patients. The pressure in the posterior chamber is higher than the anterior chamber causing the bowing of the iris, therefore blocking the trabecular meshwork and the outflow of the aqueous humor. As a result, IOP is elevated which can potentially lead to the damaging of the RGC fibres.38-40

Secondary glaucoma includes conditions such as pigmentary glaucoma, pseudoexfoliative glaucoma and uveitic glaucoma. Pigmentary glaucoma is associated with pigment dispersion syndrome which is an iris and ciliary body disorder. The mechanical pigment liberation from iris pigmented epithelium causes the clogging of the angle and the reduction of the aqueous outflow. Glaucoma secondary to pseudoexfoliation syndrome (PXF) is called pseudoexfoliative glaucoma and is caused by the accumulation of fibillogranular material that reduces the outflow of the aqueous humor.

Cases with blunt trauma to the globe can lead to raised IOP and traumatic open-angle glaucoma. There are various mechanisms that act on this type of secondary glaucoma; from angle scarring and physical damage to the obstruction of the aqueous outflow by debris. Uveitic glaucoma is associated with various uveitic conditions in the anterior or intermediate part of the eye. As expected, there is reduction in the aqueous outflow either by trabecular changes or trabecular obstructions. Other secondary types of the pathology are neovascular glaucoma, which is associated with irregular vessel growth, and aphakic glaucoma which develops in aphakic patients, mostly after cataract surgery.19

1.1.3 Epidemiology of Glaucoma

Glaucoma affects more than 60 million people worldwide with an estimated 8.4 million people being blind due to the disease (table 1.1). Women are affected more than men representing 59% of all glaucomatous cases, while the Asians are the largest group affected, comprising 47% of the total population with all types of glaucoma and 87% of PACG. The prevalence of glaucoma is projected to increase due to population growth and longer life expectancy; it is estimated that, by 2020, 76 million people will be affected by the disease while 11.2 million will be severely visually impaired.35 41

Glaucoma is the second leading cause of blindness globally, after cataract.  2010 estimates reported that 4.5 million people were blind due to POAG and 3.9 million were blind due to PACG; the risk of blindness being greater for PACG than POAG.37

It is estimated that in the UK about 2% of people over 40 years of age have POAG and this number rises to approximately 10% in people over 75 years of age. There are approximately 480,000 people affected by POAG in England and around 10% of total UK blind registrations are due to glaucomatous optic neuropathy. The number of people affected by glaucoma is expected to rise with changes in UK population demographics.42



1.1.4 The Structure – Function Relationship in Glaucoma

It seems reasonable to assume a relationship between the amount of RGC loss and degree of visual dysfunction. The classic teaching is that for the assessment of a glaucoma patient a clinician should look for an agreement between structural changes at the ONH and functional changes to the VF. When this is identified, glaucoma can be confidently diagnosed. If there is a mismatch then other diagnoses should be considered. Although the site of primary damage is still in debate, loss of a group of nerve fibres and death of the corresponding RGCs will typically produce defined scotoma that should match the topography of the dysfunctional retinal nerve fibres. However, clinical cases have shown that it does not always work this way in early cases and clinicians can find that the match between the structural appearance and the functional loss is not always as good as might have been expected.43

Early research work reported that structural loss occurs before functional changes in vision can be detected.44-46 However, the frequently quoted notion, that at least 25% of RGCs are lost before any functional loss is evident, has been challenged by many later reports. Studies by Harwerth et al. used primates with experimentally induced glaucoma to demonstrate that there is a linear relationship between structure and function when both are plotted on a log scale47 and that the relationship strengthened significantly when retinal eccentricity was taken into account.48 However, this study, along with other studies, used primates where glaucomatous damage ranged from mild to very severe. A study of longitudinal VF change in glaucoma found a poor relationship between perimetry and optic disc change and concluded that function and structure provide largely independent measures of progression.49 Nonetheless, while there may be a significant association between these two parameters when looking at a wide range of field loss, when we look at patients with early damage it is clear that the relationship between the two measures is less obvious.50 The ideal diagnostic test would show a significant relationship between psychophysical threshold and RGC density in early glaucoma or, more impressively, in normal subjects.51

The different test strategies employed for perimetry and imaging provide an inherent problem in combining and comparing structure and function. As VFs are normally reported on a logarithmic scale and the nerve fibre layer on a linear scale, the relationship between the two is unlikely to be linear. A small change in sensitivity thresholds represents a much greater change than the associated nerve fibre layer changes measured in microns on a linear scale. Typically, in early stages of glaucoma, structural loss appears greater than functional loss, while in advanced cases it seems as if functional loss still progresses when further structural loss is no longer apparent. However, more sensitive techniques for the assessment of early functional loss and better measurement of individual RGCs might produce different results with the two running in parallel. What is more, RGC dysfunction prior to actual death of the cell may play an important role in cases where functional loss appears to occur first.43

The sensitivity of diagnostic tests to early glaucomatous damage depends on relative variation of results in healthy controls and initial structural and functional status. A study from Gonzalez-Hernandez et al. (2009) examined the structure-function relationship of glaucoma in 228 controls and 1007 glaucoma suspects and glaucoma patients of different severity.52 They observed that when the analysis is performed independently for the initial and advanced stages of glaucoma no curvilinear relationship is demonstrated. Furthermore, scatter plots between mean RNFL thickness and mean VF sensitivity showed the inter-individual morphological variability in early stages of the disease, thus reducing the strength of the association between structural and functional loss (figures 1.3 and 1.4). They concluded that the determination of the degree of functional damage based on structural data is not possible; patients with very mild or no functional damage demonstrate morphological values which are close to normal. Therefore, it is better to detect glaucoma by looking for changes over time, assessing both structure and function of a glaucomatous patient or a glaucoma suspect.


1.1.5 Diagnosis and Monitoring of Glaucoma

The diagnosis of glaucoma is based upon the identification of typical structural changes at the ONH with corresponding functional evidence of damage to the VF. The assessment of more than one parameter is essential for the early diagnosis of the disease. Those with isolated early structural changes or early VF loss are classified as glaucoma suspects, and they are followed-up at specific time intervals to monitor their status before being discarded (i.e. disease-free) or diagnosed (i.e. disease onset). The National Institute for Health and Clinical Excellence (NICE) in the UK has published guidelines for the screening and monitoring of glaucoma. In these guidelines, NICE recommends the assessment of the ON structure, the VF function and the IOP along with central corneal thickness measurement and the appearance of the anterior chamber angle for the correct diagnosis of glaucoma and recognition of eyes at risk of developing the disease.42

Direct ophthalmoscopy offers a magnified view of the optic disc, but the view is not stereoscopic with limited ability to see changes in the depth of tissues at the ONH. The NICE recommendation for the assessment of the ONH is to use stereoscopic slit lamp biomicroscopy. The examination should include the dilation of patient’s pupil for the accuracy of the assessment as ocular co‑pathology may be missed. NICE accepts that stereophotography accompanied with bio-microscopic slit lamp examination is not always practical. However, it recommends the obtaining of an optic disc image at diagnosis for baseline reference. The variability in inter-observer agreement of the optic disc assessment has driven research and clinical practice towards more objective assessment techniques such as the confocal scanning laser ophthalmoscopy (Heidelberg Retina Tomograph; HRT) and Ocular Coherence Tomography (OCT).

The use of RNFL measurements for the diagnosis of glaucoma has increased considerably, since the development of OCT imaging techniques. Originally called optical coherence interferometry, OCT was firstly introduced in 199153. A large number of studies reporting the diagnostic accuracy of TD‑ OCT have shown higher specificities, approximately 90%, than sensitivities, typically ranging from 70% to 90%.54-58 A few more studies comparing time- and spectral-domain OCT (TD- and SD-OCT, respectively) have reported similar or slightly better diagnostic accuracy with the latter.59-63

The confocal scanning laser ophthalmoscopy, developed by Heidelberg Engineering (Heidelberg, Germany), uses a diode laser beam that scans the ONH and provides measurements of ONH topography. It then generates a number of stereometric parameters, such as rim area, cup area, cup-to-disc ratio etc. The device has good glaucoma discriminatory ability, which is comparable to optic disc assessment by glaucoma experts.64 65 The latest version of this technology, the HRT III, offers a large normative database and advanced analytical tools, such as the Moorfields Regression Analysis66 and the Glaucoma Probability Score67, which improve the diagnostic accuracy of the instrument68. Nonetheless, the severity of VF loss has a significant influence on the diagnostic performance of all imaging instruments (both HRT III and OCT), with more severe stages being associated with higher sensitivity.69

The evaluation of the functional status in a suspected eye is essential for the diagnosis of glaucoma. The clinical method for the assessment of a patient’s VF is called perimetry. NICE recommends the most widely used technique for VF testing, the Standard Automated Perimetry (SAP), with central thresholding test. Perimetry is invaluable to glaucoma management as it is the only method to reflect functional changes. An agreement between functional and structural changes/loss gives more confidence to glaucoma diagnosis, whereas a mismatch might indicate other ocular disorders. As functional changes are at the epicentre of this study, more details on the evaluation of VFs are given in section 1.1.7.

Goldmann applanation tonometry (slit lamp mounted) is considered to be the reference standard in IOP measurement. As mentioned previously, high IOP has been identified as an important risk factor for developing glaucoma but cannot be used to accurately discriminate between normal subjects and patients with glaucoma or quantify the disease severity. The normal upper limit of IOP is taken to be 21 mmHg.70 However, numerous studies have reported on the positive relationship between age and IOP value and also the higher prevalence of increased IOP in black populations in comparison to whites. There are also diurnal changes in IOP, where IOP normally peaks early in the morning with a trough in the afternoon. These changes have been reported to be more evident in open‑angle glaucoma (OAG) patients than normal‑tension glaucoma patients and normal subjects.71 Another shortcoming of IOP measurement alone for glaucoma detection is the influence of central corneal thickness and the anterior chamber configuration on IOP values. Therefore, NICE recommends supplementary tests to measure the central corneal thickness and assess the configuration and depth of peripheral anterior chamber.

Precise knowledge of the state of the anterior chamber angle is essential for the diagnosis of angle closure glaucoma. The process of gonioscopy involves the use of a goniolens (or gonioscope) in cooperation with a slit lamp to gain a view of the anatomical angle formed between the eye’s cornea and iris. This iridocorneal angle defines the type of the disease (open- or closed- angle glaucoma) and its management. Recent developments in OCT have also allowed the use of this technique for the assessment of the anterior chamber angle. Central corneal thickness has been identified as a risk factor for converting from ocular hypertension (OHT) to OAG.5 The measurement of corneal thickness, also called corneal pachymetry, has been proven to be an indicator of glaucoma development when combined with standard measurements of IOP. The process of corneal pachymetry involves ultrasonic and optical methods with contact and non-contact techniques. NICE guidelines recommend the measurement of central corneal thickness by both contact and non-contact methods, although it recognizes that contact measurement techniques may be associated with potential corneal injury or transmission of infection.42

Glaucoma is a lifelong condition with variable clinical features. Thus, follow‑up is required to evaluate rates of progression, long-term risk of impairment and suitability of current management. The maintenance and availability of reliable records is necessary for the coherent continuity of the health care. NICE recommends the assessment of four parameters in a single visit: 1) the IOP levels, 2) the structural appearance of the ONH, 2) the visual function and 4) the configuration of the anterior chamber depth.42 The process of examination is the same as that for diagnosis, apart from the parameter of iridocorneal angle where, given that gonioscopy’s accurate results have been recorded on diagnosis, Van Herrick’s test is preferred for follow-up assessment due to its time-effective advantage. If a change in the ONH status is observed, a new image should be obtained for the patient’s records for future assessments and comparisons. Central corneal thickness measurement is repeated only in cases where a change is suspected, e.g. following laser refractive surgery or at onset or progression of corneal pathology .42

1.1.6 Management and Treatment in Glaucoma

Treatment for glaucoma seeks to control the disease with no evidence of progression or progression at a rate which will preserve adequate visual function for the rest of the patient’s life. It is focused on the only factor that can be modified, the IOP. In some cases, no treatment may be needed due to the static state of the disease while in others a more aggressive approach is required to confront a rapidly progressive condition. The main aim when treating glaucomatous patients is the lowering of IOP levels to a clinically pre-determined ‘target pressure’. This target IOP is established on the basis of current IOP level, severity of disease at diagnosis and rate of disease progression and is subject to modification during follow-up. Other factors, such as age and life expectancy are also taken into account.72

IOP can be lowered either by medication or surgery. Medication is the first line treatment for most cases. There are five main classes of drug available: prostaglandin analogues, beta-blockers, sympathomimetics, miotics and carbonic anhydrase inhibitors. They achieve lowering of the IOP in the affected eye either by reducing the production of the intraocular fluid (aqueous humour) or by increasing the rate of outflow. The positive effect of different IOP reduction medications was reviewed by a meta-analysis of trials conducted by Vass et al. (2007).73 Numerous studies and clinical trials have provided evidence showing the positive benefits of decreased IOP upon rates of progression and a delay in conversion from OHT to POAG.74-77 However, there is still a significant proportion of cases who despite achieving target levels in IOP continue to progress. Inversely, there are patient subgroups that show no progression without any treatment. These findings indicate the presence of other factors that might contribute to the progression of the disease; further details on the theories behind potential mechanisms in the disease have been discussed earlier in section 1.1.1. An on-going placebo-controlled randomized clinical trial that is undertaken in the UK further investigates the effect of medical treatment on glaucoma.78

When drug delivery is not enough and target IOP has not been achieved, the option of surgery can be offered. Surgical treatments can be classified as penetrating and non-penetrating, all of which aim to lower IOP. NICE recommends trabeculectomy as a penetrating surgical procedure and deep sclerectomy and viscocanalostomy as non-penetrating. There are also laser techniques available for treating glaucomatous eyes, such as argon or selective laser trabeculoplasty (ALT; SLT). These two techniques are quite similar and involve the trabecular meshwork. The theory behind this treatment is that ALT and SLT are thought to activate trabecular cells, thus improving outflow through the trabecular meshwork.



1.1.7 Evaluation of Function

The assessment of the functional status of a glaucomatous eye is essential for the diagnosis and management of the disease. This section will provide an in-depth view on the clinical methods used to assess visual function.

The VF is defined as a three-dimensional space from which light can enter the eye and stimulate a visual response. The normal eye’s VF extends approximately 60◦ nasally, 100◦ temporally, 60◦ superiorly and 70◦ inferiorly.79 According to the Imaging and Perimetry Society (IPS), “the measurement of visual functions of the eye at [various] locations in the VF area” is called perimetry. A perimeter is an instrument designed to measure the VF by examining the differential light sensitivity.80 The differential light sensitivity varies across the VF with the peak sensitivity occurring at the fixation point in photopic conditions, decreasing rapidly in the 10 around fixation and then more gradually towards the periphery.81 RGC fibres transmit the visual signal through the sclera at the ONH, typically 10-15 nasally to fixation. At this location, there are no photoreceptors, creating a normal absolute scotoma, the “blind spot”. Any damage to the visual pathway, such as glaucoma and optic neuritis, will affect the VF. Currently, the standard method of VF evaluation is SAP and can be undertaken with a range of perimeters including the Humphrey Field Analyzer (HFA), Octopus and Henson perimeters.

1.1.8 History of Perimetry

Albrecht von Graefe, in 1856, was probably the first person to report a quantitative VF analysis by examining his patient’s VFs with the movement of a small stimulus along a flat surface; this examination procedure is termed campimetry.82 The first perimeter with a complete bowl and control of the background luminance was described in 1872.83 The first multiple-stimuli technique for VF examination was introduced by Harrington and Flocks.84 They designed an automated tangent screen on which several supra-threshold stimuli could be presented at different locations of the field of vision, while the patient had to report how many stimuli had been detected. Landmarks for the development of static supra‑threshold perimetry were the development of the Friedmann Visual Field Analyzer and computer driven Henson Central Field Analyzer 2000.

The Swiss ophthalmologist Hans Goldmann introduced his bowl perimeter in 1945; this instrument set a new standard for perimetry by controlling many of the parameters known to affect the VF (figure 1.5). A decade later, Armaly and Drance, created a form of quantitative static perimetry on the Goldmann perimeter. A below-threshold stimulus was presented in the VF and its intensity was increased in constant steps, until it was reported as seen by the patient. It soon became obvious that this technique had advantages over kinetic examinations, although manual static perimetry was a demanding task for the examiner and patient. In the 1970s, Heijl and Krakau in Sweden, as well as Spahr and Fankhouser in Switzerland, contributed to the development of improved instrumentation.85-87 Spahr et al. introduced the Octopus Perimeter (Interzeag, Switzerland), the first computerised static perimeter which became commercially available in 1978. Two years later, Humphrey Systems (Dublin, CA.) presented the HFA (Carl Zeiss Meditec Inc., Dublin, CA), which has, through its popularity, set a standard for SAP.

During the last three decades SAP has gradually replaced kinetic techniques for the investigation of the fields of vision. Research concentrated on the development of threshold algorithms that produce reliable estimates of the sensitivity, while keeping the time of the investigation as low as possible. The various algorithms/strategies that are currently available are explained in detail in section 1.1.12.

1.1.9 Classification of Perimetry

Perimetry can be broadly classified into two types: Kinetic and Static. In kinetic perimetry the intensity of the stimulus is kept constant while it is moved, usually from a non-seeing area to a seeing area, across the VF. The patient is expected to respond and report when the stimulus is first noticed; this location is then recorded on a VF chart. By moving the stimulus across the VF, areas of VF defect will be detected when the stimulus appears to vanish. The speed of the stimulus should be standardised, typically 2-4 degrees per second.

In static perimetry the stimuli are fixed at predetermined locations but their intensity varied to give measures of sensitivity. Most modern perimeters incorporate a series of different static tests that test different regions of the VF. Static perimetry can also be sub-classified into two techniques of investigation: threshold and supra-threshold. In threshold perimetry, an estimate of the patient’s differential light sensitivity is obtained at each test location and compared with those from a normal population of the same age (age-corrected) as the observer. The stimulus intensities, locations and timing of presentations are controlled by a computer, according to a threshold algorithm. In supra-threshold perimetry the stimuli are presented at intensity calculated to be above the threshold of a normal observer. Further details are given later in this section; supra‑threshold approach being a crucial part of the reported PhD project.

1.1.10 Psychophysics of Perimetry

In order to understand the term of threshold, we need to look further into the psychophysical background of perimetry. The idea of a “threshold” originated in the late 19th century when Fechner worked on the relationship between stimulus intensity and likelihood of perception.88 According to his “high‑threshold” theory, there is a threshold below which a stimulus is not perceived, and above which it is perceived. Psychophysical data demonstrate a continuous zone of stimulus intensities between “definite perception” and “definite non-perception”. The high-threshold theory explained this transition as an outcome of random fluctuations of the “true” threshold.

In static perimetry, sensitivity is expressed on a logarithmic, instrument‑specific ratio scale of “stimulus attenuation”, in decibel (dB) units (equation 1).

The scale shows the relationship between the luminance increment of the stimulus (

∆LStim) to the maximum luminance increment (

∆LMax) that the instrument is capable of producing. Thus, the most powerful stimulus is referred to as 0 dB, while a stimulus of 40 dB has been attenuated by 4 log units, which accounts for 1/10000 of the maximum luminance increment. Due to the dependence of the scale on the maximum luminance increment, sensitivity estimates from different instruments cannot readily be compared.

Sensitivity and response variability at a specific retinal location are well described by the frequency-of-seeing (FOS) curve. A FOS curve shows the probability of a positive response for a number of different stimulus intensities. These curves are generally S-shaped and their general form is given by equation 2.

For yes-no tasks, such as static perimetry, the threshold is usually defined as the stimulus intensity at the 50th percentile on the FOS curve (figure 1.6). The slope of the curve defines the physiological response variability, or, in other words, the width of the transition zone between “always perceived” and “never perceived”. False-positive and false-negative rates give information on how likely it is that the observer responds even though no stimulus was shown (false-positive) and fails to respond in intense supra-threshold stimuli (false negative).89

The influence of response behaviour on FOS curves is described by signal detection theory (SDT) by Green and Swets in 1966.90 SDT proposes that the observer’s detection system is noisy; thus, there is a baseline neural activity even in the absence of a visual signal – stimulus. The presence of a stimulus will increase the level of activity.

FOS data have been generated from normal, suspect and OAG eyes by several research groups.91-93 They have established that variability increases as the sensitivity reduces.94 In 2000, Henson et al. compared the relationship between sensitivity and response variability in the VF of normal eyes, eyes and those with optic neuritis, glaucoma and OHT.95 FOS data showed that the relationship between these two parameters was similar between the four groups, with the authors concluding that the results provided further evidence to support the hypothesis that response variability is dependent on functional RGC density. According to this hypothesis, the relationship would be similar in glaucoma and optic neuritis despite the different patterns of VF defects and the different mechanism of nerve fibre damage.96

1.1.11 Current Perimetric Specifications

Stimulus and Background

During VF examination individuals are asked to respond to a series of stimuli presented in different locations within their VF. The most commonly used measure of VF testing in glaucoma is the white‑on‑white perimetry, where achromatic light spots are displayed on a white background. The HFA perimeter uses a background level of 31.5 apostilb [asb; equal to ~ 10 candelas per square metre (cd/m2)] therefore producing photopic conditions in which cones are primarily tested. Stimulus intensity varies from 10,000 asb to 0.1 asb in the HFA, allowing the machine to measure thresholds over a 50 dB range.

Other alternative stimuli/backgrounds have also been developed. Blue‑on‑yellow perimetry (also called short-wavelength automated perimetry; SWAP) targets specific visual pathways that are thought to be selectively damaged in early glaucoma. Studies have shown that the use of blue stimuli on a yellow background is superior to white-on-white perimetry for assessing functional damage in early glaucoma. However, SWAP has some limitations that prevent a wide adoption of this technique. Media opacities are thought to influence threshold estimations while SWAP demonstrates, in general, higher test-retest variability.

Frequency-doubling technology (FDT) is another variation of perimetry. The FDT technology is based on the frequency-doubling effect which occurs when a low-spatial-frequency grating is flickered at a high temporal rate and results in the grating’s appearing to have twice its original spatial frequency. It is believed that the frequency-doubling concept targets a small subset of RGCs (approximately 2% of the total population) that are again thought to be selectively damaged in early glaucoma. FDT perimetry uses frequency-doubling stimuli and contrast thresholds are measured for detection of the FDT stimulus.97

Alternative forms and developments of modern perimetry, such as the ones mentioned above, aim to improve detection of glaucoma by selectively testing specific RGCs. For example, high-pass resolution perimetry (or ring perimetry) is presumed to selectively test the parvocellular system.98 The stimuli used in this variation of modern perimetry are rings of variable size with dark borders and bright centres. These rings are projected at different locations on the screen and create an average stimulus luminance equal to the background luminance. The results of this test are believed to correspond to the density of RGCs and, concerning glaucoma, ring perimetry is comparable to standard perimetry in terms of diagnostic performance.99

In the early 1980s, Prof Fitzke investigated motion displacement thresholds in glaucomatous and normal population and developed the first Motion Displacement Test (MDT) at the Institute of Ophthalmology, London. He found evidence of elevated motion displacement threshold in defective areas of the VF. MDT’s most recent development, the Moorfields MDT (research product of collaboration between Moorfields’ Glaucoma Research Unit and Institute of Ophthalmology, UCL) incorporates 31 line stimuli which are scaled in size by estimate of RGC density. Moorfields MDT is a Windows‑based software that fits a 15-inch laptop screen at a test distance of 30 cm. The test task is to look at a central spot and press the computer mouse each time a line on the screen is seen to move. The aim of the Moorfields MDT test development is to offer an affordable, portable and sensitive method of case-finding in the community.

Stimuli Distributions

There are numerous stimuli distribution patterns each of which is selected according to the needs of the VF examination; for example, whether emphasis should be given at the inferior or superior field, if the test is for screening or monitoring purposes etc. The most common stimulus distributions are the central 30-2, 24-2 and 10-2 distributions (figures 1.7 and 1.8). The 30-2 pattern examines the central 30 degrees of VF. It includes 76 stimulus places located on a square matrix of 6 degrees, displaced from the horizontal and vertical midlines by 3 degrees. The 24-2 distribution is simply a subset of the 30-2 pattern with 54 locations falling within the central 24 degrees along with two points at 27 degrees in the nasal field. While the 30-2 pattern provides the most information for the central VF, the 24-2 test has a shorter test time and smaller variability. The appearance of lens rim artefacts at the peripheral points of the 30-2 test, and hence low discriminatory power, is another reason that the 24-2 programme is routinely used in most ophthalmic clinics.100


When higher spatial resolution is needed (e.g. patients with small central fields), the 10-2 programme can be used to assess the residual visual function. The 10-2 pattern examines 68 locations within the central 10 degrees on a 2 degrees square matrix. Although the 10-2 test is routinely used for patients with advanced glaucoma, it is known that the macula (i.e. central field) is affected even in early glaucoma. Based on this theory, Traynis et al. hypothesised that some patients might fail a 10-2 test while presenting normal 24-2 results.101 They found that this was the case for 16% of the glaucomatous eyes they tested, therefore emphasising the poor detection of central loss with the 6 degree square matrix and suggesting that the 24-2 test is not optimal for detecting early damage of the macula.

Early work has shown that the informational value of each test location in the VF is likely to vary. Henson et al. analysed data obtained with a Friedman Visual Field Analyzer and showed that stimuli at locations greater than 20 degrees of eccentricity along with those around the blind spot give the least information.102 103 This evidence stimulated further work from Wang and Henson to evaluate the diagnostic performance of VF testing for early glaucomatous loss with subsets of the 24-2 test pattern (figure 1.9).104 They found that 11 locations (including 2 in the blind spot) did not contribute anything to the performance of the 24-2 test. They also presented optimized distributions with 10, 20 and 30 locations that retained good diagnostic performance.

Stimulus Size and Duration

A human’s ability to visually detect targets on a uniform background has been described by several laws in the past. One such law is Ricco’s law, which describes the relationship between a target area and target contrast required for detection (equation 3). Ricco’s law is based on the fact that the light energy required to lead to the target being detected is summed over the area and is thus proportional to this area. Ricco’s area is the area of complete summation; in other words, the largest target/stimulus size required for which the multiplication of area and intensity is constant at threshold. This region is variable based on the amount of background luminance105 and retinal eccentricity106.

As a result, stimulus size has a significant role in perimetry. Stimulus sizes were standardised by Goldmann in 1945 (Table 1.2), who based them on an estimated relationship between size and intensity, so that each step gives an approximately 5dB intensity change. In a HFA the size of standard stimulus is Goldmann III, approximately 0.5 degrees. Taking Ricco’s law into account, however, it is evident that the conventional stimulus size is smaller than Ricco’s area for retinal eccentricities over approximately 15°. Thus, thresholds for Goldman III stimuli in SAP are determined by complete spatial summation for those retinal regions only. Within 15° of the fovea, thresholds for SAP are determined by probability summation as stimuli are larger than Ricco’s area. Previous research has shown that there is no observable change in Ricco’s area as a function of age.107 However, there is a significant enlargement of the region in early glaucomatous cases suggesting that perimetric stimuli should be capable of adjusting their size as well as their contrast, therefore boosting the “glaucoma signal” within measurement noise.108

The effect of stimulus size in perimetry has been investigated by various studies. Wall et al., in 1997, studied the influence of stimulus size on the slope of psychometric function in normal and glaucomatous eyes.109 They concluded that the larger Goldmann V stimulus produced significantly steeper FOS curves than sizes III and I. In a more recent study, which tested a large number of patients with size III, V and with a method that varies stimulus size for a fixed contrast (namely size threshold perimetry), it was reported that the number of abnormal locations is the same for all different parameters.110 The study also highlighted the increased variability for size III and concluded that the adoption of a single stimulus size is not of great importance and new developments in visual perimetric stimulis should focus on other properties, such as lower variability, reduced illumination etc.

The human visual system responds through the absorption of light photons over both space and time. In the temporal domain, summation relates the duration of a stimulus to the threshold contrast achieved (i.e. Bloch’s law). When summation is complete, stimulus duration and contrast are inversely related at threshold (equation 4).

After taking Bloch’s law into account, we can assume that stimuli presented for longer durations are more likely to be seen as a result of temporal summation of information. However, Pennebaker et al. studied the effect of various stimulus duration times and concluded that between a range of 0.1 – 0.5 seconds the stimulus presentation time had little effect on threshold fluctuation in healthy individuals.111 Most static perimeters take into account Bloch’s law, which simply states that up to a certain presentation time the detection of a stimulus increased with increasing presentation time. More specifically, for photopic conditions, the critical duration is below 100 ms. In an attempt to provide a common framework for VF measurement, the IPS standardised most of the perimetric parameters, including the stimulus duration.80 The typical presentation time is 200 milliseconds; an interval longer than the critical duration of Bloch’s law, but shorter than the latency of a refixation saccade (~250 milliseconds) which would displace the retinal stimulus. A recent PhD work, by Padraig Mulholland, investigating temporal summation reported a significantly lower critical duration (~30 milliseconds) compared to results from previous studies.112 Such difference lies in the use of different analysis techniques and the assumptions they make concerning the degree of partial summation exhibited. Mulholland also reported on variations in summation in both the spatial and temporal domain in glaucoma, suggesting that stimuli modulating in area, duration and luminance may improve the sensitivity of SAP

1.1.12 Visual Field Testing Algorithms and Strategies

Threshold methods

There are numerous algorithms for deriving threshold estimates. Threshold algorithms may be adaptive or non-adaptive. In adaptive threshold algorithms, the stimulus intensity used on any trial depends on the observer’s responses to previous presentations whereas in non-adaptive methods, the intensities are pre-determined and independent of the subject’s response; the most common example being the “method of constant stimuli”.113 Early in perimetry’s history adaptive methods were considered more effective than non-adaptive, despite the uncertainty about the initial threshold value.114-116 The reason for this favoured view towards adaptive techniques was the acknowledgment that stimulus intensities closer to the true threshold value are generally more informative than those far from it and test times would be shorter.

One of the first adaptive threshold algorithms to be used in perimetry is known as the Full Threshold (FT) algorithm. In the FT algorithm the stimulus is first presented either at a value that is derived from neighbouring threshold estimates or, when these are not available, on the basis of normal values. The stimulus intensity is then decreased (or increased) according to the patient’s response (or non-response) at fixed increments (i.e. 4 dB) until the stimulus is not seen (or seen). The step size is then reduced from 4 to 2 dB and the direction reversed until the stimulus is seen (or not seen). The threshold estimate is taken as the attenuation (dB) of the last stimulus seen at that location. This process is called a “staircase algorithm”, due to the specific increments in the algorithm. The FT algorithm was developed by Bebie et al. in 1976.117 Their algorithm used the mean sensitivity of the observer’s age group as a starting level of stimulus intensity and was terminated once a response reversal occurred at a stepsize of 2 dB. The threshold value is taken as lying between the presentations that mark the second reversal; see example in figure 1.10. The method “4…2…1”, as they called it, derived from simulations that took into account the response variability in normal eyes and was quickly accepted as one of the optimal strategies. However, the FT algorithm requires approx. 5 presentations per test location and hence is exhausting, when combined with the 24-2 test pattern, for most patients (test times often exceeding 10 minutes per eye).

The Fastpac algorithm was developed for the HFA to reduce the test times of the FT algorithm. The Fastpac algorithm uses steps of 3 dB with a single reversal. Although it is faster than the FT method in normal eyes and in patients with mild VF loss, it underestimates the severity of VF defects118 and has greater variability. Also, the speed advantage is reduced in cases with advanced field loss.119

In pursuit of shorter test times that would increase the accuracy of VF testing, perimetric experts and researchers explored the benefits of other adaptive procedures, which make use of both previous knowledge about the shape of the psychometric function and the observer’s previous responses to guide further testing. The advantages of such an approach were discussed in the ‘60s by numerous authors, but it was Andrew Watson and Dennis Pelli in 1983 who introduced a Bayesian adaptive psychometric method.120 QUEST, as they named it, is an efficient method of measuring threshold based on three steps: 1) specification of prior knowledge of threshold, including an initial probability density function (pdf; details are discussed further below), 2) a method for choosing the stimulus intensity of any trial and 3) a method for choosing the final threshold estimate. Watson and Pelli introduced a Bayesian framework to calculate a current pdf which takes into account prior knowledge of the psychometric function and data from previous tests. This pdf is then applied on steps 2 and 3.

The determination of the maximum likelihood threshold for each test location requires an initial pdf, which states for each possible threshold the probability that any patient will have a threshold at that location (see Figure 1.11). The first stimulus is then presented according to that initial pdf (e.g. the median or mean of the pdf). The observer’s response to that stimulus is used to modify the pdf for the next presentation. This process is repeated until the specified terminating criteria have been met; for example when the standard deviation of the pdf falls below a fixed value.King-Smith et al evaluated various modifications on the QUEST threshold method, particularly on the technique for choosing the intensity of the next presentation but also on steps 1 and 3.121 They concluded that their Zippy Estimate by Sequential Testing algorithm (ZEST; a QUEST variant) which sets the intensity to the mean of the current pdf provided greater precision than the original QUEST method or other simulated variations.

Visual Field Testing Strategies

Following on from King-Smith’s work several perimeters introduced algorithms that adopted some of the maximum likelihood principles. The Swedish Interactive Thresholding Algorithm (SITA) was the first to adopt the more efficient Bayesian methods to derive threshold estimates of similar precision to the FT and Fastpac algorithms but with substantially fewer presentations.122 The SITA strategy was initially developed for the HFA perimeter and estimates threshold sensitivity at each point based on observer’s responses to stimuli at that location, as well as responses from nearby points. Thus, an assumption is made about the underlying psychometric function, placing the stimulus intensity at the next location as close as possible to the nearby threshold. FT strategy is still followed for the first 4 points tested, one in each quadrant of the VF. At least one reversal from decreasing to increasing intensity is obtained for each location. Test times in normal eyes are halved from FT tests, with similar or better reproducibility.123 124 SITA includes variable inter-stimulus intervals, new methods for detecting false positives and follows the “4-2” staircase algorithm for stimulus intensities. There are two version of the SITA strategy: SITA Standard and SITA Fast; the latter having looser terminating criteria, shorter test times and greater test-retest variability in areas of low sensitivity.

Another widely used strategy is the Zippy Adaptive Threshold Algorithm (ZATA), which is used in the Henson perimeters. The Henson ZATA test uses a modified ZEST algorithm.125 It differs from SITA by using the pdfs for deciding test level. It also uses prior VF test thresholds, when available, for setting starting test intensities and the use of terminating criteria that change through the test according to the patients’ responses. Quite recently, the ZEST strategy was also adapted for the FDT and was implemented in the new Humphrey Matrix perimeter. Turpin et al, in 2002, developed a new test procedure for FDT perimetry that adopts ZEST principles for threshold estimations.126 127

Multisampling Supra-Threshold Techniques

Many clinical applications call for quick, simple, yet reliable VF tests that can be performed by patients without the need of training. Conventional supra-threshold tests are easier to perform as they reduce the number of stimulus presentations and therefore test duration. However, supra-threshold perimetry is thought to be less able to detect mild VF defects than threshold testing. The conventional criterion for defining a VF location as defective is when no stimulus is seen twice out of 2 presentations. This criterion may reduce false positive errors, but also reduces the ability of the test to identify correctly those who have the condition of interest (or in other words the sensitivity of the test) by a small, yet significant, amount.128

Paul Artes et al attempted to tackle the issues of sensitivity and variability in supra-threshold perimetry by developing an optimal multisampling technique.128 The criterion for the classification of a location as normal or defective was 3 seen or missed presentations (3/5), respectively; meaning that between 3 and 5 stimuli were required to be presented at each location. They evaluated their newly developed technique along a range of defects and in comparison with conventional supra-threshold (1/2) and FT strategies. They demonstrated that multisampling could be a powerful alternative to other strategies as it shows similar sensitivity to that of the FT, which is considered the gold-standard, without sacrificing specificity (i.e. the ability of the test to identify correctly those without the disease).

Multiple-Stimulus Perimetry

Most of the current clinical VF tests use single-stimulus techniques to obtain threshold measurements. These techniques are demanding for patients; it is not unusual for individuals to report difficulties in maintaining their attention during testing. Wall et al. provided evidence showing that brief lapses of attention might be associated with overall reduced sensitivity and increase response variability.109

In the ‘50s, Harrington and Flocks introduced multiple stimulus perimetry as a screening test.84 During this type of testing up to 4 stimuli are presented at each exposure. The patient verbally reports the number of seen stimuli, along with their location if this number is smaller than the actual number of presented stimuli. Verbal feedback has been shown to be a parameter that can contribute to the maintenance of patient’s attention and reduction of variability. Recently, Miranda and Henson measured the perimetric sensitivity and the response variability of both single- and multiple-stimulus perimetry in glaucoma and demonstrated that a multiple-stimulus technique could reduce variability by more than 1 dB on average while increasing threshold sensitivity by almost 2 dB.129 Their work showed that changes in both the ways that stimuli are presented and patients respond could improve routine clinical perimetry.

Parameters influencing Perimetry

Variability in VF testing can be subdivided into short- and long-term (intra- and inter-test respectively). Variability during testing can be represented, as previously shown, by the slope of the FOS curve. It is observed to be higher in defective locations with a standard deviation of approximately 7 dB.130 Inter-test variability has been examined in glaucoma patients tested with both conventional (30-2, HFA) and FDT perimetry.131 Both techniques showed larger re-test variability in areas with reduced sensitivity compared with normal locations.

Two factors that influence perimetric outcomes are the stimulus characteristics and the observer. Stimulus size has been associated with variability, with smaller stimulus sizes (Goldmann I and II) demonstrating larger variability compared with larger sizes (Goldmann III, IV, V).132 In a study conducted by Henson et al, a statistical analysis between variability and numerous factors showed that significant fluctuations in threshold measurement are related to sensitivity (with reduced sensitivity demonstrating larger variability), diagnosis and false-negative rate, whereas no association was established between variability and factors, such as age, eccentricity, fixation losses and false positive rate.95

The observer’s variables that affect the slope of the FOS curve include perimetric experience (learning effects), fatigue, and loss of attention. More specifically, patients tend to perform better in follow-up tests as they gain test experience.91 133 The learning effect is usually greatest between the first and second test. Therefore, a patient’s first VF result should be interpreted with caution. Patient fatigue may result in decreased retinal sensitivity.134 It has been the limiting factor for attempts to increase the accuracy of testing by extending the time of the examination. The fatigue effect on VF testing has been confirmed in both normal and glaucoma groups, with the latter demonstrating a larger increment of variability.

Visual Field Loss in Glaucoma

Glaucomatous VF defects can be diffuse, as with cataract cases or patients with corneal opacification, or localised.135 VF loss associated with glaucoma is also, especially in its early stages, usually asymmetric about the horizontal meridian and typically correlates with the arrangement of the RGC axons within the RNFL.

A typical glaucomatous defect is the nasal step, where an area in the nasal VF has reduced sensitivity on one side of the horizontal meridian and normal sensitivity on the other. Another characteristic feature of glaucomatous functional loss, and a sign of a moderate stage of the disease, is the classic arcuate scotoma; a comma-shaped defect arching over the central VF. Other typical types of VF defects in glaucoma are: a paracentral defect 10°-20° from the blind spot, generalised constriction (tunnel vision) and, of course, complete VF loss at the end stages of the disease (figure 1.12).136


EssayHub’s Community of Professional Tutors & Editors
Tutoring Service, EssayHub
Professional Essay Writers for Hire
Essay Writing Service, EssayPro
Professional Custom
Professional Custom Essay Writing Services
In need of qualified essay help online or professional assistance with your research paper?
Browsing the web for a reliable custom writing service to give you a hand with college assignment?
Out of time and require quick and moreover effective support with your term paper or dissertation?
Did you find someone who can help?

Fast, Quality and Secure Essay Writing Help 24/7!