Since the first anticoagulant-resistant rodents were discovered in Scotland in 1958, researchers have sought ways reliably to distinguish between resistant and susceptible animals. Several different testing methods remain available and are widely employed; each has its advantages and drawbacks.
Early anticoagulant resistance testing methods proposed by the World Health Organization relied on laboratory no-choice feeding tests in which bait, containing the normally-used concentration of the active ingredient under investigation, was offered to groups of individually-caged rodents for different numbers of days. Baseline tests were conducted for each different rodent species using susceptible strains. The resulting dose/response lines were subjected to probit analysis to obtain lethal dose percentiles, expressed in terms of the numbers of days of continuous feeding required to kill different percentiles of susceptible populations. Individuals that survived the lethal feeding period required to kill 99% of susceptible animals (i.e. the LFP99) were considered resistant. Although they were conducted in the laboratory, these tests could be readily interpreted in terms of the practical outcome of rodent control treatments. This was because resistance was defined in terms of the period of feeding, albeit no-choice, on commercially-used baits required to kill a high percentage of a rodent population.
A drawback with lethal feeding period tests is that they are time-consuming to carry out and, because mortality is the required end-point, they are questionable on grounds of humanness. Consequently, alternative tests were developed to overcome these difficulties using the blood clotting response (BCR). In BCR tests, the ability of the blood to clot in the presence of measured doses of an anticoagulant is determined in susceptible animals. Animals are said to be resistant when their blood continues to clot when a dose of anticoagulant (the discriminating dose) is administered that would prevent clotting in given percentile, normally 99%, of susceptible rodents. BCR tests were conducted on Norway rats over a period of 20 years for a number of anticoagulant compounds of both the first and second generations. Using this method, the first routine screening for resistant Norway rats was initiated in the UK, permitting some resistance areas to be delineated. However, in their turn, these BCR tests were found to possess drawbacks. These were mostly due to the fact that the researchers who had developed them introduced variability by using different techniques, laboratory reagents and discriminating doses.
In order to overcome these difficulties, Norway rat and house mouse BCR base-line data have been developed for several first- and all second-generation anticoagulants by the industry’s Rodenticide Resistance Action Committee using a novel and consistent BCR test methodology introduced by researchers at the University of Reading, UK. Another major difficulty of the early BCR test method was that of relating resistance determined by these test methods to practical treatment outcomes. The novel RRAC BCR test methodology has overcome this particular difficulty, by, for the first time, permitting the calculation of resistance ratios from the BCR test data.
These conventional laboratory techniques for testing rodents for resistance were reviewed by the European and Mediterranean Plant Protection Organization and anticoagulant resistance and resistance testing methods have recently be comprehensively reviewed in other documents (LINK Further Information, Unterpunkt Further Reading).
New advances in our understanding of the genetics of anticoagulant resistance now offer promise of cheap and rapid tests for resistance that overcome these drawbacks (see Classification and history of rodent control). Work by researchers in Germany has identified mutations in the gene coding for vitamin K1 epoxide reductase in both Norway rats and house mice that are responsible for anticoagulant resistance in a number of resistance foci in Europe. Our increasing understanding of resistance SNPs has made it possible to develop molecular-biological techniques for the identification of mutant resistance genes in DNA extracted from small pieces of rodent tissue, and even from faecal pellets. Such quick, cheap and humane tests, for the first time, permit more detailed mapping of resistance foci which, in turn, will assist in the management of anticoagulant-resistant rodent infestations.
The severity of resistance conferred by the different SNPs, and therefore their importance in terms of practical rodent pest management, still requires interpretation using mechanistic studies, such as laboratory feeding tests and BCR tests.
However, care is required in the interpretation of the results of DNA screening surveys. Some genetic mutations are ‘silent’. That is they occur in parts of the genome that may sometimes contain significant resistance mutations but, in fact, they have no observable effects on blood clotting and therefore on resistance. Other mutations may be found on which we have no prior information and these may be either silent or confer a significant degree of resistance. In other studies, resistant rodent strains are discovered which possess no observable DNA mutations at all.