Berlin Special Session—Integrating Different Tools for Assessing Soil Health Status
Kees van Gestel, Jörg Römbke, and Mónica João de Barros Amorim
The special session Integrating Different Tools for Assessing Soil Health Status was organized by the SETAC Global Soils Advisory Group (GSAG). The session first aimed at determining what level of knowledge is needed for a proper assessment of the quality or health status of a (contaminated) soil to allow for a well-balanced decision regarding management measures to be taken. Second, it aimed at reviewing different tools available, ranging from single-species toxicity tests to new gene expression techniques. The possibility of integrating the information from different tools and confirmation of added value was discussed for two case studies.
The special session, which consisted of invited lectures only, received great attention with almost 200 participants. A summary of the lectures and case studies is provided below.
Joke van Wensem (Technical Committee Soil Protection, The Netherlands) set the scene by giving a historical overview of developments in the field of soil protection, both for predicting the toxicity of chemicals and for assessing the (ecological) risk and priority of clean-up of contaminated land. She distinguished policy-driven soil quality assessment and assessments not driven by policy. She observed that degradation of soil is not on the list of environmental problems that people tend to mention when asked for. One way of coping with this is to link soil quality to the bigger picture of ecosystem services.
Juliane Filser (Bremen University, Germany), gave an ecologist’s perspective, mentioned land use, climate change and pollution as the major threats to soil quality. Referring to the concept of resilience, it may be hard to decide at what time after a disturbance an assessment of soil quality should be made, as an impact on the system will always be followed by succession. In other words, the system is dynamic and single-time observations may not be able to provide a clear picture of the severity of the disturbance. What is needed are reliable data but also good reference data. She also considered it important that data can be produced quickly and easily. As a consequence, a pragmatic approach seems the best way forward. This approach requires a proper assessment of state and process variables, which will also be determined by the type of ecosystem to protect. Long-term observational data will be needed for the calibration of the proposed approach; where uncertain, a second-tier approach with more advance assessments can be included.
This proposal seemed to contradict the conclusions of Ryszard Laskowski (Jagiellonian University, Krakow, Poland) who argued that standard toxicity tests, in general, are performed under optimal conditions and are too short in duration. As a consequence, they are unable to predict responses under real-world conditions. He argued that unfavourable conditions, such as less optimal temperatures or salinities, should be included in testing. As an example, he showed how the effects of a combination of exposure to nickel and chlorpyrifos with different temperatures were almost impossible to predict. To support his arguments, he also referred to a recent study by Frouz et al. (2011; Appl. Soil Ecol.) who concluded that laboratory toxicity tests could not explain the abundance and diversity of soil invertebrate, algal and plant communities in a gradient of pollution.
Mike McLaughlin (CSIRO and Adelaide University, Australia) gave an overview of issues regarding exposure, especially focusing on metals. The free ion is important for membrane transport but complexation may increase uptake as was shown for cadmium uptake from CdCl2 solutions. The free ion therefore seems not to be the answer; it should be realized that the solid phase also is important because of its buffering capacity. To enable comparison of soils, two methods may be applicable. One is using soil normalization; this is currently done by correcting for leaching and ageing and expressing concentrations on the basis of added metal. Another option is the use of selective extracts, but only few accepted methods are available and calibration is lacking. Mechanistic understanding is still lacking, which also counts for the biotic ligand model that is already applied in practice for risk assessment of some metals.
Rick Scroggins (Environment Canada) concluded that standard test methods are basically needed to fulfill regulatory needs. Scroggins gave an overview of different toxicity test methods for soil, with OECD test methods mainly being used for predicting chemical effects, ISO tests mainly for retrospective testing of contaminated soils, and the tests developed by Environment Canada for both purposes. In addition to the standard tests on plants and invertebrates using single species, tests are available determining both the functioning and species diversity of the soil microbial community for substance and contaminated land assessment. Semi-field tests using terrestrial model ecosystems or focusing on ecosystem segments in the field are also available but currently only used for the assessment of chemicals. New methods under development aim at more realistic testing, for example, by focusing on boreal and subarctic soils or conditions reflective of tropical climate such soil from the tropics and higher test temperatures. In the discussion, it was suggested that the different tests available be connected to the different ecosystem services provided by the soil and then to see what elements are still missing in the battery of currently available tests.
Post-genomic tools being mature technologies in the environmental sciences was the opening conclusion of Dave Spurgeon (Centre for Ecology and Hydrology, NERC, UK); for example, microarrays have been in use in the field almost since their inception in 1995, although they have only been sparingly applied in soil ecotoxicology. Comparative research has indicated that molecular markers, for example metallothionein levels, can be more sensitive than general ecological parameters. Frequently, the variation in gene expression can be large between individuals. This can be seen as a problem for regulatory applications; however, it provides an indication of the evolutionary trend of populations to pollution, informing long-term change. New efforts in genome sequencing are providing important new information on the conservation of molecular targets between taxa. This has the potential to provide important information on the potential susceptibility of species to different chemical stressors. There are many case studies where omics helped understand pollutant effects. An example relating to the use of metabolic profiling and RNAseq to identify the role of phytochelatins and metallothionein in arsenic toxicology was presented. Efforts to identify common responses to stress have, to date, led to mixed results. For example, a study to investigate metabolic profiles between earthworms from polluted and non-polluted sites, identified site-specific differences, but no general metabolite could be identified that was indicative of pollution. With the evolution in sequencing techniques and lower costs, it is likely that many organisms will have their transcriptomic information available in the near future, and mechanisms of action of stressors can there be explored in a most phylogenetically-structured way.
Michiel Rutgers (RIVM, The Netherlands) identified four cases where assessment of risk is needed and gave examples of methods that could be used. When a single stressor is present, with or without protection goals, the standard approach using laboratory toxicity data and applying a safety factor or species sensitivity distributions (SSD) is sufficient. The SSD is the only ecological theory that has made it to legislation. In case of multiple stressors, mixture toxicity is included in a Tier 2 approach, where toxic pressure is estimated from chemical concentrations and a triad approach (chemistry, toxicology and ecological data interpretation) may be applied to assess actual risk. In Tier 3, a weight-of-evidence approach, involving a complete triad may be applied. There is no general framework for multiple-stress assessment, nor is generalized ecological theory available. In the last case, use orientation, sustainable management without specific orientation on stress is applied. In this case, a proper reference is needed. The Dutch Biological Soil Quality Network (BISQ) could act as such a system; in BISQ data are available for soil food webs in sandy soils, for 10 combinations of land use and soil type. Since none of these cases tell what the protection goal is, Rutgers argued focus should be on ecosystem services, and he proposed a method of how to link soil quality to ecosystem services.
The first case study, presented by Frank de Jong (RIVM, The Netherlands), focused on the predictive assessment of pesticide risks involving several tiers, with different approaches for non-persistent versus persistent active compounds. In a Dutch proposal, the higher tier also includes community recovery as an endpoint, with the probable effect concentration two years after application being used as an indication of the exposure level. The case study dealt with carbendazim, a fungicide for which many tests at different levels (single-species laboratory, model ecosystem, field) are available. De Jong also mentioned that the European Food Safety Agency (EFSA) is developing specific protection goals for ecosystem services, key drivers and key function-driver combinations both in-field and off-field. An EFSA working group is drafting an opinion on the soil risk assessment. He concluded that strikingly little use has been made in this area from the existing experience with contaminated soil assessment; here great possibilities are present for integration.
The second case study, dealing with the retrospective risk assessment of PAH-contaminated tar pitches in Denmark, was presented by John Jensen (Aarhus University, Denmark). This case was elaborated by applying a triad approach, including chemical, toxicological and ecological observations. Jensen clearly showed that basing assessment solely on chemical analysis, (i.e., generic soil quality standards) may overestimate the risk of contamination. Bioassays and field observations will typically add further insight into the actual risk. He warned that decisions on how to scale and weigh the relevance of the different lines of evidence in the triad have to be made upfront. If such decisions are made afterward, they will always be biased by the outcome of the analysis. In the final assessment, not only the outcome of the weighing has to be considered, but also the variation (standard deviation) of the generated data as well as the size and current and future land-use of the site in question. Advantage of the method is that it is transparent and easy to communicate.
The special session on soil quality gave a good overview of methods available and interpretation approaches with some of the current weaknesses highlighted. Unfortunately, time was lacking for an in-depth discussion of the pros and cons of the different approaches. But the excellent lectures and the responses of the audience indicated good progress has been made in the last few decades and that there is great potential to come to scientific consensus for the integrated assessment of soil quality, both for predictive and retrospective approaches. The view of members of the GSAG is that this discussion needs to continue. The GSAG steering committee will be taking measures to communicate and advance the proper use of these scientific data on soil protection frameworks and the new ecosystems services approach.
Author's contact information: email@example.com, firstname.lastname@example.org, email@example.com
Return to the Globe