Capability and suitability of the interim soil physical indicator (4.1e) to characterise changes created by forest harvesting

1.1 Capability and suitability of the interim soil physical indicator (4.1e) to characterise changes created by forest harvesting

Indicator 4.1e is “the area and percent of forested land with significant compaction or change in soil physical properties resulting from human activities”. The interim indicator (MIG, 1998) is, “proportion of harvested forest area with significant change in bulk density of any horizon of the surface (0-30 cm) soil”. Raison et al. (1998) proposed that “significant” change be regarded as >20% change compared to the pre-harvesting value and/or the area with aeration porosity <10%. Rab (Appendix 1) further proposed that the area of significant change would be defined as areas:

- with greater than 20% increase in bulk density in the surface 300 mm; - areas with less than 10% absolute value of aeration porosity; and - areas of S3 disturbance class.

Areas satisfying more than one of these conditions would not be counted twice. Data would be interrogated to find all samples that meet the first condition, then data that did not meet that condition would be examined to identify those that meet the second condition. Finally any sampling points that did not meet the first two conditions, but meet the third, would be counted.

Both sets of studies showed that the interim indicator and proposed methods of assessing it could

be applied to evaluate the effects of a logging operation on the severity and spatial extent of changes in soil physical properties. We were able to show that the more intensive harvesting of the smaller mountain ash coupes was more likely to exceed the acceptable levels of the proposed indicator than the less intensive silvertop ash logging. This was partly a function of the higher timber yields per unit area from the mountain ash forest. (This was also reflected in the results of Rab and Dignan in Part II Study A, in which timber yield was significantly related to the distribution of soil disturbance categories.) However, the role of soil type may have been important too, in that the soils from the studies in Victoria are generally deeper with a higher proportion of fine particle sizes and higher moisture retention. They would, therefore, be generally prone to greater soil property change for a given level of compactive effort. So in the mountain ash forests, there is a tendency toward both a greater proportion of the area getting disturbed, and a larger proportion of this disturbed area experiencing significant change. This fact was certainly demonstrated by the application of the proposed indicator system to the two different logging systems of southeastern NSW and the Victorian central highlands.

Another indicator of the appropriateness of the indicator is whether the measured changes are meaningful with respect to sustainability as measured by growth of the commercial species. In the NSW study, only a minor portion of the coupe, roughly equivalent to the area affected by S3 disturbance, could be proven to have lower productivity as a result of soil disturbance. In the mountain ash study, however, there was stronger evidence (though statistical proof was also difficult to achieve) that both S3 and S2 classes, or primary and secondary snig tracks, supported lower stem volumes than the S0 areas. With the difficulty in obtaining definitive statistical support for the trends, we need to emphasize that the following interpretation is tentative. That is, that this difference may again be at least partly a function of soil type, since the soil of the mountain ash forest is of higher fertility. It is more likely that deterioration of soil physical properties could become the most limiting factor to growth, thus allowing expression of a difference in growth between disturbance classes. The fact that the interim indicator has indicated potential problems, particularly with the degree of soil property change, associated with the logging of this forest type, but not the southern NSW ones, again suggests that the indicator is framed reasonably well.

The sampling protocol used for the studies worked reasonably well, with only some questions concerning the optimum transect sampling method and consideration of the area to be surveyed (gross coupe or net logged area). The proposed protocol, perhaps with some fine-tuning, works and, from a technical point of view, could be adopted.

The preceding discussion basically establishes that the interim indicator is capable of meaningfully characterising change due to harvesting, and that a scientifically rational method of measuring the indicator exists. A remaining question is that of suitability. We regard this question to mean whether the indicator, as proposed, could actually be cost-effectively measured on enough coupes over time to provide data that could be aggregated from the sub-regional level for reporting at regional and national levels. Underlying this question is the high level of resources required to perform detailed randomised surveys and soil sampling (and attendant laboratory processing) that is currently embodied within the proposed protocol. In analysing indicator 4.1e, MIG (1998) suggested that an important research priority was “examining the potential to use rut characteristics or remotely sensed data as surrogates for soil physical change.” Implicit in this statement is the fact that any indicator assessment based on actual soil sampling involves considerable time, labour, and hence, cost.

The work required to measure soil physical properties to 30 cm depth is considerably more than for surface soil alone. Based on the fact that there is a correspondence between change at depth and change at the surface, the first concession to operational practicality would be to restrict sampling to the surface 10 cm. This may require some adjustments to the interim indicator and protocols; for example, the degree of change in a soil property needed for it to be regarded as “significant” might

be increased to 30% rather than 20%. The next option for reducing the amount of resources, which, it must not be forgotten, will need to

be borne by local forest administrations, is that “significant” change be switched from a soil property basis to a disturbance class basis. This is the “potential to use rut characteristics or remotely sensed data as surrogates” referred to above. Both sets of studies showed that there is a relatively strong association between disturbance class and the degree of change in soil physical property. In NSW, access tracks, unrehabilitated log landings, and, to a lesser extent major snig tracks, were associated with significant increases in bulk density and soil strength. Although not all samples on these classes were found to exceed 20% increase in bulk density. But a large proportion did, and taken together with the results of the regeneration surveys, this suggests that these particular classes could be used as a surrogate for significant change in soil physical properties. This would greatly simplify and shorten the process of recording the area as one coupe could be assessed by a single operator equipped with GPS and hand-held computer in one or two days of field work and a further day or two office time. The current methods based on the proposed protocol require at least two weeks of fieldwork by a two-member team followed by a further period of several weeks laboratory and data analysis time. This amounts to a minimum of $20,000 per coupe.

There is also an unresoved problem in using a relative change based indicator. Consider a soil that has a bulk density of 0.5 Mg m -3 , a penetration resistance of 0.4 MPa, and an aeration porosity of, say, 50%. The physical condition of such a soil for sustaining tree growth would probably be enhanced rather than reduced by compaction that causes a 25-50% increase (or decrease in the case of aeration porosity) in these values. This is an extreme example, and may seem purely hypothetical, but similar results have in fact been found by researchers examining compaction of

coarse sand in New Zealand (Zou et al. 2000). In coarse sandy soil, initial compaction increased least limiting water range (LLWR), but further compaction decreased it. The concept of LLWR defines a range in soil water content within which plant growth is least likely to be limited by the availability of water and air, or resistance to root elongation (Zou et al. 2000). Parameters such as LLWR, as difficult as they may be to measure accurately, are far more valuable for assessing detrimental soil change than simple indices such as absolute bulk density or percent change in individual properties.

The conclusion then, is that although the interim indicator would appear to be very capable of detecting significant change, there are considerable concerns over its suitability due to the amount of resources that are required for its application.