The Gestion Intégrée des Bassins versants à l'aide d'un Système Informatisé (GIBSI), a semidistributed hydrological modeling system, was evaluated for its ability to simulate the impact of deforestation on the hydrological regime of the Famine River watershed (728 km2), a subwatershed of the Chaudière River, Québec, Canada. Annual, spring and summer, and low-water runoff, as well as peak flows, were estimated for both a base-case scenario and a deforestation scenario using 31 annual meteorological series. GIBSI simulated an average increase of annual runoff after clear-cutting of 57% (268 mm) and the proportion of runoff to precipitation increased from 40% to 63%. The average increase in spring runoff was 25%, while in summer it was 138%. For summer low-flow periods, GIBSI simulated an average increase in runoff of 102%. For spring and summer peak-flow rates, hydrographs generated by GIBSI showed that average spring peak flows were increased after deforestation by 26% while summer peak flows were increased by 101%. Differences between spring and summer runoffs as well as peak-flow rates are due to changes in the degree of saturation of the soil and actual evapotranspiration between the two scenarios. Hence, while land-use changes have a substantial impact on summer runoff and low flows, they have little impact on extreme peak-flow events, especially during spring (less than 10% or more than 90% nonexceeding probability). This suggests that land use has a limited role in controlling these extreme events. The simulation results obtained by GIBSI were consistent with those found in the literature. Therefore, GIBSI offers potential as a management tool for investigating prevention and reduction measures of deforestation effects on the hydrological regime of a watershed.
DeWalle (DeWalle, 2003) recently proposed that research on forest hydrology should deal with updated hydrological demands. Ecohydrology that focuses on the interaction of the water cycle and the surrounding ecosystem certainly meets this requirement.
During the last 5 years, Institut National de la Recherche Scientifique–Centre Eau, Terre et Environnement (INRS–ETE), in association with Environnement Québec, defined and developed the Gestion Intégrée des Bassins versants à l'aide d'un Système Informatisé (GIBSI), an integrated modeling and management system (Rousseau et al., 2000a; Rousseau et al., 2000b; Villeneuve et al., 1998). By means of simulation models and management modules, GIBSI allows for the simulation of the impact of detailed management scenarios of dams, land use, agricultural diffuse pollution, and point source discharges on water quantity and quality of a watershed river network. A data pre- and postprocessing system manages relations between the simulation models and a spatial and attribute database.
The objective of this study is to characterize the ability of GIBSI to simulate the impact of deforestation on the hydrological regime of the Famine River watershed (728 km2), a subwatershed of the Chaudière River (6682 km2), by analyzing annual runoff, seasonal runoff, low-water runoff, and peak flows.
2.1. Annual runoff
Studies on the impact of deforestation on annual runoff are numerous. Plamondon (Plamondon, 1993) observed that there is a direct link between the proportion of the deforested area and the variation of annual runoff. Stednick (Stednick, 1996) estimated that the hydrological regime of a watershed is affected only when 20% or more of the total area have been deforested. More specifically, Bosh and Hewlett (Bosh and Hewlett, 1982) established that a modification of land use of softwood species on 10% of the total area could cause an increase of annual runoff of 40 mm, while this same modification could cause an increase of 25 mm if land use is hardwood forest, and 10 mm if the ground is covered with bushes and herbaceous plants. Hornbeck et al. (Hornbeck et al., 1970), based on a review of studies of 11 watersheds in the eastern United States, concluded that an increase of annual runoff of 347 mm was obtained following the deforestation of a watershed in the experimental forest of Hubbard Brook (New Hampshire) with control of residual vegetation by herbicides. On the other hand, industrial deforestation, without growth control of vegetation, generated a smaller increase: 110–250 mm. Plamondon (Plamondon, 1993) estimated that for the hardwood and mixed woods found in Québec, Canada, runoff could increase by 50% after deforestation, while in the boreal zone, runoff would increase between 15% when annual precipitation exceeded 1400 mm, and 50% when annual precipitation was lower than 900 mm.
2.2. Seasonal runoff
Deforestation on a watershed of Hubbard Brook (16 ha), treated with herbicides, generated an increase of runoff of about 5% in spring, and of 429% in summer (Hornbeck et al., 1970). Nicolson et al. (Nicolson et al., 1982) measured an increase of 76% in spring and 225% in summer following deforestation on small watersheds (35–170 ha) covered mainly with softwood species in the Kenora region (Ontario, Canada).
2.3. Low-water runoffs
Deforestation effects on low-water runoff is similar to that on annual runoff (Plamondon, 1993), but it is in summer that deforestation has substantial impact because of evapotranspiration reduction. Data obtained from experimental watersheds in Ontario and New Brunswick, Canada, showed increases varying from 100% to 200% following deforestation during summer low-water runoff (Ordre des Ingénieurs Forestiers du Québec, 1996).
2.4. Peak flows
Most of the literature agrees that the main consequence of deforestation is the increase of low-water peak flows (Storck et al., 1998; Beshta et al., 2000; Caissie et al., 2002). Deforestation has little effect during extreme peak flows because these events occur mainly after an intense precipitation event, which fills the soil water storage capacity.
2.5. Investigation methods
The experimental method most commonly used to study the impact of deforestation on the hydrological regime is to pair watersheds. To approximate similar meteorological conditions, two watersheds located in the same area are evaluated. One of the watersheds serves as a control; the second watershed undergoes partial or total cutting. Flow rates of each of the watersheds are then gauged at their outlet. Interpretation of results is done by comparing data obtained for each watershed with respect to meteorological events. It is then possible to analyze the impact of deforestation on annual runoff, low-water runoff, and peak flows. Results obtained by this method are difficult to extrapolate to other sites because they depend on intrinsic watershed characteristics. Indeed, observations can vary according to topography, climate, soil type, leaf area, tree health, surface residues, and remaining vegetation (Plamondon, 1993).
Tools were developed for land managers to predict possible deforestation or reforestation effects on the hydrological regime, taking into account land biophysical characteristics. Among these tools, simulation models can be used to compare the relative effect of various scenarios. A summary of some of these models is presented in Table 1. Some of them were conceived for research purposes, while others for management purposes. Among management models, one finds the following: (i) Meuser's model (Meuser, 1990), which was developed to estimate the effect of cuttings and plantations on the hydrological regime of small watersheds in the highlands of occidental Germany; (ii) the Institute of Hydrology (IH) model (Eeles and Blackie, 1993), which was developed to estimate the effect of reforestation on the hydrological regime of a watershed in the United Kingdom; (iii) Leaf and Alexander's model (Leaf and Alexander, 1975), which was developed to estimate the effect of group cuttings on the hydrological regime of watersheds of the Rockies (Colorado); (iv) the SVEN model (Ryan, 1979), which was developed to estimate the effect of cuttings on watershed hydrological regime of the northwestern Pacific; (v) Betters' model (Betters, 1975), which was developed to estimate the long-term effect of deforestation by strips in the population of twisted Pines (P. contorda Doug.) on the hydrological regime of watersheds in the Rockies (Colorado); and (vi) the PROSPER model (Swift et al., 1975), which was developed to produce curves needed for the Water Resources Evaluation on Non-point Silvicultural Sources (WRENSS) procedure of the U.S. Department of Agriculture (USDA). Table 1 compares these simulation models according to model type, simulation time step, simulation processes, and modeling option.
3. Integrated modeling system description
GIBSI is composed of a database (including spatial and attribute data), distributed models (rainfall–runoff, soil erosion, agricultural–chemical transport, and water quality), management modules (land use, point source, agricultural production systems, and reservoir management), a relational database management system, and a geographic information system. The semidistributed models can predict the impact of various management scenarios all along the river network of a watershed in terms of water quantity and/or water quality. A detailed description of GIBSI can be found in Villeneuve et al. (Villeneuve et al., 1998) and Rousseau et al. (Rousseau et al., 2000a; Rousseau et al., 2000b). A brief summary of the rainfall–runoff simulation processes of GIBSI are presented in the following paragraphs.
Spatial discretization of a watershed is done with two types of computational elements: (i) river segments and (ii) relatively homogeneous hydrological units (RHHUs). River segments are one-dimensional elements that support watercourse-flow simulation processes, whereas RHHUs are basic elements for all rainfall–runoff simulation processes. RHHUs represent elementary subwatersheds taking into account the spatial variability of topography, land use, soil types, and meteorological variables. RHHUs are determined with PHYSITEL, a software tool that divides a watershed into RHHUs using a digital elevation model (DEM) and a digitized drainage network (Turcotte et al., 2001).
Rainfall–runoff processes are simulated using HYDROTEL (Turcotte et al., 2003; Fortin et al., 1995; Fortin et al., 2001a; Fortin et al., 2001b). HYDROTEL, a semidistributed physically based model, integrates six computational modules that are run in a cascade (i.e., in a decoupled manner): weather data interpolation, snow cover dynamic, potential evapotranspiration, soil moisture balance, surface runoff, and streamflow. Each module offers more than one computational algorithm based on the availability of data for the studied watershed. Some algorithms, developed from physically based principles, retain some empirical aspects while others are still fully empirical. Rainfall–runoff processes can be modeled on a 1–24-h time step basis.
Precipitation data can be obtained either from meteorological stations or from a weather radar. A mixed, degree-day, energy budget approach is used to simulate daily variations of mean snowpack characteristics: thickness, water equivalent, mean density, thermal deficit, liquid water content, and temperature. An empirical equation developed by Hydro-Quebec, which solely requires air temperatures (Fortin, 2001), may be used to estimate potential evapotranspiration (PET). Actual evapotranspiration (AET) is evaluated using PET and soil moisture, soil type, and land-use characteristics such as leaf area index, root depth, and relative soil water content. Note that when meteorological data such as solar radiation, wind velocity, and/or relative humidity are available, other methods may be used to compute PET, that is, Thornthwaite (Thornthwaite, 1948); Linacre (Linacre, 1977); Penman–Monteith (Penman, 1948; Monteith, 1965); and Priestley–Taylor (Priestley and Taylor, 1972).
Vertical water balance for each RHHU is determined with an algorithm that divides the soil into three layers of variable depths. The uppermost layer controls infiltration through the soil and soil water content decreases essentially as AET increases. The second layer is a transient layer; soil water content can vary with a change in the AET rate. The last layer is saturated, but soil water content can be modified by transpiration when plant roots reach deep enough. This method is a good compromise between theoretical and empirical approaches. This algorithm accounts for spatial variability of soil type over the watershed. It can also consider physical soil characteristics like porosity, hydraulic conductivity, and capillary capacity for the RHHU. These characteristics can be measured in situ or taken from the literature or soil database.
Surface runoff is estimated for each time step using a geomorphologic unit hydrograph after calculating the vertical water balance. The shape of this unit hydrograph is determined by routing a reference depth of water over all DEM cells of an RHHU according to a kinematic wave model. This method takes into account the flow structure within an RHHU and pathway roughness to the river segment. Two algorithms allow for simulation of streamflow: kinematic wave and diffusing wave. These consider slope, width, and roughness of each river segment of the drainage network, thus allowing for simulating flow in a natural (rivers and lakes) or controlled way (dams).
4. HYDROTEL calibration
A detailed description of calibration and validation of HYDROTEL can be found in Fortin et al. (Fortin et al., 2001b). HYDROTEL was calibrated on the Chaudière River watershed (6680 km2), located on the south shore of the Saint Lawrence River, south of Québec City, Canada (Figure 1). Soils vary from marine deposits and peat to till and rocky outcrops. Land use was initially divided into 10 classes: urban areas and roads, pasture and alfalfa, cereals, corn, water, wetlands, bare soils, bushes, hardwood forest, and softwood forest. Calibration of model parameters was performed using simulated and measured streamflows for 1989–90 and 1993–94 (October–September). The model efficiency (Nash and Sutcliffe, 1970) obtained from this calibration gave satisfactory results: 0.88 and 0.83 for 1989–90 and 1993–94, respectively.
Temporal validation of the calibration was performed for the hydrological years (October–September) of 1987–88 and 1990–91. To verify long-term behavior, the model was evaluated on a 10-yr period from October 1984 to September 1993. A spatial and temporal validation of the calibration was also performed for the Famine and the Beaurivage subwatersheds for both the calibration and validation periods. For the calibration exercise, model efficiency was 0.83 for the two temporal periods; as for spatial validation, model efficiency varied between 0.78 and 0.88. The mean 10-yr validation gave a model efficiency of 0.89.
To complete the calibration and the verification exercises, snow survey data were compared to water-equivalent depths of the simulated snowpack. These results were used to understand some discrepancies between measured and simulated streamflows and to verify the cause of those differences. This verification was more a point check on eight snow survey stations than some spatial corroboration for the entire watershed. The results obtained from the verification showed, on average, that at most stations, snow accumulation and snowmelt were well simulated. Mostly, differences for specific stations were due to representative meteorological data. Even if this verification is a point check of water-equivalent depths of the snowpack during a simulation run, it is useful information to understand what is going on in various parts of a watershed and, also, to provide additional ways of checking the accuracy of the simulations. More details on the simulation of the snowpack can be found in Fortin et al. (Fortin et al., 2001b).
From the calibration and validation exercises, the final spatial division of the Chaudière River watershed gave 1880 RHHUs, with an average area of 350 ha. The soil was divided vertically as follows, starting from the soil surface: first layer (6.25 cm deep), second layer (15.75 cm deep), and the third layer (175.00 cm deep).
5. Materials and methods
5.1. Site description
Simulations were performed for a subwatershed of the Chaudière River watershed: the Famine River (728 km2). Figure 1 shows the location of the Famine River watershed along with land use. It has a mean elevation of 370 m, varying from 180 to 550 m; 30% of total area of the watershed is occupied by hardwood forests (i.e., deciduous); and 41% is occupied by softwood forests (i.e., coniferous). Mean annual rainfall depth is 1177 mm, varying from 923 to 1528 mm, recorded from a 31-yr period taken from 30 meteorological stations located on or near the watershed.
Scenarios were created to evaluate large-scale deforestation effects on the hydrological regime. They were formed using the Land Use Scenario Editor of GIBSI. The basic scenario (BS) consisted of the actual land use that is shown in Figure 1 (upper-left panel). The deforestation scenario (DS) involved a change in deciduous and coniferous forests to bare soils (Figure 1, lower-right panel). This alteration modified the proportion of the bare soil class from 1% to 72% of the total area of the watershed. By changing the land-use classes, the Manning coefficient shifted from 0.3 to 0.1 and the root depth was changed to a null value because no more vegetation was simulated after deforestation.
5.3. Hydrological variables estimated
Five hydrological parameters were estimated in this study to evaluate the impact of deforestation on the hydrological regime using frequency analysis: (i) annual runoff, (ii) spring runoff, (iii) summer runoff, (iv) peak-flow rates, and (v) low-water runoff. These parameters were estimated between 1 November and 31 October. Only 1 yr was estimated at a time since GIBSI does not estimate forest growth. For this reason, long-term effects of deforestation or the number of years required to reduce deforestation effects on water yield and peak-flow rates were beyond the scope of this study.
To simulate the impact of DS on the hydrological regime, a composite sample was established using 31 annual meteorological series. Meteorological data were taken between November 1964 and November 1995. This composite sample was made for each of the five hydrological parameters that were estimated. For each of the 31 meteorological series, GIBSI was then run twice. The first run used BS to establish the original condition of the hydrological regime of the watershed before deforestation. The second run was done with DS, which simulated the watershed behavior when deforestation had occurred.
The hydrological parameters were estimated as follows. Annual runoff was obtained by summing up all daily runoff simulated at the outlet of the Famine River subwatershed for each of the 31 yr for the two scenarios. Low-water runoffs were obtained by taking the mean of seven consecutive days of low flows between 21 June and 20 September where runoff was smallest for each of the two scenarios. Low-water runoffs for the two scenarios did not necessarily correspond to the same seven consecutive days.
Spring and summer runoffs and peak-flow rates were obtained by taking the sum of daily runoffs and maximum daily peak-flow rates for spring (21 March–20 June) and summer (21 June–20 September) of each year of simulation for the two scenarios. To take into account peak-flow rate lag time, dates of the two most significant peaks were compared between seasons and years for the two scenarios.
The degree of saturation of the soil profile was estimated in relation to each of the five hydrological parameters simulated because it conditions surface runoff and subsurface flow (Chow et al., 1988). The degree of saturation was calculated using daily mean degree of saturation of the two first soil layers weighted by soil layer depth. The third soil layer was not taken into account because it is mostly saturated; it would have concealed the differences of the two other layers.
6. Results and discussion
6.1. Annual runoff
Figure 2a shows the results of simulated annual runoff for BS and DS on normal probability paper. For BS, the median of annual runoff is 463 mm; the values vary between 283 and 770 mm. The mean is 472 mm. For DS, the median equals 743 mm; the range varies between 538 and 1056 mm. The mean is equal to 740 mm. An average increase of 268 mm (57%) in annual runoff is calculated before and after deforestation. As shown in Figure 2a, in 75% of the cases, annual runoff does not exceed 554 mm before deforestation, while after the result it rises to 875 mm, an increase of 271 mm.
Results obtained from paired watershed data showed that, when growth was controlled by herbicides, the annual runoff increased by 249 mm (Hornbeck et al., 1993). When vegetation is not controlled by herbicides, an increase can vary between 110 and 250 mm when 100% of the watershed is clear-cut. By simplifying the results obtained by Bosh and Hewlett (Bosh and Hewlett, 1982) to take into account the same deforested area (71%), an approximate increase of 239 mm in annual runoff after deforestation was calculated. Plamondon (Plamondon, 1993) estimated an annual runoff increase of 230 mm for a watershed located 120 km east of the Famine watershed. Stednick (Stednick, 1996) presents a summary of paired watershed studies used to assess water yield after forest removal. The results compiled showed that annual water yield increased from 0 to 400 mm after harvesting the entire watershed. This significant variation between the results is due to watershed physical characteristics (land use, slope, size, area deforested, hydrological region, annual precipitation event). These results suggest that annual runoff response to deforestation is nonlinear.
The proportion of annual runoff to annual precipitation for BS is 40%. After deforestation, 63% of annual precipitation depth is lost by runoff when 71% of the watershed has been deforested. This response from the hydrological model was expected because removal of the vegetation canopy would reduce AET losses. More water is then available for runoff.
Results obtained by GIBSI for annual runoff are consistent with what has been reported in the literature. Annual runoff variations from one study to another are due to the physical characteristics of the study site and to the location of the clear-cuttings within the watershed. Annual runoff increases are nonlinear and studies focusing on long-term deforestation effects should be done to estimate these variations.
6.2. Spring runoff
Spring runoff results using normal probability paper are shown in Figures 2b and 2c. For BS, the median is equal to 244 mm and the range varies between 29 and 473 mm. The mean is equal to 254 mm. After deforestation, those results are equal to 344, 67–515, and 317 mm for the median, the range, and the mean, respectively. A 25% increase (63 mm) between BS and DS is observed for the 31 yr. Quartile analysis shows that spring runoff does not exceed, in 75% of cases, 296 mm for BS and 361 mm for DS, an increase of 65 mm.
Spring peak-flow results (Figure 2c) gave, for BS, a mean of 200, a median of 201, and a range of 10–375 m3 s−1. For DS, there is a mean of 252, a median of 259, and a range of 30–433 m3 s−1. The increase in spring peak flow after deforestation (difference between BS and DS peak flows) is equal to 52 m3 s−1, which gives, in proportion, 26%. Storck et al. (Storck et al., 1998) observed a 30% increase in spring peak flow for the Little Naches watershed (Washington). The results obtained by GIBSI are thus consistent with these aforementioned observations.
The spring peak-flow rate after deforestation (DS) occurs 0.70 day before that of BS. Martin et al. (Martin et al., 2000), in their study, mentioned an earlier peak of 17 days. The difference in timing is probably due to the fact that the watershed slope in the Martin et al. (Martin et al., 2000) study varies between 12.1% and 15.8%, while the average slope on the Famine River watershed is 2.6%. Also, the difference can be attributed to watershed response to snowmelt.
From Figure 2c, looking at extreme events (less than 10% or more than 90% nonexceeding probability), land-use changes seem to have no effect on spring peak flows. The two curves, BS and DS, do not show any differences between the scenarios. For a 10% nonexceeding probability, only a 5 m3 s−1 difference is calculated between BS and DS; and for a 90% nonexceeding probability, the difference in peak flow is 16 m3 s−1. This suggests that land use has a limited role in controlling extreme events.
6.3. Summer runoff
Figure 2d presents summer runoff on normal probability paper for BS and DS. Summer runoff for BS has a median of 65 mm, a range between 17 and 216 mm. The mean is equal to 81 mm. For DS, the median is equal to 185 mm, the range varies between 53 and 346 mm, and the mean equals 193 mm. The increase in summer runoff between BS and DS is, on average, equal to 112 mm. This increase is equivalent to 138% in proportion to BS. Comparing these results to those obtained for spring, the increase in runoff is much larger for summer than for spring (138% cf. 26%). Hornbeck et al. (Hornbeck et al., 1970) and Nicolson et al. (Nicolson et al., 1982) observed that runoff increase was greater during the summer than in the spring. Even when a smaller portion of the watershed is harvested (23.4%). Caissie et al. (Caissie et al., 2002) observed that the only detected effect on the hydrological regime of a watershed was on summer peak flows.
Low-water runoff results are shown in Figure 2e. For BS, the median is equal to 0.85 mm, while the range varies between 0.37 and 2.65 mm. The mean is equal to 1.09 mm. As for DS, these results are 1.60 (median), 0.64–6.10 (range), and 2.20 mm (mean). The average increase between the two scenarios is 1.11 mm or 102%. This increase is similar to observations made in Ontario and New Brunswick where paired watershed results showed increases between 100% and 200% for deforestation of the entire watershed (Ordre des Ingénieurs Forestiers du Québec, 1996). From Figure 2e, for extreme events of 90% and more nonexceeding probability, the increase in summer low-water runoff is 121%; as for 95% nonexceeding probability, the increase is 132%. This observation suggests that land-use changes have an effect on low-water runoff. More water is then available in streamreach, which could be beneficial to aquatic ecosystems.
Results from simulation for summer peak flows are shown in Figure 2f. Before deforestation, the average peak-flow rate is 73 m3 s−1, with values ranging between 5 and 328 m3 s−1. After deforestation, average peak flow is equal to 147 m3 s−1, with values ranging between 24 and 393 m3 s−1. The difference between the two scenarios shows an increase of 74 m3 s−1 (101%). This was also observed by other authors (Keppeler and Ziemer, 1990; Wynn et al., 2000; Martin et al., 2000; Caissie et al., 2002) where peak flows were greater after deforestation during summer periods when compared to undisturbed watersheds. From Figure 2f, for extreme events (less than 10% or more than 90% nonexceeding probability), land use seems to have a small effect in controlling peak flows.
As an example of a sampled behavior, Figure 3 shows the increase in the 1995 summer runoff as a function of precipitation events. As shown in the figure between 11 July and 30 August, the response of runoff to precipitation events for DS is larger than for BS. For the 22 July precipitation event, runoff response simulated by GIBSI occurred 2 days after the precipitation event (65 mm) with a runoff increase of 135 mm between the two scenarios. These results confirm the fact that deforestation has a great effect on runoff during summer.
Figure 4 shows the relationship between the increase in runoff and the degree of saturation of the soil between the two scenarios for spring and summer. As presented, data are located in two quadrants. The lower-left quadrant contains most of the variations in the degree of saturation related to the spring period. By extrapolation, the average increase runoff related to the variation in the degree of soil saturation during the spring is 30%. The upper-right quadrant encloses most of the variations in the degree of saturation related to the summer period with an average runoff increase of 200%. This suggests that more variation in the degree of saturation and runoff increase occurs during summer than during spring. Comparing those results to those presented in Figure 2, runoff increase follows the same trend: a spring increase of 25% and a summer increase of 138%. A logical explanation of higher variations in the degree of soil saturation during summer could be that less AET occurs after deforestation with the consequence that the degree of saturation increases during summer with precipitation events. To corroborate these observations, simulation results of the degree of soil saturation are presented in section 6.4.
As a result, summer runoff simulated by GIBSI is consistent with what has been observed from experimental paired watersheds and anticipated from theoretical hydrological processes.
6.4. Degree of saturation of the soil
During spring, the degree of saturation of the soil profile before deforestation varied between 0.74 and 0.90, with a mean of 0.82. After deforestation, the degree of saturation increased, on average, to 0.88, varying from 0.83 to 0.94. The average difference between the degree of saturation before and after deforestation is 0.06. For the summer period, the degree of saturation for BS varied between 0.64 and 0.83, with a mean of 0.73. For DS, the degree of saturation for the summer period varied between 0.78 and 0.90, with a mean of 0.85. The average difference between BS and DS is equal to 0.12. The difference in the degree of saturation is smaller in spring than in summer. This is due to less AET in spring when air temperature is low and to the absence of a canopy for half of the season. In summer, variability in the degree of saturation is more significant between the two scenarios because no vegetation transpiration occurs after deforestation. A large change occurs in summer because it is the time of the year in which deforestation leads to the greatest physical changes to the system.
The objective of this study was to characterize the predictive power of GIBSI, an integrated hydrological watershed model, to simulate large-scale deforestation impact on a watershed hydrological regime. Results obtained by GIBSI showed that this tool can be used on large-scale watersheds for land management planning with respect to water resources. Annual, spring, and summer runoff simulation results were consistent with observations from experimental paired watersheds. More simulation studies could be made to validate the predictive power of GIBSI for smaller deforested area, as well as for the long-term effects of vegetation growth. This could be coupled with an application on paired watersheds. Finally, the integration of a canopy model simulating vegetation growth would allow for the estimation of the cumulative deforestation impacts over several consecutive years.
*Corresponding author address: Alain N. Rousseau, INRS-ETE, 2800 Einstein, C.P. 7500, Sainte-Foy, Québec G1V 4C7, Canada. email@example.com
This article included in Land Use and Ecosystems special collection.