This study was conducted to quantify the soil respiratory emission (SR) in an apple orchard and to determine its relationship with key environmental factors such as air temperature, soil temperature and soil moisture content. Experiment was made over the period from 23 April 2007 to 31 March 2008 in `Fuji` apple orchard of National Institute of Horticultural and Herbal Science in Suwon, Gyeonggi-do, Korea. The SR was measured by using the automatic opening/closing chamber system based on a closed method. Diurnal variations in SR showed an increase around 0700 hours with increasing soil temperature, its peak between 1400 and 1500 hours, and then a gradual decrease thereafter. Daily variations in SR depended largely on soil and air temperatures over the year, ranging from 0.8 to 13.7 g . During the rainy spell in summer (JulyAutumn) with higher temperature and more precipitation, the SR was lower than that in the spring (MayJune) with moderate temperature. The SR showed a significant exponential relationship with soil temperature ($r^2

Most deciduous trees in temperate zone are dormant during the winter to overcome cold and dry environment. Dormancy of deciduous fruit trees is usually separated into a period of rest by physiological conditions and a period of quiescence by unfavorable environmental conditions. Inconsistent and fewer budburst in pear orchards has been reported recently in South Korea and Japan and the insufficient chilling due to warmer winters is suspected to play a role. An accurate prediction of the flowering time under the climate change scenarios may be critical to the planning of adaptation strategy for the pear industry in the future. However, existing methods for the prediction of budburst depend on the spring temperature, neglecting potential effects of warmer winters on the rest release and subsequent budburst. We adapted a dormancy clock model which uses daily temperature data to calculate the thermal time for simulating winter phenology of deciduous trees and tested the feasibility of this model in predicting budburst and flowering of Niitaka pear, one of the favorite cultivars in Korea. In order to derive the model parameter values suitable for Niitaka, the mean time for the rest release was estimated by observing budburst of field collected twigs in a controlled environment. The thermal time (in chill-days) was calculated and accumulated by a predefined temperature range from fall harvest until the chilling requirement (maximum accumulated chill-days in a negative number) is met. The chilling requirement is then offset by anti-chill days (in positive numbers) until the accumulated chill-days become null, which is assumed to be the budburst date. Calculations were repeated with arbitrary threshold temperatures from to (at an interval of 0.1), and a set of threshold temperature and chilling requirement was selected when the estimated budburst date coincides with the field observation. A heating requirement (in accumulation of anti-chill days since budburst) for flowering was also determined from an experiment based on historical observations. The dormancy clock model optimized with the selected parameter values was used to predict flowering of Niitaka pear grown in Suwon for the recent 9 years. The predicted dates for full bloom were within the range of the observed dates with 1.9 days of root mean square error.

The demand for rainfall data in gridded digital formats has increased in recent years due to the close linkage between hydrological models and decision support systems using the geographic information system. One of the most widely used tools for digital rainfall mapping is the PRISM (parameter-elevation regressions on independent slopes model) which uses point data (rain gauge stations), a digital elevation model (DEM), and other spatial datasets to generate repeatable estimates of monthly and annual precipitation. In the PRISM, rain gauge stations are assigned with weights that account for other climatically important factors besides elevation, and aspects and the topographic exposure are simulated by dividing the terrain into topographic facets. The size of facet or grid cell resolution is determined by the density of rain gauge stations and a grid cell is considered as the lowest limit under the situation in Korea. The PRISM algorithms using a 270m DEM for South Korea were implemented in a script language environment (Python) and relevant weights for each 270m grid cell were derived from the monthly data from 432 official rain gauge stations. Weighted monthly precipitation data from at least 5 nearby stations for each grid cell were regressed to the elevation and the selected linear regression equations with the 270m DEM were used to generate a digital precipitation map of South Korea at 270m resolution. Among 1.25 million grid cells, precipitation estimates at 166 cells, where the measurements were made by the Korea Water Corporation rain gauge network, were extracted and the monthly estimation errors were evaluated. An average of 10% reduction in the root mean square error (RMSE) was found for any months with more than 100mm monthly precipitation compared to the RMSE associated with the original 5km PRISM estimates. This modified PRISM may be used for rainfall mapping in rainy season (May to September) at much higher spatial resolution than the original PRISM without losing the data accuracy.

Net radiation () is a driving force of biological and physical processes between the surface and the atmosphere and its knowledge is critical to weather forecasting and water resource management. The measurement of is, however, scarce and it is typically estimated from an empirical relationship. This study presented two different methods of estimation over three major plant functional types (i.e., a deciduous forest, a coniferous forest, and a farmland) in Korea. One is a linear regression method between and solar radiation and the other is a radiation balance method. The two methods were examined using the data collected in 2008 at the three sites. Based on the linear regression method over a year, was 70% of the incoming shortwave radiation () for a deciduous forest, 79% for a coniferous forest, and 64% for a farmland, indicating that the relationship was plant functional type-specific. For the radiation balance method, the inclusion of longwave radiation component slightly improved estimations. Overall, there was a good agreement between the observed and the estimated from both methods, indicating a reliable applicability of the two methods in estimating .