Comparison of dew point temperature estimation methods in Southwestern Georgia
Recent upward trends in acres irrigated have been linked to increasing near-surface moisture. Unfortunately, stations with dew point data for monitoring near-surface moisture are sparse. Thus, models that estimate dew points from more readily observed data sources are useful. Daily average dew temperatures were estimated and evaluated at 14 stations in Southwest Georgia using linear regression models and artificial neural networks (ANN). Estimation methods were drawn from simple and readily available meteorological observations, therefore only temperature and precipitation were considered as input variables. In total, three linear regression models and 27 ANN were analyzed. The two methods were evaluated using root mean square error (RMSE), mean absolute error (MAE), and other model evaluation techniques to assess the skill of the estimation methods. Both methods produced adequate estimates of daily averaged dew point temperatures, with the ANN displaying the best overall skill. The optimal performance of both models was during the warm season. Both methods had higher error associated with colder dew points, potentially due to the lack of observed values at those ranges. On average, the ANN reduced RMSE by 6.86% and MAE by 8.30% when compared to the best performing linear regression model.