Standardized Runoff Index (SRI), as one of the well-known hydrological drought indices, may contain uncertainties caused by the employment of the distribution function, time scale, and record length of statistical data. In this study, the uncertainty in the SRI estimation of monthly discharge data of 30 and 49 year length from Minab dam watershed, south of Iran, was investigated. Four probability distribution functions (Gamma, Weibull, Lognormal, and Normal) were used to fit the cumulative discharge data at 3, 6. 9, 12, 24 and 48 month time scales, with their goodness-of-fit and normality evaluated by K-S and normality tests, respectively. Using Monte-Carlo sampling, 50,000 statistical data were generated for each event and each time scale, followed by 95% confidence interval. The width of the confidence interval was used as uncertainty and sources of uncertainty were investigated using miscellaneous factors. It was found that the maximum uncertainty was related to normal and lognormal distributions and the minimum uncertainty to gamma and Weibull distributions. Further, the increase in both time scale and record length led to the decrease in uncertainty.