In this study, a new empirical model is described for the determination of required retention time (RRT) in hindered settling (sedimentation). The proposed model was developed computationally based on 715 different settling scenarios. The predicted results obtained from the proposed approach were compared with outputs of the MATLAB (R) algorithm based on the Talmadge and Fitch's method (1955). The relationship between predicted values and MATLAB (R) outputs was determined using DataFit (R) software, and RRT equation was rewritten as a function of interface height (H), interface time (T), initial sludge concentration (C-o), desired sludge concentration (C-u), height of desired sludge (H-u), and a new time parameter (phi). The proposed RRT model was tested for 275 different new settling scenarios, and also 89 different experimental data. Results showed that the proposed empirical model obviously agreed with MATLAB (R) outputs. All predictions were proven to be satisfactory with correlation coefficients of about 0.928 and 0.972, respectively.