In their recent paper, Wang and Leblanc (Ann Inst Stat Math 60:883–900, 2008) have shown that the second-order least squares estimator (SLSE) is more efficient than the ordinary least squares estimator (OLSE) when the errors are independent and identically distributed with non zero third moments. In this paper, we generalize the theory of SLSE to regression models with autocorrelated errors. Under certain regularity conditions, we establish the consistency and asymptotic normality of the proposed estimator and provide a simulation study to compare its performance with the corresponding OLSE and generalized least square estimator (GLSE). It is shown that the SLSE performs well giving relatively small standard error and bias (or the mean square error) in estimating parameters of such regression models with autocorrelated errors. Based on our study, we conjecture that for less correlated data, the standard errors of SLSE lie between those of the OLSE and GLSE which can be interpreted as adding the second moment information can improve the performance of an estimator.