Generalized least squares estimation of a system of seemingly unrelated regressions is usually a two-stage method: (1) estimation of cross-equation covariance matrix from ordinary least squares residuals for transforming data, and (2) application of least squares on transformed data. In presence of multicollinearity problem, conventionally ridge regression is applied at stage 2. We investigate the usage of ridge residuals at stage 1, and show analytically that the covariance matrix based on the least squares residuals does not always result in more efficient estimator. A simulation study and an application to a system of firms' gross investment support our finding.