Statistical analysis frequently involves methods for reducing high-dimensional data to new variates of lower dimension for the purpose of assessing distributional properties, identification of hidden patterns, for discriminant analysis, etc. In classical multivariate analysis such matters are usually analysed by either using principal components (PC) or the Mahalanobis distance (MD). While the distributional properties of PC’s are fairly well established in high-dimensional cases, no explicit results appear to be available for the MD under such cases. The purpose of this chapter is to bridge that gap by deriving weak limits for the MD in cases where the dimension of the random vector of interest is proportional to the sample size (n, p-asymptotics). The limiting distributions allow for normality-based inference in cases when the traditional low-dimensional approximations do not apply.