This paper presents a new method for solving rank-one multivariate regression problems, providing a solution that maximizes the sum of squared correlations of the one-dimensional fitted pattern with the target variates. The suitability of the method and the consistency of the estimator are formally proved and experimentally tested. In particular, it is shown that the estimate converges not only as a function of the number of items, but also as a function of the number of target variates. An equivalent conventional reduced-rank regression case is identified, and it inherits the convergence properties of the new approach. Application programs in Matlab/Octave code are provided and numerical examples using artificial data as well as real data are presented.