Optimization over the set of matrices X that fulfill X^TBX = Ip, known as the generalized Stiefel manifold, seems in lots of functions involving sampled covariance matrices such because the canonical correlation evaluation (CCA), unbiased part evaluation (ICA), and the generalized eigenvalue drawback (GEVP). Fixing these issues is usually completed by iterative strategies that require a totally shaped B. We suggest an affordable stochastic iterative methodology that solves the optimization drawback whereas having entry solely to a random estimates of B. Our methodology doesn’t implement the constraint in each iteration; as a substitute, it produces iterations that converge to vital factors on the generalized Stiefel manifold outlined in expectation. The strategy has decrease per-iteration value, requires solely matrix multiplications, and has the identical convergence charges as its Riemannian optimization counterparts that require the total matrix B. Experiments exhibit its effectiveness in varied machine studying functions involving generalized orthogonality constraints, together with CCA, ICA, and the GEVP.