I. Random subspace with trees is an ensemble method for feature selection under memory constraints where each tree is grown from a random subset of features of size q. II. Sequential random subspace improves upon random subspace by filling part of each random subset with previously identified relevant features, speeding up convergence while maintaining asymptotic guarantees. III. Experiments on benchmark datasets show sequential random subspace more accurately ranks features and achieves better prediction performance than random subspace, especially with large numbers of irrelevant features.