(1) The document proposes a method called Conure for learning continual user representations across different tasks without forgetting previous tasks. (2) Conure addresses the catastrophic forgetting problem using an over-parameterization technique that prunes redundant parameters to preserve important knowledge across tasks. (3) Experiments on recommendation datasets show that Conure outperforms other lifelong and multi-task learning baselines by leveraging knowledge transferred across tasks.