Nguyen Thanh Sang presented a paper on developing a simpler model called GraphMixer for temporal link prediction. GraphMixer achieves better performance than more complex baseline models like RNNs and self-attention networks. It uses a fixed time encoding function and MLP mixer to summarize link information, avoiding complex neural architectures. Experiments on five datasets show GraphMixer converges faster and generalizes better. The success of the simpler GraphMixer model suggests complex neural architectures and data processing may not always be needed for temporal network tasks.