This technical report describes research into solving a two layer linear diffusion equation on a GPU. It first discusses GPU hardware and software models. It then explains how to solve a one layer diffusion equation using LU factorization, and how to parallelize this on a GPU using recursive doubling. Finally, it describes how to model diffusion across the interface between two layers, and presents results implementing and comparing CPU and GPU solutions.