We propose a novel method to simulate non-isothermal flows. This method is ideally suited for the GPU architecture. The new algorithm is derived by coupling the lattice Boltzmann formulation for the flow with the finite difference scheme for the temperature field. We apply this algorithm to solve for the flow in the well known buoyancy driven cavity problem and verify the validity of the algorithm and its GPU implementation by benchmarking the thermal flow patterns with known results. The implementation details on the GPU are discussed in detail to bring out the inherent advantage of the GPU hardware for such data parallel applications.