Causal structure learning algorithms have focused on learning in ”batch-mode”: i.e., when a full dataset is presented. In many domains, however, it is important to learn in an online fashion from sequential or ordered data, whether because of memory storage constraints or because of potential changes in the underlying causal structure over the course of learning. In this paper, we present TDSL, a novel causal structure learning algorithm that processes data sequentially. This algorithm can track changes in the generating causal structure or parameters, and requires signiﬁcantly less memory in realistic settings. We show by simulation that the algorithm performs comparably to batch-mode learning when the causal structure is stationary, and signiﬁcantly better in non-stationary environments.