The discipline of seismology is based on observations of ground motion that are inherently undersampled in space and time. Our basic understanding of earthquake processes and our ability to resolve 4D Earth structure are fundamentally limited by data volume. Today, Big Data Seismology is an emergent revolution involving the use of large, data-dense inquiries that is providing new opportunities to make fundamental advances in these areas. This article reviews recent scientific advances enabled by Big Data Seismology through the context of three major drivers: the development of new data-dense sensor systems, improvements in computing, and the development of new types of techniques and algorithms. Each driver is explored in the context of both global and exploration seismology, alongside collaborative opportunities that combine the features of long-duration data collections (common to global seismology) with dense networks of sensors (common to exploration seismology). The review explores some of the unique challenges and opportunities that Big Data Seismology presents, drawing on parallels from other fields facing similar issues. Finally, recent scientific findings enabled by dense seismic data sets are discussed, and we assess the opportunities for significant advances made possible with Big Data Seismology. This review is designed to be a primer for seismologists who are interested in getting up-to-speed with how the Big Data revolution is advancing the field of seismology.