Continuous electrocardiogram (ECG) recordings, which measure the electrical activity of the heart, are increasingly used to predict clinical outcomes via machine learning and artificial intelligence. [1-3] Raw ECG data can have a variety of interference artifacts which, if unattended to, can obscure or invalidate the results of machine learning algorithms. Current ECG de-noising routines focus primarily on how to purify ECG signals, but not on how to classify and excise regions of unusable voltage data.  Here we demonstrate a new method to quantitatively assess the quality of an ECG. We found that approximating a Fourier series to the QRS complex and cross-correlating this kernel sequentially with two-second intervals of the lead II ECG yields a metric that can be used to distinguish sections of data as low-or high-quality signal. The algorithm was developed on independently annotated ECG data, subdivided into two-second intervals (“epochs”), from ten patients admitted to the Mount Sinai Hospital. It was found to be 99% sensitive and 98.9% specific in classifying epochs into either noise or signal in 1,000 test epochs. This algorithm can be used to accelerate machine learning modeling that uses ECG data and lower the burden of pre-processing in research. Given its memory and run-time efficiency (demonstrably O(n)), this routine can be an effective starting point for real-time and continuous evaluation of ECG data using machine learning.