Within this paper, a machine learning algorithm is used to investigate the importance of certain setpoints and parameters in the filtration processes of a large-scale water treatment facility. Previously, a model for the filtration process based on Run-to-Run Control was proposed and tested against sample data from the treatment plant, but it was quickly found that such a model was incompatible for successfully computing setpoints of operation which minimize the energy cost of running the filtration systems. The machine learning model described herein is an attempt to elucidate the importance of the available data on the filtration systems and to identify the most important variables that influence the filtration run time.