Gravitational wave detection has opened up new avenues for exploring and understanding some of the fundamental principles of the universe. The optimal method for detecting modelled gravitational-wave events involves template-based matched filtering and doing a multi-detector search in the resulting signal-to-noise ratio time series. In recent years, advancements in machine learning and deep learning have led to a flurry of research into using these techniques to replace matched filtering searches and for efficient and robust parameter estimation. This paper presents a novel approach that utilizes deep learning techniques to detect gravitational waves from the signal-to-noise ratio time series produced from matched filtering. We do this to investigate if an efficient deep-learning model could replace the computationally expensive post-processing in current search pipelines. We present a feasibility study where we look to detect gravitational waves from binary black hole mergers in simulated stationary Gaussian noise from the LIGO detector in Hanford, Washington. We show that our model can match the performance of a single-detector matched filtering search and that the ranking statistic from the output of our model was robust over unseen noise, exhibiting promising results for practical online implementation in the future. We discuss the possible implications of this work and its future applications to gravitational-wave detection.