Sign language is used mainly by handicapped and disabled individuals to communicate with people from within and outside their communities. Since effective communication is mandatory in this interconnected world, each country has its own variant of sign language, which acts as a line of communication between normal and handicapped individuals. Computer-based automated recognition of sign language is important since it can help us learn different sign languages and perform automated translation. There are quite a few existing sign language detection modules, but it is often difficult to introduce those in daily life activities as those are not prepared for all the difficult scenarios, such as during the night or in low-light conditions. In this research, we propose an optimized pipeline for sign language detection under low-light conditions using thermal images of sign language. The influence and capabilities of thermal imaging in the detection of heat patterns from hand gestures help extract features easily. We employ Larq, an open-source Python library for training binarized neural networks for optimization purposes, which is effective computationally and has a low cost. These factors are crucial in terms of real-time recognition and accessibility. For comparison, traditional full precision models under similar conditions are evaluated as well. Furthermore, the research is done on a dataset consisting of low-resolution thermal images of sign language digits. The experimental results reveal that the module proves to be competitive with traditional models in terms of performance metrics, while also maintaining low computational requirements. The results lay the groundwork for effective and accessible solutions for sign language recognition. Additionally, it sheds light on the research needed to advance binarized neural networks (BNNs).