Detection and measurement of pain are crucial in designing effective pain management interventions to alleviate suffering and prevent functional decline in the medical domain. Traditional methods of pain identification rely on verbal patient reports or observable signs, which can be imprecise or subjective due to variability in patient self-assessments. Hence, there is a pressing need for automated methods to standardize pain measurement procedures. This paper proposes a novel facial expression-based automated pain assessment system as a behavioral indicator for pain evaluation. Our system integrates a fusion structure comprising Convolutional Neural Networks (CNNs) and Tensor-based extended Quantitative Expression Descriptors Analysis (TXQEDA). This innovative approach allows for joint learning of robust pain-related facial expression features from raw facial images, combining RGB appearance with shape-based latent representation. We extensively evaluated our proposed model using the UNBC-McMaster dataset for pain classification and intensity estimation. Our results demonstrate the efficacy of the proposed technique, achieving an impressive accuracy of 99.10% for pain level classification. Furthermore, our system excels in pain intensity estimation, showcasing its potential to provide precise and reliable assessments of pain severity. Overall, our research underscores the importance of automated pain assessment systems in improving patient care and clinical outcomes. By leveraging advanced deep learning techniques and fusion strategies, our proposed model offers a promising approach for standardizing pain measurement
procedures and enhancing the quality of pain management interventions.