Identification and measurement of pain are critical aspects of designing effective pain management interventions aimed at alleviating suffering and preventing a functional decline in the health medical domain. Traditional methods of pain identification rely on verbal patient reports or observable signs. However, these approaches may be imprecise or subjective due to the variability in patient self-assessments. Therefore, there is a pressing need for automated methods that can standardize pain measurement procedures. In this paper, we propose a novel facial expression-based automated
pain assessment system as a behavioral indicator for pain evaluation. Our system integrates a fusion structure comprising Convolutional Neural Networks (CNNs) and Tensor-based extended Quantitative Expression Descriptors Analysis (TXQEDA). This innovative approach allows for the joint learning of robust pain-related facial expression features from raw facial images, combining RGB appearance with shape-based latent representation. We extensively evaluated our proposed model using the UNBC-McMaster dataset for pain classification and intensity estimation. Our results demonstrate the efficacy of the proposed technique, achieving an impressive accuracy of 98.80% for pain level classification. Furthermore, our system excels in pain intensity estimation, showcasing its potential to provide precise and reliable assessments of pain severity. Overall, our research underscores the importance of automated pain assessment systems in improving patient care and clinical outcomes. By leveraging advanced deep learning techniques and fusion strategies, our proposed model offers a promising approach for standardizing pain measurement procedures and enhancing the quality of pain management interventions.