The Semantic-Enhanced Tree-LSTM (SeT-LSTM) network, a novel advancement in the realm of linguistic modeling, marks a significant step forward from traditional Tree-based Long Short Term Memory (LSTM) networks. By intricately weaving in the nuances of typed grammatical dependencies, SeT-LSTMs offer a more nuanced understanding of language semantics. Traditional models often overlook the semantic shift caused by variations in word or phrase roles, a gap this paper aims to fill by focusing on the types of grammatical connections, or typed dependencies, within sentences. Our proposed architecture, dubbed the Semantic Relationship-Guided LSTM (SRG-LSTM), leverages a control mechanism to model the interplay between sequence elements. Additionally, we present a novel Tree-LSTM variant, the Semantic Dependency Tree-LSTM (SDT-LSTM), which integrates dependency parse structures with dependency types for more robust sentence embedding. The SDT-LSTM demonstrates superior performance in Semantic Relatedness Scoring and Sentiment Analysis compared to its predecessors. Qualitatively, it shows resilience to changes in sentence voice and heightened sensitivity to nominal alterations, aligning well with human intuition. This research underlines the pivotal role of grammatical relationships in sentence understanding and paves the way for further exploration in this domain.