Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Leveraging Large Language Models to Enhance an Intelligent Agent with Multifaceted Capabilities

Version 1 : Received: 18 September 2024 / Approved: 18 September 2024 / Online: 19 September 2024 (10:01:58 CEST)

How to cite: Thottempudi, S. G.; Borra, S. Leveraging Large Language Models to Enhance an Intelligent Agent with Multifaceted Capabilities. Preprints 2024, 2024091446. https://doi.org/10.20944/preprints202409.1446.v1 Thottempudi, S. G.; Borra, S. Leveraging Large Language Models to Enhance an Intelligent Agent with Multifaceted Capabilities. Preprints 2024, 2024091446. https://doi.org/10.20944/preprints202409.1446.v1

Abstract

This project aims to create a virtual assistant with AI integration to improve Siemens Energy's internal processes. Using cloud-based technologies, microservice architecture, and large language models (LLMs), the project seeks to create a reliable, effective, and user-friendly assistant customized to Siemens Energy's requirements. The first significant business difficulty identified by the study was the time engineers had to spend looking for information in large volumes of company papers. The proposed virtual assistant responds with precision and context awareness to optimize productivity. The assistant uses a microservice architecture to guarantee scalability, flexibility, and integration for various use scenarios. Tasks like document retrieval, translation, summarization, and comparison can now be handled effectively. Utilizing Amazon Web Services (AWS) for cost-effectiveness and scalability, the backend is cloud-deployed, backed by a frontend created for natural user interaction. To increase precision and relevance, the system uses cutting-edge AI, such as vector databases and Retrieval Augmented Generation (RAG). The assistant expedites document management procedures, improves data accessibility, and reduces search time. The results highlight how it may enhance workflow efficiency for Siemens Energy engineers and how flexible it can be for future AI-driven applications.

Keywords

large language models; retrieval augmented generation

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.