Adaptive Cloud Infrastructure Optimization Using Large Language Models (LLMs) for Cost-Efficient Resource Management
Abstract
The increasing complexity of cloud environments demands intelligent automation for cost-effective resource allocation and optimization. This paper explores the application of Large Language Models (LLMs) in adaptive cloud infrastructure management, focusing on real-time workload prediction, dynamic scaling, and automated cost governance. Using a reinforcement learning-enhanced LLM framework, the study demonstrates significant cost reductions and improved resource utilization across multi-cloud ecosystems. Experimental results highlight up to 30% savings in cloud expenditures, making LLM-driven cloud optimization a scalable and efficient approach for modern enterprises.
Published
2022-07-12
Issue
Section
Articles