Home AI Business Revolutionary AI breakthrough reignites passion for on-premises infrastructure!

Revolutionary AI breakthrough reignites passion for on-premises infrastructure!

Revolutionary AI breakthrough reignites passion for on-premises infrastructure!

Is the Cloud Losing its Appeal? Generative AI May Bring IT Workloads Back On-Premises

Generative artificial intelligence (AI) could be the game-changer that brings IT workloads back to on-premises infrastructure, according to experts. The release of OpenAI LP’s ChatGPT large language model-based chatbot has sparked interest in natural language processing. A survey by Arthur D. Little Inc.’s Cutter Consortium found that 33% of businesses plan to integrate large language models (LLMs) into their applications, with only 5% ruling out the possibility. Furthermore, a survey by GBK Collective revealed that 37% of business leaders use generative AI weekly.

The debate centers around the training and fine-tuning of LLMs for specific use cases. While cloud providers initially captured much of the market, the nature of model training and tuning makes local infrastructure more attractive. Training machine learning models requires large amounts of data, and transferring petabyte-sized datasets between local data centers and cloud platforms is complex and costly.

The AI lifecycle consists of two distinct phases: training and inferencing. Training involves teaching the model to recognize patterns and make predictions, while inferencing applies the trained model to new data. Training requires powerful hardware and can last for days or even weeks, while inferencing is less computationally intensive.

Bandwidth limitations and concerns about data leakage also challenge the idea of moving LLM training entirely to the public cloud. The correlation between model performance and data used to train it isn’t always clear, which can inadvertently expose sensitive information. Privacy and protection motivate companies to keep models on-premises, particularly in regulated industries.

While on-premises training offers control and privacy advantages, it can be expensive. The cost of AI training projects has increased significantly in recent years, with some estimates for training the GPT-4 model exceeding $100 million. Nevertheless, some experts believe the control provided by on-premises infrastructure will lead to a migration of training workloads away from the cloud, while others argue that the scalability and tooling advantages of the public cloud make it a natural choice for AI training.

In conclusion, generative AI has ignited a debate about the future of IT workloads. While the cloud has been the preferred destination for AI training, the limitations of transferring large datasets and concerns about data privacy are making on-premises infrastructure more appealing. However, the scalability and cost advantages of the cloud cannot be overlooked. Only time will tell which approach prevails.

What are your thoughts on this? Do you think generative AI will bring IT workloads back on-premises, or will the cloud continue to dominate? Leave a comment and let us know your opinion!

IntelliPrompt curated this article: Read the full story at the original source by clicking here

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

You need to enable JavaScript in order to use the AI chatbot tool powered by ChatBot
Exit mobile version