It indicates an expandable section or menu, or sometimes previous / next navigation options. Homepage

How building an AI strategy can protect your company from costly GenAI mistakes

A woman’s hand is asking an AI chatbot pre-typed questions and the Artificial Intelligence website is answering on a touch screen monitor.
Laurence Dutton/Getty Images
  • GenAI can improve productivity by up to 66%. 
  • A majority of workers bring their own GenAI tools to work, whether their managers know about it or not. 
  • As AI becomes more advanced, organizations are leveraging strategies to manage security risks and spiraling costs.

Workers everywhere are tapping the benefits of AI to improve workflows and drive innovation. But as widespread use of generative AI skyrockets across teams and functions, organizations must ensure it's implemented securely and cost-effectively. 

Already, 75% of global knowledge workers are using GenAI applications, according to recent research from May 2024 — up 46% from six months prior. And employees aren't waiting for guidance from management to get started: More than 78% are bringing their own GenAI tools to work.

It's easy to understand why employees are so eager to use GenAI; it can improve productivity by up to 66%. But with increased GenAI use also comes increased risk, particularly for organizations without a formal AI strategy. A recent study found that 71% of organizations do not provide employees guidance on when, where, or how to use AI.

"If everyone is allowed to bring in GenAI applications and use them however they want, it creates serious challenges with security and governance," said Fuzz Hussain, senior manager of AI portfolio marketing at Dell Technologies. "Organizations should recall the hard lessons they learned when cloud computing was introduced and lax controls led to shadow IT. If they're not careful now, they'll end up with shadow AI."

As with shadow IT, shadow AI will lead to spiraling costs and data siloes, as well as security risks. 

The good news is organizations can avoid mistakes of the past by building an AI strategy for the future. By making the right strategic decisions about development and implementation, they can maximize control while optimizing cost efficiencies.

Solutions for GenAI implementation

Companies have a number of choices for building and deploying GenAI, and they should carefully weigh the security and cost factors involved. Below are a few of their options:

  • Building a large language model (LLM) from scratch

Building an LLM from the ground up requires immense resources and specialized expertise, making it cost-prohibitive for most organizations.   

  • Accessing an LLM through an application programming interface (API)

API-based LLMs often require third-party data processing, introducing security risks and placing them off-limits for those using PPI or other information subject to privacy or compliance regulations.  

Fees are based partially on the number of queries users submit, which can make applications difficult to gauge or scale. And that's not the only expense.

"You're also paying for the solution's hardware, software, and maintenance — those costs are baked into the price," Hussain said.

  • Building applications on public cloud infrastructure 

Building in the public cloud allows developers to fine tune models more than they can with APIs. They can also manage costs — but only up to a point.

"Many companies think they're pushing the 'easy' button by subscribing to the amount of compute they think they will need. But costs can compound quickly," Hussain said. 

Estimating compute resources, especially for a new capability like GenAI, can be difficult. Companies must also factor in charges for data storage and data movement.

"Fine tuning a model and moving data — especially to different regions — entails costs," Hussain added. "And as with API services, you're paying for the use of the provider's hardware and software."

  • Downloading and customizing an open-source LLM 

After downloading an open-source model such as Mistral or Meta Llama 3, you can fine tune it to your needs or give the model your company's data to improve its answers using a process called retrieval-augmented generation (RAG).

RAG directs the LLM to retrieve authoritative information relevant to your use cases. You can input your own data on-premises, with no need to send it outside the company. 

For organizations looking to embark on their GenAI implementation journey, Hussain says RAG is a great place to start. 

"You avoid having to train a model from scratch and you can leverage your data in-house, keeping it protected wherever it goes," he said. 

Once a solution has been developed, employees can use it anywhere, doing inferencing through data centers on-premises, in the cloud, or even on their own computers. Data is always protected by the company's own security and governance protocols. And since applications are built and deployed in-house, there are no service fees.

Proven cost-effectiveness

By keeping company data in-house, a customized open-source LLM provides a distinct security advantage. It also offers better cost control. But how much of a difference does that make?

Tech Target's Enterprise Strategy Group conducted an economic analysis in collaboration with Dell to find out. Researchers learned that running GenAI inferencing on company infrastructure is up to 75% more cost-effective than using the public cloud and up to 88% more cost-effective than using an API service. 

A similar economic analysis conducted by Principled Technologies in collaboration with Dell found that running GenAI fine tuning and inferencing solutions with Dell Technologies on-premises can be up to 74% more cost-effective than the public cloud. 

The benefits compound for organizations with thousands of users and larger LLMs, which can take advantage of economies of scale. 

Moving forward with GenAI

Regardless of their size, companies shouldn't wait to get started with GenAI, Hussain said.

"Employees are already using it, so if you don't have a plan in place, you're essentially succumbing to shadow AI," he said. "There are a variety of architectures you can use to integrate GenAI into your ecosystem securely and cost-effectively — and start unleashing its power for your business."

Click here to learn more about how Dell's solutions can help you deploy GenAI safely and cost-effectively. 

This post was created by Insider Studios with Dell.

 

AI Studios Enterprise
Advertisement
Two crossed lines that form an 'X'. It indicates a way to close an interaction, or dismiss a notification.

Jump to

  1. Main content
  2. Search
  3. Account