How to Build Composable Generative AI Solutions

Learn how to adapt the composable strategy to Generative AI use cases

May 22, 2024

Composable Generative AI Solutions
(Credits: Shutterstock)

Phil Marsalona, vice president, consulting partner of technology strategy at Merkle, notes that GenAI has developed rapidly due to factors like state-of-the-art model commoditization, enterprise-ready cloud-based model access, and diverse use cases. He also offers innovative marketing strategies for marketers and technologists to leverage flexible generative AI.

Building, iterating, and optimizing generative AI solutions in this dizzying environment of rapid innovation is only possible with a composable approach. Outlined below is the importance of focusing on your data, abstraction, and performance to drive composability. 

The Composable Strategy

In the space of customer experience, data, and marketing technology, “composable” customer data platforms (CDPs) have taken off. And for a good reason, too: their benefits include reducing data duplication, technical debt, and security exposure. By composing your CDP solution of multiple interoperable platforms rather than a single SaaS product, you’ll reduce the need to manage and manually sync copies of your data across multiple environments. This concept took around 5 years to materialize for CDPs, and we’re surprisingly seeing a similar strategy emerge with generative AI in just 18 short months.

 How is this possible for such a new technology? 

 It boils down to three driving factors:

1. Commoditization of state-of-the-art GenAI models

The explosive release of OpenAI’s ChatGPT in November 2022 was the shot heard around the world for generative AI innovation, investment, research, and integration. It transformed what once seemed like a distant innovation into something easily accessible. Since then, we’ve seen a proliferation of generative AI companies, models, apps, and services come to market in record time. Though OpenAI dominated state-of-the-art capabilities and performance in 2023, the landscape looks different now. 

With the release of Google’s Gemini 1.5, Anthropic’s Claude 3, and Mistral’s variety of high-performing and open-source models, OpenAI finally has some healthy competition.  There will certainly be continual jockeying for first place in model performance. The current level of model quality and potential utility across multiple players means that we finally have a decent menu of options for composability— hooray for competition! 

2. Enterprise-ready gen AI model access via cloud services

This brings me to my next point: it’s easier than ever for the enterprise to safely build with state-of-the-art generative AI models, regardless of your cloud service provider strategy. This is because leading AI labs are effectively linked and available with each of the major enterprise cloud service providers – Microsoft Azure and OpenAI’s GPT, Google Cloud and Google’s Gemini, AWS and Anthropic’s Claude. 

Furthermore, each of the cloud service providers has been feverishly working to build out their respective enterprise-gen AI model access and management platforms directly within the cloud consoles. This enables straightforward integration into existing enterprise cloud-hosted services, apps, data, and more, making building with any of them more efficient. 

See More: Low-code Composable Experiences Speed Up Implementations 

3. Massive gen AI use cases surface area

The sheer potential and variety of use cases that generative AI models can power is necessitating this composable paradigm. The application of gen AI ranges from creative outputs like imagery, video, and 3D models to more concrete outputs like code, nuanced classification, and workflow automation. The variety of tools, processes, people, and parts of an organization that gen AI use cases can impact means that it requires a more composable approach rather than a monolithic one.  The example of gen AI use case framework below is a starting point used to seed ideation with clients. As you can see, the breadth of application is staggering!

AI-Use-Case image

Caption: Framework example to map Gen AI use cases
Source: Merkle

So, what does this mean for you?  

For better or worse, this means that the gen AI models we use to build apps, use cases, or solutions today may quickly become outdated with the release of new models that are more performant, cost-efficient, or accurate. To keep up with the need for flexibility, we must build our gen AI solutions for composability.  

 How do we do it? 

1. Quality data is the cornerstone of gen AI success

Did I mention data? Regardless of the application, the success of generative AI applications lives and dies based on the quality of your data. Whether you’re using the data to provide examples in a prompt to drive in-context learning, create a fine-tuning data set, or craft a vector datastore to power retrieval augmented generation, the quality of the underlying data is paramount. If we first focus on ensuring the data we use in our gen AI use case is high quality, we’ll drive composability by default because it’ll be that much easier to reuse in the long run! 

2. Abstracting gen AI use cases for flexibility and future-proofing

With the relentless release cycle of new models on a near monthly basis, it’s important to ensure we’re not building our gen AI use cases monolithically, which could lock us into any one vendor’s solution. To solve this, invest effort upfront to break down the components of your use cases systematically. Ensure each part is abstracted and encapsulated in a way that it can be easily understood, updated, and swapped, apart from the individual gen AI model vendor’s specifics. 

Have a fine-tuning data set? Ensure that it’s managed and governed in a place and structure that’s agnostic to the specific vendor’s formats and processes. Have a library of prompts and example data? Make sure they are encapsulated and managed outside of the API call and application code. Abstracting your gen AI solution data and components from specific vendors makes it much easier to maintain composability and take advantage of a rapidly evolving ecosystem of gen AI models and tools. 

To clarify and abstract capabilities in the gen AI tech stack, check out the framework below to visualize the current state.  

AI-Tech-Stack image

Caption: Gen AI Capability tech stack example
Source: Merkle

3. Focus on performance first, worry about cost and scale later

Finally, with quality data and abstracted components, compose your gen AI solutions with models and tools that drive the highest success rate for your use case out of the gate. Since generative AI use cases are high visibility and quickly judged on whether they reliably work, trying to optimize the tradeoffs between performance and cost out of the gate is a losing bet. Pay extra for performance from the start, and once they can prove efficacy, then consider experimenting with less powerful but more cost-efficient models and tools. Plus, there’s bound to be new ones in the market in a few days or months anyway! 

Building, iterating, and optimizing gen AI solutions in this dizzying environment of rapid innovation is only possible with a composable approach. Focus on your data, abstraction, and performance to drive composability. 

And yes, if you’re wondering, this article was written by a real human. 

Image Source: Shutterstock

MORE ON COMPOSABILITY 

Phil Marsalona
Phil Marsalona

VP Consulting Partner of Technology Strategy, Merkle

Phil Marsalona supports Merkle’s marketing technology strategy practice, focusing on data-driven marketing technology solution strategy and omni-channel experience enablement. Phil has developed marketing technology roadmaps and driven vendor selections & RFP facilitation engagements for multiple fortune 500 organizations across numerous verticals including Insurance, Finance, Retail, Pharma, High-Tech.
Take me to Community
Do you still have questions? Head over to the Spiceworks Community to find answers.