What does an AI platform strategy look like in the LLM era?

In the era of Large Language Models (LLMs), Sid Bhatia, Regional VP & General Manager for the Middle East, Turkey & Africa at Dataiku, explores the key considerations and insights on crafting an effective AI platform strategy for organizations.

0 361

Artificial intelligence has been a hot topic more than once in recent years as various aspects of it — natural-language processing, cognitive processing, predictive analytics — passed muster and entered into the mainstream. Today, we talk about generative AI, the awe-inspiring technology that can create content much as humans do.

Generative AI is already being seen as having enormous potential for growth in the GCC region, where PwC recently predicted US$23.5 billion in “economic benefits” from the technology by 2030. Any CIO that is watching this space will be pondering how to integrate the ChatGPTs of the world (large-language models or LLMs) into an overall strategy for Everyday AI. The region’s tech leaders will doubtless revisit the build-vs-buy dilemma and update the transformation roadmap to accommodate the new kid on the block.

 

To build or buy…that is the question

Ever since Google identified “hidden technical debt” in the building of machine-learning systems from scratch, few CIOs would be brave enough to pursue a pure build path. Instead, they would buy where it made sense and build where it became necessary. Generative AI has given rise to AI-powered email drafting for sales teams, AI-powered contract reviews for procurement teams, and a selection of other point solutions that integrate the technology. They are specific and, as such, offer rapid time to value. The downsides of specificity are that the organisation gains little in the way of scalability or upskilling. And as point solutions take over, technical debt mounts along with increased reliance on third-party vendors. Additionally, off-the-shelf products can be adopted by competitors and therefore offer no opportunities to differentiate a brand, even if they are used to fulfil core business functions.

Of course, there is always the option of buying several point solutions and combining them into something unique, but this tends to be a choice for organisations that have built most of the solution they need and only need one more tool to “complete the set”. Or it is used as the answer to the consensus problem — allow each department to pick their own best-of-breed tool and stitch the solutions together down the line.

Not only will these enterprises remain bound to multiple vendors, but in giving into the temptation to let each department pick its own solution, making a coherent whole out of the point-solutions fog is almost certain to be problematic. Data lineage will be difficult to track, impacting visibility, transparency, and trust. Handover of results from team to team can slow the pipeline and delay the delivery of actionable insights. Tool silos also impact governance, as they introduce complexity in guaranteeing model-bias elimination, regulatory compliance, and more. Additionally, options for auditing and automation are missed.

 

The end-to-end platform…that is the answer

What is needed, then, is an end-to-end AI platform that allows for generative AI’s integration and covers the entire lifecycle, from data gathering and cleaning to model-building and operationalisation. The platform must be set up for easy monitoring, maintenance, and governance. By developing your own artifacts, you introduce the cost benefits of reuse. For example, data cleaning and preparation are time-consuming. If prepared datasets can be accessed for future models, an entire step can be skipped with no loss of quality, which makes AI more scalable. Focusing on a single centralised platform also simplifies procurement, strategy, and the integration of new tech like generative AI. It further allows organisations to unite data governance with AI governance, greatly lubricating issues such as compliance and democratisation.

Platforms make it easier to assimilate MLOps into DevOps, allowing the journey to Everyday AI to be accelerated by bringing CI/CD (continuous integration and continuous delivery) practices to machine learning. These practices include the automation of testing, which allows models to be pressed into live service more quickly.

The platform path is the optimum approach for almost every organisation in the region. Service providers are specialists with talent on hand to augment or upskill the customer’s skill set. Point solutions can be integrated if the business goals warrant it and because of the collaborative provisions of a platform, solutions architects can work more closely with their end-users. There is, naturally, the fear of being tied to a single vendor. But if the vendor’s solution is open and extensible, its customers can leverage the strengths of their legacy architecture while investing in the latest technologies — storage, compute, algorithms, languages, frameworks, and more. Today, that list will include generative AI. If the right integration options are present, the AI platform will include the building blocks for viable LLM solutions — those that are scalable, accurate, responsible, and compliant.

 

Check and tick

So, when you go looking for your next AI platform — the one that will carry you through the bands of maturity towards Everyday AI status — make sure it can be integrated with all the coding languages, model libraries, and other modern technologies data scientists like to use. Make sure it integrates with the required data storage systems and that everything it provides will propel you further along your AI roadmap. And revisit this roadmap. Discover if the platform you are evaluating will allow you to execute every single step and that open standards are observed so that your company has no barriers to adopting new technologies as they emerge.

Everyday AI is a term used to describe an environment in which every employee at every level understands AI, trusts AI, and thinks about AI when solving business problems. To get there, the organisation must build a culture that is enabled by a single platform, available to all, so that when new things like generative AI sidle onto the stage, they can be snapped up instantly.

Leave A Reply

Your email address will not be published.

Join our mailing list
Sign up here to get the latest news, updates and special offers delivered directly to your inbox.