AI is diversifying. The spread of current artificial intelligence functions is being driven by AI models that deliver text-based services (large language models), image-centric analysis tools (large image/vision models) and business process or domain-specific intelligence (such as seen with retrieval augmented generation) to create many new ways of providing predictions and so-called insights. At the same time, we’re seeing AI being executed at the compute processing level in public cloud services, in on-premises private clouds, across the “edge” space in the Internet of Things on remote devices and everywhere in between, even in air-gapped environments.

That’s a lot of AI models working on different jobs being powered by different computing engines. If it weren’t obvious, the next term that we must logically use in this story is “orchestration” i.e. the need to assess & audit, corral & coalesce and monitor & manage these elements is of paramount importance if these systems are not going to break down through maligned misconfiguration and mismanagement.

Any AI, On Any Compute



Aiming to provide an acute element of AI orchestration is Clarifai (clarify for clear AI performance, but spelled with a cute AI ending, get it?), a Washington DC-based company that works to help software application developers build, deploy and operationalize AI. The company’s compute orchestration capabilities function for AI workloads across any AI model, on any compute [service or platform], at any scale.

"Clarifai is opening up capabilities that we built internally to optimize our compute costs as we scaled to serve millions of models simultaneously. Our users can now have the same tools available for their compute, wherever it may be," said Matt Zeiler, founder and CEO of Clarifai. "As generative AI grows, our platform enables customers to reduce complexity and seamlessly build and deploy AI in minutes, at a lower cost, with room to scale and flex to meet future business needs easily."

With the compute orchestration capabilities on offer here, organizations are promised the ability to maximize their compute service and cloud investments while also getting the most of out their capital expenditure commitments in hardware for AI. The company says that users can also tap into Clarifai’s SaaS compute service when appropriate and balance (sorry, orchestrate) all these factors while centrally managing and monitoring costs, performance and governance.

Clarifai is the only platform capable of building and orchestrating AI workloads across any hardware provider, cloud provider, on-premises, or air-gapped environment, eliminating vendor lock-in.

As a production-grade deep learning platform for developers, data scientists and MLOps/ML engineers to build on, Clarifai’s compute orchestration layer provides the ability to use compute and abstract away complexity claims Zeiler and team. The technology is provided with a “control plane” (a core network management decision-making layer) so that users can manage and monitor costs and performance while Clarifai handles software application “dependencies” and optimizations. The platform can automatically optimize resource use through model packing, simple dependency management and customizable autoscaling options, including “scale to zero” (shutting down) for both model replicas and compute nodes.

"With over a decade building relationships with large enterprise and government customers, we are seeing clear pains deploying AI into production. If they want to deploy models on any cloud and in their datacenters, they need to implement a reference architecture for deploying AI repeatably. This means that organizations need a full stack of AI tools to create custom AI workloads and they need all of this while lowering their AI costs," said Zeiler.

Deployment Diversity



The ability to switch cloud computing services (often as a result or costs, changing contract limitations, service requirement nuances or for business reasons due to acquisitions and mergers etc.) means that deployment diversity is very much an on-trend part of the way organizations are using software as a service these days. Reflecting these needs, Clarifai says it enables users to deploy on any hardware or in any computing environment using any hardware vendor in any cloud for AI inference functions.

The company will next offer additional support for training, workflows and other workloads. Most of vendors in this space focus on inference in managed cloud or customer virtual private cloud deployments, somewhat fewer offer on-premises offerings and even fewer offer air-gapped environment support with success in meeting standards for the most demanding military situations.

Many competitors are point solutions focused on inference or compute management specifically. Clarifai offers a full-stack platform for the whole AI lifecycle: data labelling, training, evaluation, workflows and feedback, which is easy enough for any team to collaborate on,” said Zeiler. “We enable users to deploy the compute plane into a customer’s cloud or on-premises Kubernetes cluster without opening up ports into the customer environment. This allows users to manage, monitor and efficiently utilize multiple compute planes from a single control plane without sacrificing security.”

What Next For Orchestration?



We’ve been talking about orchestration before AI, before cloud and perhaps before the “golden age of networking” in the eighties and nineties, so what next for orchestration?

The next wave of orchestration will probably be defined by AI agent orchestration as we start to bring together the new tier of intelligence functions that can perform self-healing actions inside business workflows with little or no human intervention. As we dovetail data orchestration, AI orchestration and human worker orchestration functions into one gargantuan (bit cacophony of noise, we will inevitable hear the tech trade start to talk about “solutions” that work as an orchestrator of orchestrators… and all of that is really just business process automation, so we’ll probably end up right back where we started.

In the meantime, orchestration will play on.

CONTINUE READING
RELATED ARTICLES