Oracle today announced the general availability of the Oracle Cloud Infrastructure (OCI) Generative AI service along with new innovations that make it easier for enterprises to take advantage of the latest advancements in generative AI. OCI Generative AI service is a fully managed service that seamlessly integrates large language models (LLMs) from Cohere and Meta Llama 2 to address a wide range of business use cases. OCI Generative AI service now includes multilingual capabilities that support over 100 languages, an improved GPU cluster management experience, and flexible fine-tuning options. Customers can use OCI Generative AI service in the Oracle Cloud and on-premises via OCI Dedicated Region.
“Oracle’s AI focus is on solving real-world business use cases to enable widespread adoption in the enterprise. To do this, we are embedding AI across all layers of the technology stack by integrating generative AI into our applications and converged database, and offering new LLMs and managed services—all supported by a fast and cost-effective AI infrastructure,” said Greg Pavlik, senior vice president, AI and Data Management, Oracle Cloud Infrastructure. “Instead of providing a tool kit that requires assembling, we are offering a powerful suite of pre-built generative AI services and features that work together to help customers solve business problems smarter and faster.”
Simplifying the customisation of generative AI models
To help customers address business issues focused on text generation, summarisation, and semantic similarity tasks, the latest models from Cohere and Meta Llama 2 will be available in a managed service that can be consumed via API calls. In addition, customers will be able to embed generative AI easily and securely into their technology stack, with tight data security and governance.