eCommerceNews Australia - Technology news for digital commerce decision-makers
Story image

Confluent launches tools in cloud for simpler AI apps

Mon, 24th Mar 2025

Confluent has introduced new capabilities in Confluent Cloud for Apache Flink designed to simplify the development and scaling of real-time AI applications.

The enhancements include Flink Native Inference, which allows open source AI models to run directly on Confluent Cloud, thus avoiding the complexities typically associated with AI model deployment. Teams can leverage this capability to streamline workflows and maintain enterprise-level security while achieving cost efficiencies.

Flink search is another addition, providing a unified interface for querying vector databases. This feature aims to simplify the data enrichment process, integrating access to data from numerous databases such as MongoDB, Elasticsearch, and Pinecone. This approach eliminates the need for complicated ETL processes and manual data consolidation, ensuring data is always relevant and up to date.

Shaun Clowes, Chief Product Officer at Confluent, commented, "Building real-time AI applications has been too complex for too long, requiring a maze of tools and deep expertise just to get started. With the latest advancements in Confluent Cloud for Apache Flink, we're breaking down those barriers—bringing AI-powered streaming intelligence within reach of any team. What once required a patchwork of technologies can now be done seamlessly within our platform, with enterprise-level security and cost efficiencies baked in."

Additionally, built-in machine learning (ML) functions have been introduced to Confluent Cloud. These functions allow for AI-driven use cases, such as forecasting and anomaly detection, to be realised directly in Flink SQL. This feature is designed to make advanced data science tasks more accessible to those without specialist expertise.

Steffen Hoellinger, Co-founder and CEO at Airy, supports these developments, stating, "Confluent helps us accelerate copilot adoption for our customers, giving teams access to valuable real-time, organisational knowledge. Confluent's data streaming platform with Flink AI Model Inference simplified our tech stack by enabling us to work directly with large language models (LLMs) and vector databases for retrieval-augmented generation (RAG) and schema intelligence, providing real-time context for smarter AI agents. As a result, our customers have achieved greater productivity and improved workflows across their enterprise operations."

Research from McKinsey indicates that 92% of companies are planning to increase their AI investments over the coming three years. However, the development of real-time AI applications currently faces significant hurdles, including the complexity of integrating multiple tools and interfaces. Confluent's new capabilities aim to tackle these inefficiencies and enhance operational productivity.

Stewart Bond, Vice President, Data Intelligence and Integration Software at IDC, highlights the strategic advantage of using Confluent's integrated platform: "The ability to integrate real-time, contextualised, and trustworthy data into AI and ML models will give companies a competitive edge with AI. Organisations need to unify data processing and AI workflows for accurate predictions and LLM responses. Flink provides a single interface to orchestrate inference and vector search for RAG, and having it available in a cloud-native and fully managed implementation will make real-time analytics and AI more accessible and applicable to the future of generative AI and agentic AI."

Confluent's serverless stream processing solution is noted for its ability to unify real-time and batch processing, thus reducing the complexity of managing separate processing systems. This integrated platform promises to unlock greater efficiencies for businesses by enabling more streamlined workflows.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X