Confluent named a Leader in the Forrester Wave: Streaming Data Platforms | Access the report
Confluent Private Cloud (CPC) is a new software package that extends Confluent’s cloud-native innovations to your private infrastructure. CPC offers an enhanced broker with up to 10x higher throughput and a new Gateway that provides network isolation and central policy enforcement without client...
Confluent announces the General Availability of Queues for Kafka on Confluent Cloud and Confluent Platform with Apache Kafka 4.2. This production-ready feature brings native queue semantics to Kafka through KIP-932, enabling organizations to consolidate streaming and queuing infrastructure while...
Explore new Confluent Intelligence features: A2A integration, multivariate anomaly detection, vector search for Cosmos DB and S3 Vectors, Private Link, and MCP support.
Why do our customers choose Confluent as their trusted data streaming platform? In this blog, we will explore our platform’s reliability, durability, scalability, and security by presenting some remarkable statistics and providing insights into our engineering capabilities.
Over the last decade, financial services companies have doubled down on using real-time capabilities to differentiate themselves from the competition and become more efficient. This trend has had a huge impact on customer experience in banking especially, and home mortgage company Mr. Cooper
Confluent has successfully achieved Google Cloud Ready - AlloyDB designation for AlloyDB for PostgreSQL, Google Cloud’s newest fully managed PostgreSQL-compatible database service for the most demanding enterprise database workloads.
Breaking encapsulation has led to a decade of problems for data teams. But is the solution just to tell data teams to use APIs instead of extracting data from databases? The answer is no. Breaking encapsulation was never the goal, only a symptom of data and software teams not working together.
Our modern society has moved to a culture of immediacy. The most successful organizations embrace this new reality and are utilizing data in new ways to inform decisions and communications. Join the Data in Motion Tour in Washington, DC, on March 30, to learn more about data streaming.
As the Senior Marketing Manager for the Central European region, Evi Schneider has been responsible for the entire marketing mix from events to online campaigns to partner marketing, as well as localised assets for the German-speaking market.
Stream processing has long forced an uncomfortable trade-off: choose a framework based on its power, or in your preferred programming language. GraalVM may offer an alternative solution to avoid having to choose.
Boston is a city of many firsts. The first public park, the first public school, the first UFO sighting in America. And, we just added one more to the list: The first stop in North America for our Data in Motion Tour this year.
The ML and data streaming markets have socio-technical blockers between them, but they are finally coming together. Apache Kafka and stream processing solutions are a perfect match for data-hungry models.
Apache Kafka and stream processing solutions are a perfect match for data-hungry models. Our community’s solutions can form a critical part of a machine learning platform, enabling machine learning engineers to deliver real-time MLOps strategies.
The big data revolution of the early 2000s saw rapid growth in data creation, storage, and processing. A new set of architectures, tools, and technologies emerged to meet the demand. But what of big data today? You seldom hear of it anymore. Where has it gone?
When Jade Bowen joined Confluent as an account executive for the enterprise market and 11th employee in the ANZ region, there was only one client for her to work with. Three years later, she’s heading up the entire APAC Customer Success team.
Use the Confluent CLI and API to create Stream Designer pipelines from SQL source code.