Apache Kafka의 최초 개발자가 만든 Confluent는 저명한 기술자 및 업계 전문가와 함께 다양한 토크를 선보입니다.
Ready to turn data mess into a data mesh? Join us to learn how to use Confluent connectors, stream processing, and Stream Governance to successfully implement the four principles of data mesh. Build high-quality data products and make them easily discoverable and accessible across your organization.
In this webinar, we will walk you through two product demos to ensure you’re ready for ZooKeeper-less Kafka: one on how to get started with KRaft and the second on how to migrate to KRaft from an existing deployment.
원하는 카테고리 콘텐츠를 구독하고 다음 세션에 자동으로 등록하세요.