Home Education Confluent Developer Skills for Building Apache Kafka

Confluent Developer Skills for Building Apache Kafka

22
0

ABOUT THIS COURSE

The Confluent Kafka Platform is a data transmission environment that enables you to organize and manage large amounts of data that arrive every second at the entry points of modern organizations in various industries from retail, logistics, manufacturing and financial services to online social media. With Confluent, this growing flow of data organized into a Publish / Subscribe model, often unstructured and incredibly valuable. Kafka Confluent becomes an easily accessible, unified flow data platform that is always available for many uses across the organization. These uses can easily range from batch Big Data analysis with Hadoop and powering real-time monitoring systems to more traditional high-volume data integration tasks that require a high-performance backbone, extraction, transformation, and load (ETL). Confluent Kafka offers clients different training classes, both for administrators (implementation) and for developers (pub / sub client creation) and the latest method of querying data with KSQL.

In this three-day Apache Kafka developer tutorial, you will learn how to create an application that can publish data and subscribe it to a Kafka cluster. You will learn the role of Kafka in the modern line of data distribution, analyze the concepts and components of the Kafka architecture, and review the Kafka Developer APIs. The course also covers other components on the broader Confluent platform, such as Kafka Connect and Kafka Streams.

WHO IS IT FOR?

This Program is designed for application developers, ETL (extract, transform, and load) developers and data scientists who need to interact with Kafka clusters as the data source or destination.

It is recommended that students be familiar with developing in Java, .NET, C #, or Python. A working knowledge of Apache Kafka architecture is required, gained through working with the platform or through the Confluent Fundamentals for Apache Kafka course. You can check your knowledge of Apache Kafka through this questionnaire: https://cnfl.io/fundamentals-quiz

COURSE OBJECTIVES

At the end of the training, the student will obtain skills related to:

  • How to create an application that can publish and subscribe to data from an Apache Kafka® cluster.
  • Kafka’s role in the modern data distribution line, to know Kafka’s main architectural concepts and components.
  • Know the Kafka development APIs.
  • Other components on the broader Confluent platform, such as Schema Registry, REST proxy and KSQL.

CONTENTS

Apache Kafka Basics

  • The Streaming Platform
  • The Commit Log and the Log Structured Data Flow
  • Data elements, issues, segments and partitions
  • Log Replication and Compaction
  • Kafka Clients – Producers, Consumers and Kafka Connect
  • Producer design, serialization and partitioning
  • Consumer Groups

Kafka architecture

  • Kafka Commit Log, High Concurrency, and Storage
  • Aftershocks for reliability
  • Partitions and Consumer Groups for scalability
  • Security overview in Kafka

Development with Kafka

  • Scheduled access to Kafka
  • Write a Producer in Java
  • Using the REST API to write a Producer
  • Kafka reading path
  • Write a Consumer in Java
  • Using the REST API to write a Consumer

Advanced development with Kafka

  • Message size and durability
  • Enable Exactly Once Semantics (EOS)
  • Specify Offsets
  • Activity and rebalancing of a Consumer
  • Manually Commit Offsets
  • Data partitioning

Scheme Management in Kafka

  • Introduction to Avro and data serialization
  • Avro Schemes and their evolution
  • Using the Schema Registry

Data Pipelines with Kafka Connect

  • Reasons to use Kafka Connect
  • Connector Types
  • Kafka Connect implementation
  • Independent mode and distributed mode
  • Connectors Configuration

Real-time processing with Kafka Streams

  • Introduction to the Kafka Streams API
  • Kafka Streams concepts
  • Create a Kafka Streams application
  • Kafka Streams by example
  • Kafka Streams Processing Management

Real-time processing with Confluent KSQL

  • KSQL for Apache Kafka
  • Write KSQL queries

Event-managed architecture

  • Event Driven platform
  • From CQRS to event provisioning
  • Microservices

Confluent Cloud

  • Confluent Cloud overview
  • Use of Cloud CLI and Web UI
  • Configure Kafka Clients

LEAVE A REPLY

Please enter your comment!
Please enter your name here