This course is structured to give you a theoretical and coding experience of Building Kafka Applications using AVRO and Schema Registry. If you are looking forward to learning the below-listed things: Techniques that are available to evolve the data between the applications that uses Kafka as a Streaming PlatformUse a compact data format like AVRO to exchange data between the applicationsUse Schema Registry and its benefitsEnforcing Data Contracts between applications that uses Kafka as a Streaming PlatformHandle Data evolution gracefully using Schema RegistryThen this is the right course for you. This is a pure hands-on oriented course where you will be learning the concepts through code. By the end of this course, you will have a complete understanding of these concepts: Use AVRO as a data serialization formatEvolution of the data using Schema RegistryGetting Started with KafkaIn this section, I will give you all an introduction to the course and what to expect from this course. Data Contract & Serialization in KafkaLearn “How serialization is connected to Kafka” and how it benefits the overall Kafka architecture. We will look into different Serialization formats and the support for Schema in AVRO, Protobuf and ThriftIntroduction to AVRO - A data serialization systemAn introduction to AVRO and why AVRO is popular to work with Kafka and Schema Registry. Learn to build a simple AVRO schema. Kafka Setup & Demo in Local Using DockerIn this section, we will setup Kafka in local, Produce and Consume messages using Kafka Console Producer and Consumer. Greeting App - Base AVRO Project SetUp - GradleWe will set up the base project for the greeting app which we can use to generate the Java Classes from the Greetings schema using Gradle build tool. Greeting App - Base AVRO Project SetUp - MavenWe will set up the base project for the greeting app which we can use to generate the Java Classes from the Greetings schema using Maven build tool. Build AVRO Producer and Consumer in JavaWe will learn to build a Kafka Producer to publish AVRO records in to the Kafka topic. We will learn to build a Kafka Consumer to consume AVRO records from the Kafka topic. CoffeeShop Order Service Using AVRO - A Real time use CaseWe will build a AVRO schema for a real time use case and build Kafka Producers and Consumers to it. Logical Types in AVROI will cover the different logical types in AVRO and how to use them. TimeStampDecimalUUIDDateAVRO Record- Under the HoodAnatomy of an AVRO record when the data is published and consumed as AVRO recordSchema Changes in AVRODemonstration of how the consumer breaks with changing business requirementsData Evolution using Schema RegistryCover the different techniques of evolving a Schema with the changing business requirements.I will cover the different Compatibility techniques to share data between the producer and consumer applicationsBackward CompatibilityForward CompatibilityFull CompatibilityNone CompatibilitySchema Naming StrategiesI will cover the different naming strategies for Schema and how its impacts the application events. TopicName StrategyRecordName StrategyTopicRecordName StrategyBuild a Coffee Order Service using SpringBoot & Schema RegistryIn this section, we will code and build a Spring Boot Kafka application that exchanges the data in an AVRO format and interacts with Schema Registry for data evolution. Build a RestFul service to publish the events through which we receive events through the rest interface and then publish them to KafkaBy the end of this course, you will have a complete understanding of these concepts: Use AVRO as a data serialization formatEvolution of the data using Schema Registry