Skip Navigation
Show nav
Heroku Dev Center
  • Get Started
  • Documentation
  • Changelog
  • Search
  • Get Started
    • Node.js
    • Ruby on Rails
    • Ruby
    • Python
    • Java
    • PHP
    • Go
    • Scala
    • Clojure
  • Documentation
  • Changelog
  • More
    Additional Resources
    • Home
    • Elements
    • Products
    • Pricing
    • Careers
    • Help
    • Status
    • Events
    • Podcasts
    • Compliance Center
    Heroku Blog

    Heroku Blog

    Find out what's new with Heroku on our blog.

    Visit Blog
  • Log inorSign up
View categories

Categories

  • Heroku Architecture
    • Dynos (app containers)
    • Stacks (operating system images)
    • Networking & DNS
    • Platform Policies
    • Platform Principles
  • Command Line
  • Deployment
    • Deploying with Git
    • Deploying with Docker
    • Deployment Integrations
  • Continuous Delivery
    • Continuous Integration
  • Language Support
    • Node.js
    • Ruby
      • Working with Bundler
      • Rails Support
    • Python
      • Working with Django
      • Background Jobs in Python
    • Java
      • Working with Maven
      • Java Database Operations
      • Working with the Play Framework
      • Working with Spring Boot
      • Java Advanced Topics
    • PHP
    • Go
      • Go Dependency Management
    • Scala
    • Clojure
  • Databases & Data Management
    • Heroku Postgres
      • Postgres Basics
      • Postgres Performance
      • Postgres Data Transfer & Preservation
      • Postgres Availability
      • Postgres Special Topics
    • Heroku Redis
    • Apache Kafka on Heroku
    • Other Data Stores
  • Monitoring & Metrics
    • Logging
  • App Performance
  • Add-ons
    • All Add-ons
  • Collaboration
  • Security
    • App Security
    • Identities & Authentication
    • Compliance
  • Heroku Enterprise
    • Private Spaces
      • Infrastructure Networking
    • Enterprise Accounts
    • Enterprise Teams
    • Heroku Connect (Salesforce sync)
    • Single Sign-on (SSO)
  • Patterns & Best Practices
  • Extending Heroku
    • Platform API
    • App Webhooks
    • Heroku Labs
    • Building Add-ons
      • Add-on Development Tasks
      • Add-on APIs
      • Add-on Guidelines & Requirements
    • Building CLI Plugins
    • Developing Buildpacks
    • Dev Center
  • Accounts & Billing
  • Troubleshooting & Support
  • Databases & Data Management
  • Apache Kafka on Heroku
  • Apache Kafka on Heroku Add-on Migration

Apache Kafka on Heroku Add-on Migration

English — 日本語に切り替える

Last updated January 25, 2021

Table of Contents

  • When do I need to migrate between Kafka add-ons?
  • How do I handle the migration?
  • How do I migrate between add-ons while in a maintenance window?
  • How do I migrate between add-ons without entering a maintenance window?

Scaling up or down between plan levels of Apache Kafka on Heroku is normally seamless and performed in-place. However, there are a few circumstances when actual data migration is required. This document provides an overview of those conditions and the applicable processes.

When do I need to migrate between Kafka add-ons?

There are 3 cases when migrating between Kafka add-ons is necessary:

  • You have a multi-tenant Kafka (Kafka Basic) add-on and you want to start using a dedicated Kafka add-on.
  • You have a dedicated Kafka add-on and you want to start using a multi-tenant Kafka (Kafka Basic) add-on.
  • You have a Beta multi-tenant Kafka (Kafka Basic) add-on and the cluster that hosts the add-on is reaching end-of-life.

How do I handle the migration?

In many scenarios, your application might be able to enter a maintenance window and migrate to a new add-on without modifying your application’s code. In general, we recommend entering a maintenance window if you can, because it drastically reduces the complexity of the migration and does not require significant changes to your app.

If your application cannot enter a maintenance window, you need to migrate to a new add-on by double-writing to both sets of topics, and cutting over from the old add-on to the new one after the new add-on has received writes for a time period longer than your retention time.

How do I migrate between add-ons while in a maintenance window?

The high-level steps for migrating during a maintenance window are:

  1. Provision the new add-on with all relevant topics and consumer groups.
  2. Enter your maintenance window.
  3. Stop your Kafka producers.
  4. Ensure your Kafka consumers are fully caught up.
  5. Switch over to the new add-on.
  6. Start your Kafka producers and consumers.
  7. Exit your maintenance window.
$ heroku addons:create heroku-kafka:basic-0 --as NEW_KAFKA -a mackerel
$ heroku kafka:topics:create my-topic-name NEW_KAFKA -a mackerel
$ heroku kafka:consumer-groups:create my-group-name KAFKA -a mackerel
$ heroku ps:scale producer=0 -a mackerel
# check consumers
heroku ps:scale consumer=0 -a mackerel
heroku maintenance:on -a mackerel
# kafka-parallel-2019
heroku addons:attach kafka-symmetrical-26061 --as OLD_KAFKA -a mackerel
$ heroku addons:attach kafka-parallel-2019 --as KAFKA -a mackerel
$ heroku ps:scale producer=1 consumer=1 -a mackerel
$ heroku addons:destroy kafka-symmetrical-26061 -a mackerel

How do I migrate between add-ons without entering a maintenance window?

The high-level steps for migrating without entering a maintenance window are:

  1. Prepare your app for double-write.
  2. Provision the new Kafka add-on with all relevant topics and consumer groups.
  3. Double-write to both the old and the new add-ons.
  4. Wait for the new add-on to contain the same historical data as the old add-on.
  5. Stop producing to the old add-on.
  6. Destroy the old add-on.

These steps are described in greater detail below.

Step 1: Prepare your app for double-write

Your app needs to support two sets of Kafka config vars (one for each add-on).

This example uses KAFKA_URL, KAFKA_CLIENT_CERT, KAFKA_CLIENT_CERT_KEY, and KAFKA_TRUSTED_CERT for the old Kafka add-on before double-writing begins, and it uses them for the new Kafka add-on after double-writing begins.

This example uses OLD_KAFKA_URL, OLD_KAFKA_CLIENT_CERT, OLD_KAFKA_CLIENT_CERT_KEY and OLD_KAFKA_TRUSTED_CERT for the old Kafka add-on after double-writing begins. This set of config vars exists only while double-writing is taking place.

Two additional config vars are required, which tell producers and consumers where to write to and read from:

  • PRODUCER_ADDON_NAMES is used by producers to discover which add-on(s) to write to.
  • CONSUMER_ADDON_NAME is used by consumers to discover which add-on to read from.

You need to add support to your app for:

  • Producing to all add-ons specified in PRODUCER_ADDON_NAMES
  • Consuming from the add-on specified in CONSUMER_ADDON_NAME

Consumers should handle duplicate messages idempotently. For more information on this, please see the article on robust usage of Apache Kafka on Heroku.

Step 2: Provision the new add-on

Before provisioning the new add-on, attach your existing Kafka add-on with a new name in preparation:

$ heroku addons:attach kafka-symmetrical-26061 --as OLD_KAFKA -a mackerel
$ heroku addons:create heroku-kafka:basic-0 --as KAFKA -a mackerel

Step 3: Create topics and consumer groups on the new add-on

Get a list of topics and consumer groups from your old add-on:

$ heroku kafka:topics OLD_KAFKA -a mackerel
$ heroku kafka:consumer-groups OLD_KAFKA -a mackerel

Now, you can create those topics and consumer groups on your new add-on:

$ heroku kafka:topics:create my-topic-name KAFKA -a mackerel
$ heroku kafka:consumer-groups:create my-group-name KAFKA -a mackerel

Step 4: Double-write to the old and new add-ons

Your app should produce to both sets of topics and consume from the old add-on’s topics while the new add-on’s topics fill with data:

$ heroku config:set PRODUCER_ADDON_NAMES=OLD_KAFKA,KAFKA -a mackerel
$ heroku config:set CONSUMER_ADDON_NAME=OLD_KAFKA -a mackerel

Step 5: Wait for the new add-on to contain enough historical data

After the new add-on has been receiving writes for longer than your retention time, both add-ons should represent the same data. This means you can switch your consumers from the old add-on to the new add-on:

$ heroku config:set CONSUMER_ADDON_NAME=KAFKA -a mackerel

Step 6: Stop producing to the old add-on

When you are comfortable consuming from the new add-on, you can stop producing to the old add-on:

$ heroku config:set PRODUCER_ADDON_NAMES=KAFKA -a mackerel

Step 7: Destroy the old add-on

Because your app is no longer consuming from the old add-on, it is safe to destroy it:

$ heroku addons:destroy OLD_KAFKA_URL -a mackerel

Keep reading

  • Apache Kafka on Heroku

Feedback

Log in to submit feedback.

Robust Usage of Apache Kafka on Heroku Connecting to Apache Kafka on Heroku in a Private or Shield Space via PrivateLink

Information & Support

  • Getting Started
  • Documentation
  • Changelog
  • Compliance Center
  • Training & Education
  • Blog
  • Podcasts
  • Support Channels
  • Status

Language Reference

  • Node.js
  • Ruby
  • Java
  • PHP
  • Python
  • Go
  • Scala
  • Clojure

Other Resources

  • Careers
  • Elements
  • Products
  • Pricing

Subscribe to our monthly newsletter

Your email address:

  • RSS
    • Dev Center Articles
    • Dev Center Changelog
    • Heroku Blog
    • Heroku News Blog
    • Heroku Engineering Blog
  • Heroku Podcasts
  • Twitter
    • Dev Center Articles
    • Dev Center Changelog
    • Heroku
    • Heroku Status
  • Facebook
  • Instagram
  • Github
  • LinkedIn
  • YouTube
Heroku is acompany

 © Salesforce.com

  • heroku.com
  • Terms of Service
  • Privacy
  • Cookies