Skip to content

Bump confluent-kafka from 1.9.2 to 2.0.2

Afonso Mukai requested to merge dependabot/pip/confluent-kafka-2.0.2 into main

Created by: dependabot[bot]

Bumps confluent-kafka from 1.9.2 to 2.0.2.

Release notes

Sourced from confluent-kafka's releases.

v2.0.2

confluent-kafka-python v2.0.2

v2.0.2 is a feature release with the following features, fixes and enhancements:

  • Added Python 3.11 wheels.
  • KIP-222 Add Consumer Group operations to Admin API.
  • KIP-518 Allow listing consumer groups per state.
  • KIP-396 Partially implemented: support for AlterConsumerGroupOffsets.
  • As result of the above KIPs, added (#1449)
    • list_consumer_groups Admin operation. Supports listing by state.
    • describe_consumer_groups Admin operation. Supports multiple groups.
    • delete_consumer_groups Admin operation. Supports multiple groups.
    • list_consumer_group_offsets Admin operation. Currently, only supports 1 group with multiple partitions. Supports require_stable option.
    • alter_consumer_group_offsets Admin operation. Currently, only supports 1 group with multiple offsets.
  • Added normalize.schemas configuration property to Schema Registry client (@​rayokota, #1406)
  • Added metadata to TopicPartition type and commit() (#1410).
  • Added consumer.memberid() for getting member id assigned to the consumer in a consumer group (#1154).
  • Implemented nb_bool method for the Producer, so that the default (which uses len) will not be used. This avoids situations where producers with no enqueued items would evaluate to False (@​vladz-sternum, #1445).
  • Deprecated AvroProducer and AvroConsumer. Use AvroSerializer and AvroDeserializer instead.
  • Deprecated list_groups. Use list_consumer_groups and describe_consumer_groups instead.
  • Improved Consumer Example to show atleast once semantics.
  • Improved Serialization and Deserialization Examples.
  • Documentation Improvements.

Upgrade considerations

OpenSSL 3.0.x upgrade in librdkafka requires a major version bump, as some legacy ciphers need to be explicitly configured to continue working, but it is highly recommended NOT to use them. The rest of the API remains backward compatible.

confluent-kafka-python is based on librdkafka v2.0.2, see the librdkafka v2.0.0 release notes and later ones for a complete list of changes, enhancements, fixes and upgrade considerations.

Note: There were no v2.0.0 and v2.0.1 releases.

Changelog

Sourced from confluent-kafka's changelog.

v2.0.2

v2.0.2 is a feature release with the following features, fixes and enhancements:

  • Added Python 3.11 wheels.
  • KIP-222 Add Consumer Group operations to Admin API.
  • KIP-518 Allow listing consumer groups per state.
  • KIP-396 Partially implemented: support for AlterConsumerGroupOffsets.
  • As result of the above KIPs, added (#1449)
    • list_consumer_groups Admin operation. Supports listing by state.
    • describe_consumer_groups Admin operation. Supports multiple groups.
    • delete_consumer_groups Admin operation. Supports multiple groups.
    • list_consumer_group_offsets Admin operation. Currently, only supports 1 group with multiple partitions. Supports require_stable option.
    • alter_consumer_group_offsets Admin operation. Currently, only supports 1 group with multiple offsets.
  • Added normalize.schemas configuration property to Schema Registry client (@​rayokota, #1406)
  • Added metadata to TopicPartition type and commit() (#1410).
  • Added consumer.memberid() for getting member id assigned to the consumer in a consumer group (#1154).
  • Implemented nb_bool method for the Producer, so that the default (which uses len) will not be used. This avoids situations where producers with no enqueued items would evaluate to False (@​vladz-sternum, #1445).
  • Deprecated AvroProducer and AvroConsumer. Use AvroSerializer and AvroDeserializer instead.
  • Deprecated list_groups. Use list_consumer_groups and describe_consumer_groups instead.
  • Improved Consumer Example to show atleast once semantics.
  • Improved Serialization and Deserialization Examples.
  • Documentation Improvements.

Upgrade considerations

OpenSSL 3.0.x upgrade in librdkafka requires a major version bump, as some legacy ciphers need to be explicitly configured to continue working, but it is highly recommended NOT to use them. The rest of the API remains backward compatible.

confluent-kafka-python is based on librdkafka 2.0.2, see the librdkafka v2.0.0 release notes and later ones for a complete list of changes, enhancements, fixes and upgrade considerations.

Note: There were no v2.0.0 and v2.0.1 releases.

Commits
  • bbe1225 Add upgrade consideration for 2.0.2 (#1505)
  • fa1aa03 Changes related to release (#1504)
  • 1db3054 Implemented KIP-88, KIP-222, KIP-229, KIP-518 and partially KIP-396 (#1449)
  • 38d0f03 Soak test helpers (#1484)
  • 16b7747 Implement nb_bool for Producer which makes bool(Producer(...)) return True ir...
  • 1828db3 Remove reference to vault_sem2_approle secret (#1483)
  • e4374e3 chore: update repo semaphore project
  • 81e97dc DP-9632: remediate duplicate Semaphore workflows (#1473)
  • e98fa65 Wheels generation python 3.11 (#1467)
  • d2abbf2 Fixed a typo with transaction.timeout.ms in init_transactions method document...
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Merge request reports