Skip to content

Bump confluent-kafka from 1.7.0 to 1.8.2

Afonso Mukai requested to merge dependabot/pip/confluent-kafka-1.8.2 into main

Created by: dependabot[bot]

Bumps confluent-kafka from 1.7.0 to 1.8.2.

Release notes

Sourced from confluent-kafka's releases.

v1.8.2

confluent-kafka-python v1.8.2

v1.8.2 is a maintenance release with the following fixes and enhancements:

  • IMPORTANT: Added mandatory use.deprecated.format to ProtobufSerializer and ProtobufDeserializer. See Upgrade considerations below for more information.
  • Python 2.7 binary wheels are no longer provided. Users still on Python 2.7 will need to build confluent-kafka from source and install librdkafka separately, see README.md for build instructions.
  • Added use.latest.version and skip.known.types (Protobuf) to the Serializer classes. (Robert Yokota, #1133).
  • list_topics() and list_groups() added to AdminClient.
  • Added support for headers in the SerializationContext (Laurent Domenech-Cabaud)
  • Fix crash in header parsing (Armin Ronacher, #1165)
  • Added long package description in setuptools (Bowrna, #1172).
  • Documentation fixes by Aviram Hassan and Ryan Slominski.
  • Don't raise AttributeError exception when CachedSchemaRegistryClient constructor raises a valid exception.

confluent-kafka-python is based on librdkafka v1.8.2, see the librdkafka release notes for a complete list of changes, enhancements, fixes and upgrade considerations.

Note: There were no v1.8.0 and v1.8.1 releases.

Upgrade considerations

Protobuf serialization format changes

Prior to this version the confluent-kafka-python client had a bug where nested protobuf schemas indexes were incorrectly serialized, causing incompatibility with other Schema-Registry protobuf consumers and producers.

This has now been fixed, but since the old defect serialization and the new correct serialization are mutually incompatible the user of confluent-kafka-python will need to make an explicit choice which serialization format to use during a transitory phase while old producers and consumers are upgraded.

The ProtobufSerializer and ProtobufDeserializer constructors now both take a (for the time being) configuration dictionary that requires the use.deprecated.format configuration property to be explicitly set.

Producers should be upgraded first and as long as there are old (=v1.8.2) Python producer must be configured with use.deprecated.format set to True.

When all existing messages in the topic have been consumed by older consumers the consumers should be upgraded and both new producers and the new consumers must set use.deprecated.format to False.

The requirement to explicitly set use.deprecated.format will be removed in a future version and the setting will then default to False (new format).

Changelog

Sourced from confluent-kafka's changelog.

v1.8.2

v1.8.2 is a maintenance release with the following fixes and enhancements:

  • IMPORTANT: Added mandatory use.deprecated.format to ProtobufSerializer and ProtobufDeserializer. See Upgrade considerations below for more information.
  • Python 2.7 binary wheels are no longer provided. Users still on Python 2.7 will need to build confluent-kafka from source and install librdkafka separately, see README.md for build instructions.
  • Added use.latest.version and skip.known.types (Protobuf) to the Serializer classes. (Robert Yokota, #1133).
  • list_topics() and list_groups() added to AdminClient.
  • Added support for headers in the SerializationContext (Laurent Domenech-Cabaud)
  • Fix crash in header parsing (Armin Ronacher, #1165)
  • Added long package description in setuptools (Bowrna, #1172).
  • Documentation fixes by Aviram Hassan and Ryan Slominski.
  • Don't raise AttributeError exception when CachedSchemaRegistryClient constructor raises a valid exception.

confluent-kafka-python is based on librdkafka v1.8.2, see the librdkafka release notes for a complete list of changes, enhancements, fixes and upgrade considerations.

Note: There were no v1.8.0 and v1.8.1 releases.

Upgrade considerations

Protobuf serialization format changes

Prior to this version the confluent-kafka-python client had a bug where nested protobuf schemas indexes were incorrectly serialized, causing incompatibility with other Schema-Registry protobuf consumers and producers.

This has now been fixed, but since the old defect serialization and the new correct serialization are mutually incompatible the user of confluent-kafka-python will need to make an explicit choice which serialization format to use during a transitory phase while old producers and consumers are upgraded.

The ProtobufSerializer and ProtobufDeserializer constructors now both take a (for the time being) configuration dictionary that requires the use.deprecated.format configuration property to be explicitly set.

Producers should be upgraded first and as long as there are old (<=v1.7.0) Python consumers reading from topics being produced to, the new (>=v1.8.2) Python producer must be configured with use.deprecated.format set to True.

... (truncated)

Commits
  • 2ac0d72 Fix Protobuf msgidx serialization and added use.deprecated.format
  • f44d6ce Fix AttributeError exception when CachedSchemaRegistryClient constructor rais...
  • 27cfea6 Update S3 credentials
  • 3010e93 Bump version to v1.8.2 to align with embedded librdkafka version
  • beccd37 Avro 1.11.0 moves SchemaParseException to avro.errors.
  • 1349af0 Reference librdkafka v1.8.2
  • 47a25ea Add note on Python 2.7 no longer being built
  • b67f23c librdkafka is now built with msvcr140 (previously 120)
  • a305802 Don't build Py27 wheels
  • 60ad0d3 Bump to version 1.8.0
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Merge request reports