Kafka Adapter for SAP PO
Apache Kafka is a popular Open source project for event stream processing, based on a transaction log.
Kafka Clusters are often used in big data projects but also as alternative to traditional queue based messaging layers in companies. Customers value features like fault tolerance, high scalability, message durability and low latency for processing realtime event-streams.
The Kate Kafka adapter takes care of authentication, stream check pointer management & data transformation to easily connect Apache Kafka clusters with your SAP integration landscape.
Our current version of the adapter supports:
- publishing SAP PO messages as Kafka records
- subscribing Kafka records as SAP PO into single or bulk messages from 1..n topics
- reading from start/end or directed positions in the Kafka stream
- payload transformation between XML and JSON, XML and AVRO and AVRO and JSON (forth and back)
- usage of freestyle AVRO schemas and Kafka schema registries (e.g. on confluent.io or self hosted)
- popular security options like PLAIN, PLAIN_SASL, SASL_SSL or client certificates
- dynamic Kafka record headers (read and write from PO messages)
- dynamic topic/partition or key properties of Kafka records (read/write from PO messages)
- deduplication of Kafka records that are already processed
The adapter works completely with SAP PO standard configuration & monitoring and needs no external configuration
for certificates or avro schemas.
For more informations or a 30 day trial version of the adapter, contact us under firstname.lastname@example.org.
Or request our trial through our SAP Appcenter listing KaTe Kafka Adapter in SAP Appcenter.