I had the opportunity to work with Spring Cloud Data Flow streams and batches. The streams work in production and perform well. The main streams used Debezium to send the database deltas to Soap endpoints or provided Soap endpoints to write into the database. The events where send via Kafka. Spring Cloud Data Flow also provides a application to manage the streams and jobs.
The streams are build with a data source and a data sink that are separate applications and are decoupled by the events send via Kafka. Stream 1 has a Debezium source and sends the database deltas via Kafka to the sink that transforms the event into a soap request to the application. Stream 2 receives a soap request from the application and sends an event to Kafka. The sink receives the event and creates the database entries for the event.