Kafka Developer Location Hybrid work in Montréal, QC :
Role
- Design, develop, and deploy applications and services that leverage Kafka for real-time data processing and streaming.
- Integrate Kafka into existing systems and develop custom Kafka connectors as needed.
- Collaborate with software developers and DevOps engineers to design and implement scalable and fault-tolerant Kafka clusters.
- Optimize Kafka configurations, performance tuning, and troubleshooting to ensure reliable and efficient message processing.
Requirements
- Experienced with Kafka Streams KSQL architecture and associated clustering model
- Hands-on experience with how to scale Kafka, KStreams, and Connector infrastructures
- Best practices to optimize the Kafka ecosystem based on use-case and workload, e.g. how to effectively use topic, partitions, and consumer groups
- Strong understanding MongoDB and similar NoSQL databases
- Hands-on experience as a developer who has used the Kafka API to build producer and consumer applications, along with expertise in implementing KStreams components.
- Have developed KStreams pipelines, as well as deployed KStreams clusters -Experience with developing KSQL queries and best practices of using KSQL vs Kstreams
- Strong knowledge of the Kafka Connect framework, with experience using several connector types: REST proxy, JMS, File, SFTP, JDBC, Splunk, Salesforce, and how to support wire-format translations.
- Knowledge of connectors available from Confluent and the community
- Hands-on experience in designing, writing and operationalizing new Kafka Connectors using the framework
- Experience in cloud platforms - Azure, AWS, GCP.
- Experience working with CI/CD pipelines
Job Types: Full-time, Fixed term contract
Schedule:
- 8 hour shift
- Monday to Friday
Ability to commute/relocate:
- Montréal, QC: reliably commute or plan to relocate before starting work (preferred)
Work Location: Hybrid remote in Montréal, QC