Bring automation to logistics processes for the biggest retailers in the world!
Warsaw based opportunity with a possibility to work remotely
As a Kafka Principal Engineer, you will be working for our Client, a multinational IT company that operates in 150 locations across 46 countries. A successful candidate will be involved in a long term initiative to help retailers achieve their goals of improving customer experience by guaranteeing on-time deliveries, operational transparency, and omnichannel distribution. We are seeking candidates with a passion for Kafka modernization of the client’s IT department. It’s a game-changing revolution for the retail market and its customers!
Your main responsibilities: providing expertise and working on Kafka using schema registry in a very high-volume environment (900 Million messages)
- Messaging and streaming processes
- Developing topics, establishing a redundancy cluster, deploying monitoring tools, alerting, and practicing in accordance with industry best practices
- Creating stubs for producers, consumers, and consumer groups to help onboard applications from different languages and platforms
- Setting up security on Kafka
- Providing naming conventions, backup and recovery, and problem-determining strategies for the projects
- Monitoring, preventing, and troubleshooting security-related issues
- Finding strategic vision in engineering solutions that touch on the messaging aspect of the infrastructure
You’re ideal for this role if you have:
- 5+ years of experience in a similar role
- Experienced with Kafka connectors such as MQ connectors, Elastic Search, JDBC, File stream, JMS
- Hands on experience on custom connectors using the Kafka core concepts and API
- Working knowledge on Kafka Rest proxy
- Experience with working on AvroConverters, JsonConverters, and StringConverters and different data format like JSON, AVRO, ProtoBuf
- Knowledge of tools, alerts and has good knowledge of best practices and industry standards
- Operational and deployment knowledge on different Kafka objects.
- Expertise in Kafka brokers, zookeepers, KSQL, KStream, KTable and Kafka Control Centre
- Good knowledge on writing KStream and KSQL involving complex schema.
- Good experience on error tracing and handling
- Experience with tools like provisioning using Jenkins and Azure DevOps
- Fluent English
It is a strong plus if you have:
- Ability to perform data related benchmarking, performance analysis and tuning.
- Strong skills in In-memory applications, Database Design, Data Integration
- Relational to NOSQL Databases
- Well versed with Java, Spring boot, REST APIs, micro-services and docker
- Ability to build libraries and utilities for Kafka
- Any knowledge on Kafka testing
- Retail Knowledge
- Experience with reimagining legacy middleware stack and e-commerce decomposition with event streaming
#GETREADY to meet with us!
We would like to meet you. If you are interested please apply and attach your CV in English or Polish, including a statement that you agree to our processing and storing of your personal data. You can always also apply by sending us an email at firstname.lastname@example.org.
Internal number #3295