Join us, and build data solutions that drive global innovation!
Kraków – based opportunity with hybrid work model (2 days/week in the office)
As a Big Data Developer, you will be working for our client, a leading global financial institution, contributing to the design and development of cutting-edge data solutions for risk management and analytics. The client is undergoing a strategic digital transformation, focusing on scalable, cloud-based big data platforms that support advanced analytics and regulatory compliance. You will be part of a high-performing Agile team, collaborating closely with business stakeholders and technical teams to build and maintain robust distributed systems that process large volumes of data efficiently.
Your main responsibilities: Designing and developing distributed big data solutions using Spark
- Implementing microservices and APIs for data ingestion and analytics
- Managing cloud-native deployments primarily on GCP
- Writing and maintaining test automation frameworks using tools like JUnit, Cucumber, or Karate
- Collaborating with cross-functional teams to translate business requirements into technical specifications
- Developing and scheduling data workflows using Apache Airflow
- Maintaining and optimizing existing big data pipelines
- Utilizing DevOps tools such as Jenkins and Ansible for CI/CD automation
- Participating in Agile ceremonies and contributing to sprint planning and retrospectives
- Monitoring, troubleshooting, and improving data systems and services
You’re ideal for this role if you have:
- A degree in Computer Science, IT, or a related discipline
- Proven experience in designing and developing big data systems
- Hands-on experience with Spark and distributed computing
- Solid Java, Python, and Groovy development skills
- Strong knowledge of the Spring ecosystem (Boot, Batch, Cloud)
- Familiarity with REST APIs, Web Services, and API Gateway technologies
- Practical experience in DevOps tooling like Jenkins and Ansible
- Proficiency in using RDBMS, especially PostgreSQL
- Hands-on experience with public cloud platforms, particularly GCP
- Excellent communication in English
It is a strong plus if you have:
- Experience with streaming technologies like Apache Beam or Flink
- Knowledge of OLAP solutions and data modeling
- Background in financial risk management or the banking industry
- Exposure to container technologies such as Docker and Kubernetes
- Familiarity with Traded Risk domain concepts
- Experience with RPC frameworks like gRPC
- Knowledge of data lakehouse tools like Dremio or Trino
- Hands-on experience with BI or UI development
- Scrum Master or PMP certification
#GETREADY to meet with us!
We would like to meet you. If you are interested please apply and attach your CV in English or Polish, including a statement that you agree to our processing and storing of your personal data. You can always also apply by sending us an email at recruitment@itds.pl.
Internal number #7225
Adres:
SKYLIGHT BUILDING | ZŁOTA 59 | 00-120 WARSZAWA
BUSINESS LINK GREEN2DAY BUILDING | SZCZYTNICKA 11| 50-382 WROCŁAW
Kontakt:
INFO@ITDS.PL
+48 883 373 832