Join us, and shape the future with smarter data solutions!
Kraków – based opportunity with hybrid work model (6 days/month in the office)
As a Data Engineer you will be working for our client, a global leader in the financial sector that is investing heavily in digital transformation and data-driven solutions. You’ll be contributing to the evolution of a cutting-edge data ecosystem, supporting the world’s largest financial data lake, and helping transition from on-premise platforms to Google Cloud. This role is integral to delivering actionable insights from structured and unstructured data sources, empowering stakeholders across multiple business lines with reliable, scalable, and secure data services.
Your main responsibilities: Designing, building and maintaining scalable data pipelines using Python and GCP tools
- Collaborating with data scientists, analysts and architects to deliver robust solutions
- Developing and optimizing complex queries in BigQuery and other data platforms
- Contributing to Agile ceremonies and sprint planning sessions
- Creating and reviewing design documentation and technical specifications
- Participating in peer code reviews to ensure code quality and consistency
- Implementing CI/CD practices for data pipeline deployment and maintenance
- Troubleshooting and resolving data issues and performance bottlenecks
- Monitoring data flows, system health, and performance metrics
- Ensuring compliance with data governance and security requirements
You’re ideal for this role if you have:
- Proven experience designing and developing scalable data pipelines
- Strong programming skills in Python and SQL
- Experience with Google Cloud Platform (GCP), especially BigQuery and DataProc
- Good understanding of data modeling principles and practices
- Familiarity with Agile and DevOps methodologies
- Solid grasp of metadata management and data lineage concepts
- Ability to work independently and manage multiple priorities
- Strong problem-solving and debugging skills in complex data environments
- Excellent communication and collaboration skills across global teams
- Experience working with large-scale structured and unstructured datasets
It’s a strong plus if you have:
- Hands-on experience with Airflow or other orchestration tools
- Knowledge of PySpark, Scala, Hive, or Java
- Understanding of data governance frameworks, preferably within financial institutions
- Exposure to Hadoop ecosystems and legacy data infrastructure migration
- CI/CD automation experience in a data engineering context
- Experience supporting multi-tenant data platforms
- Familiarity with security and compliance in data engineering
- Knowledge of Software Development Life Cycle (SDLC) best practices
- Background in financial services or regulated environments
- Experience mentoring junior engineers or contributing to knowledge-sharing initiatives
#GETREADY to meet with us!
We would like to meet you. If you are interested please apply and attach your CV in English or Polish, including a statement that you agree to our processing and storing of your personal data. You can always also apply by sending us an email at recruitment@itds.pl.
Internal number #6941
Address:
SKYLIGHT BUILDING | ZŁOTA 59 | 00-120 WARSZAWA
BUSINESS LINK GREEN2DAY BUILDING | SZCZYTNICKA 11| 50-382 WROCŁAW
Contact:
INFO@ITDS.PL
+48 883 373 832