Join us, and shape the future of cloud-based data platforms!
Kraków – based opportunity with hybrid work model (2 days/week in the office)
As a Python Engineer, you will be working for our client, a major global financial institution driving a large-scale data transformation initiative. You are joining a team focused on building and maintaining a robust data platform using modern cloud technologies. The project involves processing and delivering data at scale using GCP services, supporting analytics and reporting needs while ensuring alignment with architecture standards, security, and performance. You will be contributing to solution design, development, automation, and code quality while collaborating with architects, analysts, and agile teams in a highly technical and data-driven environment.
Your main responsibilities: Developing and maintaining data pipelines using GCP services and Python
- Performing code reviews to ensure quality, scalability, and compliance
- Collaborating with architects and analysts to translate requirements into solutions
- Ensuring all technical solutions follow security, performance, and architectural standards
- Resolving technical challenges alongside the development team
- Automating infrastructure and deployments using Terraform, Jenkins, and Ansible
- Implementing and monitoring data workflows with Apache Airflow (Composer)
- Writing and optimizing complex SQL queries for large-scale data processing
- Contributing to team knowledge through technical upskilling and documentation
- Participating in Agile ceremonies and daily development activities
You’re ideal for this role if you have:
- Hands-on experience with GCP services such as BigQuery, Dataflow, and Cloud Storage
- Strong programming skills in Python for data engineering tasks
- Solid understanding of SQL and experience with distributed data processing
- Familiarity with data orchestration using Apache Airflow or Composer
- Experience using DevSecOps tools like Terraform, Jenkins, and Ansible
- Knowledge of cloud infrastructure and security best practices
- Ability to collaborate in cross-functional Agile teams
- Proven track record of solving technical issues in complex systems
- Experience performing code reviews and enforcing quality standards
- Strong communication skills and ability to translate business needs into technical solutions
It is a strong plus if you have:
- Experience with GCP IAM policies and security configurations
- Familiarity with Google Cloud SDK (gcloud) and API calls in Python
- Background in financial services or regulated environments
- Understanding of performance tuning in large-scale data platforms
- Knowledge of CI/CD pipelines for data workflows
- Exposure to Apache Spark for distributed data processing
- Experience supporting data governance and compliance initiatives
- Ability to create infrastructure as code with reusable templates
- Experience contributing to technical documentation and decision-making processes
- Passion for automation, scalability, and clean architecture
#GETREADY to meet with us!
We would like to meet you. If you are interested please apply and attach your CV in English or Polish, including a statement that you agree to our processing and storing of your personal data. You can always also apply by sending us an email at recruitment@itds.pl.
Internal number #7465
Address:
SKYLIGHT BUILDING | ZŁOTA 59 | 00-120 WARSZAWA
BUSINESS LINK GREEN2DAY BUILDING | SZCZYTNICKA 11| 50-382 WROCŁAW
Contact:
INFO@ITDS.PL
+48 883 373 832