Join us, and code the backbone of financial intelligence!
Kraków – based opportunity with hybrid work model (2 days/week in the office).
As a GCP Data Engineer, you will be working for our client, a global financial institution developing a cloud-based risk management platform used to generate and deliver risk factor definitions, historical market data, and scenarios for advanced financial modeling. The project involves building and optimizing scalable data pipelines, microservices, and integration layers that process high volumes of real-time and historical market data. You will be joining an international team of engineers focused on innovation, automation, and delivering measurable business value in a highly regulated environment.
Your main responsibilities: Translating business requirements into secure, scalable, and performant data solutions
- Integrating internal systems with an emphasis on fast data processing and cost optimization
- Developing and documenting data ingestion blueprints for market data pipelines
- Reviewing data solutions created by other team members
- Assessing and modernizing existing data pipelines and microservices
- Collaborating with engineers, analysts, and stakeholders to align technical solutions with business needs
- Implementing consistent logging, monitoring, error handling, and automated recovery
- Promoting automated unit and regression testing through test-centric development
- Designing and implementing performant REST APIs
- Applying industry-standard integration frameworks and patterns
You’re ideal for this role if you have:
- Strong knowledge of Java
- Solid understanding of software design principles such as KISS, SOLID, and DRY
- Proficiency with Spring Boot and its ecosystem
- Experience building performant data processing pipelines
- Familiarity with Apache Beam or similar technologies
- Experience working with relational and NoSQL databases, such as PostgreSQL and Bigtable
- Basic understanding of DevOps practices and CI/CD tools like Jenkins and Groovy
- Ability to design and implement RESTful APIs
- Excellent problem-solving and analytical skills
- Strong communication and team collaboration abilities
It is a strong plus if you have:
- Experience with GCP services like GKE, Cloud SQL, DataFlow, and BigTable
- Knowledge of monitoring tools such as Open Telemetry, Prometheus, and Grafana
- Familiarity with Kubernetes and Docker
- Exposure to Terraform for infrastructure-as-code
- Experience with messaging and streaming platforms like Kafka
#GETREADY to meet with us!
We would like to meet you. If you are interested please apply and attach your CV in English or Polish, including a statement that you agree to our processing and storing of your personal data. You can always also apply by sending us an email at recruitment@itds.pl.
Internal number #6923
Adres:
SKYLIGHT BUILDING | ZŁOTA 59 | 00-120 WARSZAWA
BUSINESS LINK GREEN2DAY BUILDING | SZCZYTNICKA 11| 50-382 WROCŁAW
Kontakt:
INFO@ITDS.PL
+48 883 373 832