Join us, and engineer solutions that keep businesses safe and secure!
Kraków – based opportunity with hybrid work model (6 office days per month).
As a Data Engineer, you will be working for our client, a global financial institution committed to cybersecurity excellence. You will be part of the team, specifically focusing on network security data analytics and engineering. Your primary role will be to support the development of a robust Network Security Data Lakehouse within a Cyber Azure Data Lake. You will process billions of records daily, ensuring data is clean, structured, and optimized for security monitoring and automation. This role requires expertise in data pipeline design, database integration, and big data processing, contributing to the security and reliability of global business operations.
Your main responsibilities: Designing, building, and testing multi-layered data architecture and efficient data pipelines
- Performing in-depth analysis of large and complex datasets to extract insights and generate statistical metrics
- Developing software and systems to acquire, aggregate, and refine data
- Using data tools to integrate data across different databases, systems, and platforms
- Optimizing data query performance to enhance efficiency and accessibility
- Participating in data ingestion, sourcing, aggregation, APIs, and feature engineering
- Collaborating with stakeholders to understand data and align deliverables with business requirements
- Applying software development best practices to write efficient, reusable, and maintainable code
- Working with Agile development methodologies, including test-driven development
- Continuously developing and enhancing your data engineering and cybersecurity skills
You’re ideal for this role if you have:
- Significant experience in Python and SQL programming for data engineering
- Expertise in designing, building, and maintaining data pipelines and ELT workflows
- Strong knowledge of data profiling, analysis, and pipeline design
- Experience with streaming pipelines and big data processing
- Familiarity with Azure DevOps, CI/CD systems, and software versioning control (e.g., Git)
- Understanding of applying data engineering methods to cybersecurity
- Ability to work with structured and unstructured data across various sources
- Strong problem-solving skills and attention to detail
- Experience in working with cloud-based data platforms, particularly Azure Databricks
- Excellent communication and collaboration skills
It is a strong plus if you have:
- Experience in machine learning model development for cybersecurity applications
- Knowledge of Power BI or other data visualization tools
- Understanding of medallion architecture in data lake environments
- Exposure to DevSecOps practices in cybersecurity and data engineering
#GETREADY to meet with us!
We would like to meet you. If you are interested please apply and attach your CV in English or Polish, including a statement that you agree to our processing and storing of your personal data. You can always also apply by sending us an email at recruitment@itds.pl.
Internal number #6745
Adres:
SKYLIGHT BUILDING | ZŁOTA 59 | 00-120 WARSZAWA
BUSINESS LINK GREEN2DAY BUILDING | SZCZYTNICKA 11| 50-382 WROCŁAW
Kontakt:
INFO@ITDS.PL
+48 883 373 832