Data Engineer MOT (MSS IT)

Industry

Branże

Banking
Industry

Doświadczenie

Senior
Industry

Metodologia

Agile
Scrum
Industry

Lokalizacja

Krakow
Industry

Tryb Pracy

Hybrid
On-site
Industry

Język

English
Industry

Technologie

GCP
SQL
  • Krakow
16 800-22 050 zł B2B

Use GCP technologies to develop our processes, and offer our customers the best solutions!

Krakow-based opportunity with the possibility to work 70% remotely!

As a Data Engineer, you will join the Global Business Insights team in the Market and Security Services department. You will design, develop, test, and deploy a series of new data pipelines, data models, and optimized reporting views aggregated into various time intervals. You will work closely with Qlik and UI developers and core data team in Google BigQuery to prepare data for time-series calculation, aggregation, and visualization, and integrate  read-write data store with data interaction functionality such as annotation, attestation, personalized thresholds, and alerts. You will work for one of the world’s largest banking and financial services organizations.

Your main responsibilities:

  • Designing, building, testing, and deploying Google Cloud data models, and transformations in BigQuery, and Datafusion environment (e.g. SQL, stored procedures, indexes, clusters, partitions, triggers, etc.)
  • Optimizing data views for specific visualization use cases making use of schema design partitions, indexes, down-sampling, archiving, etc. to manage trade-offs such as performance, and flexibility
  • Reviewing, refining, interpreting, and implementing business, and technical requirements
  • Ensuring you are part of the ongoing productivity, and priorities by refining User Stories, Epics, and Backlogs in Jira
  • Onboarding new data sources, designing, building, testing, and deploying Cloud data ingest, pipelines, warehouse, and data models/products (GCP DataFusion, Spark, etc.)
  • Managing code artifacts, and CI/CD using tools like Git, Jenkins, Google Secrets Manager
  • Estimating, committing, and delivering requirements to scope, quality, and time expectations
  • Protecting the solution with appropriate Authorization, and Authentication models, data encryption, and other security components
  • Optimizing data view for specific visualization use cases making use of schema design partitions, indexes, down-sampling, archiving
  • Delivering non-functional requirements, IT standards, and developer, and support tools to ensure our applications are secure, compliant, scalable, reliable, and cost-effective
  • Ensuring a consistent approach to logging, monitoring, error handling, and automated recovery as per Bank standards
  • Writing automated unit, and regression testing as part of a test-centric development approach
  • Delivering a data warehouse, and pipelines that follow API, abstraction, and database refactoring’s best practices
  • Developing procedures, and scripts for data migration, back-population, and initialization
  • Fixing defects, and providing enhancements during the development period, and handing over knowledge
  • Protecting the solution with relevant Data Governance, Security, Sovereignty, Masking, and Lineage capabilities

You’re ideal for the role if you have:

  • 4+ years of experience in database design, development, and administration of Traditional/Cloud Databases & Data Warehouses / Procedures / Products
  • 1+ years of experience in developing, refactoring, optimization of SQL/T-SQL procedures in BigQuery, or equivalent Cloud Databases
  • Good understanding of GCP Core, and Data Products, Architecting, and Designs/Patterns
  • Data preparation, wrangling, and refactoring skills, for example as part of a Data Science pipelines
  • Knowledge of
    • Preparation, usage, visualization, and editing of data in web, dashboard, or other user interfaces (3-tier architecture, CRUD procedures, etc.)
    • IT methodology/practices, and solid experience in Agile/Scrum
  • Experience in Collaboration tools usage such as JIRA/Confluence/Various board types
  • Fluent English

It is a strong plus if you have:

  • Experience in:
    • Developing BI/MI reports and dashboards in a popular tool like Qlik (VizLib Library, VizLib Collaboration, Mashups, etc.), Tableau, Looker etc.
    • GCP-based big data / ETL solutions DevOps / DataOps model
    • Deploying, and operating Datafusion/CDAP-based solutions
    • Building, and operating CI/CD life-cycle management Git, Jenkins, Groovy, Checkmarx, Nexus, Sonar IQ and etc.
    •  IT development, and collaboration tools
    • Working with infrastructure teams to deliver the best architecture for applications
  • Knowledge of Java, Python, and DataFlow
  • Understanding of IT Security, and Application Development best practices
  • Understanding of, and interest in various investment products, life cycles, and the nature of the investment banking business

#GETREADY  to meet with us!

We would like to meet you. If you are interested please apply and attach your CV in English or Polish, including a statement that you agree to our processing and storing of your personal data. You can always also apply by sending us an email at recruitment@itds.pl.

Internal number #3772

Korzyści

Access to +100 projects
Access to Healthcare
Access to Multisport
Access to Pluralsight
B2B or Permanent Contract
Flexible hours and remote work

Aplikuj na to stanowisko




    Potrzebujemy Twojej zgody na procesy rekrutacyjne na wybrane stanowiska. Prosimy o zamieszczenie w CV zgody na przetwarzanie danych lub przesłanie oświadczenia o wyrażeniu zgody na adres privacy@itds.pl. Możesz również wyrazić zgodę na przyszłe procesy rekrutacyjne na podobne stanowiska.