logo

View all jobs

Data Engineer

Lviv/Kyiv/ Remote, Ukraine
Our customer is continuing with its digital transformation initiative and our infrastructure footprint is growing beyond our data centres and into the Google Platform. We are currently working towards our exciting strategy that is driven by business value. Join us and help solve some complex challenges such as handling low latency and high traffic market data, event streams and messaging, in a hybrid environment within an industry that has so much room for disruption. The Data Engineer will work with a primary focus on delivering our data platform footprint, as an expansion of our presence, enabling key initiatives such as our Digital Banking. This role will be working with several teams in the organization in a collaborative manner, helping deliver data related components that enable productivity across QTG by leveraging automation to deliver self-service components as much as possible. This key individual will be part of a team that drives the implementation of data components such as pipelines, applications, sanitation, supporting our data scientists and working with the rest of the product teams while collaborating with the architecture team, influencing the design and delivery of our data footprint and striving for greater functionality in our data systems.

Responsibilities:
  • Create and maintain optimal data pipeline architecture.
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Optimally extract, transform, and load data from a wide variety of data sources using and Google data technologies.
  • Collaborate with the team to decide on which tools and strategies to use within specific data integration scenarios
  • Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs.
  • Develop and maintain code and documentation for and other data integration projects and procedures.
  • Monitor and anticipate trends in data engineering, and propose changes in alignment with organizational goals and needs.
  • Share knowledge with other teams on various data engineering or project related topics.
Requirements:
  • Minimum 3+ years of experience working in the data engineering field.
  • Proficiency in SQL language.
  • Experience with Python or other languages Java/Go
  • Experience in the optimization of high-volume ETL processes.
  • Experience with any of the popular Clouds (GCP, AWS, Azure).
  • Good knowledge of Message Broker systems (e.g., Kafka, PubSub).
  • Data modeling skills.
  • Good knowledge of popular data standards and formats (e.g, JSON, XML, Proto, Parquet, Avro, ORC, etc)
  • Experience in the financial industry is an asset.
  • Google GCP data platform (Dataflow, Dataprep, Cloud Composer, BigQuery, CloudSQL) knowledge and experience is an asset, or knowledge of the equivalent open-source toolset behind those products
We offer:
  • Flexible working hours
  • A competitive salary and good compensation package
  • Best hardware
  • A masseur and a corporate doctor
  • Healthcare & sport benefits
  • An inspiring, comfy, clean, and safe office
Professional growth:
  • Challenging tasks and innovative projects
  • Meet-ups and events for professional development
  • An individual development plan
  • Mentorship program
Fun:
  • Corporate events and outstanding parties
  • Exciting team buildings
  • Memorable anniversary presents
  • A fun zone where you can play video games, foosball, ping pong, and more

More Openings

DROP A RESUME
Python Backend Software Engineer
Senior 2D Artist
Senior Talent Acquisition Specialist
Talent Acquisition Specialist

Share This Job

Powered by