Job Description
Design, implement and support ETL/ELT data pipelines for structured and timely data consumption for purposes of internal teams and external customers
Building controls to verify data quality to ensure the data, that reaches consumers is in good shape
Collaborate with cross-functional team to implement scalable, automated infrastructures for data ingestions, processing, aggregation and delivery
Leverage technology and orientate engineering solutions to optimize performance, in order to serve high volume of data requests through API
Develop distributed data processing pipelines in clients ecosystem using Airflow, Python, Snowflake, S3, ElastiCache, Kafka etc.
Requirements
Strong background in data engineering with some experience in BI or data analytics
Experience developing databases, cloud based solutions, data pipelines with e.g. AWS, Azure, Snowflake, Redshift, Airflow
Experience and knowledge of SQL and Python
Experience with modern data stack e.g. Airflow, Kafka, Spark, Redis, Elasticache
Benefits
A digital products focused company with a variety of challenging engagements with upper mid-market and Fortune organizations
A chance to work with the top talent professionals and award winning teams across different countries
Competitive salary and performance based bonuses
Private health insurance
Personal accident insurance
2 additional vacation days
Remote work opportunities
Sponsored participation in professional improvement events
Great team building events
Sports activities including gym benefit
Employee referral program
Salary range for this position is from 2700 EUR to 6125 EUR gross before taxes.*
*Salary offer for the candidate is determined based on the predefined salary ranges for the position and depends on the level of competence and experience of the candidate.