We are looking for a mid-level Data Engineer to join a fast-moving, data-driven project. The role involves working on data pipelines, transformations, and analytical workflows that support advanced reporting and AI-powered solutions. You will collaborate closely with engineers, data experts, and stakeholders to build scalable, production-ready systems in a cloud environment.
The ideal candidate is comfortable working with data at different stages, enjoys experimenting with new tools and approaches, and takes ownership of their work. Curiosity, adaptability, and a problem-solving mindset are key to succeeding in this role.
Key Responsibilities
- Build smart ETL pipelines that move and transform data seamlessly between systems
- Shape powerful data warehouses that fuel analytics, dashboards, and AI insights
- Develop and deploy time-series forecasting models (e.g. Prophet, ARIMA) to predict trends, optimize planning, and enhance strategic insights across data products
- Work hands-on with AWS and Azure to connect data, AI, and scalable infrastructure in creative ways
- Collaborate with a talented team of engineers and data experts to turn ideas into production-ready tools
- Stay curious — explore new AI frameworks, cloud services, and data engineering techniques to keep us ahead of the curve
Requirements
- Strong experience working on data-centric projects (data processing, transformation, analytics, or data science workflows)
- Proficiency in Python and data libraries such as Pandas, NumPy, and similar tools; experience with columnar data formats (e.g. Parquet)
- Hands-on experience building or maintaining ETL / data pipelines, including partially automated or script-driven workflows
- Solid understanding of software design principles, data modeling, and database concepts
- Comfort working in a Linux-based environment and cloud infrastructure (AWS and/or Azure); experience with CLI tools is a strong plus
- Experience collaborating on production-grade systems, whether data platforms, analytics solutions, or AI-powered tools
- Ability to work both independently and within cross-functional teams, taking ownership of deliverables
- Strong communication skills and experience working with stakeholders or clients on iterative validation and feedback
- Curiosity and willingness to learn new technologies, including AI frameworks, cloud services, and modern data engineering practices
What we offer
- Hybrid/Remote work policy
- Flexible working hours
- Support in your personal and professional growth
- Private health insurance
- Private pension insurance
- Lots of team activities and perks