Analytics Engineer

Job summary
Permanent contract
Berlin
Salary: Not specified
A few days at home
Skills & expertise
Git
Looker
Tableau
Dbt
Snowflake
+3
Apply

Vestiaire Collective
Vestiaire Collective

Interested in this job?

Apply
Questions and answers about the job

The position

Job description

About the role 🖥️

This role is central to our data strategy and requires a balance of technical expertise and business acumen. As an Analytics Engineer, you will be at the heart of our data-driven initiatives, working closely with cross-functional teams to transform raw data into a single source of truth data mart. Your work will directly influence key decisions in finance, payment systems and business performance.

What you will do 👜

  • Design, implement, and maintain efficient and reliable data pipelines using a Modern Data Stack: Airflow, Snowflake, DBT, Elementary 
  • Develop advanced data models to support complex analytics, including financial reconciliations, cost effectiveness and profitability models. Collaborate with Operation (Transport, Authentication & QC, Warehouse, CS, Fraud), Payments, Sustainability and Product & Tech teams to understand their data requirements and translate these into sophisticated technical solutions
  • Ensure scalability and performance of our data infrastructure to handle large-scale, multi-faceted data sets from diverse sources
  • Implement and maintain data quality checks and monitoring systems for accuracy and consistency
  • Innovate and integrate new technologies and methodologies to enhance data capabilities across Ops & Sustainability domains
  • Assist the Ops analysts & Ops business team in building key dashboards in Tableau to enable data driven decision making
  • Who you are ⭐

  • Required Qualifications:
  • Bachelor’s/Master’s in Computer Science, Engineering, Statistics, Business Administration, or related fields
  • At least one previous experience in analytics engineering, with strong skills in ETL/ELT and data modeling, an awareness of data warehousing and DataOps practices
  • Proficient in SQL and programming languages like Python or R
  • Experience with cloud data technologies and big data tools
  • Desirable Skills:
  • Previous experience in DBT for data modeling
  • Apache Airflow: an understanding of workflow management
  • Git: Solid knowledge in version control and CI/CD integration
  • Cloud Service: AWS, Snowflake or similar cloud experience
  • Data Visualization Tools: Proficiency in tools like Tableau, Looker, SnowSight
  • Want to know more?

    These job openings might interest you!

    These companies are also recruiting for the position of “Data / Business Intelligence”.

    Apply