Data Modeler - ETL Expert - Geneva
About this role
Randstad Digital Switzerland is seeking an accomplished Senior Data Engineer to support our client in building the next generation of data platforms for the complex and dynamic aviation industry.
If you thrive on transforming intricate datasets into reliable, actionable insights, and possess deep expertise in Python, SQL, dbt, and Snowflake, this is your opportunity to set the standard for data engineering excellence!
The Challenge: Building Next-Generation Data Pipelines
You will be responsible for designing, building, and maintaining robust, automated data pipeline workflows that ingest, process, and transform complex aviation datasets. This role is crucial for establishing the long-term data framework and ensuring data quality across all stages.
Key Responsibilities:
Pipeline Automation: Build end-to-end automated pipeline workflows for complex aviation data ingestion, processing, and transformation.
Reusable Frameworks: Develop reusable scripts and models for ingestion, orchestration, and transformations using Python, SQL, dbt, and R (where applicable), setting the foundation for scalable data workflows.
Data Mapping & Documentation: Create and maintain detailed Source-to-Target (S2T) mapping diagrams, clearly documenting data flow, transformation logic, business rules, and data quality checks to guide ETL/ELT development.
Quality & Governance: Develop a Data Quality Module (monitoring ingestion, transformation, and publication) and a Data Stewardship Module (supporting human-in-the-loop validation/correction) using dbt and Streamlit.
Operational Excellence: Adhere to modern CI/CD practices and ensure thorough documentation and knowledge transfer to internal staff for sustainable operations.
Your Expertise & Profile:
Experience: 7+ years in Data Engineering, Architecture, or Analytics Consulting, preferably within complex or regulated industries like Aviation.
Technical Stack Mastery:
Proficiency in Python, SQL, and dbt for sophisticated ETL/ELT development.
Strong experience with modern cloud data warehousing, specifically Snowflake.
Experience with AWS services and data orchestration tools.
Familiarity with R for statistical transformations is a plus.
Data Architecture: Strong understanding of data modeling principles and hands-on experience designing Source-to-Target mapping.
Data Governance: Proven experience designing and implementing robust data quality frameworks and validation checks.
Soft Skills: Excellent communication and documentation skills are essential for knowledge transfer and stakeholder collaboration.