
Senior Data Engineer
- Italia
- Tempo indeterminato
- Full time
- A challenging career path in a rapidly growing company with a modern vision and talented teams.
- A competitive salary (and benefits) that values people skills and experience.
- A young and inspiring work environment that encourages diversity and cultural exchange.
- Individual growth objectives with a dedicated budget for learning/training.
- Flexible working hours and working locations, we value work-life balance!
- A work opportunity in a mission-driven company committed to empowering people around the world.
- Ping pong and foosball tournaments (sport or gym benefit is also included for everyone!).
- Seasonal celebrations, happy hours, and everyday drinks and snacks at the office.
- Sunny rooftop lunch breaks and hammocks for relaxation and concentration.
- Build and maintain Python ETL pipelines and serverless data processing.
- Integrate external data sources via REST APIs, cloud exports, and internal systems.
- Model and transform data to support analysis, reporting, and monitoring.
- Create dashboards and reports using Looker Studio.
- Collaborate with stakeholders across product, marketing, cloud, and leadership.
- Ensure data quality, consistency, and reliability through testing and monitoring.
- Document and manage code through GitHub and version-controlled workflows.
- Contribute to architectural decisions across GCP and AWS environments.
- Our Stack:
- Warehouse: BigQuery
- Orchestration: Apache Airflow
- Visualization: Looker Studio (Power BI is a plus)
- Tracking: Segment
- Integrations: REST APIs, internal systems
- Version Control: GitHub
- Cloud: GCP & AWS
- 5+ years of experience in data engineering, analytics engineering, or a similar role.
- BS/MS in Computer Science or other technical related fields
- Programming skills in Python, Go, or Scala for data processing.
- Fluency in SQL and a strong foundation in data modeling.
- Experience ingesting and processing data from third-party APIs.
- Work comfortably with cloud infrastructure (especially GCP or AWS).
- Use Git for code collaboration and version control.
- Value simplicity, maintainability, and clarity in your work.
- Communicate effectively with stakeholders using well-structured dashboards.
- Fluent in English, written and spoken
- Great teamwork and positive attitude
- Experience with Apache Airflow.
- Knowledge of data privacy, security, or GDPR-compliant data practices.
- AWS Glue and Spark experience for large-scale ETL
- Familiarity with Power BI or other enterprise BI platforms.
- Exposure to customer data platforms like Segment or event-based tracking.
- Understanding of experimentation, funnel analysis, or retention metrics.
- Use of CI/CD workflows in data engineering or analytics contexts.