Deel

Data Engineer

Deel1 months ago
Location

Spain

Workplace

Remote

Type

Full Time

Salary

EUR 50,000 – 80,000

Level

Mid

Role

Data Engineer

Posted

Feb 2, 2026

Full TimeRemoteMid

The role

Summary

Deel is seeking a Data Engineer to join their Data Platform team in Spain, focusing on building secure, high-performance data infrastructure and implementing unified permission systems. The role involves architecting ETL pipelines, managing Snowflake and Looker integrations, and ensuring data compliance across their global HR platform serving 150+ countries.

What you'll do

ETL Pipeline Development: Design, build, and maintain efficient data pipelines to integrate data from various source systems into the data warehouse
Permission System Architecture: Architect and implement scalable Role-Based Access Control (RBAC) and Attribute-Based Access Control (ABAC) models within Snowflake and Looker
Data Stack Integration: Create seamless bridges between data warehouse roles and BI tool permissions, ensuring security policies are accurately reflected across platforms
Compliance and Auditing: Support internal and external audits by building automated reporting on data access, usage, and compliance status for GDPR, SOC2, and other standards
Data Warehouse Optimization: Develop and optimize data warehouse schemas and tables to support analytics and reporting needs
SQL Development: Write and refine complex SQL queries and use scripting to transform and aggregate large datasets
Data Quality Implementation: Implement data quality measures including validation checks and cleansing routines to ensure data integrity and reliability
Cross-functional Collaboration: Collaborate with data analysts, data scientists, and other engineers to understand data requirements and deliver appropriate solutions
Documentation Management: Document pipeline designs, data flows, and data definitions for transparency and future reference, adhering to team standards
Project Management: Handle multiple tasks or projects simultaneously, prioritizing work and communicating progress to stakeholders to meet deadlines

What we look for

Technical

SQL ExpertiseStrong SQL skills with experience in data modeling and building data warehouse solutions
Programming ProficiencyProficiency in Python for data processing and pipeline automation
ETL ExperienceFamiliarity with ETL tools and workflow orchestration frameworks like Apache Airflow
Data QualityExperience implementing data quality checks and working with large-scale datasets
Problem SolvingGood problem-solving abilities with strong analytical thinking for complex data challenges

Education

Bachelor's DegreeBachelor's or Master's degree in Computer Science, Mathematics, Physics, or related technical field

Experience

Data Engineering ExperienceAt least 3 years of experience in data engineering or similar backend data development role
Communication SkillsStrong communication and teamwork skills to work with cross-functional stakeholders
Large-scale DataExperience working with large-scale datasets and enterprise-level data infrastructure

Skills

Required skills

SQLAdvanced SQL for complex queries, data modeling, and warehouse optimization
PythonProficiency in Python for data processing, pipeline automation, and ETL development
ETL ToolsExperience with ETL tools and workflow orchestration frameworks like Apache Airflow
Data WarehousingStrong experience with data warehouse solutions and schema design
Data QualityImplementation of data quality checks and validation routines for large datasets

Nice to have

Advanced LookMLExperience building complex Looker models that integrate with Snowflake security layers
Security ToolsExperience with data discovery or cataloging tools like Select.dev for metadata management
dbt GovernanceExperience using dbt tests and docs to enforce data quality and metadata standards
SnowflakeAdvanced knowledge of Snowflake features including Row-Level Security and RBAC
ComplianceUnderstanding of GDPR, SOC2, and other data compliance frameworks

Compensation & benefits

Salary

EUR 50,000 – 80,000 (annual)

Stock options

Available

Benefits

Stock Options

Stock grant opportunities dependent on role, employment status and location

Remote Work Flexibility

Fully remote work arrangement with optional WeWork access for collaboration

International Perks

Additional perks and benefits based on employment status and country of residence

Healthcare Benefits

Healthcare coverage as part of Deel's global benefits program

Professional Development

Career acceleration opportunities in a fast-growing SaaS company environment


Interview process

  1. 1
    Initial Screening Phone or video screening with talent acquisition team to discuss background and role fit
  2. 2
    Technical Assessment SQL and Python coding assessment focusing on data engineering scenarios and problem-solving
  3. 3
    System Design Interview Technical discussion on data pipeline architecture, ETL design, and scalability considerations
  4. 4
    Behavioral Interview Competency-based interview focusing on collaboration, communication, and cultural fit with cross-functional teams
  5. 5
    Final Interview Senior leadership interview discussing career goals, team dynamics, and role expectations

Apply for this position

You'll be redirected to the company's application page


Deel

Deel

View all jobs

Deel is a global payroll and HR platform that helps companies manage their global workforce.

San Francisco, California, United StatesFounded 2018deel.com

Tech Stack

Languages
SQLPythonLookML
Frameworks
Apache Airflowdbt
Databases
SnowflakeData Warehouses
Tools
LookerETL ToolsSelect.dev
Other
RBAC/ABACData Quality ToolsCompliance Frameworks
Apply Now