Job Title: Data Lakehouse Engineer
Work Type: Hybrid | Contract
Locations: Nenagh / Dublin (Ireland) or Warsaw (Poland)
Role Overview
We are seeking a highly skilled Data Lakehouse Engineer to join our team on a hybrid contract basis. This role focuses on designing, building, and optimizing modern data lakehouse architectures using AWS-native services and open table formats like Apache Iceberg. You will play a key role in enabling scalable, secure, and high-performance data platforms.
Key Responsibilities
Design and implement scalable AWS Data Lakehouse solutions
Develop and maintain data pipelines using Python and PySpark
Work with AWS services including:
S3, EMR, Lambda, IAM, EKS, MWAA
Build and manage Apache Iceberg tables for efficient data storage and querying
Implement CI/CD pipelines for data platform deployments
Use Terraform for infrastructure as code (IaC)
Ensure data quality, governance, and security best practices
Orchestrate workflows and automate data processes
Collaborate with cross-functional teams including data analysts, scientists, and DevOps engineers
Required Skills & Experience
Strong experience in AWS Data Engineering
Hands-on expertise with:
AWS services: IAM, Lambda, EKS, S3, EMR, MWAA
Apache Iceberg
Python & PySpark
Solid understanding of data lakehouse architecture principles
Experience with CI/CD pipelines and Terraform
Knowledge of data quality frameworks and data security standards
Familiarity with workflow orchestration tools
Preferred Qualifications
Experience working in hybrid cloud or multi-region environments
Strong problem-solving and communication skills
Ability to work independently in a contract setting