Role: AWS Data Engineer
Experience: 7years
Location: London
Work mode: Hybrid
Job Description:
Need Expert in Python Pyspark, AWS, Cloud, AWS Services, AWS Components
• Designing and developing scalable, testable data pipelines using Python and Apache Spark
• Orchestrating data workflows with AWS tools like Glue, EMR Serverless, Lambda, and S3
• Applying modern software engineering practices: version control, CI/CD, modular design, and automated testing
• Contributing to the development of a lakehouse architecture using Apache Iceberg
• Collaborating with business teams to translate requirements into data-driven solutions
• Building observability into data flows and implementing basic quality checks
• Participating in code reviews, pair programming, and architecture discussions
• Continuously learning about the financial indices domain and sharing insights with the team
WHAT YOU'LL BRING:
Nice-to-haves: