Location: London/Edinburgh
Work model: Hybrid, 2 days from the office
Contract Duration: 6 month
Minimum years of experience required: 8 years
primary skills : Scala, Spark and AWS
secondary skills – Python, NoSQL, Kafka, Athena, Airflow
JD:
Data Engineering:
Advanced knowledge and working experience of
• Hive
• Spark
• EMR
• DynamoDB
• S3
• DB migration service
• Redshift
• Data pipeline
• MySQL, Postgres, MS SQL 2016
• Glue
• Kinesis, Kafka
• Scala
• Expertise in Spark coding and Functional Programming
• Expertise in python coding
• Advanced knowledge of Linux tools
• Good understanding of AWS components e.g. Sagemaker, SecretsManager, dynamo, hive, GIT, Elasticbeanstalk, ECR / ECS, EC2, SQS, Lambda, cloudwatch, ELB, Elasticache, Step functions
• Advanced SQL coding