Job Description:
Soft skills
· Curiosity and analytical mindset
· Ability to learn new concepts and domains quickly
· Versatility to work in a partial greenfield environment
· Comfortable working with a broad scope with a geographically distributed team
· Able to work independently on a wide range of data engineering and analytical tasks
Hard skills
· Very strong python with good code hygiene
· Experience developing and testing data processes e.g. ETL, scraping, transformation etc.
· Experience exploring and working with new datasets, assessing data quality and exploring for valuable insights - preferably with financial or time-series datasets
· Familiarity with orchestration tools e.g. Apache Airflow
· Good understanding of cloud solutions - we are AWS based.
· Experience working with distributed compute frameworks to handle large data volumes e.g. Spark
· Experience building visuals for expert and non-expert audiences - preferably using plotly/dash
· Good source control practices and some experience with CI/CD
· Comfortable building and deploying containerised solutions.
Nice to have
· Experience developing and deploying machine learning models
· Experience working in financial markets
· Knowledge of best practices when back testing models.
Day to Day Duties
· Implement data on-boarding (ETL, web-scraping, etc.) and database management
· Implement best practices for data transformation, manipulation, and storage solutions
· Build and maintain fundamental market models for supply and demand
· Build and maintain fundamental and price-based analytics (e.g. visualizations and dashboards)