Contract: 6 months
Work mode: Hybrid (2–3 days WFO)
Location: London
Vacancies: 2
Experience: 10+ years
We are seeking a senior Data Architect with strong data product architecture experience to design and deliver scalable, event-driven data platforms within a payments domain. The role requires deep expertise in streaming, AWS-native data services, and ISO 20022 message models, combined with strong data modeling fundamentals and production-grade delivery skills.
Own data product architecture end-to-end, from domain modeling to production deployment.
Design and implement event-driven architectures using Kafka (Confluent) and AWS streaming services.
Define and govern data contracts, schemas, and evolution strategies.
Architect AWS lakehouse and streaming pipelines (real-time and batch).
Model payments lifecycles and reconciliation flows aligned with ISO 20022.
Apply CQRS, event sourcing, and domain-driven design where appropriate.
Establish data governance standards aligned to a data-mesh mindset.
Ensure observability, reliability, and cost efficiency of data platforms.
Produce production-quality code, conduct reviews, and drive engineering best practices.
Kafka (Confluent) and AWS MSK / Kinesis / Kinesis Firehose
Event ordering, replay, idempotency, DLQs
Exactly-once vs at-least-once semantics
EventBridge for routing and filtering
Saga patterns and eventual consistency
Avro / Protobuf
Schema Registry (compatibility modes, subject strategy, evolution)
Domain events mapped to Kafka topics and persistence stores
S3, Glue (batch & streaming), Athena
Redshift (analytics)
Lambda, Step Functions
Iceberg-ready lakehouse patterns
Streaming ingestion: Kinesis → S3 → Glue
Error handling and DLQ patterns
PAIN / PACS / CAMT message types
Payment lifecycle modeling
Reconciliation and advice flows
API, file-based, and SWIFT channel knowledge
Logical data modeling (ERDs, normalization 1NF → BCNF)
Physical data modeling (partitioning, indexing, SCDs)
OLTP vs analytical storage patterns
3NF vs Data Vault vs Star Schema trade-offs
CQRS design and justification
Bounded contexts and domain modeling
Entities, value objects, repositories
Temporal and versioned data models
Data mesh principles (ownership, SLAs, access, retention, lineage)
Observability (lag, throughput, failures)
FinOps KPIs and cost optimization
Secure-by-design data platforms
QuickSight or Tableau
Redshift performance tuning
ksqlDB or Flink
Aurora PostgreSQL internals
API Gateway / Apigee, mTLS, webhook patterns