JOB DETAILS
Role Title: Integration Product Owner and Architect
Possibility of remote work: Hybrid | 3 days in the office
Contract duration: 3 months
Location: Slough Business Park
Required Core Skills:
• Bachelor’s degree in computer science, Engineering, or a related field.
• Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
• Strong proficiency in Java and Spring Boot.
• Experience with Apache Kafka and stream processing.
• Familiarity with Big Data technologies (Hadoop, Spark, etc.).
• Knowledge of NoSQL databases (e.g., Druid, Cassandra, MongoDB).
• Understanding of distributed systems and scalability.
• Design, develop, and implement Kafka-based microservices using Spring Boot.
• Build data pipelines for ingesting, processing, and analyzing large-scale data sets.
• Optimize Kafka configurations for performance and reliability.
• Work with Big Data technologies such as Hadoop, Spark, and NoSQL databases.
• Ensure data security, integrity, and compliance with industry standards.
• Troubleshoot and resolve issues related to Kafka topics, consumers, and producers.
• Monitor system performance and proactively address bottlenecks.
• Participate in code reviews and mentor junior developers.
Nice to have skills:
• Certification in Kafka or related technologies.
• Experience with cloud platforms (AWS, Azure, GCP).
• Knowledge of containerization (Docker, Kubernetes)
Minimum years of experience: 12 years
Detailed Job Description:
• Bachelor’s degree in computer science, Engineering, or a related field.
• Strong proficiency in Java and Spring Boot.
• Experience with Apache Kafka and stream processing.
• Familiarity with Big Data technologies (Hadoop, Spark, etc.).
• Knowledge of NoSQL databases (e.g., Druid, Cassandra, MongoDB).
• Understanding of distributed systems and scalability.
• Excellent problem-solving skills and attention to detail.
• Effective communication and teamwork abilities.
• Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
• Design, develop, and implement Kafka-based microservices using Spring Boot.
• Build data pipelines for ingesting, processing, and analyzing large-scale data sets.
• Optimize Kafka configurations for performance and reliability.
• Work with Big Data technologies such as Hadoop, Spark, and NoSQL databases.
• Ensure data security, integrity, and compliance with industry standards.
• Troubleshoot and resolve issues related to Kafka topics, consumers, and producers.
• Monitor system performance and proactively address bottlenecks.
• Participate in code reviews and mentor junior developers