Location: Wraclaw, Poland
Experience: 05 to 12 years
Requirement:
• create and implement highly scalable and reliable data distribution solution using VQL, Python, Spark & open-source technologies, to deliver data to
business components.
• work with Denodo, ADLS, Databricks, Kafka, data modelling, data replication, clustering, SQL Query patterns and indexing for handling for large
data sets.
• demonstrate experience with Python and data access (Numpy, Scipy, panda etc.), machine learning (Tensorflow etc.), and AI libraries (Chat GTP etc.)
•4-5 years of hands-on experience in developing large scale applications using data virtualization and/or data streaming technologies.
• software engineer/developer focused on cloud based data virtualization and data delivery technologies
• Denodo platform familiarity and SQL Experience highly desirable
• Know-how to apply standards, methods, techniques and templates as defined by our SDLC including code control, code inspection, code deployment.
Responsibility:
• Design, plan and deliver solutions in a large scale enterprise environment
• Working with solution architect & business analysts to define implementation design & coding of the assigned modules/responsibilities with highest
quality (bug free).
• Determining technical approaches to be used, and defining the appropriate methodologies
ó
• Must be capable of working in a collaborative, multi-site environment to support rapid development and delivery of results and capabilities (i.e.
AGILE SDLC)
• Effectively communicating technical analyses, recommendations, status, and results to project management team.
• Produce secure and clean code that is stable, operational, consistent and well-performing