Hi All,
We are looking for Azure Data Engineer for the Bhubaneshwar location.
Required Details:
Total Experience
Relevant Experience
Current Company:
Current Designation:
Current CTC
Expected CTC
Notice Period:
Current Location
Expected Location:
Offer In hand:
PAN Number (upload profiles to the portal):
DOB (upload profiles to the portal):
Reason for Job Change:
Degree
CGPA
Passed Out:
University:
Out of 5 Skillset Ratings:
Azure Data Factory (ADF)
Azure Databricks
Azure Synapse
PySpark
SQL
CI/CD
Data pipelines / data ingestion
Streaming / event-based tech
GitHub
Data profiling / data analytics
Job Title: Engineer
Work Location: BHUBANESWAR , OD
Skill Required: Digital : Microsoft Azure~Digital : Databricks~Azure Data Factory
Experience Range in Required Skills: 6-8 yrs
Job Description: Should have PySpark, SQL, Azure Services (ADF, DataBricks, Synapse)Designing and implementing data ingestion pipelines from multiple sources using Azure Databricks.Developing scalable and re-usable frameworks for ingesting data sets Integrating the end-to-end data pipeline - to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained.Working with event based / streaming technologies to ingest and process data.Working with other members of the project team to support delivery of additional project components (API interfaces, Search)Evaluating the performance and applicability of multiple tools against customer requirementsHave knowledge on deployment framework such as CI/CD, GitHub check in processAble to perform data analytics, data analysis and data profiling Good communication
Essential Skills: Should have PySpark, SQL, Azure Services (ADF, DataBricks, Synapse)Designing and implementing data ingestion pipelines from multiple sources using Azure Databricks.Developing scalable and re-usable frameworks for ingesting data sets Integrating the end-to-end data pipeline - to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained.Working with event based / streaming technologies to ingest and process data.Working with other members of the project team to support delivery of additional project components (API interfaces, Search)Evaluating the performance and applicability of multiple tools against customer requirementsHave knowledge on deployment framework such as CI/CD, GitHub check in processAble to perform data analytics, data analysis and data profiling Good communication