Role : Snowflake Developer
Location: Poland, Remote
Employment type : Contract
Work Mode : Remote
Primary Skillset:
• Data Modeling & ETL: Design, develop, and optimize data models and ETL processes using Snowflake for efficient data storage and analytics.
• Design and implement end-to-end ETL pipelines for loading data from various sources into Snowflake.
• Utilize Snowflake’s built-in features such as tasks, streams, and Snowpipe to automate the ETL process for continuous and batch data loads.
• Implement data transformation logic using SQL, Snowflake Stored Procedures (SQL and Python), and ETL tool to ensure the integrity, accuracy, and consistency of data.
• Optimize data loads and transformations for scalability and performance using Snowflake’s micro-partitioning and clustering features.
• Optimize and tune Snowflake queries, Data Storage, Warehouse for performance and efficiency.
• Migration & Integration: Experience with data migration, particularly from SQL Server to Snowflake, and integrating Snowflake with other data sources.
• Collaborate with data architects and business stakeholders to understand data requirements.
• Develop and maintain documentation for data processes and solutions.
• Ensure data quality, governance, and security within the Snowflake environment.
• Troubleshoot and resolve issues related to the Snowflake data warehouse.
• Stay updated with the latest Snowflake features and best practices.
• API Integrations: Work on integrating Snowflake with external APIs for seamless data extraction and transfer.
DevOps Integration - Secondary
• Azure DevOps Experience: Proficiency with Azure DevOps for:
• Source control management
• Automating deployments
• Deploying Snowflake scripts across different environments
• Managing ETL pipeline deployments
• Proficiency in DevOps tools and best practices, with experience deploying Snowflake and ETL tool services.
Qualifications:
• Proven experience as a Snowflake Developer with hands-on experience in Snowflake data warehousing solutions.
• Data Architecture: Strong understanding of Snowflake platform features, including micro-partitioning, file processing from AWS S3, and data quality practices.
• Expertise in writing and optimizing SQL queries, including complex queries, CTEs, and stored procedures (using JavaScript within Snowflake).
• Solid experience in working with Snowflake’s micro-partitioning, file processing from AWS S3, and optimizing data models.
• Strong knowledge of Python for data engineering tasks and automation.
• Background in data migration, specifically from SQL Server to Snowflake.
• Familiarity with designing and managing data pipelines and ETL processes.
• Excellent communication skills and the ability to collaborate effectively with cross-functional teams.
Preferred Skills:
• Experience in using the ETL tools like DBT/SnapLogic/Talend/Informatica for building ETL pipelines, testing, and deploying transformation models. DBT/SnapLogic is preferred.
• Snowpark for Python/Scala: Strong knowledge of Snowpark for building scalable data pipelines and implementing advanced analytics workloads.
• Hands-on experience with AWS services and DevOps tools (e.g., CloudFormation, Terraform) for deploying AWS resources such as Lambda, SQS, and SNS.
• Familiarity with legacy Microsoft SQL platforms and integrating these with modern cloud-based data solutions.
• Strong problem-solving skills and the ability to thrive in a collaborative and fast-paced environment.