Data Engineer II Snowflake,Python,Sql | Codersbrain
full-time
Posted on September 13, 2025
Job Description
Snowflake Data Engineer
Company Overview
(Company details not specified)
Job Summary
We are seeking a highly skilled Snowflake Data Engineer with strong expertise in Snowflake, Python, and SQL to design, build, and optimize scalable data pipelines and analytics solutions. The role involves developing robust ETL/ELT workflows, implementing performance optimization strategies, and ensuring high data quality and governance standards in a cloud data warehouse environment.
Responsibilities
- Design, develop, and maintain scalable data pipelines in Snowflake to support enterprise data needs.
- Implement ETL/ELT workflows using Python and SQL for structured and semi-structured data ingestion.
- Write optimized SQL queries, stored procedures, and user-defined functions within Snowflake.
- Perform query tuning, clustering, caching, and warehouse optimization in Snowflake for performance improvement.
- Integrate Snowflake with cloud platforms (AWS S3, Azure Blob, GCP) for data ingestion and transformation.
- Build data quality and validation frameworks using Python and SQL to ensure accuracy and completeness.
- Automate workflows and job scheduling with tools like Airflow, DBT, or custom Python scripts.
- Collaborate with BI teams, analysts, and stakeholders to design data models supporting reporting and analytics.
- Implement role-based access control, masking policies, and governance best practices in Snowflake.
- Troubleshoot production issues, monitor pipelines, and optimize costs in the Snowflake environment.
Qualifications
- Proven hands-on experience with Snowflake Data Warehouse (2+ years).
- Strong Python programming experience for data engineering and automation.
- Advanced proficiency in SQL for complex queries, transformations, and performance tuning.
- Experience with ETL/ELT tools, orchestration frameworks (Airflow/DBT), and version control (Git).
- Knowledge of data modeling concepts, dimensional modeling, and schema design in Snowflake.
- Familiarity with semi-structured data formats (JSON, Parquet, Avro) and Snowflake features (Streams, Tasks, Snowpipe).
- Exposure to cloud platforms (AWS, Azure, or GCP) and storage integration with Snowflake.
- Strong problem-solving and analytical skills with the ability to optimize data pipelines.
- Education: Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, or related field.
Preferred Skills
- Experience migrating legacy data warehouses (Teradata, Oracle, SQL Server) to Snowflake.
- Familiarity with CI/CD pipelines for data engineering solutions.
- Exposure to BI tools (Power BI, Tableau, Looker) connected with Snowflake.
- Knowledge of data governance, security, and compliance practices.
Experience
- Experience Required: 5+ years in data engineering roles with a focus on Snowflake and related technologies.
Environment
- Location: Chennai, Bangalore / Hybrid work arrangement.
Salary
(Salary details not specified)
Growth Opportunities
(Growth opportunities not specified)
Benefits
(Benefits not specified)