ETL Developer | Codersbrain
Job Description
ETL Developer (Hybrid)
Job Summary
We are seeking an experienced ETL Developer with over 5-7 years of hands-on experience in Azure-based Data Engineering. The ideal candidate is specialized in developing scalable storage solutions, robust schema layers, and reliable data pipelines. This role is critical in integrating data from core platforms into centralized warehouses or data lakes while ensuring high code quality, adherence to security standards, and delivering exceptional user experiences. Candidates should be available for immediate joining up to 15 days notice and work from one of our centers in Bangalore, Hyderabad, Chennai, or Gurgaon.
Responsibilities
- Design, Develop & Troubleshoot: Architect, build, and maintain scalable ETL/streaming pipelines using Azure Synapse, Azure Data Factory, and Apache Spark.
- Data Integration: Integrate data from core platforms to centralized warehouses or data lakes ensuring efficient extraction, transformation, and loading processes.
- Database & Datawarehouse Management: Leverage expertise in relational databases, SQL, T-SQL, and data warehousing to design and optimize data storage solutions.
- Programming & Automation: Utilize programming languages such as Python or Java along with relevant data processing libraries to create robust data solutions.
- Best Practices & Code Quality: Adhere to rigorous coding standards, implement automated testing, and enforce engineering best practices.
- Security & Access Models: Develop and maintain secure systems and access models for managing highly sensitive data while ensuring compliance with organizational policies.
Qualifications
- Experience: 5-7 years of hands-on experience in Azure-based Data Engineering and ETL development.
- Technical Skills:
- Proficient in Azure Synapse and Azure Data Factory.
- Solid experience with Data Warehousing concepts, SQL, and T-SQL.
- Expertise in distributed data frameworks such as Apache Spark.
- Programming: Proficiency in Python or Java, including familiarity with their respective data processing libraries.
- ETL Development: Proven experience in designing and troubleshooting ETL/streaming pipelines.
- Database Management: Strong background in working with relational databases such as Azure SQL, RDS, etc.
- Security Focus: Well-versed in establishing secure systems and access models for sensitive data.
- Communication Skills: Strong cross-functional communication skills for requirements gathering and collaboration.
Preferred Skills
- Experience in architecting shared datasets and designing scalable storage solutions.
- Ability to innovate and create user-centric tools with a passion for delivering exceptional user experiences.
- Additional certifications or advanced training in Azure Data Engineering or related fields.
Experience
- Required: 5-7 years of hands-on experience in the field, with a proven track record in building and deploying ETL/streaming pipelines, data engineering, and system security.
Environment
- Work Setting: Hybrid work model.
- Location Options: Bangalore, Hyderabad, Chennai, or Gurgaon.
- Notice Period: Immediate to 15 days notice required.
Tools
file_search
// Tool for searching files uploaded by the user. // // To use this tool, you must send it a message. To set the tool as the recipient for your message, include this in the message header: to=file_search.<function_name> // // For example, to call file_search.msearch, you would use: // <|im_start|>assistant to=file_search.msearch code<|im_sep|>{"queries": ["first query", "second query"]}<|ghissue|> // // Note that the above must match exactly. // // You must provide citations for your answers. Each result will include a citation marker that looks like this: fileciteturn7file4