Data Engineer | Codersbrain
contractual
Posted on April 25, 2025
Job Description
Data Engineer
Job Summary
The Data Engineer will play a key role in designing, building, and managing scalable data pipelines and architectures to support the organization’s data-driven initiatives. The role focuses on developing and optimizing data flows, ensuring data integrity, and enabling business intelligence and analytics capabilities. As a contractual position with immediate start, the Data Engineer will contribute to ongoing projects in a fast-paced, collaborative environment.
Responsibilities
- Design, implement, and maintain robust data pipelines for data ingestion, transformation, and loading.
- Develop and optimize SQL queries and database structures to ensure efficient data storage and retrieval.
- Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions.
- Manage and monitor data workflows, ensuring data quality, consistency, and accuracy.
- Troubleshoot and resolve pipeline and database issues to minimize downtime.
- Document data processes, workflows, and best practices for knowledge sharing and compliance.
Qualifications
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
- Minimum of 5 years of professional experience in data engineering or related roles.
- Advanced proficiency in Python programming for data processing and automation.
- Strong expertise in SQL programming and SQL database management (such as MySQL, PostgreSQL, MS SQL Server, or similar).
- Experience with ETL (Extract, Transform, Load) tools and techniques.
- Knowledge of data modeling, warehousing concepts, and data architecture.
- Excellent problem-solving skills and attention to detail.
- Strong communication and teamwork abilities.
Preferred Skills
- Experience with cloud data platforms (e.g., AWS, Google Cloud Platform, Azure).
- Familiarity with big data technologies (e.g., Apache Spark, Hadoop).
- Experience with data orchestration tools (e.g., Apache Airflow, Luigi).
- Knowledge of data security and privacy best practices.
- Exposure to DevOps principles and CI/CD pipelines for data solutions.
Experience
- Minimum 5 years of relevant experience in data engineering, database development, or data warehousing.
- Prior experience working with large data sets and complex data integration projects is highly desirable.
Environment
- Work location: Mumbai, Bangalore, Pune, Hyderabad, or Noida (as per candidate’s preference and project needs).
- Engagement type: Contractual.
- Typical setting: Modern office environment; some projects may allow remote or hybrid work options depending on client requirements.
- Collaborative and fast-paced atmosphere, working with cross-functional teams.