Senior Data Engineer(8-10YRS ) | Codersbrain
Job Description
Required Skills: AWS, , PySpark
Experience: 8 - 10 Yrs
Notice Period: Immediate
Location: Chennai, Coimbatore, Bangalore, Pune (Hybrid)
Job Title: Senior Data Engineer
Experience: 9–11 Years
Location: Bangalore, Chennai, Pune, Coimbatore
About the Role
We are seeking a highly skilled Senior Data Engineer with strong expertise in AWS and
PySpark. The ideal candidate will have 8–10 years of experience in building and optimizing
data pipelines, ensuring data quality, and enabling data-driven decision-making across the
organization. This is a senior-level role requiring both hands-on technical skills and the
ability to mentor and guide junior team members.
Key Responsibilities
• Design, develop, and maintain scalable, reliable, and high-performance data
pipelines and ETL processes.
• Work extensively with AWS cloud services to build secure and optimized data
solutions.
• Leverage PySpark for large-scale data processing and transformation.
• Collaborate with Data Architects, Analysts, and Business stakeholders to define
data requirements and implement effective solutions.
• Ensure data quality, governance, and compliance across all data systems.
• Optimize data workflows for performance, scalability, and cost efficiency.
• Troubleshoot and resolve data pipeline and system issues.
• Mentor junior engineers and contribute to best practices, code reviews, and
knowledge sharing within the team.
Primary Skills
• Strong expertise in Data Engineering concepts, architecture, and frameworks.
• Hands-on experience with AWS services (S3, Redshift, Glue, EMR, Lambda, etc.).
• Proficiency in PySpark and distributed data processing.
• Solid understanding of data modeling, warehousing, and ETL design.
• Experience working with structured and unstructured data.
• Strong problem-solving skills with a focus on performance and scalability.
• Excellent communication and collaboration skills.
Preferred Qualifications
• Experience with Databricks or similar cloud-based big data platforms.
• Knowledge of SQL and database optimization techniques.
• Exposure to CI/CD pipelines and DevOps practices for data engineering.
• Familiarity with Medallion Architecture and Data Lakehouse concepts.