HumanBit Logo

Senior Data Engineer | Codersbrain

full-time
Posted on June 13, 2025

Job Description

Job Title: Senior Data Engineer
Location: Hyderabad
Experience: 6+ Years
Employment Type: Full-Time

Job Summary:

We are looking for a highly skilled Senior Data Engineer to join our Data Engineering team. You will play a key role in designing, implementing, and optimizing robust, scalable data solutions that drive business decisions for our clients. This position involves hands-on development of data pipelines, cloud data platforms, and analytics tools using cutting-edge technologies.


Key Responsibilities:

  • Design and build reliable, scalable, and high-performance data pipelines to ingest, transform, and store data from various sources.

  • Develop cloud-based data infrastructure using platforms such as AWS, Azure, or Google Cloud Platform (GCP).

  • Optimize data processing and storage frameworks for cost efficiency and performance.

  • Ensure high standards for data quality, integrity, and governance across all systems.

  • Collaborate with cross-functional teams including data scientists, analysts, and product managers to translate requirements into technical solutions.

  • Troubleshoot and resolve issues with data pipelines and workflows, ensuring system reliability and availability.

  • Stay current with emerging trends and technologies in big data and cloud ecosystems and recommend improvements accordingly.


Required Qualifications:

  • Bachelor’s degree in Computer Science, Software Engineering, or a related field.

  • Minimum 6 years of professional experience in data engineering or a related discipline.

  • Proficiency in Python, Java, or Scala for data engineering tasks.

  • Strong expertise in SQL and hands-on experience with modern data warehouses (e.g., Snowflake, Redshift, BigQuery).

  • In-depth knowledge of big data technologies such as Hadoop, Spark, or Hive.

  • Practical experience with cloud-based data platforms such as AWS (e.g., Glue, EMR), Azure (e.g., Data Factory, Synapse), or GCP (e.g., Dataflow, BigQuery).

  • Excellent analytical, problem-solving, and communication skills.


Nice to Have:

  • Experience with containerization and orchestration tools such as Docker and Kubernetes.

  • Familiarity with CI/CD pipelines for data workflows.

  • Knowledge of data governance, security, and compliance best practices.

Powered by
HumanBit Logo