Pyspark+Scala | Codersbrain
full-time
Posted on August 14, 2025
Job Description
Apache Scala / Apache Spark Developer
Company Overview
Company details not provided.
Job Summary
The Apache Scala / Apache Spark Developer will be responsible for designing, creating, and maintaining Scala-based applications. This role demands a strong proficiency in Apache Spark along with an understanding of its architecture, contributing to the company's data processing and analytics initiatives.
Responsibilities
- Design, create, and maintain Scala-based applications, ensuring they meet organizational needs.
- Develop efficient and robust applications utilizing Apache Spark.
- Collaborate with data teams to optimize performance and scalability of applications.
- Implement and manage data workflows and ETL processes within the Hadoop ecosystem.
- Utilize SQL for querying and managing data effectively within the applications.
- Craft and execute shell scripts to automate processes and enhance productivity.
Qualifications
- Education: Bachelor's degree in Computer Science, Information Technology, or a related field.
- Experience: Minimum of 6+ years of relevant experience in Scala and Spark development.
- Technical Skills:
- In-depth knowledge of Spark and its architecture.
- Strong development experience in creating applications using Scala.
- Proficient in SQL and the Hadoop ecosystem, specifically Hive.
- Familiar with shell scripting and automation techniques.
Preferred Skills
- Knowledge of Control M for job scheduling and Tableau for data visualization would be advantageous.
Experience
- At least 6 years of hands-on experience in Scala and Apache Spark development, with a strong grasp of data processing architectures.
Environment
Typical work setting not specified; additional details regarding location or work arrangements are not provided.
Salary
Salary information not provided.
Growth Opportunities
Opportunities for advancement within the company are not specified.
Benefits
Benefits information not provided.