Data Architect | Codersbrain
Job Description
- Location: Remote
- Contract Duration: 3 Months
- Experience Required: 10+ years
Job Title: Data Architect – Data Migration & Compliance Platform Modernization
Role Summary:
We are seeking a highly experienced(10+ years) and hands-on Data Architect to lead and deliver the end-to-end
data migration from Databricks and Teradata to BigQuery, including complete ETL/ELT pipeline design,
dataset migration, and platform modernization. The candidate will also be responsible for understanding the
existing audit compliance monitoring application and building a new data pipeline into the next-generation
audit platform, ensuring high data quality, governance, and traceability.
Key Responsibilities:
Data Migration & Platform Delivery
Lead and execute data migration strategy from Databricks and Teradata to Google BigQuery
Analyze and migrate existing data models, views, datasets, and ETL pipelines
Redesign schemas for optimized performance and compliance in BigQuery
Identify and address data quality, format, lineage, and validation issues
Audit Compliance Application Transition
Understand the current audit compliance monitoring application
Identify data flows, lineage, metrics, alerts, and critical data sets used
Redesign and build pipelines to feed data into the new audit compliance platform
Ensure compliance with audit trails, data retention, and security policies
ETL/ELT Architecture
Design and implement scalable and eƯicient ETL/ELT workflows using tools like Apache Airflow, dbt,
Dataflow, or custom frameworks
Ensure reusability, observability, and resilience of data pipelines
Work closely with DevOps for CI/CD of data pipelines
Data Design & Governance
Build logical and physical data models, define data contracts and standards
Enable data quality rules, validation frameworks, and governance mechanisms
Define partitioning, clustering, cost optimization strategies for BigQuery
Stakeholder Management & Leadership
Collaborate with engineering, QA, product, data analysts, compliance, and audit teams
Capture requirements, translate into technical designs, estimate workloads
Lead and mentor a team of data engineers and assign responsibilities eƯectively
Communicate status, risks, dependencies, and results to senior leadership
Required Skills & Experience:
Data Platforms:
Deep hands-on expertise in BigQuery, Teradata, Databricks (Delta Lake, Spark)
Good understanding of Google Cloud Platform (GCP) services: GCS, Pub/Sub, Dataflow, Composer
Tools & Frameworks:
Proficiency in SQL, Python, and orchestration tools (Airflow, dbt, etc.)
Experience with data pipeline frameworks, batch & streaming architectures
Familiarity with audit logging, compliance monitoring, data security
Strategic & Functional:
Strong data architecture, modelling (3NF, star, snowflake) and schema design experience
Ability to reverse engineer data flows and audit systems
Proven experience in data migration and modernization programs
Experience working with audit, compliance, or risk applications is a strong plus
Communication & Leadership:
Experience working with cross-functional teams in Agile/DevOps environment
Excellent skills in requirement analysis, eƯort estimation, and task allocation
Ability to lead technically while being hands-on in design and delivery
Preferred Qualifications:
10+ years in data engineering, data architecture, or BI
Certifications in GCP (Professional Data Engineer or Architect)
Background in audit, compliance, banking, or regulated environments
Deliverables Expected:
Migration plan and roadmap for all datasets
Target data models, dictionaries, and lineage documentation
ETL/ELT pipeline code and documentation
Data validation test scripts
Migration success criteria and signoƯ reports
Soft Skills:
Detail-oriented, organized, and analytical thinker
Strong communication and presentation skills
Proactive, adaptable, and thrives in a fast-moving environment