Lead Data Architect - Qualified Pipeline

Remote
Full Time
Experienced

Lead Data Architect – Qualified Pipeline

Location: Remote, US based
Position type: TBD


Job Summary

Data Meaning is a front-runner in Business Intelligence and Data Analytics consulting, renowned for our high-quality consulting services throughout the US and beyond. Our expertise lies in delivering tailored solutions in Business Intelligence, Data Warehousing, and Project Management. We have a diverse, global team of consultants, all working remotely, embodying a collaborative, inclusive, and innovation-driven work culture. As part of our growth and upcoming initiatives, we are proactively searching for Lead Data Architects to support future data platform modernization initiatives across our client portfolio.
 

Position Summary

The Lead Data Architect will lead the architecture assessment, transition planning, and modernization of the client’s data platform. This role will be responsible for designing the future-state architecture, defining best practices for ELT pipelines, and guiding the migration of existing SQL-based transformations into a scalable dbt framework within a Snowflake environment.

The position will work closely with the client’s engineering teams and internal Data Meaning consultants, leading knowledge transfer sessions, architecture walkthroughs, and platform transition initiatives to ensure long-term scalability, reliability, and governance of the data platform.


Key Responsibilities:

  • Lead architecture assessment and documentation of the current data platform (Snowflake, Airflow, AWS)
  • Design the future-state data platform architecture following modern ELT and dbt best practices
  • Define standards for: data modeling, pipeline architecture, transformation frameworks, governance and data quality
  • Lead the migration of transformation logic into dbt models
  • Define architectural patterns for: staging layers, intermediate layers, data marts
  • Implement automated testing frameworks within dbt
  • Lead discovery sessions and technical workshops with the client’s data engineering teams
  • Document architecture, pipelines, and operational processes
  • Conduct architecture walkthrough sessions during knowledge transfer phases
  • Define standards for CI/CD, monitoring frameworks, and data quality validation
  • Establish schema management and governance best practices
  • Optimize Snowflake compute usage and platform performance
  • Design scalable orchestration strategies for Airflow pipelines
  • Improve platform reliability, monitoring, and operational observability

Required Skills & Qualifications:
  • Advanced / fluent English
  • 10+ years of experience in Data Engineering or Data Architecture
  • Strong experience designing and implementing modern data platforms
  • Expert-level experience with Snowflake
  • Strong experience implementing dbt transformation frameworks
  • Advanced SQL modeling and performance optimization
  • Strong experience with Apache Airflow orchestration
  • Strong Python programming skills
  • Experience designing: ELT pipelines, Data warehouse architectures, Transformation frameworks, Large-scale data pipelines
  • Experience working in cloud environments (AWS preferred)

Nice to Have

  • Snowflake certification
  • Experience leading data platform migrations or modernization initiatives
  • Experience implementing Medallion architecture patterns
  • Experience defining data governance frameworks
  • Experience working with distributed or global engineering teams

Share

Apply for this position

Required*
We've received your resume. Click here to update it.
Attach resume as .pdf, .doc, .docx, .odt, .txt, or .rtf (limit 5MB) or Paste resume

Paste your resume here or Attach resume file

Human Check*