Information/Data Architecture

Spectraforce Technologies Inc.

Seattle, WA

JOB DETAILS
JOB TYPE
Full-time, Employee
SKILLS
Analysis Skills, Aviation Industry, Best Practices, Business Intelligence, Change Management, Cisco Unity, Code Reviews, Communication Skills, Computer Science, Continuous Deployment/Delivery, Continuous Integration, Data Management, Data Modeling, Data Processing, Data Quality, DataArchitect Data Modeling Tool, Design Document, DevOps, Docker, Documentation, Error Handling, Hubs, Information Architecture, Information Technology & Information Systems, Maintain Compliance, Mathematics, Mentoring, Metadata, Microsoft Windows Azure, Network Routing, Performance Analysis, Process Modeling, Product Engineering, Quality Metrics, Quality Monitoring, SQL (Structured Query Language), Software Engineering, Systems Engineering, Technical Leadership, Traceability, Writing Skills
LOCATION
Seattle, WA
POSTED
Today

Title: Information/Data ArchitectureLocation: Seattle, WADuration: 12 months



Job Description Qualifications

:


Seeking a Senior Data Architect to join the Performance Training Analytics (PTA) application team, supporting the Boeing Global Services Training Solutions organization. This position is based out of Seattle, WA. Be part of a high performing software engineering organization focused on transforming the aviation training industry through Competency Based Training and Assessment (CBTA) digital solutions.


In this role, your responsibility will be to leverage Unified Data Modeling (UDM) to design and govern data models, process layers, transformations, routing, and schema evolution for the datalake built on Azure Databricks and Microsoft Azure. You will map incoming and changing source data to the UDM, manage and define data transformation rules, and collaborate closely with distributed data engineering teams (including team members in India) to implement robust, scalable data pipelines and governance.



Position Responsibilities:

  • Design, document, and maintain the Unified Data Model (UDM) artifacts and mappings.
  • Model process layers in the datalake, defining transformation responsibilities and lineage across layers.
  • Define and enforce rules for data processing, including routing tables, validation rules, enrichment logic, and error-handling policies.
  • Author and maintain schema definitions, attribute dictionaries, and change management processes for schema evolution.
  • Translate business and source system requirements into data transformation specifications to be implemented in Azure Databricks and downstream systems.
  • Collaborate with data engineers to design performant transformations, partitioning, and storage strategies in the datalake.
  • Review and approve data pipeline designs, ensuring adherence to UDM, governance, and security policies.
  • Work with DevOps and engineering teams to operationalize CI/CD for Databricks notebooks, jobs, and infrastructure-as-code.
  • Provide technical leadership and mentorship to data engineering teams, including remote collaboration with engineers located in India; coordinate design, implementation, and delivery across time zones (may require off-hour work).
  • Establish data quality metrics, monitoring, and remediation guidance; ensure traceability and lineage from source to consumption.
  • Participate in architecture and design reviews, code reviews, and agile ceremonies; drive best practices for data modeling and transformation.
  • Communicate architecture decisions and trade-offs to stakeholders, product owners, and engineering teams.

Describe the project/day-to-day activities they will be working on:

  • Designing, improving, communicating, and managing data and data models for training analytics application.

Top 3-5 Technical/Software Skills needed to perform this role/job

:

  • Unified Data Model (UDM)
  • Data lakehouse concepts
  • Business intelligence analytical understanding

Basic Qualifications (Required Skill/Experience):

  • 9+ years of experience in data architecture, data modeling, or related data engineering roles.
  • Proven experience designing and implementing Unified Data Models (UDM).
  • Strong expertise in Structured Query Language (SQL)
  • Strong expertise in data modeling across multiple process layers (raw/ingest, transformed, curated) and defining transformation logic.
  • Deep understanding of data governance, data lineage, metadata management, and data quality concepts.
  • Demonstrated experience with Databricks and building/architecting datalake solutions.
  • Experience defining schema evolution processes, new attribute definitions, and backward-compatible changes.
  • Experience authoring route tables, processing rules, and data routing/ingestion patterns.
  • Experience collaborating with geographically distributed engineering teams and willingness to work off hours to coordinate with teams in India.
  • Strong communication, documentation, and stakeholder engagement skills

Education

:


Technical bachelor's degree and typically 9 or more years' related work experience or a Master's degree with typically 7 or more years' or a PhD degree with typically 4 or more years' related work experience or an equivalent combination of education and experience. A technical degree is defined as any four year degree, or greater, in a mathematic, scientific or information technology field of study.



Preferred Qualifications (Desired Skills/Experience):

  • Prior experience in aviation, training analytics, or related operational data domains.
  • Experience with Azure Data Factory, Delta Lake, Unity Catalog, or equivalent data governance tools.
  • Familiarity with infrastructure-as-code and CI/CD practices for data pipelines (Terraform, Azure DevOps, GitOps).
  • Knowledge of streaming ingestion patterns, Kafka/Event Hubs, and near-real-time processing.
  • Experience with metadata platforms and data catalog tools (e.g., Purview, Alation).
  • Experience with containerization and orchestration (Docker, Kubernetes) is a plus.
  • Bachelor's or advanced degree in Computer Science, Information Systems, Engineering, or related field.
  • Prior experience in aviation, training analytics, or related operational data domains.
  • Experience with Azure Data Factory, Delta Lake, Unity Catalog, or equivalent data governance tools.
  • Familiarity with infrastructure-as-code and CI/CD practices for data pipelines (Terraform, Azure DevOps, GitOps).
  • Knowledge of streaming ingestion patterns, Kafka/Event Hubs, and near-real-time processing.
  • Experience with metadata platforms and data catalog tools (e.g., Purview, Alation).
  • Experience with containerization and orchestration (Docker, Kubernetes) is a plus.
  • Bachelor's or advanced degree in Computer Science, Information Systems, Engineering, or related field.

SPECTRAFORCE is an equal opportunity employer and does not discriminate against any employee or applicant for employment because of race, religion, color, sex, national origin, age, sexual orientation, gender identity, genetic information, disability or veteran status, or any other category protected by applicable federal, state, or local laws. Please contact Human Resources at nahr@spectraforce.com if you require reasonable accommodation.


About the Company

S

Spectraforce Technologies Inc.