Enterprise Data Services Manager

National Digital Trust Company (In Organization)

*, NY

JOB DETAILS
LOCATION
*, NY
POSTED
30+ days ago

Enterprise Data Services Manager

(Remote Candidates will be considered)  

 

Our Story and Our Purpose  

 

National Digital Trust Company (In Organization) has received conditional approval from the Office of the Comptroller of the Currency to open as a federally chartered trust bank to provide a broad range of digital asset services.

 

We are building a specialized financial institution addressing the growing demand for digital asset services. Our primary business will focus on digital asset custody, providing secure, efficient custodial and fiduciary services for a variety of digital assets.  You will work with foundational systems and processes to help shape our operating model and influence how a new category of financial infrastructure comes to market.

 

We are looking for builders who handle complexity with confidence and tackle ambitious opportunities while keeping pace with this rapidly evolving industry.  Let's build this together!

 

Our Principles 

 

Greatness is a mindset, not an accomplishment. Mediocrity is unacceptable. Excellence is contagious. We hire people because we believe in their greatness. Now is the time to prove us right.

 

Responsibility comes with the territory. Everyone is an owner, which means we share a common vision and mutual accountability. We act in line with our strategic objectives and the trust our customers place in us. We believe there is no such thing as "not my problem." Taking this level of ownership not only drives our collective success but also offers the potential for significant reward.

Innovation and adaptation are in our DNA. We are in a period of the most dramatic and rapid period of technological change in the history of humankind. Those that stay ahead will thrive, those that don't, won't. We innovate intelligently and thrive on overcoming challenges, to get (at least) a little better every day and ensure our continued growth and success.

Team first. We are reliable teammates working together toward extraordinary success through honesty and accountability. We believe collaboration knows no hierarchy, and we focus on what matters.  We work toward consensus, but when necessary, we disagree and commit. We know that winners win.

 

Role Overview

We are seeking a highly seasoned Data Architect/Manager with a minimum of 15 years of experience in banks or related financial services to support the architecture, development, and optimization of our data infrastructure within a highly regulated financial environment. The ideal candidate is a "SQL Master" with extensive experience in the SQL and PostgreSQL ecosystem, capable of designing high-performance database schemas while ensuring strict adherence to global banking and federal regulatory standards.

In this role, you will act as the primary authority on data modeling, statistical data engineering, and data mastering. You will lead the implementation of data lineage and quality controls required for BCBS 239 compliance, ensure alignment with ISO 20022 messaging standards, and maintain process excellence following ISO 9000 and CMM/CMMI frameworks. Furthermore, you will spearhead the data operations and management strategy for AI/ML and Generative AI, specifically focusing on robust data testing and high-performance RAG (Retrieval-Augmented Generation) architectures.

 

Objectives 

  • Data Strategy for AI:  Design a  data layer for LLM-powered applications, ensuring that RAG systems have access to high-quality, governed, and real-time context from PostgreSQL.
  • Advanced Data Testing & Observability: Design and implement a "Test-First" data culture. Develop automated frameworks to validate data integrity, freshness, and distribution at every stage of the ELT/ML pipeline.
  • Standardized Financial Modeling: Define database schemas that natively support ISO 20022 structures, ensuring seamless interoperability for AI-driven financial analysis.
  • Process Excellence (CMMI): Define and enforce data management  workflows that meet CMM Level 3/4 standards, specifically applying these to the fast-moving AI/ML development lifecycle.
  • Regulatory Data Governance: Ensure AI and ML data pipelines adhere to BCBS 239 and FFIEC AIO standards, focusing on model interpretability and data provenance.
  • Data Mastering & Quality:  Design  MDM solutions using R,SQL and other tools to build automated validation suites that identify anomalies before they reach downstream ML models or RAG systems.
  • AI-Assisted  Capabilities:  Develop internal tools  to accelerate developer productivity while maintaining strict data  governance and auditability.
  • Explore and leverage  pgvector and optimizing high-dimensional vector similarity searches in Postgres.
  • Contributions to PostgreSQL open-source projects or AI/ML data frameworks.

 

What you bring to our company 

Experience & Core Expertise

  • 15+ Years in Data Management & Operations: A proven track record of building and maintaining enterprise-grade data pipelines and large-scale distributed systems, specifically within the Financial Services or Fintech sectors.
  • Regulatory & Financial Standards Expertise:

BCBS 239: Deep understanding of the Basel Committee’s principles for effective risk data aggregation and risk reporting.

ISO 20022: Expert knowledge of the ISO 20022 universal financial industry message scheme.

FFIEC AIO: Practical experience aligning data architecture and infrastructure operations with the FFIEC booklet on Architecture, Infrastructure, and Operations.

  • Quality & Process Maturity:

ISO 9000: Experience implementing and maintaining Data Quality Management Systems (DQMS).

CMM / CMMI: Experience operating within Level 3+ organizations where defined, standardized, and integrated processes are mandatory.

  • SQL Mastery: Expert-level SQL skills, including advanced window functions, recursive queries, CTEs, and complex joins. Ability to write highly efficient, readable, and maintainable code for high-concurrency environments.
  • PostgreSQL Specialization: At least 8-10 years of deep, hands-on experience specifically with PostgreSQL, including internal mechanisms (MVCC, WAL, VACUUM), partitioning, and advanced indexing (pgvector, GIN, GiST).
  • Data Integrity & State Validation: Able to design and run Row Verification: Querying the DB to ensure a new record exists with the exact values sent in the API payload. Data Truncation: Checking if long strings sent to the API were accidentally cut off by a database column limit. Default Values: Ensuring fields not sent in the API (like created_at or is_active) were populated correctly by DB triggers or application logic.

 

  • Data Structures: Proficiency in JSON and XML. You should know how to navigate nested objects and arrays to extract specific data. Hands-on experience with OAuth 2.0, JWT (JSON Web Tokens), API Keys, and Bearer Tokens.  Using JavaScript to dynamically generate data (e.g., timestamps or hashed signatures) before a request is sent. Use natural language to generate test suites, debug failing requests, and even create documentation.  Familiarity with REST, JSON, OpenAPI Spec, CI/CD integration.

 

  • Leveraging visual, low-code builder to map out complex logic, conditional branching (If/Else), and loops between multiple services.  Using tools to generate client code in languages like Python, Java, or Node.js to jumpstart the actual implementation in the codebase or for testing.  Using CSV or JSON files to test hundreds of scenarios (e.g., valid users, expired accounts, unauthorized countries) in a single click. Design and run automated tests every hour/day to ensure that a change in one microservice hasn't accidentally broken another, etc. etc.

 

  • Postman Library (External Modules): You can use tools like Postman-to-SQL or custom Node.js bridges to run SQL queries directly from your Postman "Tests" tab. You can create basic Comparison Logic: 1. Postman hits the API and saves the response as a variable. 2. A script triggers a DB query. 3. A test script compares the JSON response to the SQL Result Set.

 

Advanced Technical Skills

  • ML & LLM Data: Preparation:  Expert at designing data pipelines for machine learning (MLOps) and large language models (LLMOps). Experience with feature stores, data labeling workflows, and vector database integration.
  • RAG Architecture: Deep understanding of Retrieval-Augmented Generation (RAG) patterns. Proficiency in optimizing PostgreSQL (using pgvector) for semantic search, hybrid search (keyword + vector), and high-fidelity context retrieval.
  • Automated Data Testing: Mastery of data testing frameworks (e.g., Great Expectations, dbt-tests, or custom R-based suites). Experience implementing circuit breakers in pipelines, data contract testing, and regression testing for large-scale migrations.
  • Statistical (R): Proficiency in R  or other modern languages for advanced data profiling, statistical validation of data migrations, and building automated data quality frameworks to meet regulatory audit requirements.
  • Generative AI & Prompt: Preparation:  Advanced ability to design and refine prompts for LLMs to automate SQL generation, translate natural language to complex Postgres queries, and perform automated schema documentation.
  • Data Modeling & Mastering: Expert knowledge of OLTP vs. OLAP modeling and Data Vault 2.0. Proven experience in Data Mastering, including entity resolution and "Golden Record" management.
  • Experience with CDC (Change Data Capture) tools like Debezium, Kakfa for real-time RAG updates.
  • Familiarity with Infrastructure as Code (Terraform) for compliant, repeatable database and AI infrastructure provisioning.

 Certifications

  • Certified Data Management Professional (CDMP) - Master level preferred.
  • SQL Certification (PostgreSQL Professional Certification (e.g., PostgreSQL Associate/Professional or EDB Certified Professional) preferred/ideal)
  • Oracle Certified Professional: SQL Developer or equivalent high-level SQL mastery credential.
  • AWS/Google/Azure Professional Data Engineer or Machine Learning Engineer certifications.
  • Six Sigma or ISO 9001 Internal Auditor certification (Preferred).
  • CMMI Associate or similar process maturity credentials (Preferred).

We promote diversity of thought, culture, background, and experience. We are an equal opportunity employer, and employment at our company is based solely on one's merit and qualifications directly related to professional competence. We do not discriminate based on race, creed, color, ancestry, religion, gender, sexual orientation, gender identity, national origin, age, disability, genetic information, military or veteran status, or any other characteristics protected by law. 

Featured benefits 

Employer-provided: Medical, Dental, and Vision insurance, 401(k), life and disability insurance.

About the Company

N

National Digital Trust Company (In Organization)