Snowflake Data Cloud Architect

Talent Software Services

New York City, NY

JOB DETAILS
SALARY
$154,000–$180,000 Per Year
JOB TYPE
Full-time, Employee
SKILLS
Application Programming Interface (API), Architectural Services, Artificial Intelligence (AI), Automation, Benchmarking, Business Intelligence Software, Cloud Architecture, Cloud Computing, Communication Skills, Computer Science, Cost Control, Cryptography, Data Quality, Documentation, Flask, GitHub, Information/Data Security (InfoSec), Interoperability, Investment Management, Management of Information Systems/Technology (MIS), Microsoft Windows Azure, Performance Analysis, Performance Management, Power BI, Python Programming/Scripting Language, Regulatory Reports, Risk, SQL (Structured Query Language), Snowflake Schema, Software Development, Systems Reliability, Tableau, Team Player, Technical Leadership, Technical Strategy, Technical/Engineering Design, Warehousing, Web Programming, Workflow Analysis
LOCATION
New York City, NY
POSTED
4 days ago

Title: Snowflake Data Cloud Architect
Location: New York, NY

Title: Snowflake Data Cloud Architect
Location: New York, NY


As a Snowflake Data Cloud Architect , you will be responsible for designing and implementing scalable, secure, and high-performance data solutions on Snowflake to support our Investment Management business goals . You will lead the architecture for data platform integration , enabling seamless communication between upstream (custodians, market data providers) and downstream (risk, performance, reporting) systems using event-driven approaches . This role requires strategic thinking, technical leadership, and deep expertise in Snowflake and modern data engineering practices.

The Contributions You'll Make:

  • Architect and Implement Snowflake Solutions:
    • Design and optimize Snowflake-based data architectures for portfolio, security master, transactions, benchmarks, and performance data.
    • Implement advanced Snowflake features (Streams, Tasks, Snowpipe, Snowpark) for real-time and batch processing.
  • Enable Event-Driven Integration:
    • Build ingestion pipelines leveraging Kafka, Azure Event Hub, or similar technologies for market data and transactional feeds.
    • Design workflows for upstream and downstream system interoperability.
  • Data Governance and Compliance:
    • Implement RBAC, data masking, and encryption aligned with enterprise data policy.
    • Ensure lineage and observability for regulatory reporting and audit.
  • Technical Leadership:
    • Act as a trusted advisor for architectural decisions and future-state roadmaps.
    • Prepare technical specifications and design documentation.
  • Innovation and Best Practices:
    • Advocate for data quality through automation, validation frameworks, and rigorous testing.
    • Recommend new technologies to improve system performance and reliability.

Minimum Knowledge and Experience:

  • Bachelor's degree in Computer Science, MIS, Engineering, or related field.
  • 10 years of experience in data architecture/engineering, with 3 years focused on Snowflake.
  • Strong background in financial data domains (IBOR/ABOR, transactions, market data, reference data).
  • Hands-on experience with:
    • Snowflake architecture (warehouses, streams, tasks, Snowpark). SQL, Python; experience with DBT.
    • Event-driven technologies (Kafka / Azure Event Hub).
    • Web API development (Flask / FastAPI).
  • Experience implementing data security measures and compliance frameworks.
  • Experience building data solutions with AI tools like Github Copilot.
  • Experience with Data Quality Tools such as dbt Test, Monte Carlo, SODA.
  • Excellent communication skills with ability to collaborate across business and technology teams.

Nice to Have:

  • SnowPro Core / Advanced Architect certification.
  • Experience with Aladdin Data Cloud, Bloomberg DL, or similar investment data platforms.
  • Experience with Data Application development (Streamlit, Dash).
  • Familiarity with BI tools (Power BI, Tableau).
  • Knowledge of FinOps for cost optimization.
  • Exposure to Databricks, PySpark, and advanced analytics workflows.

The Contributions You'll Make:

  • Architect and Implement Snowflake Solutions:
    • Design and optimize Snowflake-based data architectures for portfolio, security master, transactions, benchmarks, and performance data.
    • Implement advanced Snowflake features (Streams, Tasks, Snowpipe, Snowpark) for real-time and batch processing.
  • Enable Event-Driven Integration:
    • Build ingestion pipelines leveraging Kafka, Azure Event Hub, or similar technologies for market data and transactional feeds.
    • Design workflows for upstream and downstream system interoperability.
  • Data Governance and Compliance:
    • Implement RBAC, data masking, and encryption aligned with enterprise data policy.
    • Ensure lineage and observability for regulatory reporting and audit.
  • Technical Leadership:
    • Act as a trusted advisor for architectural decisions and future-state roadmaps.
    • Prepare technical specifications and design documentation.
  • Innovation and Best Practices:
    • Advocate for data quality through automation, validation frameworks, and rigorous testing.
    • Recommend new technologies to improve system performance and reliability.

Minimum Knowledge and Experience:

  • Bachelor's degree in Computer Science, MIS, Engineering, or related field.
  • 10 years of experience in data architecture/engineering, with 3 years focused on Snowflake.
  • Strong background in financial data domains (IBOR/ABOR, transactions, market data, reference data).
  • Hands-on experience with:
    • Snowflake architecture (warehouses, streams, tasks, Snowpark). SQL, Python; experience with DBT.
    • Event-driven technologies (Kafka / Azure Event Hub).
    • Web API development (Flask / FastAPI).
  • Experience implementing data security measures and compliance frameworks.
  • Experience building data solutions with AI tools like Github Copilot.
  • Experience with Data Quality Tools such as dbt Test, Monte Carlo, SODA.
  • Excellent communication skills with ability to collaborate across business and technology teams.

Nice to Have:

  • SnowPro Core / Advanced Architect certification.
  • Experience with Aladdin Data Cloud, Bloomberg DL, or similar investment data platforms.
  • Experience with Data Application development (Streamlit, Dash).
  • Familiarity with BI tools (Power BI, Tableau).
  • Knowledge of FinOps for cost optimization.
  • Exposure to Databricks, PySpark, and advanced analytics workflows.

About the Company

T

Talent Software Services