Data Engineer III – Streaming & Kafka- Remote

ConsultNet

Birmingham, AL(remote)

JOB DETAILS
SALARY
$70–$77 Per Hour
SKILLS
Amazon Simple Storage Service (S3), Apache Kafka, Atlassian JIRA, Banking Regulations, Banking Services, Big Data, Business Support, Cloudera, Data Lake, Data Management, Data Migration, Ecosystems, GitHub, Mainframe Computer, Microsoft SQL Server, Microsoft SharePoint, Oracle Applications, Oracle Database, Python Programming/Scripting Language, Relational Databases (RDBMS), SQL (Structured Query Language), Snowflake Schema
LOCATION
Birmingham, AL
POSTED
3 days ago

Data Engineer III – Streaming & Kafka
Remote (working CST)
9- month contract
Pay Rate: $70.00 - $77.00per hr.

This is a highly specialized, high-priority 9-month contract role supporting critical enterprise data initiatives within a large banking organization. The role focuses on real-time data ingestion, event streaming, and modernization of enterprise data pipelines into Snowflake. This position is mission-critical and supports some of the bank's most visible and regulated data programs.

Key Business Initiatives Supported

  • Open banking regulatory data ingestion (Kafka-based streaming required)
  • Enterprise R2 deposit data initiative (top bank priority)
  • Snowflake migration and data lake modernization efforts
Core Responsibilities
  • Design, build, and support real-time and batch ingestion pipelines into Snowflake
  • Work extensively with Kafka-based event streaming systems
  • Support enterprise ingestion from multiple source types:
    • Event streams (Kafka)
    • RDBMS systems (Oracle, SQL Server)
    • Flat files and mainframe extracts
  • Build and maintain ingestion pipelines using Python and SQL
  • Support complex data transformation and movement across systems
  • Work with tools such as Precisely and Click for orchestration
  • Load data into Snowflake landing zones and downstream structures
  • Ensure reliability and scalability of streaming ingestion pipelines

Required Skills:

  • 5+ years of relevant experience
  • Strong, hands-on Apache Kafka experience (mandatory)
  • Advanced SQL (complex joins, optimization, tuning)
  • Python development experience
  • Snowflake experience
  • Experience with big data ecosystems (Cloudera or similar)
  • Strong understanding of event-driven architecture and streaming ingestion

Bonus Skills:

  • Experience with ingestion tools (Precisely, Click)
  • AWS S3 exposure
  • Experience with enterprise-scale data pipelines
  • Familiarity with GitHub-based development workflows
  • Understanding of SharePoint and Jira data ingestion (nice to have)
  • Experience working in highly regulated or banking environments

About the Company

C

ConsultNet