Senior Data Engineer (DevOps Automation) - Alameda, CA (Hybrid)

Georgia Tek Systems

Alameda, CA

JOB DETAILS
SKILLS
Amazon Simple Storage Service (S3), Amazon Web Services (AWS), Automation, Communication Skills, Computer Science, Continuous Deployment/Delivery, Continuous Integration, Corporate Policies, Data Management, DevOps, EAD, Emerging Technology, Engineering Software, GitHub, Identify Issues, Industry Standards, Information/Data Security (InfoSec), Needs Assessment, Performance Tuning/Optimization, Problem Solving Skills, Python Programming/Scripting Language, Regulatory Compliance, Sales Pipeline, Scripting (Scripting Languages), Software Engineering, Source Code/Configuration Management (SCM), Team Player, Time Management
LOCATION
Alameda, CA
POSTED
30+ days ago
Senior Data Engineer (DevOps Automation)
Location: Alameda, CA (Hybrid)

Duration: Contract
Rate: DOE

US Citizens, GC, EAD ( H4, L2), E3 TN visa holders preferred, NO third party corp to corp accepted for this job


Key Responsibilities:
  • Design, implement, and maintain robust, scalable, and automated data engineering solutions.
  • Develop and manage data pipelines using AWS services like Glue, S3, and Redshift.
  • Utilize Terraform for infrastructure as code (IaC) to ensure consistent and reliable environment provisioning.
  • Maintain and enhance CI/CD pipelines for data engineering applications, ensuring seamless integration and deployment.
  • Collaborate with data engineers to understand data needs and implement solutions that meet these requirements.
  • Develop scripts and tools in Python to automate various data engineering tasks.
  • Manage source control using GitHub, including branching, merging, and version control.
  • Ensure data security and compliance with industry standards and company policies.
  • Monitor, troubleshoot, and optimize data systems for performance and efficiency.
  • Stay updated with the latest trends and technologies in DevOps and data engineering.

Qualifications:
  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
  • At least 9+ years of experience in a DevOps role with a focus on data engineering infrastructure.
  • Strong experience with AWS services, especially Glue, S3, and Redshift.
  • Proficient in Terraform for infrastructure automation.
  • Extensive experience in Python programming, specifically for data engineering tasks.
  • Solid understanding of data pipeline and workflow management tools.
  • Expertise in source control management using GitHub.
  • Excellent problem-solving skills and ability to work under tight deadlines.
  • Strong communication and teamwork skills.

About the Company

G

Georgia Tek Systems