Overview: We have several openings at multiple levels (Jr-Sr). In this position you will design and develop high-performance, secure, and automated pipelines for deployment of Esri technology in the cloud and on-premises. You will partner closely with software developers and cloud engineers to accomplish this.
Responsibilities
- Exposure to Design, implement, and maintain high-speed CI/CD pipelines using GitHub Actions/Azure DevOps for cloud-native workflows and Jenkins for complex, cross-platform on-premises build automation
- Architect and optimize Docker environments to ensure consistent, isolated, and reproducible build/test runners across the entire development lifecycle
- Work with the team to help lead the design of the next-generation Python package creation process; manage complex Conda recipes and environments to ensure seamless distribution of the ArcGIS API and its dependencies
- Build and manage the underlying build infrastructure (runners, nodes, and registries) using automated provisioning to ensure high availability and scalability of the release system
- Implement automated security gatekeeping, including the generation of SBOMs (Software Bill of Materials), vulnerability scanning, and license compliance for all Python and Rust artifacts
- Work with the team to help design and implement monitoring frameworks to track build performance; assist in automated load and performance testing of the Python stack to identify regressions before release
- Identify and eliminate manual bottlenecks by transitioning legacy workflows into modern, 'everything-as-code' processes
- Oversee the lifecycle of third-party dependencies, ensuring that the supply chain is secure, from source code to the final distributed package
Requirements:
- US citizenship with Active or Current TS/SCI clearance
- 1+ years of relevant experience (including internships, open-source work, or academic projects) in DevOps, CI/CD, or platform automation
- Hands-on experience designing and maintaining CI/CD pipelines (GitHub Actions, GitLab CI, Azure Pipelines, or similar)
- Experience working with YAML-based pipeline configurations and infrastructure definitions
- Exposure to automation and scripting (Python and/or JavaScript preferred)
- Experience with infrastructure automation tools such as:
- Ansible, AWS CDK, Terraform, or similar IaC frameworks
- Familiarity with Linux environments and containerized workflows
- Ability to write and optimize Docker files and support container-based deployments
- Understanding of artifact management, dependency management, and secure build practices
- Familiarity with SBOM tools and artifact signing concepts
- Exposure to cloud platforms (AWS and/or Azure), especially:
- Compute services
- Storage services (e.g., S3, Blob Storage)
- Bachelor’s degree in Computer Science, Engineering, DevOps, or related technical field (or equivalent practical experience)
Preferred Qualifications:
- Master's in Computer Science, Mathematics, Geographic Information Systems (GIS), or Stem related field
- Experience automating backend systems and operational workflows
- Familiarity with release engineering and deployment strategies (blue/green, canary,
- etc.)
- Experience integrating security scanning into pipelines (SAST/DAST, container scanning)
- Exposure to Kubernetes-based delivery environments
- Exposure to DevSecOps practices and secure development pipelines
- CompTIA Security + certification
- Experience working in Agile or Scrum development environments