Consulting FHIR API Architect and Engineer - GCP

CereCore

Nashville, TN

JOB DETAILS
LOCATION
Nashville, TN
POSTED
18 days ago

Classification: Contract-to-hire

Contract Length: 12 Months


Position Summary

The Sr Staff API Engineer will be part of the team in Nashville, TN. The Senior Staff API Engineer - API serves as a primary development resource for design, writing code, test, implementation, document functionality, and maintain of NextGen solutions for the GCP Cloud enterprise data initiatives. The role requires working closely with data teams, frequently in a matrixed environment as part of a broader project team. This role requires ‘self-starters’ who are proficient in problem solving and capable of bringing clarity to complex situations. The culture of the organization places an emphasis on teamwork, so social and interpersonal skills are equally important as technical capability. Due to the emerging and fast-evolving nature of GCP technology and practice, the position requires that one stay well-informed of technological advancements and be proficient at putting new innovations into effective practice.

 

In addition, this candidate will have a history of increasing responsibility in a small multi-role team. This position requires a candidate who can analyze business requirements, perform design tasks, construct, test, and implement solutions with minimal supervision. This candidate will have a record of accomplishment of participation in successful projects in a fast-paced, mixed team (consultant and employee) environment. In addition, the applicant must be willing to mentor other developers to prepare them for assuming the responsibilities.

 

As a Senior Staff API Engineer, you will collaborate closely with all team members to create a modular, scalable solution that addresses current needs, but will also serve as a foundation for future success. The position will be critical in building the team’s API engineering practices in test driven development, continuous integration, and automated deployment and is a hands-on team member who actively coaches the team to solve complex problems.


Responsibilities?

  • Work with API engineers, data architects, data scientists, and other internal stakeholders to understand product requirements and then design, build, and monitor data platforms and pipelines that meet today's requirements but can gracefully scale.
  • Implement automated workflows that lower manual/operational costs, define and uphold SLAs for timely delivery of data, and move the company closer to democratizing data.
  • Enable a self-service data architecture supporting query exploration, dashboards, data catalog, and rich data discovery.
  • Promote a collaborative team environment that prioritizes effective communication, team member growth, and success of the team over success of the individual.
  • Design and create APIs that accelerate the time from idea to insight.
  • Adheres to and supports API best practices, processes, and standards.
  • Produce high quality, modular, reusable code that incorporates best practices and serves as an example for less experienced engineers.
  • Helps promote and support data security best practices that align with industry standards and regulatory and legal requirements.
  • Help mentor team members on complex data projects and following the Agile process.
  • Help lead data analysis efforts and solution proposals to data related and data architecture problems.
  • Help lead implementation of unit and integration tests and promote and conduct performance testing where appropriate.
  • Be a leader in the HCA data community. Evangelize API engineering best practices and standards, participate, or present at community events, and encourage the continual growth and development of others.
  • Be curious. Be growth minded. Encourage and enable this in others.
  • Demonstrate professional and personal maturity through self-leadership.
  • Build productive and healthy relationships within the department and other teams to foster growth of our culture, our people, and our platforms.
  • Practices and adheres to the “Code of Conduct” philosophy and “Mission and Value Statement.”
  • Perform other duties as assigned.
  • Responsible for building and supporting a GCP based ecosystem designed for enterprise-wide analysis of structured, semi-structured, and unstructured data.
  • Work independently, and complete tasks on-schedule by exercising strong judgment and problem-solving skills.
  • Analyze requirements, design AI/ML based solutions, and integrate those solutions for customer environments.
  • Proven experience effectively prioritizing workload to meet deadlines and work objectives.
  • Works in an environment with rapidly changing business requirements and priorities
  • Shares knowledge and experience to contribute to growth of overall team capabilities.
  • Actively participate in technical group discussions and adopt any modern technologies to improve the development and operations.

 

Requirements

  • Strong understanding of best practices and standards for GCP Data process design and implementation.
  • 2+ years hands-on experience with GCP platform and experience with many of the following components:
    • Postman, Bruno, Dynatrace
    • API versioning, caching, and pagination
    • Cloud Run, GKE
    • Bigtable, Cloud SQL, Cloud Spanner
    • BigQuery, Cloud Function
    • Cloud Run, CI/CD, Cloud Logging
    • GitHub
    • FHIR Format
  • 4+ Years of hands-on experience with many of the following components:
    • API Development
    • Apigee
    • Python FastAPI Framework
    • Streaming, Kafka
    • SQL, JSON, Avro, Parquet
    • Java, Python, or Scala
  • Certifications (a plus, but not required): GCP Cloud Professional Data Engineer
  • Bachelor's degree in computer science, related technical field, or equivalent experience
  • Master's degree in computer science or related field
  • 7+ years of experience in API Engineer
  • 3+ years of experience in Healthcare
  • 10+ years of experience in Information Technology

About the Company

C

CereCore