Senior DevOps Engineer

Cape Analytics

Cape Analytics

Software Engineering
Remote · Palo Alto, CA, USA
Posted on Thursday, January 27, 2022
CAPE Analytics is the leading provider of geospatial property intelligence. CAPE provides instant property insights for millions of residential and commercial buildings by analyzing high-resolution imagery, property records, and novel data sources using computer vision and machine learning. With a mission to better understand and protect the built environment, CAPE provides property stakeholders with risk-predictive property attributes that are more timely, accurate, and objective than on-site inspections. Comprised of insurance, real estate, and data experts, CAPE is backed by leading venture capital firms and insurance carriers.
Since our founding in 2014, CAPE Analytics has used machine learning and computer vision to pioneer a new form of property information, built specifically for the organizations that finance, protect, and invest in our homes and businesses. Our 50+ (and rapidly growing!) clients across insurance and real estate are leading a digital transformation to secure properties and livelihoods in the face of complex trends in housing and climate.
As a Senior Devops Engineer on CAPE’s platform team, you’ll be a core contributor to our push to build the next generation of CAPE’s products. Working with machine learning, platform and data engineers, you’ll integrate novel machine-learning models into our APIs, build out dynamically scalable systems, and deliver features and products that drive impact for our clients.
CAPE’s insurance solutions have been adopted by leading carriers across the U.S., Canada, and Australia...but we are just getting started. Over the past 6 years, we’ve constructed an analytics platform purpose-built for deep learning. On the heels of our recent $44 million Series C financing, we’re growing rapidly. In CAPE’s next phase, we’re setting out to solve a larger share of the problem, leveraging a radically expanded array of input data sources and advanced machine learning technologies.
CAPE leverages all available tools and technologies to build our best-in-class tech-stack, which affords us flexibility of fast-deployments, along with the stability to support aggressive SLAs for critical-path client APIs and applications. We build our models using Pytorch and Tensorflow, and leverage Python, Spark and Postgres across our cloud infrastructure.
The Devops stack includes Kubernetes, Argo, Helm, Jenkins, Terraform and Python scripting to deploy compute, storage and SQL resources to the cloud.


  • Onboard with CAPE’s engineering team to learn about our tech-stack, our software development process, and start contributing to our code-base.
  • Get to know the engineers, data scientists, and machine learning researchers you’ll work with through 1:1s and by sitting in on team and project meetings.
  • Start learning our current software and data architecture, including how we manage, update and deploy systems while maintaining stability and speed within our systems.
  • Prepare deployments for microservices and tune for efficiency and reliability.
  • Understand the security mechanisms employed to protect customer data.


  • Design and build automation to deliver microservices to the cloud. The microservices feature support for our Insurance and Real-Estate products.
  • Participate in technical architecture reviews, and gain a deeper understanding of our backend systems.
  • Learn CAPE’s near-term and long-term product strategy and understand our technology roadmap.
  • Develop solutions for monitoring and alerting based on potential capacity issues.
  • Provide security assessments for CAPE’s production systems and propose mitigations.
  • Shadow DevOps and software engineers during on-call rotations and learn how to maintain system stability and respond to high-priority issues.


  • Understand the subtleties of high availability cloud deployments of microservices and learn to tune deployments for speed and cost.
  • Lead development and execution of automation on critical technical projects, working with a cross-functional team to plan and execute against product goals.
  • Develop cost cutting measures for cloud architecture. Analyze costs using console tools and Athena queries. Develop dashboards to inform management of usage and costs.
  • Participate in prioritization of security and development automation tasks.


  • 5-10 years of relevant experience
  • Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent work experience)
  • Demonstrated success in designing cloud-based infrastructure to support microservices connecting to databases and file systems
  • Experience with cloud platforms (e.g., AWS, Azure, GCP)
  • Demonstrated success in automating deployment to Kubernetes clusters with tools like Terraform and Helm
  • Proficiency in scripting and automation is highly desirable
  • Ability to work independently and as part of a cross functional team
  • Ability to analyze architecture and make performance improvements
  • Excellent communication skills: Able to effectively communicate in a clear, concise manner
  • Ability to travel 1-2 times annually for company/team events
You will join a growing team of software and DevOps engineers with years of experience building and shipping product-focused engineering across a multitude of different industries. At CAPE, our software engineers work daily with machine learning engineers and data scientists to build the databases, APIs and applications that are the backbone of CAPE's infrastructure. We tackle difficult engineering challenges and focus on delivering impact for our coworkers and clients each day.
We believe:
*Talent is critical, but best when tempered with humility
*Self-motivation leads to the best outcomes
*Open, direct communication is a sign of respect
*Teamwork drives success
*Having fun together is an important part of the job
View our CCPA policy here
***CAPE Analytics is an E-verify participant.***