Infrastructure Engineer (Python, Docker, Airflow) opening @ Sofar Ocean (San Francisco, CA)

Sofar is growing across all departments and we’ve got a great opportunity inside our Optimizer Team for someone that is passionate about all things platform, reliability, and infrastructure!

More details can be found below and a direct link to apply is here - Infrastructure Engineer

The Position

We are looking for an experienced platform engineer who can help us build the next generation of weather forecasting infrastructure. Our weather forecast pipeline combines realtime sensor readings from our worldwide fleet of ocean buoys, publicly available weather data sources, and our own proprietary forecast model into a variety of data products suitable for visualization and scientific analysis. In this role you will help us grow the scope and scale of our weather models and pipelines as well as design, build, debug, and maintain highly available distributed systems for massive compute tasks.

In this role, you would work alongside our software engineering team building out APIs and frontend applications that use our data, as well as domain experts in ocean science and other data scientists performing analysis on weather data. You will quickly learn that “the forecast” is a lot more complicated than it seems at first glance; our team works with multiple forecast models that each produce gigabytes of data every hour, at every location on the globe. Most of the industry-standard tools and data formats are optimized for an era of supercomputers, not cloud computing. **We are looking for the right person who is up to the challenge of designing and building systems that let us push the boundaries of what is possible with weather forecasting on cloud compute platforms, as well as maturing our existing platforms.

Currently our stack is built with Apache Airflow, Docker & Kubernetes, Postgres/PostGIS, and AWS S3.

About You

Requirements:

  • Passionate about building a product with a positive impact on the world
  • 3+ years professional experience as a software engineer or infrastructure/data engineer
  • Strong working knowledge of Python and cloud computing concepts
  • Experience with container systems such as Docker and data workflow systems such as Airflow
  • Willing to learn new tools, languages, and patterns as needed to build a great product
  • A solid communicator who enjoys collaborating with other engineers, designers, PMs, and scientists
  • Excited to be be a part of a small but growing startup team

Bonus points:

  • Experience building applications on large-scale distributed computing infrastructure in a cloud environment
  • Experience with large-scale application deployment and management systems such as Kubernetes
  • Experience working with modeling and data science applications and teams

Apply Here

1 Like

Thanks for sharing…looks like a great opportunity!

This comment probably resonates strongly with the community here. Curious whether you have investigated the Pangeo cloud-native stack at all (Zarr, Xarray, Dask, etc.).

1 Like

Thanks for pointing that out. I’ll pass the details along to our engineering team and see if they’ve taken a look.

1 Like