Remote

Senior Golang Engineer

90PoE logo 90PoE
€70000 - €90000
Remote - Europe 🇪🇺

Maritime Excellence through Digital Intelligence:

90PoE’s goal is to revolutionise shipping by creating a suite of comprehensive software for the maritime industry. Our journey begins now. Over the next couple of years, our teams will build more than 30 products from the ground up. This includes everything from global vessel tracking to vessel performance analysis, crew optimisation and so much more.

The Role

90PoE’s mission is big. It’s not going to be easy but with the support and determination from our stakeholders and the brilliant people we have hired we can achieve anything. Right now, we're seeking an experienced senior data engineer to help the Vessel Performance Stream. This is crucial role for one of the most innovative streams of the company. You will interact closely with the other data engineers and data scientists to take effective architectural and technical choices for high throughput data stream systems. This is an incredible opportunity to help innovate the maritime industry to increase vessel and fleet performance with the help of AI and real time metrics collected on board.

Technology

Our platform ingests data directly from our hardware running on vessels, as well as various third-party sources, using various technologies such as Kafka, MQTT, REST, gRPC, and normalises it for processing by our bespoke micro-batch engine.

Platform overview

Our tech stack consists of multiple client applications communicating using GraphQL to microservice containers orchestrated by Kubernetes. Most of our services are written in Golang with stream processing in Golang and Java, they use gRPC for communication, achieve high scalability thanks to Apache Kafka based event driven architecture, persist data to a mix of RDBS and No-SQL databases including Postgres, MongoDB, Cassandra, Redis, S3 and Elasticsearch. We use CI/CD and gitops to deploy to production multiple times per week.

What’s in for you

90PoE is a growing company, championing real change in the maritime industry. This is an exciting and challenging opportunity to apply cutting-edge technology to revolutionising an iconic industry.

You will demonstrate our values, strive for excellence, engage and motivate those around you and be accountable for your contribution to the team’s priorities.

  • Disrupting a century old industry in a startup environment
  • Fix global scale issues through cloud technologies and AI
  • Opportunity to grow and develop your core skills
  • Work with a diverse multicultural team in an agile environment
  • Opportunity to work with latest cutting-edge technologies
  • Variety of knowledge sharing and self-development opportunities
  • Be paid for speaking at public events
  • Competitive salary and bonuses
  • Possibility to work remotely from UK and Europe

Requirements

  • In depth technical knowledge of backend languages and technologies: Golang
  • Proficient in designing and building event driven systems
  • Practical experience with Postgres, Cassandra and Redis
  • Able to design clean software systems
  • Excellent communication skills
  • Able to identify ways to improve data reliability, efficiency and quality
  • Experience in basic data analysis: SQL/CQL
  • Proven experience in delivering complex systems to production
  • Deep understanding of database technologies and designing data intensive applications
  • Familiar with Java programs and concepts
  • Experience in data analysis techniques
  • Comfortable with bash scripting and docker
  • Familiar with SOLID, DDD, hexagonal architecture
  • Able to optimise data pipelines, time-series and other systems to perform at scale
  • Able to deploy systems with zero down time

With great powers come great responsibilities:

  • Be accountable for data pipeline design, building and deployment
  • Support the team during product releases to production, to ensure successful smooth deployment

It would be great if you have:

  • Practical experience with Kafka and Kafka Streams Java libraries
  • Engineering experience to understand the underpinning physics and mathematics of pipeline calculations
  • Experience with Spark/Flink
  • AWS stack experience
  • Ability to perform basic DevOps tasks
  • Ability to understand Python software: our data scientists use Jupyter Notebooks
  • Understanding of statistics

Looking for more roles like this?

Join our talent network and get matched with similar opportunities from top companies.