|ABOUT THE COMPANY
The client are a space-to-cloud analytics company that owns and operates the largest multi-purpose constellation of satellites. Their proprietary data and algorithms provide the most advanced maritime, aviation and weather tracking int the world. In addition to its constellation, our data infrastructure includes a global ground station network and 24/7 operations that provide real-time global coverage of every point on Earth.
Their aviation software team is developing API-based SaaS products that enable our customers to get real time and historical insights into aviation activities. This includes current and past historical positions of aircraft, flight information, and higher levels of data derived using analytics pipelines.
The software team is moving fast and is tightly integrated cross-functionally with the aviation business functions: product management, product marketing, sales, and customer experience. Together we essentially form a 10 person start-up within a larger organization. This gives us the best of both worlds: we are nimble yet have access to resources of a larger organization such as SREs, security engineers, and platform tooling.
The software team is involved across requirements definition, internal data ingestion, third party data acquisition and ingestion,
and building and operating the data processing and analytics pipelines.
· Past experience working as part of a small team to design and deploy web applications, services, and data streaming/processing systems
· Designing and implementing robust and scalable APIs
· Developing and deploying distributed applications on AWS or GCP
· Professional experience working in at least one interpreted language
· Professional experience working in at least one compiled language
· Enjoy working as part of a cross-functional customer-centric team, but can also take on and complete tasks on your own
· Experience writing code in Python, Java and Go -- the three languages we use today on this team.
· Strong writing skills and the ability to communicate and present arguments in documents and on Slack
· Experience with stream processing systems like Flink or Beam
· Have familiarity with tools and techniques in distributed systems and handling large volumes of streaming data on the order of billions of events per day: queues, RPCs, serialization, data versioning, exactly/at-most/at-least-once message semantics
· Experience with infrastructure including AWS, Kubernetes, CI/CD pipelines, and debugging operational application issues
· Have 5+ years of practical programming experience
· Have in-depth knowledge of API design principles and best practices, caching strategies, and designing for resiliency and scalability
· Strong relational and non-relational database skills
If you are interested please submit your CV