What Temporal is looking for in applicants
We are expanding our team and you can be anywhere in the United States to join us.At Temporal, we are on a mission to remove the complexity in developing reliable software for the cloud. If you want to solve hard distributed system problems, have a passion for open source software and building a strong developer community, then come join us in our mission. Temporal offers an entirely new way to build scalable and reliable applications. Temporal enables developers to focus on writing important business logic, and not on managing state or worrying about the underlying infrastructure. The Temporal platform is already being trusted by top-tier companies as a core technology in their mission critical systems. Our active open source community of developers, who are also our users, provide us with real-time feedback and contributions. We're backed by top VC firms and have a team of professionals from start-ups and larger companies like Microsoft, Google, Amazon, Uber, Apple, Cisco and more.Senior Data Engineer The Hosted Service Engineering group at Temporal is looking for an engineer who has demonstrated a track record of developing horizontally scalable, resilient, and performing-under-load distributed systems in a production environment. The primary focus is on building a Billing Platform with ETL workflows, dimensional data models. This is a unique opportunity as this person gets to contribute towards Temporal OSS and cloud services while building this framework.What You’ll Do* Build a reliable and scalable Billing Platform leveraging Big Data technologies.* You will design, implement and operate services that will process hundreds of million records daily.* Collaborate with product managers, finance/revenue teams, and other engineering teams to build new features and products for business needs.* Provide mentorship to junior engineers.What You bring to us* 6+ years of experience developing a globally distributed data services platform.* 5+ years coding experience (Go, Java, Python, or other applicable languages)* Experience developing and operating large-scale data pipelines demonstrating knowledge with technologies, such as Spark, Kafka, ElasticSearch, Cassandra, and SQL* Strong communication skills and desire to make an impact and thrive in small, collaborative, energetic teams.* Capable of operating from conception through the continuous operation of 24x7 services. Ideally, you have experience in a production DevOps/DataOps environment.What is good to have* Hands-on experience with cloud technologies such as Amazon Web Services, Google Cloud, or Azure is a plus.* Experience with Docker and Kubernetes.* Knowledge/experience building GDPR, SOX compliant systems.