What Temporal is looking for in applicants
We are expanding our team! You can be anywhere in the United States for all of our positions, and other various locations outside the U.S. for other roles to join us.
At Temporal, we are on a mission to remove the complexity in developing reliable software for the cloud. If you want to solve hard distributed system problems, have a passion for open source software and building a strong developer community, then come join us in our mission. Temporal enables developers to focus on writing important business logic, and not on managing state or worrying about the underlying infrastructure. The Temporal platform is being trusted by top-tier companies as a core technology in their mission critical systems. Our active open source community of developers, who are also our users, provide us with real-time feedback and contributions. We're backed by top VC firms, have closed Series B and have a team of professionals from start-ups and larger companies like Microsoft, Google, Amazon, Meta, Uber, Apple, Cisco and more.
The Temporal Cloud Engineering group at Temporal is looking for an engineer who has demonstrated a track record of developing horizontally scalable, resilient, and performing-under-load distributed systems in a production environment. The primary focus is on building a data platform with ETL/ELT workflows, dimensional data models. This is a unique opportunity as this person will also get to program using Temporal OSS and cloud services while building this framework.
What You’ll Do
-Build a reliable and scalable data platform leveraging big data technologies like: Spark, Flink, Kafka, Pravega
-Design, implement, and operate services that will process hundreds of million records daily
-Collaborate with product managers and other engineering teams to build new features and products for business needs
-Provide mentorship to junior engineers
What You Bring to Us
-6+ years of experience developing a globally distributed data services platform
-5+ years of coding experience (Go, Java/Scala, Python, or other applicable languages)
-Experience developing and operating large-scale data pipelines demonstrating knowledge with technologies such as Spark, Kafka, Elasticsearch, Cassandra, and SQL
-Strong communication skills and desire to make an impact and thrive in small, collaborative, energetic teams
-Capable of operating from conception through the continuous operation of 24x7 services. Ideally, you have experience in a production DevOps/DataOps environment
What is Good to Have
-Hands-on experience with cloud technologies such as Amazon Web Services, Google Cloud, or Azure is a plus
-Experience with Docker and Kubernetes
-Knowledge/experience building GDPR, SOX compliant systems