Senior Software Engineer (Data Platform)

About Okta

Okta is an enterprise grade identity management service, built from the ground up in the cloud and delivered with an unwavering focus on customer success. With Okta you can manage access across any application, person, or device. Whether the people are employees, partners, or customers, or the applications are in the cloud, on premises, or on a mobile device, Okta helps you become more secure, make people more productive, and maintain compliance.

The Okta service provides directory services, single sign-on, strong authentication, provisioning, workflow, and built in reporting. It runs in the cloud on a secure, reliable, extensively audited platform and integrates deeply with on premises applications, directories, and identity management systems.

About the Team

The Data Platform team is responsible for the foundational data services, systems, and data products for Okta that benefit our users. Today, the Data Platform team solves challenges and enables:

  • Streaming analytics 
  • Interactive end-user reporting 
  • Data and ML platform for Okta to scale
  • Telemetry of our products and data

Our elite team is fast, creative and flexible. We encourage ownership. We expect great things from our engineers and reward them with stimulating new projects, new technologies and the chance to have significant equity in a company. Okta is about to change the cloud computing landscape forever. 

About the Position

This is an opportunity for experienced Software Engineers to join our fast growing Data Platform organization that is passionate about scaling high volume, low-latency, distributed data-platform services & data products. In this role, you will get to work with engineers throughout the organization to build foundational infrastructure that allows Okta to scale for years to come. As a member of the Data Platform team, you will be responsible for designing, building, and deploying the systems that power our data analytics and ML. Our analytics infrastructure stack sits on top of many modern technologies, including Kinesis, Flink, ElasticSearch, and Snowflake. 

We are looking for experienced Software Engineers who can help design and own the building, deploying and optimizing the streaming infrastructure. This project has a directive from engineering leadership to make OKTA a leader in the use of data and machine learning to improve end-user security and to expand that core-competency across the rest of engineering. You will have a sizable impact on the direction, design & implementation of the solutions to these problems.

Job Duties and Responsibilities:

  • Design, implement and own data-intensive, high-performance, scalable platform components
  • Work with engineering teams, architects and cross functional partners on the development of projects, design, and implementation
  • Conduct and participate in design reviews, code reviews, analysis, and performance tuning
  • Coach and mentor engineers to help scale up the engineering organization
  • Debug production issues across services and multiple levels of the stack 

Required Knowledge, Skills, and Abilities: 

  • 5+ years of experience in object-oriented language, preferably Java 
  • Hands-on experience using a cloud-based distributed computing technologies including
    • Messaging systems such as Kinesis, Kafka 
    • Data processing systems like Flink, Spark, Beam
    • Storage & Compute systems such as Snowflake, Hadoop
    • Coordinators and schedulers like the ones in Kubernetes, Hadoop, Mesos
  • Experience in developing and tuning highly scalable distributed systems
  • Excellent grasp of software engineering principles
  • Solid understanding of multithreading, garbage collection and memory management
  • Experience with reliability engineering specifically in areas such as data quality, data observability and incident management

Nice to have

  • Maintained security, encryption, identity management, or authentication infrastructure
  • Leveraged major public cloud providers to build mission-critical, high volume services
  • Hands-on experience in developing Data Integration applications for large scale (petabyte scale) environments with experience in both batch and online systems.
  • Contributed to the development of distributed systems or used one or more at high volume or criticality such as Kafka or Hadoop
  • Experience developing Kubernetes based services on AWS Stack

#LI-Remote

#LI-AY1

Apply

Resume
Upload Resume/CV (PDF must be less than 8 MB )
Cover Letter
Upload Cover Letter (PDF must be less than 8 MB )
U.S. Equal Opportunity Employment Information (Click here for instructions)

We request this data to promote diversity, inclusion, and belonging and to ensure we maintain fair and equitable hiring practices. Responding to the survey is voluntary.