This is an opportunity to join our fast-growing Engineering Data Science team to develop cutting-edge data pipelines and platforms to help augment our product offerings in security, authentication, applications, and customer experience. We are looking for senior data engineers who can help architect and own the platform for deploying and optimizing the data lake and data streams used by our machine learning models to protect user authentication and security. They will also own the pipeline which needs to process hundreds of millions of events per day and provide results back to the authentication system to make real-time risk evaluation during user authentication. This project has a directive from engineering leadership to make OKTA a leader in the use of data and machine learning to improve end-user security and to expand that core-competency across the rest of engineering.
We hope you will share our passion and great pride in the work we do and will join an engineering team that strongly believes in automated testing and an iterative process to build high-quality next-generation cloud platforms.
Our elite team is fast, innovative, and flexible. We expect great things from our engineers and reward them with stimulating new projects and emerging technologies.
Job Duties and Responsibilities:
- Overall ownership of the data collection pipeline and data lakeokaydatalake used for developing, deploying, and maintaining machine learning models in production.
- Work with Data Scientists to help improve their productivity and implement their ideas
- Design and maintain new data processing pipelines to support new decision and scoring models
- Analyze performance metrics and logs to identify inefficiencies and opportunities to improve scalability and performance
- SQL Query Tuning: complex query plan analysis and optimization and Schema (re-) design
- Research production issues using tools such as Splunk, Wavefront, CloudWatch, etc
- Maintain and enhance our performance monitoring and analysis telemetry, frameworks, and tools
- Test-driven development, design and code reviews
Minimum Required Knowledge, Skills, and Abilities:
- 6+ years experience building enterprise grade highly reliable, mission-critical software or big data systems
- 3+ years of experience with in production SaaS deployment
- Experience with Java, Python, C# or C++
- Experience deploying data streams for use by ML models in production environments with low latency requirements.
- Experience with streaming systems: MQ, Kafka, Storm, Spark, etc.
- Expert-level understanding of relational databases (columnar and row-based), and NoSQL including mongo, Cassandra or similar
- Experience with the data toolchains: EMR, Kinesis, Redshift, Glue
- Experience with Flink/KDA, Snowflake, Redis, and/or ElasticSearch
- Working knowledge of AWS Sagemaker, Lambda, and API Gateway including production deployment
- Experience with Docker, Terraform, Chef, Jenkins, or similar build tools
- Jupyter Notebook Kernel maintenance
- IPython, TensorFlow, PyTorch
Okta is an equal opportunity employer
Okta is rethinking the traditional work environment, providing our employees with the flexibility to be their most creative and successful versions of themselves, no matter where they are located. We enable a flexible approach to work, meaning for roles where it makes sense, you can work from the office, or from home, regardless of where you live. Okta invests in the best technologies and provides flexible benefits and collaborative work environments/experiences, empowering employees to work productively in a setting that best and uniquely suits their needs