A Brief Overview of Testing at Okta

At Okta, we take pride in ensuring our product features are developed with testability in mind, thoroughly tested, and continuous monitoring is implemented before reaching our customers. This blog post will describe some of the quality signals we use to ensure features are ready for customer adoption.

Internal testing 

Upon every approved pull request, we execute a comprehensive suite of unit, API, and integration tests. At the same time, static code analyzers (SCA) enforce syntax, architectural decisions, and broad software engineering uniformity in our internally built continuous integration system.

Complete functional product feature testing and non-functional (performance, cross-platform, and usability) testing is impossible to perform without severely hindering developer velocity. So, the tests that cannot be performed upon code merge are regularly bunched and scheduled for execution with strict Service-Level Agreement (SLA) requirements to address failures.

Quality-gates

As Okta’s products and integrations evolve, we must ensure we meet our customers’ expectations and keep proposed changes safe. We aim to balance API contracts, fundamental user expectations, and external integrations with product feature upgrades and innovations. 

We may offer features that are thoroughly tested but not yet slated for general availability release timelines to customers. Self Service EA is available to customers with monthly releases deployed to Preview Sandbox a month in advance for customer feedback. As a part of these quality gates and customer feedback loops, we do expect that Self Service EA features are GA quality.

Continuous verification

Synthetic monitoring in testing environments by test automation across global regions is just one part of our monitoring processes. With a worldwide user base, Okta takes a comprehensive approach to ensuring that the service is resilient and responsive. We rely on API status “health checks,” front-door page load testing, and sophisticated synthetic user flow emulation to notify engineering teams and swarm upon service degradation signals striving to notify our customers of failures and resolve issues with deliberate speed.

Okta’s feature development team configures monitoring for in-development features across environments to observe how our customers' usage changes and, if needed, to dive deep into low-level activity paths if errors occur during the phased rollout.

Takeaways

As we expand our product offerings, we continue to innovate heavily toward improving our efficiency and test coverage. Pushing our resources toward self-contained runtimes of distinct product areas will allow better targeted testing to improve developer productivity and resource efficiency. Okta Japan and Okta for Government initiatives are helping us bring broader globalization, localization, and accessibility of the admin console, allowing admins to use the console in languages other than English. We are also increasing our framework capabilities to capture “English leaks” allowing for continuing international expansion and a global user base. 

In summary, to foster an engineering culture where reliability and quality are top of mind across the organization, consider some of the values we balance: 

  • All changes include relevant tests, manual validation, and test updates as part of the changeset.
  • Leverage 100% passing continuous integration test execution requirements for merging changesets.
  • Teams that build the feature should monitor and configure appropriate alerting.
  • Lean on automated testing ensuring feature readiness requirements are met covering broad critical flows and correcting issues before merge.

To learn more about Okta’s delivery of reliable service delivering mission-critical functionality, refer to a presentation provided by Jon Todd, our Chief Architect, during Oktane: Oktane19: Building and Running Infrastructure at Scale: How We Do It at Okta

Tags

testing