QA chronicles: Safeguarding the quality of Generally Available features

As software development companies race toward delivering Generally Available (GA) features, the emphasis on quality assurance (QA) becomes more critical than ever. Beyond traditional testing methodologies, diversifying testing types is paramount to ensuring the success of GA features.

At Okta, the product team is responsible for maintaining quality and ensuring a seamless rollout. In this blog, we’ll explore the importance of diversification in testing to safeguard the quality of GA features and delve into the necessity of continuous monitoring and post-deployment validation for sustained excellence.

The ever-evolving landscape of software testing

QA has evolved far beyond traditional testing methodologies. However, the complexities of modern applications and the increasing expectations of end-users necessitate a more comprehensive testing approach. 

Methods to diversify testing approaches for GA features include:

1. Functional validation suite:

  • Unit testing is a software testing method in which individual units or components of a software application are tested in isolation to validate that each unit performs as designed.
    • Scope: Individual functions or methods.
    • Purpose: Verify the correctness of small, isolated units of code.
    • Benefits: Early detection of bugs, easier debugging, and improved code maintainability.
    • Tools: Frameworks like JUnit for Java, pytest for Python, NUnit for .NET, etc.
       
  • Integration testing ensures seamless interaction between different components, safeguarding against integration issues that may arise in a real-world environment.
    • Scope: Interaction between multiple units/modules.
    • Purpose: Verify that components work together as expected.
    • Benefits: Identify issues related to the integration of components.
    • Tools: Depending on the technology stack, integration tests can be written using frameworks such as TestNG, Jasmine, or tools like Postman.
       
  • End-to-end testing primarily verifies that all components and systems work together cohesively as intended, simulating real-world user scenarios and ensuring the smooth integration of various modules or layers within the application.
    • Scope: Entire application or a significant portion of it.
    • Purpose: Verify the system’s overall behavior from the user's perspective.
    • Benefits: Ensure all components work together to fulfill user requirements.
    • Tools: Selenium for web applications, Appium for mobile applications, Cypress for modern web applications, etc.
       
  • Cross-browser and device compatibility testing ensures consistent functionality across different web browsers and devices to provide a seamless experience for users, regardless of their browser and device preferences.
    • Scope: Ensure consistent performance across different web browsers and devices.
    • Purpose: Identify and address issues related to rendering, functionality, and user experience on various browsers and devices.
    • Examples: Testing on popular browsers (Chrome, Firefox, Safari, Edge, etc.) and different devices (desktops, laptops, tablets, and mobile phones)
    • Tools: Cross-browser testing tools (e.g., BrowserStack, CrossBrowserTesting, Sauce Labs) and device testing labs

2. Non-functional dimensions:

  • Performance testing involves evaluating the performance of GA features under various conditions, including stress testing and load testing. This ensures optimal performance, even during peak usage.
    • Scope: Evaluate system performance under various conditions.
    • Purpose: Assess the system's performance regarding speed, responsiveness, and stability.
    • Examples: Load testing, stress testing, and scalability testing
    • Tools: JMeter, Apache Benchmark, Gatling for load testing; Apache JMeter, Locust for stress testing; and tools specific to the technology stack
       
  • Security testing should be thorough to help identify and mitigate vulnerabilities, safeguard sensitive data, and ensure the feature's resilience against potential security threats.
    • Scope: Identify vulnerabilities and ensure the system is secure.
    • Purpose: Assess the security features of the application, including data protection and access controls.
    • Examples: Penetration testing, vulnerability scanning, and code analysis
    • Tools: OWASP ZAP, Burp Suite, Nessus, and various code analysis tools
       
  • Scalability testing assesses the ability of the feature to accommodate growing user bases. Scalability testing is crucial to maintain performance as user numbers increase.
    • Scope: Ensure the system's reliability and stability under normal and extreme conditions.
    • Purpose: Assess how well the system recovers from failures and handles errors.
    • Examples: Fault tolerance testing, recovery testing, and reliability testing.
    • Tools: Custom scripts, Chaos Monkey for distributed systems.

3. User-centric validation:

  • Usability testing gauges the user-friendliness of GA features, incorporating real user feedback to refine the user experience.
    • Scope: Evaluate the user interface and overall user experience.
    • Purpose: Assess how user-friendly and intuitive the application is.
    • Examples: User interface testing, user experience testing, and accessibility testing.
    • Tools: Usability testing can involve automated and manual processes, including tools like UserTesting for remote usability testing
  • Accessibility testing ensures individuals with diverse abilities can easily use GA features.
    • Scope: Assess the application's accessibility for users with disabilities.
    • Purpose: Ensure the application complies with accessibility standards and guidelines (e.g., WCAG — Web Content Accessibility Guidelines).
    • Examples: Keyboard navigation testing, screen reader testing, color contrast testing
    • Tools: Automated accessibility testing tools (e.g., Axe, Google Lighthouse, WAVE) and manual testing with assistive technologies

4. Continuous and post-deployment monitoring 

  • Proactive surveillance using health checks: Implement real-time monitoring in test, pre-production, and production environments to detect issues as they arise. Proactive surveillance aids in promptly identifying and addressing potential problems before they impact end-users.
    • Purpose: Periodic health checks involve regular assessments of the system's overall health and performance at predetermined intervals.
    • Timing: These checks are conducted at scheduled times, often as part of routine maintenance or system management.
    • Scope: They are broader in scope, aiming to identify potential issues, bottlenecks, or performance degradation over time.
      • Post-deployment monitoring:
        • Purpose: Post-deployment monitoring focuses on observing the system's behavior and performance immediately after a new release or update deployment.
        • Timing: It begins immediately after deployment and continues for a specific duration to capture real-time insights into the system's stability and performance.
        • Scope: The focus is on detecting issues arising from the recent deployment, such as bugs, errors, or unexpected behavior.
    • Use cases: Post-deployment monitoring is critical for ensuring the latest changes introduced in a new release do not negatively impact the system's performance or user experience.

​​​​​​​5. Post-deployment verification

  • Purpose: Post-deployment verification ensures the deployment process is successful and the newly released version functions as expected.
  • Timing: This occurs immediately after deployment and involves a set of predefined tests to confirm the application is accessible, features work correctly, and no critical errors exist.
  • Scope: The scope is narrower, concentrating on confirming the basic functionality and stability of the newly deployed version.

​​​​​​​To learn more about post-deployment verification, read keys-to-high-quality-service-releases-at-okta.

Beyond deployment

The process of delivering resilient GA features extends beyond deployment; it requires a continuous dedication to quality. Through the diversification of testing types in development and the integration of continuous monitoring and post-deployment validation, development teams can establish a robust QA ecosystem. This safeguards the initial quality of GA features and guarantees their enduring excellence in the constantly evolving realm of software usage.

Have questions about this blog post? Reach out to us at [email protected].

Explore more insightful Engineering Blogs from Okta to expand your knowledge.

Ready to join our passionate team of exceptional engineers? Visit our career page.

Unlock the potential of modern and sophisticated identity management for your organization. Contact Sales for more information.