Lesson 06 – Testing Types¶
Software quality relies on more than one style of test. Each level of testing uncovers different issues. This lesson introduces a complete testing workflow using a short scenario and then breaks down common test types.
1. Scenario: The RevOps Engineer's Odyssey¶
Alex, a RevOps Engineer at TechCorp, rolled out a new Salesforce module called Lead Optimizer. To verify that it worked and that other features stayed stable, Alex followed a structured series of tests from sanity checks to security reviews. This journey illustrates how the various testing levels build on one another.
2. Overview of Testing Types¶
The most widely used testing types are summarized below. Each serves a unique purpose in the software development lifecycle.
Sanity Testing¶
Focus on a single new feature to confirm it works before spending time on broader testing. - Instruction: Trigger just the new workflow or component. - Example: Create a test lead and check that "Lead Optimizer" assigns it automatically.
Smoke Testing¶
Validate that core functionality still operates after a new deployment. - Instruction: Exercise essential actions such as sending a test email. - Example: Send a message through the new module and verify delivery.
Unit Testing¶
Verify individual functions or objects in isolation. - Instruction: Test one component, like a custom object, apart from the rest of the system. - Example: Input values into a "Lead Score" object and ensure they are stored correctly.
Integration Testing¶
Ensure multiple modules cooperate as expected. - Instruction: Trigger interactions between connected components. - Example: Run a scoring workflow and check that the results appear on the "Sales Dashboard."
Parallel Testing¶
Run identical tests in two environments—often an old system and a new one—to confirm consistent behavior. - Instruction: Execute the same test cases in both environments and compare outcomes. - Example: Create identical leads in old and new Salesforce setups and verify they are assigned the same way.
Regression Testing¶
Confirm that previously working features still function after changes. - Instruction: Re-run tests for existing functionality once new code is merged. - Example: After adding the "Lead Optimizer," test the original lead management process.
User Acceptance Testing (UAT)¶
Real users validate that the system meets business requirements. - Instruction: Have users interact with the new feature in a production-like environment and give feedback. - Example: Invite sales staff to try the new module and confirm that assignments appear as expected.
Performance Testing¶
Check responsiveness and stability under anticipated load. - Instruction: Simulate heavy usage and observe response times. - Example: Stress the lead assignment feature with many simultaneous requests.
Security Testing¶
Identify vulnerabilities and confirm proper access controls. - Instruction: Attempt to access restricted data with various user profiles. - Example: Verify that financial details in lead records remain inaccessible to unauthorized users.
Summary¶
A solid testing strategy layers these approaches. Starting with sanity and smoke tests gives quick confidence in basic functionality. Unit and integration tests catch lower-level issues, while parallel and regression tests guard against unexpected side effects. Finally, UAT, performance, and security testing confirm that the system is robust, scalable, and safe for real users.
With these foundations in place, you're ready to move on to defining a minimum viable product in the next module.
Next up: Lesson 01 – MVP Definition