Manual Testing Guidelines

This document defines the standards and best practices for writing and executing manual tests for the Perma initiative.

Test Case Format

All test cases must be written in GIVEN-WHEN-THEN Gherkin syntax.

References:

Test Case Structure

Each test case must include the following fields:

- Description: [Brief description of what is being tested]
- Given [Initial context/state]
- When I [Action being performed]
- Then I [Expected result]
- Actual Result: [Leave empty when creating test case, fill during execution]
- Test Result: [Leave empty when creating test case, use emoji during execution]
- Priority: [High/Medium/Low]
- Risk Assessment (Regressions-Likelihood): [High/Medium/Low]
- Testable as Unit Test? [Yes/No with explanation]
- Notes: [Additional information, issue links]

Test Case Writing Rules

  • DO NOT delete empty bullet points
  • DO NOT fill out "Actual Result" or "Test Result" when creating test cases (only during execution)
  • Keep descriptions clear and concise
  • Focus on user-facing behavior, not implementation details
  • Each test case should test one specific scenario

Test Status Classification

When executing tests, always use exactly one of these three emojis in the - Test Result: field:

✅ Passed

Test executed successfully with no issues found.

Use when:

  • All expected behavior works correctly
  • No bugs or deviations detected

Issue linking:

  • Not required
  • May add notes for informational purposes

⚠️ Problematic

Test has blocking dependencies or minor issues that don't affect core functionality.

Use when:

  • Cannot fully test feature due to blocking dependencies
    • Example: "Cannot test background image rotation because setting background image is broken (#1234)"
  • Feature has minor issues that don't block core functionality
  • Test execution was hindered but not completely blocked

Issue linking:

  • MUST link to blocking issue or dependency in Notes field
  • Explain why test could not be fully executed

❌ Failed

Test failed due to software defects requiring fixes.

Use when:

  • Feature being tested has a defect
  • Expected behavior does not match actual behavior
  • Critical functionality is broken
  • Example: "Rotation function causes image to disappear (#1235)"

Issue linking:

  • MUST link to GitLab issue in Notes field
  • Create new issue if none exists (after checking for duplicates)

Avoiding Duplicate Issues

Before creating new issues:

  1. Search previous test reports in doc/tests/manual/reports/
  2. Search GitLab issues for similar problems
  3. Check with team members if uncertain

Report Naming Conventions

Test Reports

Test reports must be named using the date format: YYMMDD.md

Examples:

  • 250131.md - Test report from January 31, 2025
  • 241215.md - Test report from December 15, 2024

Report Location

All test reports must be saved in: doc/tests/manual/reports/

Test Environment Requirements

For Guided Tour Testing

  • MUST use local build
  • Cannot be tested on dev.permaplant.net
  • Requires database reset capability

For Other Tests

  • Local build OR dev.permaplant.net
  • Choose based on availability and testing needs

Test Report Structure

Each test report must include:

  1. General Section

    • Tester name
    • Date/Time
    • Duration
    • Commit/Tag being tested
    • Setup environment (specify: local build or dev.permaplant.net)
    • Test counters (Planned, Executed, Passed, Problematic, Failed)
  2. Test Cases

    • All executed test cases with results
    • Properly filled Actual Result and Test Result fields
    • Issue links in Notes for failed/problematic tests
  3. Closing Remarks

    • Assessment of current software state
    • Whether quality objectives were achieved
    • Lessons learned and process improvements

Execution Order

  1. ALWAYS execute 00_guided_tour.md first

    • This test must be performed before any other tests
    • If you exit the tour mid-test, reset your database before continuing
  2. Execute remaining tests in alphabetical order

    • Test cases are numbered/named to maintain proper sequence

Test Case Maintenance

When You Find Errors in Test Cases

  • Fix immediately - Update the test case file right away
  • Document the change - Note the correction in your test report
  • Continue testing - Use the corrected version for your test execution

Updating Test Cases

When updating test cases, maintain the Test Case Structure

Browser Testing

When finding issues:

  • Try to reproduce in another browser to identify browser-specific problems
  • Document browser information in Notes field if issue is browser-specific
  • Test on supported browsers as defined in project requirements

Test Data and Resources

Test User Account

  • Always use designated test user account: testuser_t@permaplant.net
  • Credentials available in doc/tests/testusers.md
  • Test account is pre-configured with necessary materials

Test Images and Assets

  • Stored in Nextcloud under "Photos" directory of test user account
  • Access via "Anmelden mit Keycloak" using test user credentials
  • Individual test cases specify which images are needed

Examples

Example 1: Passed Test

## Map Creation

- Description: Test creating a new map with valid parameters
- Given I am logged in as testuser_t@permaplant.net
- When I click "Create New Map" and fill in name and location
- Then I should see the new map in my map list
- Actual Result: Map created successfully and appears in list
- Test Result: ✅
- Priority: High
- Risk Assessment (Regressions-Likelihood): Low
- Testable as Unit Test? No (integration test)
- Notes: None

Example 2: Failed Test

## Background Image Rotation

- Description: Test rotating a background image on the base layer
- Given I have uploaded a background image
- When I click the rotation handle and drag
- Then I should see the image rotate smoothly
- Actual Result: Image disappears when rotation angle exceeds 45 degrees
- Test Result: ❌
- Priority: High
- Risk Assessment (Regressions-Likelihood): Medium
- Testable as Unit Test? No
- Notes: Critical bug - image rendering fails at certain angles. See #1337

Example 3: Problematic Test

## Copy Plant with Background

- Description: Test copying a plant when background image is set
- Given I have a map with background image and plants
- When I copy a plant using Ctrl+C and Ctrl+V
- Then I should see the copied plant
- Actual Result: Cannot test - background image upload is broken
- Test Result: ⚠️
- Priority: High
- Risk Assessment (Regressions-Likelihood): Unknown
- Testable as Unit Test? No
- Notes: Blocked by #1336 (background upload failure). Will retest after fix.

Best Practices

  1. Be thorough but efficient

    • Follow test cases precisely
    • Don't skip steps
    • Document deviations
  2. Write clear notes

    • Explain unexpected behavior
    • Include reproduction steps
    • Reference related issues
  3. Maintain test quality

    • Fix errors in test cases immediately
    • Suggest improvements to test coverage
    • Report gaps in testing
  4. Communicate findings

    • Write clear closing remarks
    • Highlight critical issues
    • Suggest process improvements
  5. Follow the workflow