Manual Testing Guidelines

This document defines the standards and best practices for executing manual tests for the Perma initiative.

Test Case Report Structure

Each test case must include the following fields:

- Description: [Brief description of what is being tested]
- Given [Initial context/state]
- When I [Action being performed]
- Then I [Expected result]
- Actual Result: [Fill during execution]
- Test Result: [Use emoji during execution]
- Priority: [High/Medium/Low]
- Risk Assessment (Regressions-Likelihood): [High/Medium/Low]
- Testable as Unit Test? [Yes/No with explanation]
- Notes: [Additional information, issue links]

Test Case Execution Rules

  • DO NOT delete empty bullet points from test cases.
  • Fill "Actual Result" and "Test Result" fields during execution only.

Test Status Classification

When executing tests, always use exactly one of these three emojis in the - Test Result: field:

✅ Passed

Test executed successfully with no issues found.

Use when:

  • all expected behavior works correctly
  • no bugs or deviations detected

Issue linking:

  • not required
  • may add notes for informational purposes

⚠️ Problematic

Test has blocking dependencies or minor issues that don't affect core functionality.

Use when:

  • cannot fully test feature due to blocking dependencies
    • Example: "Cannot test background image rotation because setting background image is broken (#1234)"
  • feature has minor issues that don't block core functionality
  • test execution was hindered but not completely blocked

Issue linking:

  • MUST link to blocking issue or dependency in Notes field.
  • Explain why test could not be fully executed.

❌ Failed

Test failed due to software defects requiring fixes.

Use when:

  • feature being tested has a defect
  • expected behavior does not match actual behavior
  • critical functionality is broken
  • Example: "Rotation function causes image to disappear (#1235)"

Issue linking:

  • MUST link to GitLab issue in Notes field.
  • Create new issue if none exists (after checking for duplicates).

Avoiding Duplicate Issues

Before creating new issues:

  1. Search previous test reports in doc/tests/manual/reports/.
  2. Search GitLab issues for similar problems.
  3. Check with team members if uncertain.

Report Naming Conventions

Test Reports

Test reports must be named using the date format: YYMMDD.md

Examples:

  • 250131.md - Test report from January 31, 2025
  • 241215.md - Test report from December 15, 2024

Report Location

All test reports must be saved in: doc/tests/manual/reports/

Test Environment Requirements

What Counts as a "Local Build"

A local build includes:

  • Native build running on your local machine
  • Devcontainer (Docker-based development environment is acceptable)

A local build does NOT include:

  • dev.permaplant.net or other remote environments

For Guided Tour Testing

  • MUST use local build (native or devcontainer).
  • cannot be tested on dev.permaplant.net
  • requires database reset capability

For Other Tests

  • local build (native or devcontainer) OR dev.permaplant.net
  • Choose based on availability and testing needs.

Test Report Structure

Each test report must include:

  1. General Section

    • Tester name
    • Date/Time
    • Duration
    • Commit/Tag being tested
    • Setup environment (specify: local build or dev.permaplant.net)
    • Test counters (Planned, Executed, Passed, Problematic, Failed)
  2. Test Cases

    • all executed test cases with results
    • properly filled Actual Result and Test Result fields
    • issue links in Notes for failed/problematic tests
  3. Closing Remarks

    • assessment of current software state
    • whether quality objectives were achieved
    • lessons learned and process improvements

Execution Order

  1. ALWAYS execute 00_guided_tour.md first

    • This test must be performed before any other tests.
    • If you exit the tour mid-test, reset your database before continuing.
  2. Execute remaining tests in alphabetical order

    • test cases are numbered/named to maintain proper sequence

Test Case Maintenance

When You Find Errors in Test Cases

Only fix minor, non-logic errors immediately:

  • Typos and readability - Fix small spelling, grammar, or clarity issues.
  • Document the change - Note the correction in your test report.
  • Continue testing - Use the corrected version for your test execution.

DO NOT fix:

  • logic or expected behavior changes
  • test case structure issues
  • semantic problems that could affect test validity

Report logic errors as issues and test with the original case.

Browser Testing

When finding issues:

  • Try to reproduce in another browser to identify browser-specific problems.
  • Document browser information in Notes field if issue is browser-specific.
  • Test on supported browsers as defined in requirements ./README.md#client-requirements.

Test Data and Resources

Test User Account

  • Always use designated test user account: testuser_t@permaplant.net.
  • credentials available in doc/tests/testusers.md
  • test account is pre-configured with necessary materials

Test Images and Assets

  • stored in Nextcloud under "Photos" directory of test user account
  • Access via "Anmelden mit Keycloak" using test user credentials.
  • individual test cases specify which images are needed

Examples

Example 1: Passed Test

## Map Creation

- Description: Test creating a new map with valid parameters
- Given I am logged in as testuser_t@permaplant.net
- When I click "Create New Map" and fill in name and location
- Then I should see the new map in my map list
- Actual Result: Map created successfully and appears in list
- Test Result: ✅
- Priority: High
- Risk Assessment (Regressions-Likelihood): Low
- Testable as Unit Test? No (integration test)
- Notes: None

Example 2: Failed Test

## Background Image Rotation

- Description: Test rotating a background image on the base layer
- Given I have uploaded a background image
- When I click the rotation handle and drag
- Then I should see the image rotate smoothly
- Actual Result: Image disappears when rotation angle exceeds 45 degrees
- Test Result: ❌
- Priority: High
- Risk Assessment (Regressions-Likelihood): Medium
- Testable as Unit Test? No
- Notes: Critical bug - image rendering fails at certain angles. See #1337

Example 3: Problematic Test

## Copy Plant with Background

- Description: Test copying a plant when background image is set
- Given I have a map with background image and plants
- When I copy a plant using Ctrl+C and Ctrl+V
- Then I should see the copied plant
- Actual Result: Cannot test - background image upload is broken
- Test Result: ⚠️
- Priority: High
- Risk Assessment (Regressions-Likelihood): Unknown
- Testable as Unit Test? No
- Notes: Blocked by #1336 (background upload failure). Will retest after fix.

Best Practices

  1. Be thorough but efficient

    • Follow test cases precisely.
    • Don't skip steps.
    • Document any deviations.
  2. Write clear notes

    • Explain unexpected behavior.
    • Include reproduction steps.
    • Reference related issues.
  3. Maintain test quality

    • Fix errors in test cases immediately.
    • Suggest improvements to test coverage.
    • Report gaps in testing.
  4. Communicate findings

    • Write clear closing remarks.
    • Highlight critical issues.
    • Suggest process improvements.
  5. Follow the workflow