The test-pages system is a comprehensive testing framework for Content Scope Scripts that validates feature functionality across different platforms (Android, Apple, Windows, and browser extensions). These test pages are shared by clients and can be run both in browsers and in CI environments.
injected/integration-test/test-pages/
├── index.html # Main entry point
├── blank.html # Minimal page for extension testing
├── shared/ # Shared utilities and styles
│ ├── utils.js # Test framework utilities
│ ├── style.css # Common styling
│ └── ...
├── {feature-name}/ # Feature-specific test directories
│ ├── index.html # Feature test index
│ ├── pages/ # Individual test pages
│ │ └── {test-name}.html # Test page implementations
│ ├── config/ # Feature configurations
│ │ └── {config-name}.json # JSON configuration files
│ └── scripts/ # Additional test scripts (optional)
└── ...
pages/*.html
)Individual HTML pages that implement specific test scenarios. Each page:
test()
functionconfig/*.json
)JSON files that define feature configurations for testing:
shared/utils.js
)Provides the testing framework:
test(name, testFunction)
- Define test casesrenderResults()
- Execute tests and display results<!DOCTYPE html>
<html>
<head>
<title>Conditional Matching Test</title>
<link rel="stylesheet" href="../../shared/style.css" />
</head>
<body>
<script src="../../shared/utils.js"></script>
<script>
test('Conditional matching', async () => {
const results = [
{
name: 'APIs changing, expecting to always match',
result: navigator.hardwareConcurrency,
expected: 222,
},
];
// Test logic here...
return results;
});
renderResults();
</script>
</body>
</html>
{
"readme": "This config tests conditional matching of experiments",
"version": 1,
"features": {
"apiManipulation": {
"state": "enabled",
"settings": {
"apiChanges": {
"Navigator.prototype.hardwareConcurrency": {
"type": "descriptor",
"getterValue": {
"type": "number",
"value": 222
}
}
},
"conditionalChanges": [
{
"condition": {
"urlPattern": "/test/*"
},
"patchSettings": [
{
"op": "replace",
"path": "/apiChanges/Navigator.prototype.hardwareConcurrency/getterValue/value",
"value": 333
}
]
}
]
}
}
}
}
Tip: The apiManipulation
feature is particularly useful for testing config conditions because it modifies browser APIs in predictable ways. You can use it to validate that conditional logic, URL patterns, and other config conditions are being applied correctly by checking if the expected API values are returned.
The test pages are designed to work across multiple platforms:
The test framework automatically handles platform differences through the ResultsCollector
class, which applies appropriate setup and polyfills for each platform during test execution.
Start the test server:
npm run serve
Access test pages:
http://localhost:3220/
for the main indexTests are ran in CI environments:
// Example CI test
test('Test infra', async ({ page }, testInfo) => {
await testPage(
page,
testInfo,
'/infra/pages/conditional-matching.html',
'./integration-test/test-pages/infra/config/conditional-matching.json',
);
});
See pages.spec.js for complete CI test examples.
When writing integration tests, follow these important guidelines:
It's unadvisable to add custom state for tests directly in .spec.js
files as it makes validation difficult and reduces test reliability. If custom state is absolutely required, ensure this is clearly explained in the corresponding test HTML file with detailed comments about what state is being set and why it's necessary.
The Platform
parameter can be passed to test functions to simulate different platform environments. This is demonstrated in the min-supported-version tests in pages.spec.js:
minSupportedVersion (string)
: Uses { version: '1.5.0' }
minSupportedVersion (int)
: Uses { version: 99 }
This is needed when testing features that have platform-specific behavior or version requirements. The platform object allows testing how features behave under different version constraints without modifying the core test infrastructure.
Where possible, prefer purely config-driven testing to validate features. This approach:
For detailed testing guidelines and examples, see the IMPORTANT TESTING GUIDELINES section in the pages.spec.js file.
?automation=true
in the URL, a "Run Tests" button appears at the top of the page.?automation=true
in the URL, tests will run automatically as soon as the Content Scope Scripts are initialized.window.results
as a standardized object.Tests return results in a standardized format:
{
"Test Name": [
{
"name": "Specific test case",
"result": "actual value",
"expected": "expected value"
}
]
}
Results are displayed in HTML tables with pass/fail indicators and can be collected programmatically for CI validation.
When creating a new feature directory in the test pages system, it's best practice to include an index.html
file that serves as a navigation hub for that feature's tests. This provides several benefits:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width" />
<title>Feature Name</title>
</head>
<body>
<p><a href="../index.html">[Home]</a></p>
<p>Feature Name</p>
<ul>
<li><a href="./pages/test-page-1.html">Test Page 1</a></li>
<li><a href="./pages/test-page-2.html">Test Page 2</a></li>
</ul>
</body>
</html>
pages/
subdirectoryreadme
fieldThe test pages are hosted at https://privacy-test-pages.site/ and used by DuckDuckGo clients, platform teams, CI systems, and external developers to ensure consistent functionality across all platforms.