Skip to content
Draft
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next Next commit
Add support for creating test cases
  • Loading branch information
hi-rai committed Dec 19, 2025
commit 993ce44e156eded67e1e47e74ee5b3018001070a
38 changes: 23 additions & 15 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,12 +60,19 @@ QAS_URL=https://qas.eu1.qasphere.com

## Commands: `junit-upload`, `playwright-json-upload`

The `junit-upload` and `playwright-json-upload` commands upload test results from JUnit XML and Playwright JSON reports to QA Sphere respectively. Both commands can either create a new test run within a QA Sphere project or upload results to an existing run, and they share the same set of options.
The `junit-upload` and `playwright-json-upload` commands upload test results from JUnit XML and Playwright JSON reports to QA Sphere respectively.

There are two modes for uploading results using the commands:
1. Upload to an existing test run by specifying its URL via `--run-url` flag
2. Create a new test run and upload results to it (when `--run-url` flag is not specified)

### Options

- `-r, --run-url` - Optional URL of an existing run for uploading results (a new run is created if not specified)
- `--run-name` - Optional name template for creating new test run when run url is not specified (supports `{env:VAR}`, `{YYYY}`, `{YY}`, `{MM}`, `{MMM}`, `{DD}`, `{HH}`, `{hh}`, `{mm}`, `{ss}`, `{AMPM}` placeholders). If not specified, `Automated test run - {MMM} {DD}, {YYYY}, {hh}:{mm}:{ss} {AMPM}` is used as default
- `-r`/`--run-url` - Upload results to an existing test run
- `--project-code`, `--run-name`, `--create-tcases` - Create a new test run and upload results to it
- `--project-code` - Project code for creating new test run. It can also be auto detected from test case markers in the results, but this is not fully reliable, so it is recommended to specify the project code explicitly
- `--run-name` - Optional name template for creating new test run. It supports `{env:VAR}`, `{YYYY}`, `{YY}`, `{MM}`, `{MMM}`, `{DD}`, `{HH}`, `{hh}`, `{mm}`, `{ss}`, `{AMPM}` placeholders (default: `Automated test run - {MMM} {DD}, {YYYY}, {hh}:{mm}:{ss} {AMPM}`)
- `--create-tcases` - Automatically create test cases in QA Sphere for results that don't have valid test case markers. A mapping file (`qasphere-automapping-YYYYMMDD-HHmmss.txt`) is generated showing the sequence numbers assigned to each new test case (default: `false`)
- `--attachments` - Try to detect and upload any attachments with the test result
- `--force` - Ignore API request errors, invalid test cases, or attachments
- `--ignore-unmatched` - Suppress individual unmatched test messages, show summary only
Expand Down Expand Up @@ -96,32 +103,33 @@ Ensure the required environment variables are defined before running these comma

**Note:** The following examples use `junit-upload`, but you can replace it with `playwright-json-upload` and adjust the file extension from `.xml` to `.json` to upload Playwright JSON reports instead.

1. Create a new test run with default name template (`Automated test run - {MMM} {DD}, {YYYY}, {hh}:{mm}:{ss} {AMPM}`) and upload results:
1. Upload to an existing test run:
```bash
qasphere junit-upload ./test-results.xml
qasphere junit-upload -r https://qas.eu1.qasphere.com/project/P1/run/23 ./test-results.xml
```

2. Upload to an existing test run:
2. Create a new test run with default name template and upload results:
```bash
qasphere junit-upload -r https://qas.eu1.qasphere.com/project/P1/run/23 ./test-results.xml
qasphere junit-upload ./test-results.xml
```
Project code is detected from test case markers in the results.

3. Create a new test run with name template without any placeholders and upload results:
```bash
qasphere junit-upload --run-name "v1.4.4-rc5" ./test-results.xml
qasphere junit-upload --project-code P1 --run-name "v1.4.4-rc5" ./test-results.xml
```

4. Create a new test run with name template using environment variables and date placeholders and upload results:
```bash
qasphere junit-upload --run-name "CI Build {env:BUILD_NUMBER} - {YYYY}-{MM}-{DD}" ./test-results.xml
qasphere junit-upload --project-code P1 --run-name "CI Build {env:BUILD_NUMBER} - {YYYY}-{MM}-{DD}" ./test-results.xml
```
If `BUILD_NUMBER` environment variable is set to `v1.4.4-rc5` and today's date is January 1, 2025, the run would be named "CI Build v1.4.4-rc5 - 2025-01-01".

5. Create a new test run with name template using date/time placeholders and upload results:
5. Create a new test run with name template using date/time placeholders and create test cases for results without valid markers and upload results:
```bash
qasphere junit-upload --run-name "Nightly Tests {YYYY}/{MM}/{DD} {HH}:{mm}" ./test-results.xml
qasphere junit-upload --project-code P1 --run-name "Nightly Tests {YYYY}/{MM}/{DD} {HH}:{mm}" --create-tcases ./test-results.xml
```
If the current time is 10:34 PM on January 1, 2025, the run would be named "Nightly Tests 2025/01/01 22:34".
If the current time is 10:34 PM on January 1, 2025, the run would be named "Nightly Tests 2025/01/01 22:34". This also creates new test cases in QA Sphere for any results that doesn't have a valid test case marker. A mapping file (`qasphere-automapping-YYYYMMDD-HHmmss.txt`) is generated showing the sequence numbers assigned to each newly created test case. Update your test cases to include the markers in the name, for future uploads.

6. Upload results with attachments:
```bash
Expand All @@ -139,21 +147,21 @@ Ensure the required environment variables are defined before running these comma
```
This will show only a summary like "Skipped 5 unmatched tests" instead of individual error messages for each unmatched test.

9. Skip stdout/stderr for passed tests to reduce result payload size:
9. Skip stdout for passed tests to reduce result payload size:
```bash
qasphere junit-upload --skip-report-stdout on-success ./test-results.xml
```
This will exclude stdout from passed tests while still including it for failed, blocked, or skipped tests.

Skip both stdout and stderr for passed tests:
10. Skip both stdout and stderr for passed tests:
```bash
qasphere junit-upload --skip-report-stdout on-success --skip-report-stderr on-success ./test-results.xml
```
This is useful when you have verbose logging in tests but only want to see output for failures.

## Test Report Requirements

The QAS CLI requires test cases in your reports (JUnit XML or Playwright JSON) to reference corresponding test cases in QA Sphere. These references are used to map test results from your automation to the appropriate test cases in QA Sphere. If a report lacks these references or the referenced test case doesn't exist in QA Sphere, the tool will display an error message.
The QAS CLI maps test results from your reports (JUnit XML or Playwright JSON) to corresponding test cases in QA Sphere using test case markers. If a test result lacks a valid marker, the CLI will display an error unless you use `--create-tcases` to automatically create test cases, or `--ignore-unmatched`/`--force` to skip unmatched results.

### JUnit XML

Expand Down
31 changes: 31 additions & 0 deletions src/api/schemas.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,37 @@ export type ResourceId = string | number

export type ResultStatus = 'open' | 'passed' | 'blocked' | 'failed' | 'skipped'

export interface PaginatedResponse<T> {
data: T[]
total: number
page: number
limit: number
}

export interface PaginatedRequest {
page?: number
limit?: number
}

export interface TCase {
id: string
legacyId?: string
seq: number
title: string
version: number
projectId: string
folderId: number
}

export interface CreateTCasesRequest {
folderPath: string[]
tcases: { title: string; tags: string[] }[]
}

export interface CreateTCasesResponse {
tcases: { id: string; seq: number }[]
}

export interface Folder {
id: number
title: string
Expand Down
53 changes: 22 additions & 31 deletions src/api/tcases.ts
Original file line number Diff line number Diff line change
@@ -1,34 +1,25 @@
import { ResourceId } from './schemas'
import { jsonResponse, withJson } from './utils'
export interface PaginatedResponse<T> {
data: T[]
total: number
page: number
limit: number
}

export interface TCaseBySeq {
id: string
legacyId?: string
seq: number
version: number
projectId: string
folderId: number
}

export interface GetTCasesBySeqRequest {
seqIds: string[]
page?: number
limit?: number
}
import {
CreateTCasesRequest,
CreateTCasesResponse,
PaginatedRequest,
PaginatedResponse,
ResourceId,
TCase,
} from './schemas'
import { appendSearchParams, jsonResponse, withJson } from './utils'

export const createTCaseApi = (fetcher: typeof fetch) => {
fetcher = withJson(fetcher)
return {
getTCasesBySeq: (projectCode: ResourceId, request: GetTCasesBySeqRequest) =>
fetcher(`/api/public/v0/project/${projectCode}/tcase/seq`, {
method: 'POST',
body: JSON.stringify(request),
}).then((r) => jsonResponse<PaginatedResponse<TCaseBySeq>>(r)),
}
fetcher = withJson(fetcher)
return {
getTCasesPaginated: (projectCode: ResourceId, request: PaginatedRequest) =>
fetcher(appendSearchParams(`/api/public/v0/project/${projectCode}/tcase`, request)).then(
(r) => jsonResponse<PaginatedResponse<TCase>>(r)
),

createTCases: (projectCode: ResourceId, request: CreateTCasesRequest) =>
fetcher(`/api/public/v0/project/${projectCode}/tcase/bulk`, {
method: 'POST',
body: JSON.stringify(request),
}).then((r) => jsonResponse<CreateTCasesResponse>(r)),
}
}
36 changes: 36 additions & 0 deletions src/api/utils.ts
Original file line number Diff line number Diff line change
Expand Up @@ -48,3 +48,39 @@ export const jsonResponse = async <T>(response: Response): Promise<T> => {
}
throw new Error(response.statusText)
}

const updateSearchParams = <T extends object>(searchParams: URLSearchParams, obj?: T) => {
const isValidValue = (value: unknown) => {
return value || value === false || value === ''
}

if (!obj) return

Object.entries(obj).forEach(([key, value]) => {
if (isValidValue(value)) {
if (Array.isArray(value)) {
value.forEach((param) => {
if (isValidValue(param)) {
searchParams.append(key, String(param))
}
})
} else if (value instanceof Date) {
searchParams.set(key, value.toISOString())
} else if (typeof value === 'object') {
updateSearchParams(searchParams, value)
} else {
searchParams.set(key, String(value))
}
}
})
}

export const appendSearchParams = <T extends object>(pathname: string, obj: T): string => {
const searchParams = new URLSearchParams()
updateSearchParams(searchParams, obj)

if (searchParams.size > 0) {
return `${pathname}?${searchParams.toString()}`
}
return pathname
}
77 changes: 58 additions & 19 deletions src/commands/resultUpload.ts
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
import { Arguments, Argv, CommandModule } from 'yargs'
import chalk from 'chalk'
import { loadEnvs } from '../utils/env'
import { loadEnvs, qasEnvFile } from '../utils/env'
import {
ResultUploadCommandArgs,
ResultUploadCommandHandler,
UploadCommandType
ResultUploadCommandArgs,
ResultUploadCommandHandler,
UploadCommandType,
} from '../utils/result-upload/ResultUploadCommandHandler'

const commandTypeDisplayStrings: Record<UploadCommandType, string> = {
Expand Down Expand Up @@ -36,11 +36,22 @@ export class ResultUploadCommandModule implements CommandModule<unknown, ResultU
type: 'string',
requiresArg: true,
},
'project-code': {
describe:
'Existing project code for uploading results (when run url is not specified). It can also be auto detected from test case markers in the results, but this is not fully reliable, so it is recommended to specify the project code explicitly',
type: 'string',
},
'run-name': {
describe:
'Optional name template for creating new test run when run url is not specified. If not specified, "Automated test run - {MMM} {DD}, {YYYY}, {hh}:{mm}:{ss} {AMPM}" is used as default',
'Optional name template for creating new test run (when run url is not specified). If not specified, "Automated test run - {MMM} {DD}, {YYYY}, {hh}:{mm}:{ss} {AMPM}" is used as default',
type: 'string',
},
'create-tcases': {
describe:
'Create new test cases for results without valid markers (when run url is not specified)',
type: 'boolean',
default: false,
},
attachments: {
describe: 'Try to detect any attachments and upload it with the test result',
type: 'boolean',
Expand Down Expand Up @@ -72,44 +83,72 @@ export class ResultUploadCommandModule implements CommandModule<unknown, ResultU
})

argv.example(
`$0 ${this.type} ./test-results.${commandTypeFileExtensions[this.type]}`,
'Create a new test run with default name template and upload results (project code detected from test names)'
`$0 ${this.type} -r https://qas.eu1.qasphere.com/project/P1/run/23 ./test-results.${
commandTypeFileExtensions[this.type]
}`,
'Upload results to existing run ID 23 of project P1'
)

argv.example(
`$0 ${this.type} -r https://qas.eu1.qasphere.com/project/P1/run/23 ./test-results.${commandTypeFileExtensions[this.type]}`,
'Upload results to existing run ID 23 of project P1'
`$0 ${this.type} ./test-results.${commandTypeFileExtensions[this.type]}`,
'Create a new test run with default name template and upload results. Project code is detected from test case markers in the results'
)

argv.example(
`$0 ${this.type} --run-name "v1.4.4-rc5" ./test-results.${commandTypeFileExtensions[this.type]}`,
`$0 ${this.type} --project-code P1 --run-name "v1.4.4-rc5" ./test-results.${
commandTypeFileExtensions[this.type]
}`,
'Create a new test run with name template without any placeholders and upload results'
)

argv.example(
`$0 ${this.type} --run-name "CI Build {env:BUILD_NUMBER} - {YYYY}-{MM}-{DD}" ./test-results.${commandTypeFileExtensions[this.type]}`,
`$0 ${
this.type
} --project-code P1 --run-name "CI Build {env:BUILD_NUMBER} - {YYYY}-{MM}-{DD}" ./test-results.${
commandTypeFileExtensions[this.type]
}`,
'Create a new test run with name template using environment variable and date placeholders and upload results'
)

argv.example(
`$0 ${this.type} --run-name "Nightly Tests {YYYY}/{MM}/{DD} {HH}:{mm}" ./test-results.${commandTypeFileExtensions[this.type]}`,
'Create a new test run with name template using date and time placeholders and upload results'
`$0 ${
this.type
} --project-code P1 --run-name "Nightly Tests {YYYY}/{MM}/{DD} {HH}:{mm}" --create-tcases ./test-results.${
commandTypeFileExtensions[this.type]
}`,
'Create a new test run with name template using date and time placeholders and create test cases for results without valid markers and upload results'
)

argv.epilogue(`Requirements:
Test case names in the report should contain QA Sphere test case reference (PROJECT-SEQUENCE). This reference is used to match test cases in the report with test cases in QA Sphere.
argv.epilogue(`
${chalk.bold('Modes:')}
There are two modes for uploading results using the command:
- Upload to an existing test run by specifying its URL via ${chalk.bold(
'--run-url'
)} flag. Project code and the run ID are extracted from the URL
- Create a new test run and upload results to it (when --run-url flag is not specified). Following flags (all optional) are applicable in this mode: ${chalk.bold(
'--project-code'
)}, ${chalk.bold('--run-name')}, ${chalk.bold('--create-tcases')}
All other options are applicable to both the modes.

${chalk.bold('Test Case Matching:')}
Test case names in the report should contain QA Sphere test case markers (PROJECT-SEQUENCE) to match the results.
- ${chalk.bold('PROJECT')} is your QASphere project code
- ${chalk.bold('SEQUENCE')} is at least three-digit test case sequence number

For example,
- ${chalk.bold('PRJ-312')}: Login with valid credentials
- Login with valid credentials: ${chalk.bold('PRJ-312')}

Required environment variables (in .qaspherecli or exported):
- QAS_TOKEN: Your QASphere API token
- QAS_URL: Your QASphere instance URL (e.g., https://qas.eu1.qasphere.com)
If markers are not present, use ${chalk.bold(
'--create-tcases'
)} to automatically create test cases in QA Sphere.

${chalk.bold('Required environment variables:')}
These should be either defined in a ${qasEnvFile} file or exported as environment variables:
- ${chalk.bold('QAS_TOKEN')}: Your QASphere API token
- ${chalk.bold('QAS_URL')}: Your QASphere instance URL (e.g., https://qas.eu1.qasphere.com)

Run name template placeholders:
${chalk.bold('Run name template placeholders:')}
- ${chalk.bold('{env:VAR_NAME}')}: Environment variables
- ${chalk.bold('{YYYY}')}: 4-digit year
- ${chalk.bold('{YY}')}: 2-digit year
Expand Down
16 changes: 16 additions & 0 deletions src/tests/fixtures/junit-xml/without-markers.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
<testsuites id="" name="" tests="3" failures="0" skipped="0" errors="0" time="17.629">
<testsuite name="ui.cart.spec.ts" timestamp="2024-04-22T09:25:16.777Z" hostname="chromium"
tests="3" failures="0" skipped="0" time="17.629" errors="0">
<!-- Valid test case marker -->
<testcase name="Test cart TEST-002" classname="ui.cart.spec.ts" time="10.686">
</testcase>
<!-- No test case marker -->
<testcase name="The cart is still filled after refreshing the page"
classname="ui.cart.spec.ts" time="5.712">
</testcase>
<!-- Invalid test case marker -->
<testcase name="TEST-010: Cart should be cleared after making the checkout"
classname="ui.cart.spec.ts" time="1.231">
</testcase>
</testsuite>
</testsuites>
Loading