Task
Task
Display the tasks of all chips in the current project, showcasing the workflow analysis and data analysis modules, with the option to switch between lists by clicking.
Workflow task
Display all task lists of the current project (including tasks for SAW, advanced analysis, and Workflow).
Task list
List search
Search tasks by Project name\SN\Task ID,Status and Workflow name.

Task status
- The task status is divided into waiting, running, canceled, error, completed, and warning. When the status is failed, you can click the "error" button to view the logs of the failed task.
Operations:
● Common operations include viewing task details, canceling, deleting, rerunning, and authorizing tasks.
● For standard analysis tasks, it supports "visualization," "entering interactive analysis," "run recut" (only for T tasks), and viewing reports.
● For advanced analysis tasks, it supports "visualization."
FAQ about task
How to continue running at Breakpoint
Tips
The cloud platform uses the call-caching policy of Cromwell to implement breakpoint running. When the task is detected to be consistent with the previous execution record, the previous calculation result can be directly used without recalculation, which can save time and cost.After a computing job is successfully completed, if the platform encounters completely consistent input and completely consistent execution commands, the actual execution will be skipped and the existing computing results will be directly returned immediately.
- Hit rule: The platform calculates a hash value based on the input parameter value (input count \input), Runtime attribute value (such as failOnStderr and continueOnReturnCode), Command line, and output parameter value (output count \output).If the hash value of the computing job record is consistent with that of the platform history, and the job record is within the valid retention time, the Call-Caching is considered to be hit. * * Therefore, modifying the input, output, and Command will result in a hit failure, for example, the hash value comparison of the following process:
Effective Retention Time: The default effective retention time of a computing job record of a generally failed task is 3 days,After the data is saved to Data Management, the intermediate results are deleted immediately.
- Hit Result: If Call-Caching is hit for computing jobs generated by running tasks on the platform, the jobs are skipped and the output results in the history record are reused. The platform will not generate back-end computing jobs,There is also no resource consumption and no charge for skipping steps.
Why can't the task hit call-Caching
Cache hit requires task input to be consistent
If you set the Runtime attribute of the task by using the input variable,Modifying the runtime is thought to modify the input, resulting in a different hash for the task.
The File parameter is defined as String instead of File input, and the call cache may fail. When using the File type, the hash value of two identical files stored in different locations is the same,But using String, but even if the content of the file is the same, the hash value of the String value in different locations will be different.
Cache hit requires task output consistency
- Output quantity and output parameters must be the same to hit
How to view task logs
After the task delivery is completed,You can view task details, task status, logs, and results in Task Management. The detailed information displays the subtask analysis status, delivery information, and input and output of the process. The running log contains the logs of task execution in the container.

View Task Status
For the task of process analysis,The status is divided into: to be analyzed, running, completed, failed and canceled. The meaning of each status is as follows
To be analyzed: The task is queued and waiting for scheduling to enter the resource pool analysis.
Running: After background scheduling, the task enters the resource pool. Note that there may be situations where the underlying resource pool is full,Go to the task details page to view the running status of the subtask. If the subtask status is to be analyzed, the subtask is queued in the resource pool. If the subtask status is Running, the task has been analyzed.
Complete: Task analysis is complete and data has been saved to Data Management
Failure: An error was reported when the task was executed,Click "Failure" to enter the log interface to view the specific failure reason.
Cancel: cancel the task midway, stop analysis
View the results of a task
After the task is completed, you can view the result file generated by the task. In the process analysis task list, click Details to view the basic information and log information of the task.When View Result Directory is highlighted, the task is exported to Data Management, and you can view the task result file.

- Click the file name under the "View Results Directory" column to jump to the directory under the Data Management Display Results folder.

Click the "View" button in the output action bar to preview the file online.
View Run Log
When a task fails or there is an exception,On the Run Log page, view the process execution logs, which display the logs output from the container and the process engine execution logs during the workflow running process.

The workflow run details record the log information generated during each step,Include [stderr] [stdout] [script] [intermediate result file]
- stderr is the standard error stream output by the container. You can view most program execution errors here. Note here that the program needs to define the execution error output to the standard output,Otherwise, no valid information can be found after running the task. The definitions refer to the following:
import sys
def main():
# Print to standard output
print("This is the standard output message")
# Print to standard error output
print("This is the standard error output message", file=sys.stderr)
if __name__ == "__main__":
main()
When the output log volume is relatively large, you can also organize the output log into a log file and print it to the intermediate result file.
[stdout] is the standard output stream executed by the container,Is the normal output of the program, the content of the program output will be redirected to a file or process.
[script] The task execution script, which can only be viewed by the task creator.
[Intermediate result file] is the file output by the task execution.
How to view intermediate result files
If you want to view the file generated by a step of a running or failed task, click the "intermediate result file" button of the step in the workflow run details to view it online.

Online viewing is opened through a web page,Some file types will appear garbled when opening. At this time, you can open the container in the personality analysis to view it.
Enter the personality analysis module. When creating a new personality analysis, use the method of [Mount by Task], select the task in the "Running" or "Failed" state, and mount the intermediate result file of the task into the container for access.
Tips
The intermediate result file governance follows the following rules. After the intermediate result file is deleted, it cannot be viewed.
When the task is completed, the final result file of the task is saved to data management, and the intermediate result file is automatically deleted;
When the task fails or is canceled, the intermediate result file is saved for three days by default,Deleted three days later.
Analysis
Data analysis module, displaying the online and offline task lists of the Notebook, and clicking on the tab can switch the display.
Online analysis tasks
After entering the project, you can view the online analysis tasks in the "Task Management" module of the project, including the computing and image resources used by the tasks, the used duration, remaining duration, etc. You can also perform "Open", "Logs", "Close", and "Delay" operations in this module.

Offline Analysis tasks
You can view offline tasks through the "Task Management-Data Analysis" module. Clicking on "Open" will display a list of sub-tasks for batch tasks.

Clicking on the "Details" button in the operation column allows you to view task details, task logs, and resource consumption.

