Understanding Metric Data
The Metric endpoint returns the measured results for each of the Cloud Manager quality gates: Code Quality, Security, Performance, and Experience Audit. Each metric is represented with a JSON object containing the severity, the expected value, the actual value, the result, the comparator used to generate the result, and the metric name.
For example, if you look at the code quality metrics, you will see an object like this:
{
"id": "7776",
"severity": "important",
"passed": true,
"override": false,
"actualValue": "50.4",
"expectedValue": "50",
"comparator": "GTE",
"kpi": "coverage"
}
Here, we can see that this is the coverage metric, the expected value is 50, the actual value is 50.4, to pass the actual value must be greater than or equal to (GTE) the expected value, that the metric passed.
The kpi value is a technical value which will generally need to be translated into a user-facing value. The reference tables below provide mappings between the technical name in the API response and the names/definitions in the documentation for each of the quality gates.
Code Quality
security_ratingreliability_ratingsqale_ratingcoverageskipped_testsopen_issuesduplicated_lines_densitySecurity Testing
deserialization_firewall_attach_api_readinessdeserialization_firewall_functionaldeserialization_firewall_loadedauthorizable_node_name_generationdefault_login_accountssling_get_servletcq_dispatcher_configurationcq_html_library_manager_configsling_java_script_handlersling_jsp_script_handlersling_referrer_filterssl_configurationuser_profile_default_accesscrxde_supportdavex_health_checkexample_content_packageswcm_filter_configurationwebdav_health_checkweb_server_configurationreplication_and_transport_usersPerformance Testing
error_ratecpu_utilization_ratedisk_io_wait_timeresponse_timepeak_resp_timeviews_per_minutedisk_bandwidth_utilnetwork_bandwidth_utilrequests_per_minuteExperience Audit
performancebest-practicesaccessibilityseopwa