Greg_Cunningham Genesys Employee
February 21

I worked out a way to return the epoch if there is no value present. The following success template

"successTemplate": "{"activityDate": ${successTemplateUtils.firstFromArray(${activityDateValue}, "${esc.quote}1970-01-01T00:00:00.000Z${esc.quote}")} }"

will return

            "activityDate": "1970-01-01T00:00:00.000Z"

This solution is using the escape tools to have wrap the epoch date in the quotes required for building the correct JSON body.

The return of the epoch date can then be used in your Architect or Flow logic to know you had an empty results.


Visit Topic to respond.


Previous Replies

Zockie
February 20

Update: I changed the contract output to be a string and get the following error:

Validate output against schema: JSON failed output schema validation for the following reasons: Schema: # @/properties/activityDate. Error location: /activityDate. instance type (integer) does not match any allowed primitive type (allowed: ["string"])

So now I'm working on figuring out how to get it to read that 0 as a string instead of an integer. As of now the response works when there is a single or multiple callers but obviously not when there are no callers.

Zockie
February 20

Hi,

I'm struggling to get this working properly. For context, the entire purpose of this is to get the longest waiting callers current wait time, which is represented by "activityDate". I need activityDateValue to return 0 when null (no callers waiting in queue) and the ISO 8061 string when there is a caller waiting.

I tried following many articles related to the subject but I am met with an error any which way I try. My current methodology is using the structure defined here Help with translation map syntax in data-action - Data Actions - Genesys Cloud Developer Forum .

Request Body:
ENDPOINT: /api/v2/analytics/conversations/activity/query

{
  "metrics": [
    {
      "metric": "oWaiting",
      "details": true
    }
  ],
  "groupBy": [
    "queueId"
  ],
  "filter": {
    "type": "and",
    "clauses": [
      {
        "type": "and",
        "predicates": [
          {
            "type": "dimension",
            "dimension": "queueId",
            "operator": "matches",
            "value": "c256fb45-d7bc-44fd-bb76-7bd5bfc042cc"
          }
        ]
      }
    ]
  },
  "order": "asc"
}

ENDPOINT RESPONSE FOR CONTEXT

{
  "results": [
    {
      "group": {
        "queueId": "c256fb45-d7bc-44fd-bb76-7bd5bfc042cc"
      },
      "data": [
        {
          "metric": "oWaiting",
          "qualifier": "voice",
          "entityIds": [
            "2c752ca9-1f3b-300d-9a94-beebd1399589"
          ],
          "count": 1
        }
      ],
      "truncated": false,
      "entities": [
        {
          "activityDate": "2025-02-20T16:34:07.243Z",
          "metric": "oWaiting",
          "ani": "tel:+17066314756",
          "conversationId": "894f3ec7-10e4-4c26-afc4-ddb1f810da60",
          "direction": "inbound",
          "dnis": "tel:+16146595785",
          "mediaType": "voice",
          "participantName": "Augusta GA",
          "queueId": "c256fb45-d7bc-44fd-bb76-7bd5bfc042cc",
          "requestedRoutings": [
            "Standard"
          ],
          "routingPriority": 0,
          "sessionId": "2c752ca9-1f3b-300d-9a94-beebd1399589"
        }
      ]
    }
  ],
  "entityIdDimension": "sessionId"
}

RESPONSE MAPPING

{
  "translationMap": {
    "activityDateValue": "$.results[0].entities[*].activityDate"
  },
  "translationMapDefaults": {
    "activityDateValue": "[]"
  },
  "successTemplate": "{\"activityDate\":${successTemplateUtils.firstFromArray(\"${activityDateValue}\",\"0\")}}"
}

Right now, my current error is:

Validate output against schema: JSON failed output schema validation for the following reasons: Schema: # @/properties/activityDate. Error location: /activityDate. instance type (string) does not match any allowed primitive type (allowed: ["integer"])

Visit Topic to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, click here.