Skip to content

Headless Integration

In addition to the standard integration, we offer endpoints in case you want to build your own UI for different flows:

  • community reports
  • statements of reason and appeals
  • moderation decisions
  • inquiries

To call these endpoints, follow the same principles as the standard integration.

Community Reports

Checkstep API can be used to receive reports from members of the community. These are valuable sources of information as they mainly contain context that AI models could fail to detect, and a high volume of community reports can also be an alarm for an emergency.

OpenAPI Specification

An example payload:

{
  "timestamp": "2021-06-27T10:20:18.019Z",
  "objectType": "user-profile",
  "contentId": "3111b18c-bbdc-48aa-b2a2-b52c1879a36d",
  "category": "adult-content",
  "reason": "The user profile is explicit",
  "userId": "cd004eac-d32a-41ab-9751-8764bd0aa447"
}

The contentId supplied as part of the request should be the ID of the content on the platform. If a piece of content with the same objectType and contentId has already been sent to Checkstep, the community report will be associated with that piece of content, otherwise the community report will still exist but in isolation. Review queues need to be configured with reporting categories for community report incidents to appear.

Categories

OpenAPI Specification

A set of reporting categories can be provisioned for each platform that has been configured. A reporting category is required to be sent as part of an incoming report.

The endpoint returns JSON containing a list of all report categories that have been configured on your platform.

Statement of reason

To provide transparency to your users, you can call the statement of reason endpoint. In case a content was removed, it will provide you information about the moderation decision associated with the removal.

You will also get the eventual status of an appeal.

Statuses are, in order :

  • no-incident : no incident detected yet on this content case
  • enforced : a human or automated decision has enforced this content, sor section is provided
  • appealed : an appeal has been submitted and is pending for review, appeal section is provided
  • appeal-ruled : the appeal has been reviewed, appealRuling section is provided

OpenAPI Specification

An example of a JSON response:

{
  "id": "f04d6dbc40ec2839d5a7f38c7aad902f6ea092e8",
  "status": "appeal-ruled",
  "content": {
    "timestamp": "2023-12-19T16:27:39.806132Z",
    "fields": [
      {
        "id": "text",
        "type": "text",
        "src": "Offensive text"
      }
    ]
  },
  "sor": {
    "timestamp": "2023-12-19T16:28:57.568469Z",
    "violations": [
      {
        "policy": {
          "name": "Hate Speech",
          "code": "HTE",
          "description": "Our platform is not a place for ..."
        },
        "field": "text"
      }
    ]
  },
  "appeal": {
    "timestamp": "2023-12-20T13:04:51.209658Z",
    "reason": "I don't agree with the decision !"
  },
  "appealRuling": {
    "timestamp": "2023-12-20T13:07:04.890362Z",
    "accepted": false
  }
}

Appeal

When your users do not agree with a moderation decision, they should have the right to appeal. Call the appeal endpoint to submit an appeal.

OpenAPI Specification

An example payload:

{
  "timestamp": "2023-11-16T13:42:46.123Z",
  "statement": "I don't agree with the decision, bring my content back !",
  "author": "a672bd2a-d103-46e7-a76a-77b431bc48b7"
}

The above snippet returns a 202 Accepted response status code

Decisions

If you've made a decision about the content or author outside Checkstep, send us the related information to ensure transparency.

About content

OpenAPI Specification

An example payload:

{
  "id": "f8cdba92-7b53-4b14-83ac-4a168aaca4c3",
  "type": "user-profile",
  "decision": "act",
  "moderator": "ae484ce1-1ad5-487e-a112-e5d73a925e45",
  "violations": [
    {
      "field": "firstName",
      "policy": "HTE"
    }
  ]
}

The below list contains all options for decision:

  • act you took down the content
  • dismiss you reviewed the content and found no violation
  • escalate you decided to escalate the case to an upper level
  • overturn you restored a taken-down content
  • upheld you rejected users appeal and kept the content down

About author

OpenAPI Specification

{
  "author": "f8cdba92-7b53-4b14-83ac-4a168aaca4c3",
  "decision": "act",
  "moderator": "ae484ce1-1ad5-487e-a112-e5d73a925e45",
  "violations": [
    "HTE",
    "SXC"
  ]
}

The below list contains all options for decision:

  • act you banned author
  • overturn you canceled the ban for author
  • escalate you decided to escalate the case to an upper level

Inquiries

You can open new incident on the content by creating inquiry. This can be useful if you want to let your moderators review some content which was not identified as potentially violating by automation.

OpenAPI Specification

An example payload:

{
  "id": "f8cdba92-7b53-4b14-83ac-4a168aaca4c3",
  "type": "user-profile",
  "violations": [
    {
      "policy": "HTE"
    }
  ],
  "origin": "internal-report-1"
}