Content Moderation
Runway takes the safety of its platform seriously. This means we will moderate certain API requests.
If your account makes too many requests that get moderated, we will suspend your account. If needed, you should add moderation before calling the Runway API, to avoid suspension.
Approach to Trust & SafetySection titled “Approach to Trust & Safety”
Refer to our help center approach to trust & safety.
Moderated content categoriesSection titled “Moderated content categories”
Refer to our help center FAQ guide on content moderation.
Moderated content typesSection titled “Moderated content types”
Runway moderation will evaluate all elements of your request. This means your requests may be moderated for either an image or text prompt violation.
Moderation implicationsSection titled “Moderation implications”
Recommended moderation approachSection titled “Recommended moderation approach”
We recommend reviewing the categories we block to determine what type of, if any, moderation you need in place.
Each call to Runway’s API defaults to auto
moderation levels. If you wish to be less strict about preventing generations that include recognizable public figures, add the contentModeration
object to your image or video generation API requests. See the API reference for details.
Cost of moderated generationsSection titled “Cost of moderated generations”
Moderated generations have the same credit cost as successful generations.
Account suspensionSection titled “Account suspension”
If your account makes too many moderated requests we will suspend it. You can appeal an account suspension here. Be sure to email from an email account associated with your developer portal login.
Moderation API responsesSection titled “Moderation API responses”
If a request was moderated, the task status response will return with "status":"FAILED"
.
Additional details describing the moderation appear in the "failure"
and "failureCode"
fields.