Illegal content depicting sexual abuse of minors. Many AI services implement controls to detect, block, and report CSAM due to legal obligations and platform policy requirements; specific duties, reporting pathways, and retention requirements vary by jurisdiction and service design.
See: Content filtering; Prohibited AI practices; Safety policy