Loading...

Floating-point operations (FLOPs)

A measure of computational work; used to quantify training and inference costs. FLOP thresholds appear in regulatory definitions (e.g. the EU AI Act and AI Executive Orders); relevant to compute cost negotiations.

See: Compute; Training