FLOPs Calculator
Estimate FLOPs (floating point operations) for PyTorch layers. Calculate computational cost for Conv2d, Linear, LSTM, and attention layers. Compare model efficiency.
Built by Michael Lip
Frequently Asked Questions
What are FLOPs?
FLOPs (Floating Point Operations) measure computational cost. One multiply-add counts as 2 FLOPs. A Linear(512, 256) layer performs 512×256 multiply-adds = 262,144 FLOPs.
FLOPs vs FLOPS?
FLOPs (lowercase s) = total operations (a count). FLOPS (uppercase S) = operations per second (a rate). Inference time ≈ FLOPs / FLOPS.
How to calculate Conv2d FLOPs?
Conv2d FLOPs = 2 × out_channels × out_h × out_w × in_channels × kernel_h × kernel_w. For Conv2d(3, 64, 3) on 224×224: ≈ 173M FLOPs.
FLOPs vs parameters?
Parameters = model size (memory). FLOPs = computation time. They don't always correlate. For mobile: FLOPs matter more. For memory-constrained: parameters matter more.
Is this tool free?
Yes. All HeyTensor tools are free, run in your browser, and require no signup.
About This Tool
Part of HeyTensor. All calculations run in your browser. Source code on GitHub.
Contact
Built by Michael Lip. Email [email protected].