-
Notifications
You must be signed in to change notification settings - Fork 38
Open
Description
My library environmet is as follows
torch:2.6.0
torchvision:0.14.0+cu116
calflops:0.3.2
I executed the following code.
import torch
from torchvision.models import resnet18, vit_b_16, swin_s
from calflops import calculate_flops
resnet = resnet18(weights=None).cuda()
vit = vit_b_16(weights=None).cuda()
swin = swin_s(weights=None).cuda()
batch_size = 1
input_shape = (batch_size, 3, 224, 224)
resnet_flops, resnet_macs, resnet_params = calculate_flops(model=resnet,
input_shape=input_shape,
output_as_string=False,
output_precision=2,
print_results=False)
vit_flops, vit_macs, vit_params = calculate_flops(model=vit,
input_shape=input_shape,
output_as_string=False,
output_precision=2,
print_results=False)
swin_flops, swin_macs, swin_params = calculate_flops(model=swin,
input_shape=input_shape,
output_as_string=False,
output_precision=2,
print_results=False)
print(f"ResNet18: {resnet_flops} GFLOPS, {resnet_macs} MACs, {resnet_params} params")
print(f"ViT-B/16: {vit_flops} GFLOPS, {vit_macs} MACs, {vit_params} params")
print(f"Swin-S: {swin_flops} GFLOPS, {swin_macs} MACs, {swin_params} params")`
and I got the following results.
ResNet18: 3.64 GFLOPS, 1.81 GMACs, 11.69 M params
ViT-B/16: 33.72 GFLOPS, 16.85 GMACs, 86.57 M params
Swin-S: 17.52 GFLOPS, 8.74 GMACs, 49.61 M params
However, when looking at the MMclassification and torchvision official documents, it appears that MACs are almost equal to FLOPs and FLOPs are about twice as large. Has anyone else had similar results?
Metadata
Metadata
Assignees
Labels
No labels