How Many Parameters Does Linear(768, 3072) Have?

Linear(768, 3072) has 2,362,368 trainable parameters. This includes 2,359,296 weights and 3072 bias terms.

Formula Breakdown

For a Linear layer, the parameter count is:

parameters = in_features * out_features + out_features (bias)
parameters = 768 * 3072 + 3072
parameters = 2,359,296 + 3072
parameters = 2,362,368

The weight matrix W has shape (3072, 768) = 2,359,296 values. The bias vector b has 3072 values. Together: 2,362,368 trainable parameters.

Memory Usage

In float32, this layer uses 9.01 MB of memory for weights alone. During training with Adam optimizer, multiply by 3 (weights + momentum + variance) = 27.04 MB.

Architecture Context

This layer configuration is found in BERT feed-forward network (FFN) intermediate layer. Understanding parameter counts helps you estimate model size, memory requirements, and the risk of overfitting. Layers with more parameters need more training data and compute to train effectively.

Linear layers are often the most parameter-heavy part of a network. For example, VGG-16 has ~124M parameters in its three fully connected layers versus only ~14M in all its convolutional layers. Modern architectures minimize linear layers by using global average pooling.

PyTorch Code to Verify

import torch.nn as nn

layer = nn.Linear(768, 3072)

# Count parameters
total = sum(p.numel() for p in layer.parameters())
print(f"Total parameters: {total}")  # 2,362,368

# Break it down
print(f"Weight shape: {layer.weight.shape}")  # (3072, 768)
print(f"Weight params: {layer.weight.numel()}")  # 2,359,296
print(f"Bias shape: {layer.bias.shape}")  # (3072,)
print(f"Bias params: {layer.bias.numel()}")  # 3072

# Without bias
layer_no_bias = nn.Linear(768, 3072, bias=False)
print(f"Without bias: {sum(p.numel() for p in layer_no_bias.parameters())}")  # 2,359,296

Comparison: With vs. Without Bias

Configuration Parameters
Linear(768, 3072) (with bias) 2,362,368
Linear(768, 3072, bias=False) 2,359,296

When using BatchNorm after a convolutional layer, the bias is redundant because BatchNorm has its own bias term. Setting bias=False saves 3072 parameters per layer.

Related Questions

Try the Parameter Counter