How to Fix "View Size Is Not Compatible" in PyTorch

Use .reshape() instead of .view(), or call .contiguous().view(). This happens when the tensor is not stored contiguously in memory after operations like transpose or permute.

The Error

RuntimeError: view size is not compatible with input tensor's size and stride
(at least one dimension spans across two contiguous subspaces). Use .reshape(...) instead.

Fix 1: Replace .view() with .reshape() (Recommended)

# Instead of this:
x = x.transpose(1, 2).view(batch, -1)  # ERROR

# Do this:
x = x.transpose(1, 2).reshape(batch, -1)  # Works ✓

.reshape() handles both contiguous and non-contiguous tensors. It returns a view when possible and copies data when necessary.

Fix 2: Make Contiguous First

# If you need .view() specifically:
x = x.transpose(1, 2).contiguous().view(batch, -1)  # Works ✓

Why This Happens

These operations make tensors non-contiguous (they change how data is indexed without moving it in memory):

x = torch.randn(3, 4)
print(x.is_contiguous())              # True
print(x.t().is_contiguous())          # False
print(x.t().contiguous().is_contiguous())  # True

view() vs reshape()

In most cases, .reshape() is the safer choice. Use .view() only when you specifically want to guarantee no data copy.

Related Questions

Try the View Error Solver