Krull–Schmidt–Azumaya theorem

Finite-length modules decompose uniquely (up to permutation) into indecomposable summands.
Krull–Schmidt–Azumaya theorem

Krull–Schmidt–Azumaya theorem: Let MM be an RR-module that has finite (equivalently, MM admits a ). Then:

  1. MM decomposes as a finite of indecomposable submodules.
  2. Any two decompositions of MM into finite direct sums of indecomposable modules are equivalent up to permutation and isomorphism of summands: if Mi=1nMij=1mNjM\cong \bigoplus_{i=1}^n M_i \cong \bigoplus_{j=1}^m N_j with all Mi,NjM_i,N_j indecomposable, then n=mn=m and after reindexing MiNiM_i\cong N_i for all ii.

A common sufficient hypothesis for Krull–Schmidt is that MM is both and (which implies finite length). The theorem underlies the uniqueness of indecomposable decompositions and contrasts with the stronger behavior of where summands can be taken simple.

Proof sketch (optional): Existence follows from descending chain conditions on direct summands, splitting off indecomposables inductively. Uniqueness uses that endomorphism rings of indecomposable finite-length modules are local, enabling a cancellation argument that matches summands between two decompositions.