about summary refs log tree commit diff
path: root/compiler/rustc_pattern_analysis/src
diff options
context:
space:
mode:
Diffstat (limited to 'compiler/rustc_pattern_analysis/src')
-rw-r--r--compiler/rustc_pattern_analysis/src/constructor.rs988
-rw-r--r--compiler/rustc_pattern_analysis/src/cx.rs856
-rw-r--r--compiler/rustc_pattern_analysis/src/errors.rs95
-rw-r--r--compiler/rustc_pattern_analysis/src/lib.rs56
-rw-r--r--compiler/rustc_pattern_analysis/src/lints.rs290
-rw-r--r--compiler/rustc_pattern_analysis/src/pat.rs205
-rw-r--r--compiler/rustc_pattern_analysis/src/usefulness.rs1319
7 files changed, 3809 insertions, 0 deletions
diff --git a/compiler/rustc_pattern_analysis/src/constructor.rs b/compiler/rustc_pattern_analysis/src/constructor.rs
new file mode 100644
index 00000000000..6486ad8b483
--- /dev/null
+++ b/compiler/rustc_pattern_analysis/src/constructor.rs
@@ -0,0 +1,988 @@
+//! As explained in [`crate::usefulness`], values and patterns are made from constructors applied to
+//! fields. This file defines a `Constructor` enum and various operations to manipulate them.
+//!
+//! There are two important bits of core logic in this file: constructor inclusion and constructor
+//! splitting. Constructor inclusion, i.e. whether a constructor is included in/covered by another,
+//! is straightforward and defined in [`Constructor::is_covered_by`].
+//!
+//! Constructor splitting is mentioned in [`crate::usefulness`] but not detailed. We describe it
+//! precisely here.
+//!
+//!
+//!
+//! # Constructor grouping and splitting
+//!
+//! As explained in the corresponding section in [`crate::usefulness`], to make usefulness tractable
+//! we need to group together constructors that have the same effect when they are used to
+//! specialize the matrix.
+//!
+//! Example:
+//! ```compile_fail,E0004
+//! match (0, false) {
+//!     (0 ..=100, true) => {}
+//!     (50..=150, false) => {}
+//!     (0 ..=200, _) => {}
+//! }
+//! ```
+//!
+//! In this example we can restrict specialization to 5 cases: `0..50`, `50..=100`, `101..=150`,
+//! `151..=200` and `200..`.
+//!
+//! In [`crate::usefulness`], we had said that `specialize` only takes value-only constructors. We
+//! now relax this restriction: we allow `specialize` to take constructors like `0..50` as long as
+//! we're careful to only do that with constructors that make sense. For example, `specialize(0..50,
+//! (0..=100, true))` is sensible, but `specialize(50..=200, (0..=100, true))` is not.
+//!
+//! Constructor splitting looks at the constructors in the first column of the matrix and constructs
+//! such a sensible set of constructors. Formally, we want to find a smallest disjoint set of
+//! constructors:
+//! - Whose union covers the whole type, and
+//! - That have no non-trivial intersection with any of the constructors in the column (i.e. they're
+//!     each either disjoint with or covered by any given column constructor).
+//!
+//! We compute this in two steps: first [`crate::cx::MatchCheckCtxt::ctors_for_ty`] determines the
+//! set of all possible constructors for the type. Then [`ConstructorSet::split`] looks at the
+//! column of constructors and splits the set into groups accordingly. The precise invariants of
+//! [`ConstructorSet::split`] is described in [`SplitConstructorSet`].
+//!
+//! Constructor splitting has two interesting special cases: integer range splitting (see
+//! [`IntRange::split`]) and slice splitting (see [`Slice::split`]).
+//!
+//!
+//!
+//! # The `Missing` constructor
+//!
+//! We detail a special case of constructor splitting that is a bit subtle. Take the following:
+//!
+//! ```
+//! enum Direction { North, South, East, West }
+//! # let wind = (Direction::North, 0u8);
+//! match wind {
+//!     (Direction::North, 50..) => {}
+//!     (_, _) => {}
+//! }
+//! ```
+//!
+//! Here we expect constructor splitting to output two cases: `North`, and "everything else". This
+//! "everything else" is represented by [`Constructor::Missing`]. Unlike other constructors, it's a
+//! bit contextual: to know the exact list of constructors it represents we have to look at the
+//! column. In practice however we don't need to, because by construction it only matches rows that
+//! have wildcards. This is how this constructor is special: the only constructor that covers it is
+//! `Wildcard`.
+//!
+//! The only place where we care about which constructors `Missing` represents is in diagnostics
+//! (see `crate::usefulness::WitnessMatrix::apply_constructor`).
+//!
+//! We choose whether to specialize with `Missing` in
+//! `crate::usefulness::compute_exhaustiveness_and_usefulness`.
+//!
+//!
+//!
+//! ## Empty types, empty constructors, and the `exhaustive_patterns` feature
+//!
+//! An empty type is a type that has no valid value, like `!`, `enum Void {}`, or `Result<!, !>`.
+//! They require careful handling.
+//!
+//! First, for soundness reasons related to the possible existence of invalid values, by default we
+//! don't treat empty types as empty. We force them to be matched with wildcards. Except if the
+//! `exhaustive_patterns` feature is turned on, in which case we do treat them as empty. And also
+//! except if the type has no constructors (like `enum Void {}` but not like `Result<!, !>`), we
+//! specifically allow `match void {}` to be exhaustive. There are additionally considerations of
+//! place validity that are handled in `crate::usefulness`. Yes this is a bit tricky.
+//!
+//! The second thing is that regardless of the above, it is always allowed to use all the
+//! constructors of a type. For example, all the following is ok:
+//!
+//! ```rust,ignore(example)
+//! # #![feature(never_type)]
+//! # #![feature(exhaustive_patterns)]
+//! fn foo(x: Option<!>) {
+//!   match x {
+//!     None => {}
+//!     Some(_) => {}
+//!   }
+//! }
+//! fn bar(x: &[!]) -> u32 {
+//!   match x {
+//!     [] => 1,
+//!     [_] => 2,
+//!     [_, _] => 3,
+//!   }
+//! }
+//! ```
+//!
+//! Moreover, take the following:
+//!
+//! ```rust
+//! # #![feature(never_type)]
+//! # #![feature(exhaustive_patterns)]
+//! # let x = None::<!>;
+//! match x {
+//!   None => {}
+//! }
+//! ```
+//!
+//! On a normal type, we would identify `Some` as missing and tell the user. If `x: Option<!>`
+//! however (and `exhaustive_patterns` is on), it's ok to omit `Some`. When listing the constructors
+//! of a type, we must therefore track which can be omitted.
+//!
+//! Let's call "empty" a constructor that matches no valid value for the type, like `Some` for the
+//! type `Option<!>`. What this all means is that `ConstructorSet` must know which constructors are
+//! empty. The difference between empty and nonempty constructors is that empty constructors need
+//! not be present for the match to be exhaustive.
+//!
+//! A final remark: empty constructors of arity 0 break specialization, we must avoid them. The
+//! reason is that if we specialize by them, nothing remains to witness the emptiness; the rest of
+//! the algorithm can't distinguish them from a nonempty constructor. The only known case where this
+//! could happen is the `[..]` pattern on `[!; N]` with `N > 0` so we must take care to not emit it.
+//!
+//! This is all handled by [`crate::cx::MatchCheckCtxt::ctors_for_ty`] and
+//! [`ConstructorSet::split`]. The invariants of [`SplitConstructorSet`] are also of interest.
+//!
+//!
+//!
+//! ## Opaque patterns
+//!
+//! Some patterns, such as constants that are not allowed to be matched structurally, cannot be
+//! inspected, which we handle with `Constructor::Opaque`. Since we know nothing of these patterns,
+//! we assume they never cover each other. In order to respect the invariants of
+//! [`SplitConstructorSet`], we give each `Opaque` constructor a unique id so we can recognize it.
+
+use std::cmp::{self, max, min, Ordering};
+use std::fmt;
+use std::iter::once;
+
+use smallvec::SmallVec;
+
+use rustc_apfloat::ieee::{DoubleS, IeeeFloat, SingleS};
+use rustc_data_structures::fx::FxHashSet;
+use rustc_hir::RangeEnd;
+use rustc_index::IndexVec;
+use rustc_middle::mir::Const;
+use rustc_target::abi::VariantIdx;
+
+use self::Constructor::*;
+use self::MaybeInfiniteInt::*;
+use self::SliceKind::*;
+
+use crate::usefulness::PatCtxt;
+
+/// Whether we have seen a constructor in the column or not.
+#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord)]
+enum Presence {
+    Unseen,
+    Seen,
+}
+
+/// A possibly infinite integer. Values are encoded such that the ordering on `u128` matches the
+/// natural order on the original type. For example, `-128i8` is encoded as `0` and `127i8` as
+/// `255`. See `signed_bias` for details.
+#[derive(Debug, Clone, Copy, PartialEq, Eq, PartialOrd, Ord)]
+pub enum MaybeInfiniteInt {
+    NegInfinity,
+    /// Encoded value. DO NOT CONSTRUCT BY HAND; use `new_finite`.
+    #[non_exhaustive]
+    Finite(u128),
+    /// The integer after `u128::MAX`. We need it to represent `x..=u128::MAX` as an exclusive range.
+    JustAfterMax,
+    PosInfinity,
+}
+
+impl MaybeInfiniteInt {
+    pub fn new_finite_uint(bits: u128) -> Self {
+        Finite(bits)
+    }
+    pub fn new_finite_int(bits: u128, size: u64) -> Self {
+        // Perform a shift if the underlying types are signed, which makes the interval arithmetic
+        // type-independent.
+        let bias = 1u128 << (size - 1);
+        Finite(bits ^ bias)
+    }
+
+    pub fn as_finite_uint(self) -> Option<u128> {
+        match self {
+            Finite(bits) => Some(bits),
+            _ => None,
+        }
+    }
+    pub fn as_finite_int(self, size: u64) -> Option<u128> {
+        // We decode the shift.
+        match self {
+            Finite(bits) => {
+                let bias = 1u128 << (size - 1);
+                Some(bits ^ bias)
+            }
+            _ => None,
+        }
+    }
+
+    /// Note: this will not turn a finite value into an infinite one or vice-versa.
+    pub fn minus_one(self) -> Self {
+        match self {
+            Finite(n) => match n.checked_sub(1) {
+                Some(m) => Finite(m),
+                None => bug!(),
+            },
+            JustAfterMax => Finite(u128::MAX),
+            x => x,
+        }
+    }
+    /// Note: this will not turn a finite value into an infinite one or vice-versa.
+    pub fn plus_one(self) -> Self {
+        match self {
+            Finite(n) => match n.checked_add(1) {
+                Some(m) => Finite(m),
+                None => JustAfterMax,
+            },
+            JustAfterMax => bug!(),
+            x => x,
+        }
+    }
+}
+
+/// An exclusive interval, used for precise integer exhaustiveness checking. `IntRange`s always
+/// store a contiguous range.
+///
+/// `IntRange` is never used to encode an empty range or a "range" that wraps around the (offset)
+/// space: i.e., `range.lo < range.hi`.
+#[derive(Clone, Copy, PartialEq, Eq)]
+pub struct IntRange {
+    pub lo: MaybeInfiniteInt, // Must not be `PosInfinity`.
+    pub hi: MaybeInfiniteInt, // Must not be `NegInfinity`.
+}
+
+impl IntRange {
+    /// Best effort; will not know that e.g. `255u8..` is a singleton.
+    pub(crate) fn is_singleton(&self) -> bool {
+        // Since `lo` and `hi` can't be the same `Infinity` and `plus_one` never changes from finite
+        // to infinite, this correctly only detects ranges that contain exacly one `Finite(x)`.
+        self.lo.plus_one() == self.hi
+    }
+
+    #[inline]
+    pub fn from_singleton(x: MaybeInfiniteInt) -> IntRange {
+        IntRange { lo: x, hi: x.plus_one() }
+    }
+
+    #[inline]
+    pub fn from_range(lo: MaybeInfiniteInt, mut hi: MaybeInfiniteInt, end: RangeEnd) -> IntRange {
+        if end == RangeEnd::Included {
+            hi = hi.plus_one();
+        }
+        if lo >= hi {
+            // This should have been caught earlier by E0030.
+            bug!("malformed range pattern: {lo:?}..{hi:?}");
+        }
+        IntRange { lo, hi }
+    }
+
+    fn is_subrange(&self, other: &Self) -> bool {
+        other.lo <= self.lo && self.hi <= other.hi
+    }
+
+    fn intersection(&self, other: &Self) -> Option<Self> {
+        if self.lo < other.hi && other.lo < self.hi {
+            Some(IntRange { lo: max(self.lo, other.lo), hi: min(self.hi, other.hi) })
+        } else {
+            None
+        }
+    }
+
+    /// Partition a range of integers into disjoint subranges. This does constructor splitting for
+    /// integer ranges as explained at the top of the file.
+    ///
+    /// This returns an output that covers `self`. The output is split so that the only
+    /// intersections between an output range and a column range are inclusions. No output range
+    /// straddles the boundary of one of the inputs.
+    ///
+    /// Additionally, we track for each output range whether it is covered by one of the column ranges or not.
+    ///
+    /// The following input:
+    /// ```text
+    ///   (--------------------------) // `self`
+    /// (------) (----------)    (-)
+    ///     (------) (--------)
+    /// ```
+    /// is first intersected with `self`:
+    /// ```text
+    ///   (--------------------------) // `self`
+    ///   (----) (----------)    (-)
+    ///     (------) (--------)
+    /// ```
+    /// and then iterated over as follows:
+    /// ```text
+    ///   (-(--)-(-)-(------)-)--(-)-
+    /// ```
+    /// where each sequence of dashes is an output range, and dashes outside parentheses are marked
+    /// as `Presence::Missing`.
+    ///
+    /// ## `isize`/`usize`
+    ///
+    /// Whereas a wildcard of type `i32` stands for the range `i32::MIN..=i32::MAX`, a `usize`
+    /// wildcard stands for `0..PosInfinity` and a `isize` wildcard stands for
+    /// `NegInfinity..PosInfinity`. In other words, as far as `IntRange` is concerned, there are
+    /// values before `isize::MIN` and after `usize::MAX`/`isize::MAX`.
+    /// This is to avoid e.g. `0..(u32::MAX as usize)` from being exhaustive on one architecture and
+    /// not others. This was decided in <https://github.com/rust-lang/rfcs/pull/2591>.
+    ///
+    /// These infinities affect splitting subtly: it is possible to get `NegInfinity..0` and
+    /// `usize::MAX+1..PosInfinity` in the output. Diagnostics must be careful to handle these
+    /// fictitious ranges sensibly.
+    fn split(
+        &self,
+        column_ranges: impl Iterator<Item = IntRange>,
+    ) -> impl Iterator<Item = (Presence, IntRange)> {
+        // The boundaries of ranges in `column_ranges` intersected with `self`.
+        // We do parenthesis matching for input ranges. A boundary counts as +1 if it starts
+        // a range and -1 if it ends it. When the count is > 0 between two boundaries, we
+        // are within an input range.
+        let mut boundaries: Vec<(MaybeInfiniteInt, isize)> = column_ranges
+            .filter_map(|r| self.intersection(&r))
+            .flat_map(|r| [(r.lo, 1), (r.hi, -1)])
+            .collect();
+        // We sort by boundary, and for each boundary we sort the "closing parentheses" first. The
+        // order of +1/-1 for a same boundary value is actually irrelevant, because we only look at
+        // the accumulated count between distinct boundary values.
+        boundaries.sort_unstable();
+
+        // Accumulate parenthesis counts.
+        let mut paren_counter = 0isize;
+        // Gather pairs of adjacent boundaries.
+        let mut prev_bdy = self.lo;
+        boundaries
+            .into_iter()
+            // End with the end of the range. The count is ignored.
+            .chain(once((self.hi, 0)))
+            // List pairs of adjacent boundaries and the count between them.
+            .map(move |(bdy, delta)| {
+                // `delta` affects the count as we cross `bdy`, so the relevant count between
+                // `prev_bdy` and `bdy` is untouched by `delta`.
+                let ret = (prev_bdy, paren_counter, bdy);
+                prev_bdy = bdy;
+                paren_counter += delta;
+                ret
+            })
+            // Skip empty ranges.
+            .filter(|&(prev_bdy, _, bdy)| prev_bdy != bdy)
+            // Convert back to ranges.
+            .map(move |(prev_bdy, paren_count, bdy)| {
+                use Presence::*;
+                let presence = if paren_count > 0 { Seen } else { Unseen };
+                let range = IntRange { lo: prev_bdy, hi: bdy };
+                (presence, range)
+            })
+    }
+}
+
+/// Note: this will render signed ranges incorrectly. To render properly, convert to a pattern
+/// first.
+impl fmt::Debug for IntRange {
+    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
+        if let Finite(lo) = self.lo {
+            write!(f, "{lo}")?;
+        }
+        write!(f, "{}", RangeEnd::Excluded)?;
+        if let Finite(hi) = self.hi {
+            write!(f, "{hi}")?;
+        }
+        Ok(())
+    }
+}
+
+#[derive(Copy, Clone, Debug, PartialEq, Eq)]
+pub enum SliceKind {
+    /// Patterns of length `n` (`[x, y]`).
+    FixedLen(usize),
+    /// Patterns using the `..` notation (`[x, .., y]`).
+    /// Captures any array constructor of `length >= i + j`.
+    /// In the case where `array_len` is `Some(_)`,
+    /// this indicates that we only care about the first `i` and the last `j` values of the array,
+    /// and everything in between is a wildcard `_`.
+    VarLen(usize, usize),
+}
+
+impl SliceKind {
+    fn arity(self) -> usize {
+        match self {
+            FixedLen(length) => length,
+            VarLen(prefix, suffix) => prefix + suffix,
+        }
+    }
+
+    /// Whether this pattern includes patterns of length `other_len`.
+    fn covers_length(self, other_len: usize) -> bool {
+        match self {
+            FixedLen(len) => len == other_len,
+            VarLen(prefix, suffix) => prefix + suffix <= other_len,
+        }
+    }
+}
+
+/// A constructor for array and slice patterns.
+#[derive(Copy, Clone, Debug, PartialEq, Eq)]
+pub struct Slice {
+    /// `None` if the matched value is a slice, `Some(n)` if it is an array of size `n`.
+    pub(crate) array_len: Option<usize>,
+    /// The kind of pattern it is: fixed-length `[x, y]` or variable length `[x, .., y]`.
+    pub(crate) kind: SliceKind,
+}
+
+impl Slice {
+    pub fn new(array_len: Option<usize>, kind: SliceKind) -> Self {
+        let kind = match (array_len, kind) {
+            // If the middle `..` has length 0, we effectively have a fixed-length pattern.
+            (Some(len), VarLen(prefix, suffix)) if prefix + suffix == len => FixedLen(len),
+            (Some(len), VarLen(prefix, suffix)) if prefix + suffix > len => bug!(
+                "Slice pattern of length {} longer than its array length {len}",
+                prefix + suffix
+            ),
+            _ => kind,
+        };
+        Slice { array_len, kind }
+    }
+
+    pub(crate) fn arity(self) -> usize {
+        self.kind.arity()
+    }
+
+    /// See `Constructor::is_covered_by`
+    fn is_covered_by(self, other: Self) -> bool {
+        other.kind.covers_length(self.arity())
+    }
+
+    /// This computes constructor splitting for variable-length slices, as explained at the top of
+    /// the file.
+    ///
+    /// A slice pattern `[x, .., y]` behaves like the infinite or-pattern `[x, y] | [x, _, y] | [x,
+    /// _, _, y] | etc`. The corresponding value constructors are fixed-length array constructors of
+    /// corresponding lengths. We obviously can't list this infinitude of constructors.
+    /// Thankfully, it turns out that for each finite set of slice patterns, all sufficiently large
+    /// array lengths are equivalent.
+    ///
+    /// Let's look at an example, where we are trying to split the last pattern:
+    /// ```
+    /// # fn foo(x: &[bool]) {
+    /// match x {
+    ///     [true, true, ..] => {}
+    ///     [.., false, false] => {}
+    ///     [..] => {}
+    /// }
+    /// # }
+    /// ```
+    /// Here are the results of specialization for the first few lengths:
+    /// ```
+    /// # fn foo(x: &[bool]) { match x {
+    /// // length 0
+    /// [] => {}
+    /// // length 1
+    /// [_] => {}
+    /// // length 2
+    /// [true, true] => {}
+    /// [false, false] => {}
+    /// [_, _] => {}
+    /// // length 3
+    /// [true, true,  _    ] => {}
+    /// [_,    false, false] => {}
+    /// [_,    _,     _    ] => {}
+    /// // length 4
+    /// [true, true, _,     _    ] => {}
+    /// [_,    _,    false, false] => {}
+    /// [_,    _,    _,     _    ] => {}
+    /// // length 5
+    /// [true, true, _, _,     _    ] => {}
+    /// [_,    _,    _, false, false] => {}
+    /// [_,    _,    _, _,     _    ] => {}
+    /// # _ => {}
+    /// # }}
+    /// ```
+    ///
+    /// We see that above length 4, we are simply inserting columns full of wildcards in the middle.
+    /// This means that specialization and witness computation with slices of length `l >= 4` will
+    /// give equivalent results regardless of `l`. This applies to any set of slice patterns: there
+    /// will be a length `L` above which all lengths behave the same. This is exactly what we need
+    /// for constructor splitting.
+    ///
+    /// A variable-length slice pattern covers all lengths from its arity up to infinity. As we just
+    /// saw, we can split this in two: lengths below `L` are treated individually with a
+    /// fixed-length slice each; lengths above `L` are grouped into a single variable-length slice
+    /// constructor.
+    ///
+    /// For each variable-length slice pattern `p` with a prefix of length `plₚ` and suffix of
+    /// length `slₚ`, only the first `plₚ` and the last `slₚ` elements are examined. Therefore, as
+    /// long as `L` is positive (to avoid concerns about empty types), all elements after the
+    /// maximum prefix length and before the maximum suffix length are not examined by any
+    /// variable-length pattern, and therefore can be ignored. This gives us a way to compute `L`.
+    ///
+    /// Additionally, if fixed-length patterns exist, we must pick an `L` large enough to miss them,
+    /// so we can pick `L = max(max(FIXED_LEN)+1, max(PREFIX_LEN) + max(SUFFIX_LEN))`.
+    /// `max_slice` below will be made to have this arity `L`.
+    ///
+    /// If `self` is fixed-length, it is returned as-is.
+    ///
+    /// Additionally, we track for each output slice whether it is covered by one of the column slices or not.
+    fn split(
+        self,
+        column_slices: impl Iterator<Item = Slice>,
+    ) -> impl Iterator<Item = (Presence, Slice)> {
+        // Range of lengths below `L`.
+        let smaller_lengths;
+        let arity = self.arity();
+        let mut max_slice = self.kind;
+        // Tracks the smallest variable-length slice we've seen. Any slice arity above it is
+        // therefore `Presence::Seen` in the column.
+        let mut min_var_len = usize::MAX;
+        // Tracks the fixed-length slices we've seen, to mark them as `Presence::Seen`.
+        let mut seen_fixed_lens = FxHashSet::default();
+        match &mut max_slice {
+            VarLen(max_prefix_len, max_suffix_len) => {
+                // A length larger than any fixed-length slice encountered.
+                // We start at 1 in case the subtype is empty because in that case the zero-length
+                // slice must be treated separately from the rest.
+                let mut fixed_len_upper_bound = 1;
+                // We grow `max_slice` to be larger than all slices encountered, as described above.
+                // `L` is `max_slice.arity()`. For diagnostics, we keep the prefix and suffix
+                // lengths separate.
+                for slice in column_slices {
+                    match slice.kind {
+                        FixedLen(len) => {
+                            fixed_len_upper_bound = cmp::max(fixed_len_upper_bound, len + 1);
+                            seen_fixed_lens.insert(len);
+                        }
+                        VarLen(prefix, suffix) => {
+                            *max_prefix_len = cmp::max(*max_prefix_len, prefix);
+                            *max_suffix_len = cmp::max(*max_suffix_len, suffix);
+                            min_var_len = cmp::min(min_var_len, prefix + suffix);
+                        }
+                    }
+                }
+                // If `fixed_len_upper_bound >= L`, we set `L` to `fixed_len_upper_bound`.
+                if let Some(delta) =
+                    fixed_len_upper_bound.checked_sub(*max_prefix_len + *max_suffix_len)
+                {
+                    *max_prefix_len += delta
+                }
+
+                // We cap the arity of `max_slice` at the array size.
+                match self.array_len {
+                    Some(len) if max_slice.arity() >= len => max_slice = FixedLen(len),
+                    _ => {}
+                }
+
+                smaller_lengths = match self.array_len {
+                    // The only admissible fixed-length slice is one of the array size. Whether `max_slice`
+                    // is fixed-length or variable-length, it will be the only relevant slice to output
+                    // here.
+                    Some(_) => 0..0, // empty range
+                    // We need to cover all arities in the range `(arity..infinity)`. We split that
+                    // range into two: lengths smaller than `max_slice.arity()` are treated
+                    // independently as fixed-lengths slices, and lengths above are captured by
+                    // `max_slice`.
+                    None => self.arity()..max_slice.arity(),
+                };
+            }
+            FixedLen(_) => {
+                // No need to split here. We only track presence.
+                for slice in column_slices {
+                    match slice.kind {
+                        FixedLen(len) => {
+                            if len == arity {
+                                seen_fixed_lens.insert(len);
+                            }
+                        }
+                        VarLen(prefix, suffix) => {
+                            min_var_len = cmp::min(min_var_len, prefix + suffix);
+                        }
+                    }
+                }
+                smaller_lengths = 0..0;
+            }
+        };
+
+        smaller_lengths.map(FixedLen).chain(once(max_slice)).map(move |kind| {
+            let arity = kind.arity();
+            let seen = if min_var_len <= arity || seen_fixed_lens.contains(&arity) {
+                Presence::Seen
+            } else {
+                Presence::Unseen
+            };
+            (seen, Slice::new(self.array_len, kind))
+        })
+    }
+}
+
+/// A globally unique id to distinguish `Opaque` patterns.
+#[derive(Clone, Debug, PartialEq, Eq)]
+pub struct OpaqueId(u32);
+
+impl OpaqueId {
+    pub fn new() -> Self {
+        use std::sync::atomic::{AtomicU32, Ordering};
+        static OPAQUE_ID: AtomicU32 = AtomicU32::new(0);
+        OpaqueId(OPAQUE_ID.fetch_add(1, Ordering::SeqCst))
+    }
+}
+
+/// A value can be decomposed into a constructor applied to some fields. This struct represents
+/// the constructor. See also `Fields`.
+///
+/// `pat_constructor` retrieves the constructor corresponding to a pattern.
+/// `specialize_constructor` returns the list of fields corresponding to a pattern, given a
+/// constructor. `Constructor::apply` reconstructs the pattern from a pair of `Constructor` and
+/// `Fields`.
+#[derive(Clone, Debug, PartialEq)]
+pub enum Constructor<'tcx> {
+    /// The constructor for patterns that have a single constructor, like tuples, struct patterns,
+    /// and references. Fixed-length arrays are treated separately with `Slice`.
+    Single,
+    /// Enum variants.
+    Variant(VariantIdx),
+    /// Booleans
+    Bool(bool),
+    /// Ranges of integer literal values (`2`, `2..=5` or `2..5`).
+    IntRange(IntRange),
+    /// Ranges of floating-point literal values (`2.0..=5.2`).
+    F32Range(IeeeFloat<SingleS>, IeeeFloat<SingleS>, RangeEnd),
+    F64Range(IeeeFloat<DoubleS>, IeeeFloat<DoubleS>, RangeEnd),
+    /// String literals. Strings are not quite the same as `&[u8]` so we treat them separately.
+    Str(Const<'tcx>),
+    /// Array and slice patterns.
+    Slice(Slice),
+    /// Constants that must not be matched structurally. They are treated as black boxes for the
+    /// purposes of exhaustiveness: we must not inspect them, and they don't count towards making a
+    /// match exhaustive.
+    /// Carries an id that must be unique within a match. We need this to ensure the invariants of
+    /// [`SplitConstructorSet`].
+    Opaque(OpaqueId),
+    /// Or-pattern.
+    Or,
+    /// Wildcard pattern.
+    Wildcard,
+    /// Fake extra constructor for enums that aren't allowed to be matched exhaustively. Also used
+    /// for those types for which we cannot list constructors explicitly, like `f64` and `str`.
+    NonExhaustive,
+    /// Fake extra constructor for variants that should not be mentioned in diagnostics.
+    /// We use this for variants behind an unstable gate as well as
+    /// `#[doc(hidden)]` ones.
+    Hidden,
+    /// Fake extra constructor for constructors that are not seen in the matrix, as explained at the
+    /// top of the file.
+    Missing,
+}
+
+impl<'tcx> Constructor<'tcx> {
+    pub(crate) fn is_non_exhaustive(&self) -> bool {
+        matches!(self, NonExhaustive)
+    }
+
+    pub(crate) fn as_variant(&self) -> Option<VariantIdx> {
+        match self {
+            Variant(i) => Some(*i),
+            _ => None,
+        }
+    }
+    fn as_bool(&self) -> Option<bool> {
+        match self {
+            Bool(b) => Some(*b),
+            _ => None,
+        }
+    }
+    pub(crate) fn as_int_range(&self) -> Option<&IntRange> {
+        match self {
+            IntRange(range) => Some(range),
+            _ => None,
+        }
+    }
+    fn as_slice(&self) -> Option<Slice> {
+        match self {
+            Slice(slice) => Some(*slice),
+            _ => None,
+        }
+    }
+
+    /// The number of fields for this constructor. This must be kept in sync with
+    /// `Fields::wildcards`.
+    pub(crate) fn arity(&self, pcx: &PatCtxt<'_, '_, 'tcx>) -> usize {
+        pcx.cx.ctor_arity(self, pcx.ty)
+    }
+
+    /// Returns whether `self` is covered by `other`, i.e. whether `self` is a subset of `other`.
+    /// For the simple cases, this is simply checking for equality. For the "grouped" constructors,
+    /// this checks for inclusion.
+    // We inline because this has a single call site in `Matrix::specialize_constructor`.
+    #[inline]
+    pub(crate) fn is_covered_by<'p>(&self, pcx: &PatCtxt<'_, 'p, 'tcx>, other: &Self) -> bool {
+        match (self, other) {
+            (Wildcard, _) => {
+                span_bug!(
+                    pcx.cx.scrut_span,
+                    "Constructor splitting should not have returned `Wildcard`"
+                )
+            }
+            // Wildcards cover anything
+            (_, Wildcard) => true,
+            // Only a wildcard pattern can match these special constructors.
+            (Missing { .. } | NonExhaustive | Hidden, _) => false,
+
+            (Single, Single) => true,
+            (Variant(self_id), Variant(other_id)) => self_id == other_id,
+            (Bool(self_b), Bool(other_b)) => self_b == other_b,
+
+            (IntRange(self_range), IntRange(other_range)) => self_range.is_subrange(other_range),
+            (F32Range(self_from, self_to, self_end), F32Range(other_from, other_to, other_end)) => {
+                self_from.ge(other_from)
+                    && match self_to.partial_cmp(other_to) {
+                        Some(Ordering::Less) => true,
+                        Some(Ordering::Equal) => other_end == self_end,
+                        _ => false,
+                    }
+            }
+            (F64Range(self_from, self_to, self_end), F64Range(other_from, other_to, other_end)) => {
+                self_from.ge(other_from)
+                    && match self_to.partial_cmp(other_to) {
+                        Some(Ordering::Less) => true,
+                        Some(Ordering::Equal) => other_end == self_end,
+                        _ => false,
+                    }
+            }
+            (Str(self_val), Str(other_val)) => {
+                // FIXME Once valtrees are available we can directly use the bytes
+                // in the `Str` variant of the valtree for the comparison here.
+                self_val == other_val
+            }
+            (Slice(self_slice), Slice(other_slice)) => self_slice.is_covered_by(*other_slice),
+
+            // Opaque constructors don't interact with anything unless they come from the
+            // syntactically identical pattern.
+            (Opaque(self_id), Opaque(other_id)) => self_id == other_id,
+            (Opaque(..), _) | (_, Opaque(..)) => false,
+
+            _ => span_bug!(
+                pcx.cx.scrut_span,
+                "trying to compare incompatible constructors {:?} and {:?}",
+                self,
+                other
+            ),
+        }
+    }
+}
+
+#[derive(Debug, Clone, Copy)]
+pub enum VariantVisibility {
+    /// Variant that doesn't fit the other cases, i.e. most variants.
+    Visible,
+    /// Variant behind an unstable gate or with the `#[doc(hidden)]` attribute. It will not be
+    /// mentioned in diagnostics unless the user mentioned it first.
+    Hidden,
+    /// Variant that matches no value. E.g. `Some::<Option<!>>` if the `exhaustive_patterns` feature
+    /// is enabled. Like `Hidden`, it will not be mentioned in diagnostics unless the user mentioned
+    /// it first.
+    Empty,
+}
+
+/// Describes the set of all constructors for a type. For details, in particular about the emptiness
+/// of constructors, see the top of the file.
+///
+/// In terms of division of responsibility, [`ConstructorSet::split`] handles all of the
+/// `exhaustive_patterns` feature.
+#[derive(Debug)]
+pub enum ConstructorSet {
+    /// The type has a single constructor, e.g. `&T` or a struct. `empty` tracks whether the
+    /// constructor is empty.
+    Single { empty: bool },
+    /// This type has the following list of constructors. If `variants` is empty and
+    /// `non_exhaustive` is false, don't use this; use `NoConstructors` instead.
+    Variants { variants: IndexVec<VariantIdx, VariantVisibility>, non_exhaustive: bool },
+    /// Booleans.
+    Bool,
+    /// The type is spanned by integer values. The range or ranges give the set of allowed values.
+    /// The second range is only useful for `char`.
+    Integers { range_1: IntRange, range_2: Option<IntRange> },
+    /// The type is matched by slices. `array_len` is the compile-time length of the array, if
+    /// known. If `subtype_is_empty`, all constructors are empty except possibly the zero-length
+    /// slice `[]`.
+    Slice { array_len: Option<usize>, subtype_is_empty: bool },
+    /// The constructors cannot be listed, and the type cannot be matched exhaustively. E.g. `str`,
+    /// floats.
+    Unlistable,
+    /// The type has no constructors (not even empty ones). This is `!` and empty enums.
+    NoConstructors,
+}
+
+/// Describes the result of analyzing the constructors in a column of a match.
+///
+/// `present` is morally the set of constructors present in the column, and `missing` is the set of
+/// constructors that exist in the type but are not present in the column.
+///
+/// More formally, if we discard wildcards from the column, this respects the following constraints:
+/// 1. the union of `present`, `missing` and `missing_empty` covers all the constructors of the type
+/// 2. each constructor in `present` is covered by something in the column
+/// 3. no constructor in `missing` or `missing_empty` is covered by anything in the column
+/// 4. each constructor in the column is equal to the union of one or more constructors in `present`
+/// 5. `missing` does not contain empty constructors (see discussion about emptiness at the top of
+///    the file);
+/// 6. `missing_empty` contains only empty constructors
+/// 7. constructors in `present`, `missing` and `missing_empty` are split for the column; in other
+///    words, they are either fully included in or fully disjoint from each constructor in the
+///    column. In yet other words, there are no non-trivial intersections like between `0..10` and
+///    `5..15`.
+///
+/// We must be particularly careful with weird constructors like `Opaque`: they're not formally part
+/// of the `ConstructorSet` for the type, yet if we forgot to include them in `present` we would be
+/// ignoring any row with `Opaque`s in the algorithm. Hence the importance of point 4.
+#[derive(Debug)]
+pub(crate) struct SplitConstructorSet<'tcx> {
+    pub(crate) present: SmallVec<[Constructor<'tcx>; 1]>,
+    pub(crate) missing: Vec<Constructor<'tcx>>,
+    pub(crate) missing_empty: Vec<Constructor<'tcx>>,
+}
+
+impl ConstructorSet {
+    /// This analyzes a column of constructors to 1/ determine which constructors of the type (if
+    /// any) are missing; 2/ split constructors to handle non-trivial intersections e.g. on ranges
+    /// or slices. This can get subtle; see [`SplitConstructorSet`] for details of this operation
+    /// and its invariants.
+    #[instrument(level = "debug", skip(self, pcx, ctors), ret)]
+    pub(crate) fn split<'a, 'tcx>(
+        &self,
+        pcx: &PatCtxt<'_, '_, 'tcx>,
+        ctors: impl Iterator<Item = &'a Constructor<'tcx>> + Clone,
+    ) -> SplitConstructorSet<'tcx>
+    where
+        'tcx: 'a,
+    {
+        let mut present: SmallVec<[_; 1]> = SmallVec::new();
+        // Empty constructors found missing.
+        let mut missing_empty = Vec::new();
+        // Nonempty constructors found missing.
+        let mut missing = Vec::new();
+        // Constructors in `ctors`, except wildcards and opaques.
+        let mut seen = Vec::new();
+        for ctor in ctors.cloned() {
+            match ctor {
+                Opaque(..) => present.push(ctor),
+                Wildcard => {} // discard wildcards
+                _ => seen.push(ctor),
+            }
+        }
+
+        match self {
+            ConstructorSet::Single { empty } => {
+                if !seen.is_empty() {
+                    present.push(Single);
+                } else if *empty {
+                    missing_empty.push(Single);
+                } else {
+                    missing.push(Single);
+                }
+            }
+            ConstructorSet::Variants { variants, non_exhaustive } => {
+                let seen_set: FxHashSet<_> = seen.iter().map(|c| c.as_variant().unwrap()).collect();
+                let mut skipped_a_hidden_variant = false;
+
+                for (idx, visibility) in variants.iter_enumerated() {
+                    let ctor = Variant(idx);
+                    if seen_set.contains(&idx) {
+                        present.push(ctor);
+                    } else {
+                        // We only put visible variants directly into `missing`.
+                        match visibility {
+                            VariantVisibility::Visible => missing.push(ctor),
+                            VariantVisibility::Hidden => skipped_a_hidden_variant = true,
+                            VariantVisibility::Empty => missing_empty.push(ctor),
+                        }
+                    }
+                }
+
+                if skipped_a_hidden_variant {
+                    missing.push(Hidden);
+                }
+                if *non_exhaustive {
+                    missing.push(NonExhaustive);
+                }
+            }
+            ConstructorSet::Bool => {
+                let mut seen_false = false;
+                let mut seen_true = false;
+                for b in seen.iter().map(|ctor| ctor.as_bool().unwrap()) {
+                    if b {
+                        seen_true = true;
+                    } else {
+                        seen_false = true;
+                    }
+                }
+                if seen_false {
+                    present.push(Bool(false));
+                } else {
+                    missing.push(Bool(false));
+                }
+                if seen_true {
+                    present.push(Bool(true));
+                } else {
+                    missing.push(Bool(true));
+                }
+            }
+            ConstructorSet::Integers { range_1, range_2 } => {
+                let seen_ranges: Vec<_> =
+                    seen.iter().map(|ctor| *ctor.as_int_range().unwrap()).collect();
+                for (seen, splitted_range) in range_1.split(seen_ranges.iter().cloned()) {
+                    match seen {
+                        Presence::Unseen => missing.push(IntRange(splitted_range)),
+                        Presence::Seen => present.push(IntRange(splitted_range)),
+                    }
+                }
+                if let Some(range_2) = range_2 {
+                    for (seen, splitted_range) in range_2.split(seen_ranges.into_iter()) {
+                        match seen {
+                            Presence::Unseen => missing.push(IntRange(splitted_range)),
+                            Presence::Seen => present.push(IntRange(splitted_range)),
+                        }
+                    }
+                }
+            }
+            ConstructorSet::Slice { array_len, subtype_is_empty } => {
+                let seen_slices = seen.iter().map(|c| c.as_slice().unwrap());
+                let base_slice = Slice::new(*array_len, VarLen(0, 0));
+                for (seen, splitted_slice) in base_slice.split(seen_slices) {
+                    let ctor = Slice(splitted_slice);
+                    match seen {
+                        Presence::Seen => present.push(ctor),
+                        Presence::Unseen => {
+                            if *subtype_is_empty && splitted_slice.arity() != 0 {
+                                // We have subpatterns of an empty type, so the constructor is
+                                // empty.
+                                missing_empty.push(ctor);
+                            } else {
+                                missing.push(ctor);
+                            }
+                        }
+                    }
+                }
+            }
+            ConstructorSet::Unlistable => {
+                // Since we can't list constructors, we take the ones in the column. This might list
+                // some constructors several times but there's not much we can do.
+                present.extend(seen);
+                missing.push(NonExhaustive);
+            }
+            ConstructorSet::NoConstructors => {
+                // In a `MaybeInvalid` place even an empty pattern may be reachable. We therefore
+                // add a dummy empty constructor here, which will be ignored if the place is
+                // `ValidOnly`.
+                missing_empty.push(NonExhaustive);
+            }
+        }
+
+        // We have now grouped all the constructors into 3 buckets: present, missing, missing_empty.
+        // In the absence of the `exhaustive_patterns` feature however, we don't count nested empty
+        // types as empty. Only non-nested `!` or `enum Foo {}` are considered empty.
+        if !pcx.cx.tcx.features().exhaustive_patterns
+            && !(pcx.is_top_level && matches!(self, Self::NoConstructors))
+        {
+            // Treat all missing constructors as nonempty.
+            // This clears `missing_empty`.
+            missing.append(&mut missing_empty);
+        }
+
+        SplitConstructorSet { present, missing, missing_empty }
+    }
+}
diff --git a/compiler/rustc_pattern_analysis/src/cx.rs b/compiler/rustc_pattern_analysis/src/cx.rs
new file mode 100644
index 00000000000..8a4f39a1f4a
--- /dev/null
+++ b/compiler/rustc_pattern_analysis/src/cx.rs
@@ -0,0 +1,856 @@
+use std::fmt;
+use std::iter::once;
+
+use rustc_arena::TypedArena;
+use rustc_data_structures::captures::Captures;
+use rustc_hir::def_id::DefId;
+use rustc_hir::{HirId, RangeEnd};
+use rustc_index::Idx;
+use rustc_index::IndexVec;
+use rustc_middle::middle::stability::EvalResult;
+use rustc_middle::mir;
+use rustc_middle::mir::interpret::Scalar;
+use rustc_middle::thir::{FieldPat, Pat, PatKind, PatRange, PatRangeBoundary};
+use rustc_middle::ty::layout::IntegerExt;
+use rustc_middle::ty::{self, Ty, TyCtxt, VariantDef};
+use rustc_span::{Span, DUMMY_SP};
+use rustc_target::abi::{FieldIdx, Integer, VariantIdx, FIRST_VARIANT};
+use smallvec::SmallVec;
+
+use crate::constructor::{
+    Constructor, ConstructorSet, IntRange, MaybeInfiniteInt, OpaqueId, Slice, SliceKind,
+    VariantVisibility,
+};
+use crate::pat::{DeconstructedPat, WitnessPat};
+
+use Constructor::*;
+
+pub struct MatchCheckCtxt<'p, 'tcx> {
+    pub tcx: TyCtxt<'tcx>,
+    /// The module in which the match occurs. This is necessary for
+    /// checking inhabited-ness of types because whether a type is (visibly)
+    /// inhabited can depend on whether it was defined in the current module or
+    /// not. E.g., `struct Foo { _private: ! }` cannot be seen to be empty
+    /// outside its module and should not be matchable with an empty match statement.
+    pub module: DefId,
+    pub param_env: ty::ParamEnv<'tcx>,
+    pub pattern_arena: &'p TypedArena<DeconstructedPat<'p, 'tcx>>,
+    /// Lint level at the match.
+    pub match_lint_level: HirId,
+    /// The span of the whole match, if applicable.
+    pub whole_match_span: Option<Span>,
+    /// Span of the scrutinee.
+    pub scrut_span: Span,
+    /// Only produce `NON_EXHAUSTIVE_OMITTED_PATTERNS` lint on refutable patterns.
+    pub refutable: bool,
+    /// Whether the data at the scrutinee is known to be valid. This is false if the scrutinee comes
+    /// from a union field, a pointer deref, or a reference deref (pending opsem decisions).
+    pub known_valid_scrutinee: bool,
+}
+
+impl<'p, 'tcx> MatchCheckCtxt<'p, 'tcx> {
+    pub(super) fn is_uninhabited(&self, ty: Ty<'tcx>) -> bool {
+        !ty.is_inhabited_from(self.tcx, self.module, self.param_env)
+    }
+
+    /// Returns whether the given type is an enum from another crate declared `#[non_exhaustive]`.
+    pub fn is_foreign_non_exhaustive_enum(&self, ty: Ty<'tcx>) -> bool {
+        match ty.kind() {
+            ty::Adt(def, ..) => {
+                def.is_enum() && def.is_variant_list_non_exhaustive() && !def.did().is_local()
+            }
+            _ => false,
+        }
+    }
+
+    pub(crate) fn alloc_wildcard_slice(
+        &self,
+        tys: impl IntoIterator<Item = Ty<'tcx>>,
+    ) -> &'p [DeconstructedPat<'p, 'tcx>] {
+        self.pattern_arena
+            .alloc_from_iter(tys.into_iter().map(|ty| DeconstructedPat::wildcard(ty, DUMMY_SP)))
+    }
+
+    // In the cases of either a `#[non_exhaustive]` field list or a non-public field, we hide
+    // uninhabited fields in order not to reveal the uninhabitedness of the whole variant.
+    // This lists the fields we keep along with their types.
+    pub(crate) fn list_variant_nonhidden_fields<'a>(
+        &'a self,
+        ty: Ty<'tcx>,
+        variant: &'a VariantDef,
+    ) -> impl Iterator<Item = (FieldIdx, Ty<'tcx>)> + Captures<'p> + Captures<'a> {
+        let cx = self;
+        let ty::Adt(adt, args) = ty.kind() else { bug!() };
+        // Whether we must not match the fields of this variant exhaustively.
+        let is_non_exhaustive = variant.is_field_list_non_exhaustive() && !adt.did().is_local();
+
+        variant.fields.iter().enumerate().filter_map(move |(i, field)| {
+            let ty = field.ty(cx.tcx, args);
+            // `field.ty()` doesn't normalize after substituting.
+            let ty = cx.tcx.normalize_erasing_regions(cx.param_env, ty);
+            let is_visible = adt.is_enum() || field.vis.is_accessible_from(cx.module, cx.tcx);
+            let is_uninhabited = cx.tcx.features().exhaustive_patterns && cx.is_uninhabited(ty);
+
+            if is_uninhabited && (!is_visible || is_non_exhaustive) {
+                None
+            } else {
+                Some((FieldIdx::new(i), ty))
+            }
+        })
+    }
+
+    pub(crate) fn variant_index_for_adt(
+        ctor: &Constructor<'tcx>,
+        adt: ty::AdtDef<'tcx>,
+    ) -> VariantIdx {
+        match *ctor {
+            Variant(idx) => idx,
+            Single => {
+                assert!(!adt.is_enum());
+                FIRST_VARIANT
+            }
+            _ => bug!("bad constructor {:?} for adt {:?}", ctor, adt),
+        }
+    }
+
+    /// Creates a new list of wildcard fields for a given constructor. The result must have a length
+    /// of `ctor.arity()`.
+    #[instrument(level = "trace", skip(self))]
+    pub(crate) fn ctor_wildcard_fields(
+        &self,
+        ctor: &Constructor<'tcx>,
+        ty: Ty<'tcx>,
+    ) -> &'p [DeconstructedPat<'p, 'tcx>] {
+        let cx = self;
+        match ctor {
+            Single | Variant(_) => match ty.kind() {
+                ty::Tuple(fs) => cx.alloc_wildcard_slice(fs.iter()),
+                ty::Ref(_, rty, _) => cx.alloc_wildcard_slice(once(*rty)),
+                ty::Adt(adt, args) => {
+                    if adt.is_box() {
+                        // The only legal patterns of type `Box` (outside `std`) are `_` and box
+                        // patterns. If we're here we can assume this is a box pattern.
+                        cx.alloc_wildcard_slice(once(args.type_at(0)))
+                    } else {
+                        let variant =
+                            &adt.variant(MatchCheckCtxt::variant_index_for_adt(&ctor, *adt));
+                        let tys = cx.list_variant_nonhidden_fields(ty, variant).map(|(_, ty)| ty);
+                        cx.alloc_wildcard_slice(tys)
+                    }
+                }
+                _ => bug!("Unexpected type for `Single` constructor: {:?}", ty),
+            },
+            Slice(slice) => match *ty.kind() {
+                ty::Slice(ty) | ty::Array(ty, _) => {
+                    let arity = slice.arity();
+                    cx.alloc_wildcard_slice((0..arity).map(|_| ty))
+                }
+                _ => bug!("bad slice pattern {:?} {:?}", ctor, ty),
+            },
+            Bool(..)
+            | IntRange(..)
+            | F32Range(..)
+            | F64Range(..)
+            | Str(..)
+            | Opaque(..)
+            | NonExhaustive
+            | Hidden
+            | Missing { .. }
+            | Wildcard => &[],
+            Or => {
+                bug!("called `Fields::wildcards` on an `Or` ctor")
+            }
+        }
+    }
+
+    /// The number of fields for this constructor. This must be kept in sync with
+    /// `Fields::wildcards`.
+    pub(crate) fn ctor_arity(&self, ctor: &Constructor<'tcx>, ty: Ty<'tcx>) -> usize {
+        match ctor {
+            Single | Variant(_) => match ty.kind() {
+                ty::Tuple(fs) => fs.len(),
+                ty::Ref(..) => 1,
+                ty::Adt(adt, ..) => {
+                    if adt.is_box() {
+                        // The only legal patterns of type `Box` (outside `std`) are `_` and box
+                        // patterns. If we're here we can assume this is a box pattern.
+                        1
+                    } else {
+                        let variant =
+                            &adt.variant(MatchCheckCtxt::variant_index_for_adt(&ctor, *adt));
+                        self.list_variant_nonhidden_fields(ty, variant).count()
+                    }
+                }
+                _ => bug!("Unexpected type for `Single` constructor: {:?}", ty),
+            },
+            Slice(slice) => slice.arity(),
+            Bool(..)
+            | IntRange(..)
+            | F32Range(..)
+            | F64Range(..)
+            | Str(..)
+            | Opaque(..)
+            | NonExhaustive
+            | Hidden
+            | Missing { .. }
+            | Wildcard => 0,
+            Or => bug!("The `Or` constructor doesn't have a fixed arity"),
+        }
+    }
+
+    /// Creates a set that represents all the constructors of `ty`.
+    ///
+    /// See [`crate::constructor`] for considerations of emptiness.
+    #[instrument(level = "debug", skip(self), ret)]
+    pub fn ctors_for_ty(&self, ty: Ty<'tcx>) -> ConstructorSet {
+        let cx = self;
+        let make_uint_range = |start, end| {
+            IntRange::from_range(
+                MaybeInfiniteInt::new_finite_uint(start),
+                MaybeInfiniteInt::new_finite_uint(end),
+                RangeEnd::Included,
+            )
+        };
+        // This determines the set of all possible constructors for the type `ty`. For numbers,
+        // arrays and slices we use ranges and variable-length slices when appropriate.
+        match ty.kind() {
+            ty::Bool => ConstructorSet::Bool,
+            ty::Char => {
+                // The valid Unicode Scalar Value ranges.
+                ConstructorSet::Integers {
+                    range_1: make_uint_range('\u{0000}' as u128, '\u{D7FF}' as u128),
+                    range_2: Some(make_uint_range('\u{E000}' as u128, '\u{10FFFF}' as u128)),
+                }
+            }
+            &ty::Int(ity) => {
+                let range = if ty.is_ptr_sized_integral() {
+                    // The min/max values of `isize` are not allowed to be observed.
+                    IntRange {
+                        lo: MaybeInfiniteInt::NegInfinity,
+                        hi: MaybeInfiniteInt::PosInfinity,
+                    }
+                } else {
+                    let size = Integer::from_int_ty(&cx.tcx, ity).size().bits();
+                    let min = 1u128 << (size - 1);
+                    let max = min - 1;
+                    let min = MaybeInfiniteInt::new_finite_int(min, size);
+                    let max = MaybeInfiniteInt::new_finite_int(max, size);
+                    IntRange::from_range(min, max, RangeEnd::Included)
+                };
+                ConstructorSet::Integers { range_1: range, range_2: None }
+            }
+            &ty::Uint(uty) => {
+                let range = if ty.is_ptr_sized_integral() {
+                    // The max value of `usize` is not allowed to be observed.
+                    let lo = MaybeInfiniteInt::new_finite_uint(0);
+                    IntRange { lo, hi: MaybeInfiniteInt::PosInfinity }
+                } else {
+                    let size = Integer::from_uint_ty(&cx.tcx, uty).size();
+                    let max = size.truncate(u128::MAX);
+                    make_uint_range(0, max)
+                };
+                ConstructorSet::Integers { range_1: range, range_2: None }
+            }
+            ty::Slice(sub_ty) => ConstructorSet::Slice {
+                array_len: None,
+                subtype_is_empty: cx.is_uninhabited(*sub_ty),
+            },
+            ty::Array(sub_ty, len) => {
+                // We treat arrays of a constant but unknown length like slices.
+                ConstructorSet::Slice {
+                    array_len: len.try_eval_target_usize(cx.tcx, cx.param_env).map(|l| l as usize),
+                    subtype_is_empty: cx.is_uninhabited(*sub_ty),
+                }
+            }
+            ty::Adt(def, args) if def.is_enum() => {
+                let is_declared_nonexhaustive = cx.is_foreign_non_exhaustive_enum(ty);
+                if def.variants().is_empty() && !is_declared_nonexhaustive {
+                    ConstructorSet::NoConstructors
+                } else {
+                    let mut variants =
+                        IndexVec::from_elem(VariantVisibility::Visible, def.variants());
+                    for (idx, v) in def.variants().iter_enumerated() {
+                        let variant_def_id = def.variant(idx).def_id;
+                        // Visibly uninhabited variants.
+                        let is_inhabited = v
+                            .inhabited_predicate(cx.tcx, *def)
+                            .instantiate(cx.tcx, args)
+                            .apply(cx.tcx, cx.param_env, cx.module);
+                        // Variants that depend on a disabled unstable feature.
+                        let is_unstable = matches!(
+                            cx.tcx.eval_stability(variant_def_id, None, DUMMY_SP, None),
+                            EvalResult::Deny { .. }
+                        );
+                        // Foreign `#[doc(hidden)]` variants.
+                        let is_doc_hidden =
+                            cx.tcx.is_doc_hidden(variant_def_id) && !variant_def_id.is_local();
+                        let visibility = if !is_inhabited {
+                            // FIXME: handle empty+hidden
+                            VariantVisibility::Empty
+                        } else if is_unstable || is_doc_hidden {
+                            VariantVisibility::Hidden
+                        } else {
+                            VariantVisibility::Visible
+                        };
+                        variants[idx] = visibility;
+                    }
+
+                    ConstructorSet::Variants { variants, non_exhaustive: is_declared_nonexhaustive }
+                }
+            }
+            ty::Adt(..) | ty::Tuple(..) | ty::Ref(..) => {
+                ConstructorSet::Single { empty: cx.is_uninhabited(ty) }
+            }
+            ty::Never => ConstructorSet::NoConstructors,
+            // This type is one for which we cannot list constructors, like `str` or `f64`.
+            // FIXME(Nadrieril): which of these are actually allowed?
+            ty::Float(_)
+            | ty::Str
+            | ty::Foreign(_)
+            | ty::RawPtr(_)
+            | ty::FnDef(_, _)
+            | ty::FnPtr(_)
+            | ty::Dynamic(_, _, _)
+            | ty::Closure(_, _)
+            | ty::Coroutine(_, _, _)
+            | ty::Alias(_, _)
+            | ty::Param(_)
+            | ty::Error(_) => ConstructorSet::Unlistable,
+            ty::CoroutineWitness(_, _) | ty::Bound(_, _) | ty::Placeholder(_) | ty::Infer(_) => {
+                bug!("Encountered unexpected type in `ConstructorSet::for_ty`: {ty:?}")
+            }
+        }
+    }
+
+    pub(crate) fn lower_pat_range_bdy(
+        &self,
+        bdy: PatRangeBoundary<'tcx>,
+        ty: Ty<'tcx>,
+    ) -> MaybeInfiniteInt {
+        match bdy {
+            PatRangeBoundary::NegInfinity => MaybeInfiniteInt::NegInfinity,
+            PatRangeBoundary::Finite(value) => {
+                let bits = value.eval_bits(self.tcx, self.param_env);
+                match *ty.kind() {
+                    ty::Int(ity) => {
+                        let size = Integer::from_int_ty(&self.tcx, ity).size().bits();
+                        MaybeInfiniteInt::new_finite_int(bits, size)
+                    }
+                    _ => MaybeInfiniteInt::new_finite_uint(bits),
+                }
+            }
+            PatRangeBoundary::PosInfinity => MaybeInfiniteInt::PosInfinity,
+        }
+    }
+
+    /// Note: the input patterns must have been lowered through
+    /// `rustc_mir_build::thir::pattern::check_match::MatchVisitor::lower_pattern`.
+    pub fn lower_pat(&self, pat: &Pat<'tcx>) -> DeconstructedPat<'p, 'tcx> {
+        let singleton = |pat| std::slice::from_ref(self.pattern_arena.alloc(pat));
+        let cx = self;
+        let ctor;
+        let fields: &[_];
+        match &pat.kind {
+            PatKind::AscribeUserType { subpattern, .. }
+            | PatKind::InlineConstant { subpattern, .. } => return self.lower_pat(subpattern),
+            PatKind::Binding { subpattern: Some(subpat), .. } => return self.lower_pat(subpat),
+            PatKind::Binding { subpattern: None, .. } | PatKind::Wild => {
+                ctor = Wildcard;
+                fields = &[];
+            }
+            PatKind::Deref { subpattern } => {
+                ctor = Single;
+                fields = singleton(self.lower_pat(subpattern));
+            }
+            PatKind::Leaf { subpatterns } | PatKind::Variant { subpatterns, .. } => {
+                match pat.ty.kind() {
+                    ty::Tuple(fs) => {
+                        ctor = Single;
+                        let mut wilds: SmallVec<[_; 2]> =
+                            fs.iter().map(|ty| DeconstructedPat::wildcard(ty, pat.span)).collect();
+                        for pat in subpatterns {
+                            wilds[pat.field.index()] = self.lower_pat(&pat.pattern);
+                        }
+                        fields = cx.pattern_arena.alloc_from_iter(wilds);
+                    }
+                    ty::Adt(adt, args) if adt.is_box() => {
+                        // The only legal patterns of type `Box` (outside `std`) are `_` and box
+                        // patterns. If we're here we can assume this is a box pattern.
+                        // FIXME(Nadrieril): A `Box` can in theory be matched either with `Box(_,
+                        // _)` or a box pattern. As a hack to avoid an ICE with the former, we
+                        // ignore other fields than the first one. This will trigger an error later
+                        // anyway.
+                        // See https://github.com/rust-lang/rust/issues/82772 ,
+                        // explanation: https://github.com/rust-lang/rust/pull/82789#issuecomment-796921977
+                        // The problem is that we can't know from the type whether we'll match
+                        // normally or through box-patterns. We'll have to figure out a proper
+                        // solution when we introduce generalized deref patterns. Also need to
+                        // prevent mixing of those two options.
+                        let pattern = subpatterns.into_iter().find(|pat| pat.field.index() == 0);
+                        let pat = if let Some(pat) = pattern {
+                            self.lower_pat(&pat.pattern)
+                        } else {
+                            DeconstructedPat::wildcard(args.type_at(0), pat.span)
+                        };
+                        ctor = Single;
+                        fields = singleton(pat);
+                    }
+                    ty::Adt(adt, _) => {
+                        ctor = match pat.kind {
+                            PatKind::Leaf { .. } => Single,
+                            PatKind::Variant { variant_index, .. } => Variant(variant_index),
+                            _ => bug!(),
+                        };
+                        let variant =
+                            &adt.variant(MatchCheckCtxt::variant_index_for_adt(&ctor, *adt));
+                        // For each field in the variant, we store the relevant index into `self.fields` if any.
+                        let mut field_id_to_id: Vec<Option<usize>> =
+                            (0..variant.fields.len()).map(|_| None).collect();
+                        let tys = cx
+                            .list_variant_nonhidden_fields(pat.ty, variant)
+                            .enumerate()
+                            .map(|(i, (field, ty))| {
+                                field_id_to_id[field.index()] = Some(i);
+                                ty
+                            });
+                        let mut wilds: SmallVec<[_; 2]> =
+                            tys.map(|ty| DeconstructedPat::wildcard(ty, pat.span)).collect();
+                        for pat in subpatterns {
+                            if let Some(i) = field_id_to_id[pat.field.index()] {
+                                wilds[i] = self.lower_pat(&pat.pattern);
+                            }
+                        }
+                        fields = cx.pattern_arena.alloc_from_iter(wilds);
+                    }
+                    _ => bug!("pattern has unexpected type: pat: {:?}, ty: {:?}", pat, pat.ty),
+                }
+            }
+            PatKind::Constant { value } => {
+                match pat.ty.kind() {
+                    ty::Bool => {
+                        ctor = match value.try_eval_bool(cx.tcx, cx.param_env) {
+                            Some(b) => Bool(b),
+                            None => Opaque(OpaqueId::new()),
+                        };
+                        fields = &[];
+                    }
+                    ty::Char | ty::Int(_) | ty::Uint(_) => {
+                        ctor = match value.try_eval_bits(cx.tcx, cx.param_env) {
+                            Some(bits) => {
+                                let x = match *pat.ty.kind() {
+                                    ty::Int(ity) => {
+                                        let size = Integer::from_int_ty(&cx.tcx, ity).size().bits();
+                                        MaybeInfiniteInt::new_finite_int(bits, size)
+                                    }
+                                    _ => MaybeInfiniteInt::new_finite_uint(bits),
+                                };
+                                IntRange(IntRange::from_singleton(x))
+                            }
+                            None => Opaque(OpaqueId::new()),
+                        };
+                        fields = &[];
+                    }
+                    ty::Float(ty::FloatTy::F32) => {
+                        ctor = match value.try_eval_bits(cx.tcx, cx.param_env) {
+                            Some(bits) => {
+                                use rustc_apfloat::Float;
+                                let value = rustc_apfloat::ieee::Single::from_bits(bits);
+                                F32Range(value, value, RangeEnd::Included)
+                            }
+                            None => Opaque(OpaqueId::new()),
+                        };
+                        fields = &[];
+                    }
+                    ty::Float(ty::FloatTy::F64) => {
+                        ctor = match value.try_eval_bits(cx.tcx, cx.param_env) {
+                            Some(bits) => {
+                                use rustc_apfloat::Float;
+                                let value = rustc_apfloat::ieee::Double::from_bits(bits);
+                                F64Range(value, value, RangeEnd::Included)
+                            }
+                            None => Opaque(OpaqueId::new()),
+                        };
+                        fields = &[];
+                    }
+                    ty::Ref(_, t, _) if t.is_str() => {
+                        // We want a `&str` constant to behave like a `Deref` pattern, to be compatible
+                        // with other `Deref` patterns. This could have been done in `const_to_pat`,
+                        // but that causes issues with the rest of the matching code.
+                        // So here, the constructor for a `"foo"` pattern is `&` (represented by
+                        // `Single`), and has one field. That field has constructor `Str(value)` and no
+                        // fields.
+                        // Note: `t` is `str`, not `&str`.
+                        let subpattern = DeconstructedPat::new(Str(*value), &[], *t, pat.span);
+                        ctor = Single;
+                        fields = singleton(subpattern)
+                    }
+                    // All constants that can be structurally matched have already been expanded
+                    // into the corresponding `Pat`s by `const_to_pat`. Constants that remain are
+                    // opaque.
+                    _ => {
+                        ctor = Opaque(OpaqueId::new());
+                        fields = &[];
+                    }
+                }
+            }
+            PatKind::Range(patrange) => {
+                let PatRange { lo, hi, end, .. } = patrange.as_ref();
+                let ty = pat.ty;
+                ctor = match ty.kind() {
+                    ty::Char | ty::Int(_) | ty::Uint(_) => {
+                        let lo = cx.lower_pat_range_bdy(*lo, ty);
+                        let hi = cx.lower_pat_range_bdy(*hi, ty);
+                        IntRange(IntRange::from_range(lo, hi, *end))
+                    }
+                    ty::Float(fty) => {
+                        use rustc_apfloat::Float;
+                        let lo = lo.as_finite().map(|c| c.eval_bits(cx.tcx, cx.param_env));
+                        let hi = hi.as_finite().map(|c| c.eval_bits(cx.tcx, cx.param_env));
+                        match fty {
+                            ty::FloatTy::F32 => {
+                                use rustc_apfloat::ieee::Single;
+                                let lo = lo.map(Single::from_bits).unwrap_or(-Single::INFINITY);
+                                let hi = hi.map(Single::from_bits).unwrap_or(Single::INFINITY);
+                                F32Range(lo, hi, *end)
+                            }
+                            ty::FloatTy::F64 => {
+                                use rustc_apfloat::ieee::Double;
+                                let lo = lo.map(Double::from_bits).unwrap_or(-Double::INFINITY);
+                                let hi = hi.map(Double::from_bits).unwrap_or(Double::INFINITY);
+                                F64Range(lo, hi, *end)
+                            }
+                        }
+                    }
+                    _ => bug!("invalid type for range pattern: {}", ty),
+                };
+                fields = &[];
+            }
+            PatKind::Array { prefix, slice, suffix } | PatKind::Slice { prefix, slice, suffix } => {
+                let array_len = match pat.ty.kind() {
+                    ty::Array(_, length) => {
+                        Some(length.eval_target_usize(cx.tcx, cx.param_env) as usize)
+                    }
+                    ty::Slice(_) => None,
+                    _ => span_bug!(pat.span, "bad ty {:?} for slice pattern", pat.ty),
+                };
+                let kind = if slice.is_some() {
+                    SliceKind::VarLen(prefix.len(), suffix.len())
+                } else {
+                    SliceKind::FixedLen(prefix.len() + suffix.len())
+                };
+                ctor = Slice(Slice::new(array_len, kind));
+                fields = cx.pattern_arena.alloc_from_iter(
+                    prefix.iter().chain(suffix.iter()).map(|p| self.lower_pat(&*p)),
+                )
+            }
+            PatKind::Or { .. } => {
+                ctor = Or;
+                let pats = expand_or_pat(pat);
+                fields =
+                    cx.pattern_arena.alloc_from_iter(pats.into_iter().map(|p| self.lower_pat(p)))
+            }
+            PatKind::Never => {
+                // FIXME(never_patterns): handle `!` in exhaustiveness. This is a sane default
+                // in the meantime.
+                ctor = Wildcard;
+                fields = &[];
+            }
+            PatKind::Error(_) => {
+                ctor = Opaque(OpaqueId::new());
+                fields = &[];
+            }
+        }
+        DeconstructedPat::new(ctor, fields, pat.ty, pat.span)
+    }
+
+    /// Convert back to a `thir::PatRangeBoundary` for diagnostic purposes.
+    /// Note: it is possible to get `isize/usize::MAX+1` here, as explained in the doc for
+    /// [`IntRange::split`]. This cannot be represented as a `Const`, so we represent it with
+    /// `PosInfinity`.
+    pub(crate) fn hoist_pat_range_bdy(
+        &self,
+        miint: MaybeInfiniteInt,
+        ty: Ty<'tcx>,
+    ) -> PatRangeBoundary<'tcx> {
+        use MaybeInfiniteInt::*;
+        let tcx = self.tcx;
+        match miint {
+            NegInfinity => PatRangeBoundary::NegInfinity,
+            Finite(_) => {
+                let size = ty.primitive_size(tcx);
+                let bits = match *ty.kind() {
+                    ty::Int(_) => miint.as_finite_int(size.bits()).unwrap(),
+                    _ => miint.as_finite_uint().unwrap(),
+                };
+                match Scalar::try_from_uint(bits, size) {
+                    Some(scalar) => {
+                        let value = mir::Const::from_scalar(tcx, scalar, ty);
+                        PatRangeBoundary::Finite(value)
+                    }
+                    // The value doesn't fit. Since `x >= 0` and 0 always encodes the minimum value
+                    // for a type, the problem isn't that the value is too small. So it must be too
+                    // large.
+                    None => PatRangeBoundary::PosInfinity,
+                }
+            }
+            JustAfterMax | PosInfinity => PatRangeBoundary::PosInfinity,
+        }
+    }
+
+    /// Whether the range denotes the fictitious values before `isize::MIN` or after
+    /// `usize::MAX`/`isize::MAX` (see doc of [`IntRange::split`] for why these exist).
+    pub fn is_range_beyond_boundaries(&self, range: &IntRange, ty: Ty<'tcx>) -> bool {
+        ty.is_ptr_sized_integral() && {
+            // The two invalid ranges are `NegInfinity..isize::MIN` (represented as
+            // `NegInfinity..0`), and `{u,i}size::MAX+1..PosInfinity`. `hoist_pat_range_bdy`
+            // converts `MAX+1` to `PosInfinity`, and we couldn't have `PosInfinity` in `range.lo`
+            // otherwise.
+            let lo = self.hoist_pat_range_bdy(range.lo, ty);
+            matches!(lo, PatRangeBoundary::PosInfinity)
+                || matches!(range.hi, MaybeInfiniteInt::Finite(0))
+        }
+    }
+
+    /// Convert back to a `thir::Pat` for diagnostic purposes.
+    pub(crate) fn hoist_pat_range(&self, range: &IntRange, ty: Ty<'tcx>) -> Pat<'tcx> {
+        use MaybeInfiniteInt::*;
+        let cx = self;
+        let kind = if matches!((range.lo, range.hi), (NegInfinity, PosInfinity)) {
+            PatKind::Wild
+        } else if range.is_singleton() {
+            let lo = cx.hoist_pat_range_bdy(range.lo, ty);
+            let value = lo.as_finite().unwrap();
+            PatKind::Constant { value }
+        } else {
+            // We convert to an inclusive range for diagnostics.
+            let mut end = RangeEnd::Included;
+            let mut lo = cx.hoist_pat_range_bdy(range.lo, ty);
+            if matches!(lo, PatRangeBoundary::PosInfinity) {
+                // The only reason to get `PosInfinity` here is the special case where
+                // `hoist_pat_range_bdy` found `{u,i}size::MAX+1`. So the range denotes the
+                // fictitious values after `{u,i}size::MAX` (see [`IntRange::split`] for why we do
+                // this). We show this to the user as `usize::MAX..` which is slightly incorrect but
+                // probably clear enough.
+                let c = ty.numeric_max_val(cx.tcx).unwrap();
+                let value = mir::Const::from_ty_const(c, cx.tcx);
+                lo = PatRangeBoundary::Finite(value);
+            }
+            let hi = if matches!(range.hi, Finite(0)) {
+                // The range encodes `..ty::MIN`, so we can't convert it to an inclusive range.
+                end = RangeEnd::Excluded;
+                range.hi
+            } else {
+                range.hi.minus_one()
+            };
+            let hi = cx.hoist_pat_range_bdy(hi, ty);
+            PatKind::Range(Box::new(PatRange { lo, hi, end, ty }))
+        };
+
+        Pat { ty, span: DUMMY_SP, kind }
+    }
+    /// Convert back to a `thir::Pat` for diagnostic purposes. This panics for patterns that don't
+    /// appear in diagnostics, like float ranges.
+    pub fn hoist_witness_pat(&self, pat: &WitnessPat<'tcx>) -> Pat<'tcx> {
+        let cx = self;
+        let is_wildcard = |pat: &Pat<'_>| matches!(pat.kind, PatKind::Wild);
+        let mut subpatterns = pat.iter_fields().map(|p| Box::new(cx.hoist_witness_pat(p)));
+        let kind = match pat.ctor() {
+            Bool(b) => PatKind::Constant { value: mir::Const::from_bool(cx.tcx, *b) },
+            IntRange(range) => return self.hoist_pat_range(range, pat.ty()),
+            Single | Variant(_) => match pat.ty().kind() {
+                ty::Tuple(..) => PatKind::Leaf {
+                    subpatterns: subpatterns
+                        .enumerate()
+                        .map(|(i, pattern)| FieldPat { field: FieldIdx::new(i), pattern })
+                        .collect(),
+                },
+                ty::Adt(adt_def, _) if adt_def.is_box() => {
+                    // Without `box_patterns`, the only legal pattern of type `Box` is `_` (outside
+                    // of `std`). So this branch is only reachable when the feature is enabled and
+                    // the pattern is a box pattern.
+                    PatKind::Deref { subpattern: subpatterns.next().unwrap() }
+                }
+                ty::Adt(adt_def, args) => {
+                    let variant_index =
+                        MatchCheckCtxt::variant_index_for_adt(&pat.ctor(), *adt_def);
+                    let variant = &adt_def.variant(variant_index);
+                    let subpatterns = cx
+                        .list_variant_nonhidden_fields(pat.ty(), variant)
+                        .zip(subpatterns)
+                        .map(|((field, _ty), pattern)| FieldPat { field, pattern })
+                        .collect();
+
+                    if adt_def.is_enum() {
+                        PatKind::Variant { adt_def: *adt_def, args, variant_index, subpatterns }
+                    } else {
+                        PatKind::Leaf { subpatterns }
+                    }
+                }
+                // Note: given the expansion of `&str` patterns done in `expand_pattern`, we should
+                // be careful to reconstruct the correct constant pattern here. However a string
+                // literal pattern will never be reported as a non-exhaustiveness witness, so we
+                // ignore this issue.
+                ty::Ref(..) => PatKind::Deref { subpattern: subpatterns.next().unwrap() },
+                _ => bug!("unexpected ctor for type {:?} {:?}", pat.ctor(), pat.ty()),
+            },
+            Slice(slice) => {
+                match slice.kind {
+                    SliceKind::FixedLen(_) => PatKind::Slice {
+                        prefix: subpatterns.collect(),
+                        slice: None,
+                        suffix: Box::new([]),
+                    },
+                    SliceKind::VarLen(prefix, _) => {
+                        let mut subpatterns = subpatterns.peekable();
+                        let mut prefix: Vec<_> = subpatterns.by_ref().take(prefix).collect();
+                        if slice.array_len.is_some() {
+                            // Improves diagnostics a bit: if the type is a known-size array, instead
+                            // of reporting `[x, _, .., _, y]`, we prefer to report `[x, .., y]`.
+                            // This is incorrect if the size is not known, since `[_, ..]` captures
+                            // arrays of lengths `>= 1` whereas `[..]` captures any length.
+                            while !prefix.is_empty() && is_wildcard(prefix.last().unwrap()) {
+                                prefix.pop();
+                            }
+                            while subpatterns.peek().is_some()
+                                && is_wildcard(subpatterns.peek().unwrap())
+                            {
+                                subpatterns.next();
+                            }
+                        }
+                        let suffix: Box<[_]> = subpatterns.collect();
+                        let wild = Pat::wildcard_from_ty(pat.ty());
+                        PatKind::Slice {
+                            prefix: prefix.into_boxed_slice(),
+                            slice: Some(Box::new(wild)),
+                            suffix,
+                        }
+                    }
+                }
+            }
+            &Str(value) => PatKind::Constant { value },
+            Wildcard | NonExhaustive | Hidden => PatKind::Wild,
+            Missing { .. } => bug!(
+                "trying to convert a `Missing` constructor into a `Pat`; this is probably a bug,
+                `Missing` should have been processed in `apply_constructors`"
+            ),
+            F32Range(..) | F64Range(..) | Opaque(..) | Or => {
+                bug!("can't convert to pattern: {:?}", pat)
+            }
+        };
+
+        Pat { ty: pat.ty(), span: DUMMY_SP, kind }
+    }
+
+    /// Best-effort `Debug` implementation.
+    pub(crate) fn debug_pat(
+        f: &mut fmt::Formatter<'_>,
+        pat: &DeconstructedPat<'p, 'tcx>,
+    ) -> fmt::Result {
+        let mut first = true;
+        let mut start_or_continue = |s| {
+            if first {
+                first = false;
+                ""
+            } else {
+                s
+            }
+        };
+        let mut start_or_comma = || start_or_continue(", ");
+
+        match pat.ctor() {
+            Single | Variant(_) => match pat.ty().kind() {
+                ty::Adt(def, _) if def.is_box() => {
+                    // Without `box_patterns`, the only legal pattern of type `Box` is `_` (outside
+                    // of `std`). So this branch is only reachable when the feature is enabled and
+                    // the pattern is a box pattern.
+                    let subpattern = pat.iter_fields().next().unwrap();
+                    write!(f, "box {subpattern:?}")
+                }
+                ty::Adt(..) | ty::Tuple(..) => {
+                    let variant = match pat.ty().kind() {
+                        ty::Adt(adt, _) => Some(
+                            adt.variant(MatchCheckCtxt::variant_index_for_adt(pat.ctor(), *adt)),
+                        ),
+                        ty::Tuple(_) => None,
+                        _ => unreachable!(),
+                    };
+
+                    if let Some(variant) = variant {
+                        write!(f, "{}", variant.name)?;
+                    }
+
+                    // Without `cx`, we can't know which field corresponds to which, so we can't
+                    // get the names of the fields. Instead we just display everything as a tuple
+                    // struct, which should be good enough.
+                    write!(f, "(")?;
+                    for p in pat.iter_fields() {
+                        write!(f, "{}", start_or_comma())?;
+                        write!(f, "{p:?}")?;
+                    }
+                    write!(f, ")")
+                }
+                // Note: given the expansion of `&str` patterns done in `expand_pattern`, we should
+                // be careful to detect strings here. However a string literal pattern will never
+                // be reported as a non-exhaustiveness witness, so we can ignore this issue.
+                ty::Ref(_, _, mutbl) => {
+                    let subpattern = pat.iter_fields().next().unwrap();
+                    write!(f, "&{}{:?}", mutbl.prefix_str(), subpattern)
+                }
+                _ => write!(f, "_"),
+            },
+            Slice(slice) => {
+                let mut subpatterns = pat.iter_fields();
+                write!(f, "[")?;
+                match slice.kind {
+                    SliceKind::FixedLen(_) => {
+                        for p in subpatterns {
+                            write!(f, "{}{:?}", start_or_comma(), p)?;
+                        }
+                    }
+                    SliceKind::VarLen(prefix_len, _) => {
+                        for p in subpatterns.by_ref().take(prefix_len) {
+                            write!(f, "{}{:?}", start_or_comma(), p)?;
+                        }
+                        write!(f, "{}", start_or_comma())?;
+                        write!(f, "..")?;
+                        for p in subpatterns {
+                            write!(f, "{}{:?}", start_or_comma(), p)?;
+                        }
+                    }
+                }
+                write!(f, "]")
+            }
+            Bool(b) => write!(f, "{b}"),
+            // Best-effort, will render signed ranges incorrectly
+            IntRange(range) => write!(f, "{range:?}"),
+            F32Range(lo, hi, end) => write!(f, "{lo}{end}{hi}"),
+            F64Range(lo, hi, end) => write!(f, "{lo}{end}{hi}"),
+            Str(value) => write!(f, "{value}"),
+            Opaque(..) => write!(f, "<constant pattern>"),
+            Or => {
+                for pat in pat.iter_fields() {
+                    write!(f, "{}{:?}", start_or_continue(" | "), pat)?;
+                }
+                Ok(())
+            }
+            Wildcard | Missing { .. } | NonExhaustive | Hidden => write!(f, "_ : {:?}", pat.ty()),
+        }
+    }
+}
+
+/// Recursively expand this pattern into its subpatterns. Only useful for or-patterns.
+fn expand_or_pat<'p, 'tcx>(pat: &'p Pat<'tcx>) -> Vec<&'p Pat<'tcx>> {
+    fn expand<'p, 'tcx>(pat: &'p Pat<'tcx>, vec: &mut Vec<&'p Pat<'tcx>>) {
+        if let PatKind::Or { pats } = &pat.kind {
+            for pat in pats.iter() {
+                expand(pat, vec);
+            }
+        } else {
+            vec.push(pat)
+        }
+    }
+
+    let mut pats = Vec::new();
+    expand(pat, &mut pats);
+    pats
+}
diff --git a/compiler/rustc_pattern_analysis/src/errors.rs b/compiler/rustc_pattern_analysis/src/errors.rs
new file mode 100644
index 00000000000..0efa8a0ec08
--- /dev/null
+++ b/compiler/rustc_pattern_analysis/src/errors.rs
@@ -0,0 +1,95 @@
+use crate::{cx::MatchCheckCtxt, pat::WitnessPat};
+
+use rustc_errors::{AddToDiagnostic, Diagnostic, SubdiagnosticMessage};
+use rustc_macros::{LintDiagnostic, Subdiagnostic};
+use rustc_middle::thir::Pat;
+use rustc_middle::ty::Ty;
+use rustc_span::Span;
+
+#[derive(Subdiagnostic)]
+#[label(pattern_analysis_uncovered)]
+pub struct Uncovered<'tcx> {
+    #[primary_span]
+    span: Span,
+    count: usize,
+    witness_1: Pat<'tcx>,
+    witness_2: Pat<'tcx>,
+    witness_3: Pat<'tcx>,
+    remainder: usize,
+}
+
+impl<'tcx> Uncovered<'tcx> {
+    pub fn new<'p>(
+        span: Span,
+        cx: &MatchCheckCtxt<'p, 'tcx>,
+        witnesses: Vec<WitnessPat<'tcx>>,
+    ) -> Self {
+        let witness_1 = cx.hoist_witness_pat(witnesses.get(0).unwrap());
+        Self {
+            span,
+            count: witnesses.len(),
+            // Substitute dummy values if witnesses is smaller than 3. These will never be read.
+            witness_2: witnesses
+                .get(1)
+                .map(|w| cx.hoist_witness_pat(w))
+                .unwrap_or_else(|| witness_1.clone()),
+            witness_3: witnesses
+                .get(2)
+                .map(|w| cx.hoist_witness_pat(w))
+                .unwrap_or_else(|| witness_1.clone()),
+            witness_1,
+            remainder: witnesses.len().saturating_sub(3),
+        }
+    }
+}
+
+#[derive(LintDiagnostic)]
+#[diag(pattern_analysis_overlapping_range_endpoints)]
+#[note]
+pub struct OverlappingRangeEndpoints<'tcx> {
+    #[label]
+    pub range: Span,
+    #[subdiagnostic]
+    pub overlap: Vec<Overlap<'tcx>>,
+}
+
+pub struct Overlap<'tcx> {
+    pub span: Span,
+    pub range: Pat<'tcx>,
+}
+
+impl<'tcx> AddToDiagnostic for Overlap<'tcx> {
+    fn add_to_diagnostic_with<F>(self, diag: &mut Diagnostic, _: F)
+    where
+        F: Fn(&mut Diagnostic, SubdiagnosticMessage) -> SubdiagnosticMessage,
+    {
+        let Overlap { span, range } = self;
+
+        // FIXME(mejrs) unfortunately `#[derive(LintDiagnostic)]`
+        // does not support `#[subdiagnostic(eager)]`...
+        let message = format!("this range overlaps on `{range}`...");
+        diag.span_label(span, message);
+    }
+}
+
+#[derive(LintDiagnostic)]
+#[diag(pattern_analysis_non_exhaustive_omitted_pattern)]
+#[help]
+#[note]
+pub(crate) struct NonExhaustiveOmittedPattern<'tcx> {
+    pub scrut_ty: Ty<'tcx>,
+    #[subdiagnostic]
+    pub uncovered: Uncovered<'tcx>,
+}
+
+#[derive(LintDiagnostic)]
+#[diag(pattern_analysis_non_exhaustive_omitted_pattern_lint_on_arm)]
+#[help]
+pub(crate) struct NonExhaustiveOmittedPatternLintOnArm {
+    #[label]
+    pub lint_span: Span,
+    #[suggestion(code = "#[{lint_level}({lint_name})]\n", applicability = "maybe-incorrect")]
+    pub suggest_lint_on_match: Option<Span>,
+    pub lint_level: &'static str,
+    pub lint_name: &'static str,
+}
diff --git a/compiler/rustc_pattern_analysis/src/lib.rs b/compiler/rustc_pattern_analysis/src/lib.rs
new file mode 100644
index 00000000000..07730aa49d3
--- /dev/null
+++ b/compiler/rustc_pattern_analysis/src/lib.rs
@@ -0,0 +1,56 @@
+//! Analysis of patterns, notably match exhaustiveness checking.
+
+pub mod constructor;
+pub mod cx;
+pub mod errors;
+pub(crate) mod lints;
+pub mod pat;
+pub mod usefulness;
+
+#[macro_use]
+extern crate tracing;
+#[macro_use]
+extern crate rustc_middle;
+
+rustc_fluent_macro::fluent_messages! { "../messages.ftl" }
+
+use lints::PatternColumn;
+use rustc_hir::HirId;
+use rustc_middle::ty::Ty;
+use usefulness::{compute_match_usefulness, UsefulnessReport};
+
+use crate::cx::MatchCheckCtxt;
+use crate::lints::{lint_nonexhaustive_missing_variants, lint_overlapping_range_endpoints};
+use crate::pat::DeconstructedPat;
+
+/// The arm of a match expression.
+#[derive(Clone, Copy, Debug)]
+pub struct MatchArm<'p, 'tcx> {
+    /// The pattern must have been lowered through `check_match::MatchVisitor::lower_pattern`.
+    pub pat: &'p DeconstructedPat<'p, 'tcx>,
+    pub hir_id: HirId,
+    pub has_guard: bool,
+}
+
+/// The entrypoint for this crate. Computes whether a match is exhaustive and which of its arms are
+/// useful, and runs some lints.
+pub fn analyze_match<'p, 'tcx>(
+    cx: &MatchCheckCtxt<'p, 'tcx>,
+    arms: &[MatchArm<'p, 'tcx>],
+    scrut_ty: Ty<'tcx>,
+) -> UsefulnessReport<'p, 'tcx> {
+    let pat_column = PatternColumn::new(arms);
+
+    let report = compute_match_usefulness(cx, arms, scrut_ty);
+
+    // Lint on ranges that overlap on their endpoints, which is likely a mistake.
+    lint_overlapping_range_endpoints(cx, &pat_column);
+
+    // Run the non_exhaustive_omitted_patterns lint. Only run on refutable patterns to avoid hitting
+    // `if let`s. Only run if the match is exhaustive otherwise the error is redundant.
+    if cx.refutable && report.non_exhaustiveness_witnesses.is_empty() {
+        lint_nonexhaustive_missing_variants(cx, arms, &pat_column, scrut_ty)
+    }
+
+    report
+}
diff --git a/compiler/rustc_pattern_analysis/src/lints.rs b/compiler/rustc_pattern_analysis/src/lints.rs
new file mode 100644
index 00000000000..8ab559c9e7a
--- /dev/null
+++ b/compiler/rustc_pattern_analysis/src/lints.rs
@@ -0,0 +1,290 @@
+use smallvec::SmallVec;
+
+use rustc_data_structures::captures::Captures;
+use rustc_middle::ty::{self, Ty};
+use rustc_session::lint;
+use rustc_session::lint::builtin::NON_EXHAUSTIVE_OMITTED_PATTERNS;
+use rustc_span::Span;
+
+use crate::constructor::{Constructor, IntRange, MaybeInfiniteInt, SplitConstructorSet};
+use crate::cx::MatchCheckCtxt;
+use crate::errors::{
+    NonExhaustiveOmittedPattern, NonExhaustiveOmittedPatternLintOnArm, Overlap,
+    OverlappingRangeEndpoints, Uncovered,
+};
+use crate::pat::{DeconstructedPat, WitnessPat};
+use crate::usefulness::PatCtxt;
+use crate::MatchArm;
+
+/// A column of patterns in the matrix, where a column is the intuitive notion of "subpatterns that
+/// inspect the same subvalue/place".
+/// This is used to traverse patterns column-by-column for lints. Despite similarities with the
+/// algorithm in [`crate::usefulness`], this does a different traversal. Notably this is linear in
+/// the depth of patterns, whereas `compute_exhaustiveness_and_usefulness` is worst-case exponential
+/// (exhaustiveness is NP-complete). The core difference is that we treat sub-columns separately.
+///
+/// This must not contain an or-pattern. `specialize` takes care to expand them.
+///
+/// This is not used in the main algorithm; only in lints.
+#[derive(Debug)]
+pub(crate) struct PatternColumn<'p, 'tcx> {
+    patterns: Vec<&'p DeconstructedPat<'p, 'tcx>>,
+}
+
+impl<'p, 'tcx> PatternColumn<'p, 'tcx> {
+    pub(crate) fn new(arms: &[MatchArm<'p, 'tcx>]) -> Self {
+        let mut patterns = Vec::with_capacity(arms.len());
+        for arm in arms {
+            if arm.pat.is_or_pat() {
+                patterns.extend(arm.pat.flatten_or_pat())
+            } else {
+                patterns.push(arm.pat)
+            }
+        }
+        Self { patterns }
+    }
+
+    fn is_empty(&self) -> bool {
+        self.patterns.is_empty()
+    }
+    fn head_ty(&self) -> Option<Ty<'tcx>> {
+        if self.patterns.len() == 0 {
+            return None;
+        }
+        // If the type is opaque and it is revealed anywhere in the column, we take the revealed
+        // version. Otherwise we could encounter constructors for the revealed type and crash.
+        let is_opaque = |ty: Ty<'tcx>| matches!(ty.kind(), ty::Alias(ty::Opaque, ..));
+        let first_ty = self.patterns[0].ty();
+        if is_opaque(first_ty) {
+            for pat in &self.patterns {
+                let ty = pat.ty();
+                if !is_opaque(ty) {
+                    return Some(ty);
+                }
+            }
+        }
+        Some(first_ty)
+    }
+
+    /// Do constructor splitting on the constructors of the column.
+    fn analyze_ctors(&self, pcx: &PatCtxt<'_, 'p, 'tcx>) -> SplitConstructorSet<'tcx> {
+        let column_ctors = self.patterns.iter().map(|p| p.ctor());
+        pcx.cx.ctors_for_ty(pcx.ty).split(pcx, column_ctors)
+    }
+
+    fn iter<'a>(&'a self) -> impl Iterator<Item = &'p DeconstructedPat<'p, 'tcx>> + Captures<'a> {
+        self.patterns.iter().copied()
+    }
+
+    /// Does specialization: given a constructor, this takes the patterns from the column that match
+    /// the constructor, and outputs their fields.
+    /// This returns one column per field of the constructor. They usually all have the same length
+    /// (the number of patterns in `self` that matched `ctor`), except that we expand or-patterns
+    /// which may change the lengths.
+    fn specialize(&self, pcx: &PatCtxt<'_, 'p, 'tcx>, ctor: &Constructor<'tcx>) -> Vec<Self> {
+        let arity = ctor.arity(pcx);
+        if arity == 0 {
+            return Vec::new();
+        }
+
+        // We specialize the column by `ctor`. This gives us `arity`-many columns of patterns. These
+        // columns may have different lengths in the presence of or-patterns (this is why we can't
+        // reuse `Matrix`).
+        let mut specialized_columns: Vec<_> =
+            (0..arity).map(|_| Self { patterns: Vec::new() }).collect();
+        let relevant_patterns =
+            self.patterns.iter().filter(|pat| ctor.is_covered_by(pcx, pat.ctor()));
+        for pat in relevant_patterns {
+            let specialized = pat.specialize(pcx, ctor);
+            for (subpat, column) in specialized.iter().zip(&mut specialized_columns) {
+                if subpat.is_or_pat() {
+                    column.patterns.extend(subpat.flatten_or_pat())
+                } else {
+                    column.patterns.push(subpat)
+                }
+            }
+        }
+
+        assert!(
+            !specialized_columns[0].is_empty(),
+            "ctor {ctor:?} was listed as present but isn't;
+            there is an inconsistency between `Constructor::is_covered_by` and `ConstructorSet::split`"
+        );
+        specialized_columns
+    }
+}
+
+/// Traverse the patterns to collect any variants of a non_exhaustive enum that fail to be mentioned
+/// in a given column.
+#[instrument(level = "debug", skip(cx), ret)]
+fn collect_nonexhaustive_missing_variants<'p, 'tcx>(
+    cx: &MatchCheckCtxt<'p, 'tcx>,
+    column: &PatternColumn<'p, 'tcx>,
+) -> Vec<WitnessPat<'tcx>> {
+    let Some(ty) = column.head_ty() else {
+        return Vec::new();
+    };
+    let pcx = &PatCtxt::new_dummy(cx, ty);
+
+    let set = column.analyze_ctors(pcx);
+    if set.present.is_empty() {
+        // We can't consistently handle the case where no constructors are present (since this would
+        // require digging deep through any type in case there's a non_exhaustive enum somewhere),
+        // so for consistency we refuse to handle the top-level case, where we could handle it.
+        return vec![];
+    }
+
+    let mut witnesses = Vec::new();
+    if cx.is_foreign_non_exhaustive_enum(ty) {
+        witnesses.extend(
+            set.missing
+                .into_iter()
+                // This will list missing visible variants.
+                .filter(|c| !matches!(c, Constructor::Hidden | Constructor::NonExhaustive))
+                .map(|missing_ctor| WitnessPat::wild_from_ctor(pcx, missing_ctor)),
+        )
+    }
+
+    // Recurse into the fields.
+    for ctor in set.present {
+        let specialized_columns = column.specialize(pcx, &ctor);
+        let wild_pat = WitnessPat::wild_from_ctor(pcx, ctor);
+        for (i, col_i) in specialized_columns.iter().enumerate() {
+            // Compute witnesses for each column.
+            let wits_for_col_i = collect_nonexhaustive_missing_variants(cx, col_i);
+            // For each witness, we build a new pattern in the shape of `ctor(_, _, wit, _, _)`,
+            // adding enough wildcards to match `arity`.
+            for wit in wits_for_col_i {
+                let mut pat = wild_pat.clone();
+                pat.fields[i] = wit;
+                witnesses.push(pat);
+            }
+        }
+    }
+    witnesses
+}
+
+pub(crate) fn lint_nonexhaustive_missing_variants<'p, 'tcx>(
+    cx: &MatchCheckCtxt<'p, 'tcx>,
+    arms: &[MatchArm<'p, 'tcx>],
+    pat_column: &PatternColumn<'p, 'tcx>,
+    scrut_ty: Ty<'tcx>,
+) {
+    if !matches!(
+        cx.tcx.lint_level_at_node(NON_EXHAUSTIVE_OMITTED_PATTERNS, cx.match_lint_level).0,
+        rustc_session::lint::Level::Allow
+    ) {
+        let witnesses = collect_nonexhaustive_missing_variants(cx, pat_column);
+        if !witnesses.is_empty() {
+            // Report that a match of a `non_exhaustive` enum marked with `non_exhaustive_omitted_patterns`
+            // is not exhaustive enough.
+            //
+            // NB: The partner lint for structs lives in `compiler/rustc_hir_analysis/src/check/pat.rs`.
+            cx.tcx.emit_spanned_lint(
+                NON_EXHAUSTIVE_OMITTED_PATTERNS,
+                cx.match_lint_level,
+                cx.scrut_span,
+                NonExhaustiveOmittedPattern {
+                    scrut_ty,
+                    uncovered: Uncovered::new(cx.scrut_span, cx, witnesses),
+                },
+            );
+        }
+    } else {
+        // We used to allow putting the `#[allow(non_exhaustive_omitted_patterns)]` on a match
+        // arm. This no longer makes sense so we warn users, to avoid silently breaking their
+        // usage of the lint.
+        for arm in arms {
+            let (lint_level, lint_level_source) =
+                cx.tcx.lint_level_at_node(NON_EXHAUSTIVE_OMITTED_PATTERNS, arm.hir_id);
+            if !matches!(lint_level, rustc_session::lint::Level::Allow) {
+                let decorator = NonExhaustiveOmittedPatternLintOnArm {
+                    lint_span: lint_level_source.span(),
+                    suggest_lint_on_match: cx.whole_match_span.map(|span| span.shrink_to_lo()),
+                    lint_level: lint_level.as_str(),
+                    lint_name: "non_exhaustive_omitted_patterns",
+                };
+
+                use rustc_errors::DecorateLint;
+                let mut err = cx.tcx.sess.struct_span_warn(arm.pat.span(), "");
+                err.set_primary_message(decorator.msg());
+                decorator.decorate_lint(&mut err);
+                err.emit();
+            }
+        }
+    }
+}
+
+/// Traverse the patterns to warn the user about ranges that overlap on their endpoints.
+#[instrument(level = "debug", skip(cx))]
+pub(crate) fn lint_overlapping_range_endpoints<'p, 'tcx>(
+    cx: &MatchCheckCtxt<'p, 'tcx>,
+    column: &PatternColumn<'p, 'tcx>,
+) {
+    let Some(ty) = column.head_ty() else {
+        return;
+    };
+    let pcx = &PatCtxt::new_dummy(cx, ty);
+
+    let set = column.analyze_ctors(pcx);
+
+    if matches!(ty.kind(), ty::Char | ty::Int(_) | ty::Uint(_)) {
+        let emit_lint = |overlap: &IntRange, this_span: Span, overlapped_spans: &[Span]| {
+            let overlap_as_pat = cx.hoist_pat_range(overlap, ty);
+            let overlaps: Vec<_> = overlapped_spans
+                .iter()
+                .copied()
+                .map(|span| Overlap { range: overlap_as_pat.clone(), span })
+                .collect();
+            cx.tcx.emit_spanned_lint(
+                lint::builtin::OVERLAPPING_RANGE_ENDPOINTS,
+                cx.match_lint_level,
+                this_span,
+                OverlappingRangeEndpoints { overlap: overlaps, range: this_span },
+            );
+        };
+
+        // If two ranges overlapped, the split set will contain their intersection as a singleton.
+        let split_int_ranges = set.present.iter().filter_map(|c| c.as_int_range());
+        for overlap_range in split_int_ranges.clone() {
+            if overlap_range.is_singleton() {
+                let overlap: MaybeInfiniteInt = overlap_range.lo;
+                // Ranges that look like `lo..=overlap`.
+                let mut prefixes: SmallVec<[_; 1]> = Default::default();
+                // Ranges that look like `overlap..=hi`.
+                let mut suffixes: SmallVec<[_; 1]> = Default::default();
+                // Iterate on patterns that contained `overlap`.
+                for pat in column.iter() {
+                    let this_span = pat.span();
+                    let Constructor::IntRange(this_range) = pat.ctor() else { continue };
+                    if this_range.is_singleton() {
+                        // Don't lint when one of the ranges is a singleton.
+                        continue;
+                    }
+                    if this_range.lo == overlap {
+                        // `this_range` looks like `overlap..=this_range.hi`; it overlaps with any
+                        // ranges that look like `lo..=overlap`.
+                        if !prefixes.is_empty() {
+                            emit_lint(overlap_range, this_span, &prefixes);
+                        }
+                        suffixes.push(this_span)
+                    } else if this_range.hi == overlap.plus_one() {
+                        // `this_range` looks like `this_range.lo..=overlap`; it overlaps with any
+                        // ranges that look like `overlap..=hi`.
+                        if !suffixes.is_empty() {
+                            emit_lint(overlap_range, this_span, &suffixes);
+                        }
+                        prefixes.push(this_span)
+                    }
+                }
+            }
+        }
+    } else {
+        // Recurse into the fields.
+        for ctor in set.present {
+            for col in column.specialize(pcx, &ctor) {
+                lint_overlapping_range_endpoints(cx, &col);
+            }
+        }
+    }
+}
diff --git a/compiler/rustc_pattern_analysis/src/pat.rs b/compiler/rustc_pattern_analysis/src/pat.rs
new file mode 100644
index 00000000000..404651124ad
--- /dev/null
+++ b/compiler/rustc_pattern_analysis/src/pat.rs
@@ -0,0 +1,205 @@
+//! As explained in [`crate::usefulness`], values and patterns are made from constructors applied to
+//! fields. This file defines types that represent patterns in this way.
+use std::cell::Cell;
+use std::fmt;
+
+use smallvec::{smallvec, SmallVec};
+
+use rustc_data_structures::captures::Captures;
+use rustc_middle::ty::{self, Ty};
+use rustc_span::{Span, DUMMY_SP};
+
+use self::Constructor::*;
+use self::SliceKind::*;
+
+use crate::constructor::{Constructor, SliceKind};
+use crate::cx::MatchCheckCtxt;
+use crate::usefulness::PatCtxt;
+
+/// Values and patterns can be represented as a constructor applied to some fields. This represents
+/// a pattern in this form.
+/// This also uses interior mutability to keep track of whether the pattern has been found reachable
+/// during analysis. For this reason they cannot be cloned.
+/// A `DeconstructedPat` will almost always come from user input; the only exception are some
+/// `Wildcard`s introduced during specialization.
+///
+/// Note that the number of fields may not match the fields declared in the original struct/variant.
+/// This happens if a private or `non_exhaustive` field is uninhabited, because the code mustn't
+/// observe that it is uninhabited. In that case that field is not included in `fields`. Care must
+/// be taken when converting to/from `thir::Pat`.
+pub struct DeconstructedPat<'p, 'tcx> {
+    ctor: Constructor<'tcx>,
+    fields: &'p [DeconstructedPat<'p, 'tcx>],
+    ty: Ty<'tcx>,
+    span: Span,
+    /// Whether removing this arm would change the behavior of the match expression.
+    useful: Cell<bool>,
+}
+
+impl<'p, 'tcx> DeconstructedPat<'p, 'tcx> {
+    pub fn wildcard(ty: Ty<'tcx>, span: Span) -> Self {
+        Self::new(Wildcard, &[], ty, span)
+    }
+
+    pub fn new(
+        ctor: Constructor<'tcx>,
+        fields: &'p [DeconstructedPat<'p, 'tcx>],
+        ty: Ty<'tcx>,
+        span: Span,
+    ) -> Self {
+        DeconstructedPat { ctor, fields, ty, span, useful: Cell::new(false) }
+    }
+
+    pub(crate) fn is_or_pat(&self) -> bool {
+        matches!(self.ctor, Or)
+    }
+    /// Expand this (possibly-nested) or-pattern into its alternatives.
+    pub(crate) fn flatten_or_pat(&'p self) -> SmallVec<[&'p Self; 1]> {
+        if self.is_or_pat() {
+            self.iter_fields().flat_map(|p| p.flatten_or_pat()).collect()
+        } else {
+            smallvec![self]
+        }
+    }
+
+    pub fn ctor(&self) -> &Constructor<'tcx> {
+        &self.ctor
+    }
+    pub fn ty(&self) -> Ty<'tcx> {
+        self.ty
+    }
+    pub fn span(&self) -> Span {
+        self.span
+    }
+
+    pub fn iter_fields<'a>(
+        &'a self,
+    ) -> impl Iterator<Item = &'p DeconstructedPat<'p, 'tcx>> + Captures<'a> {
+        self.fields.iter()
+    }
+
+    /// Specialize this pattern with a constructor.
+    /// `other_ctor` can be different from `self.ctor`, but must be covered by it.
+    pub(crate) fn specialize<'a>(
+        &'a self,
+        pcx: &PatCtxt<'_, 'p, 'tcx>,
+        other_ctor: &Constructor<'tcx>,
+    ) -> SmallVec<[&'p DeconstructedPat<'p, 'tcx>; 2]> {
+        match (&self.ctor, other_ctor) {
+            (Wildcard, _) => {
+                // We return a wildcard for each field of `other_ctor`.
+                pcx.cx.ctor_wildcard_fields(other_ctor, pcx.ty).iter().collect()
+            }
+            (Slice(self_slice), Slice(other_slice))
+                if self_slice.arity() != other_slice.arity() =>
+            {
+                // The only tricky case: two slices of different arity. Since `self_slice` covers
+                // `other_slice`, `self_slice` must be `VarLen`, i.e. of the form
+                // `[prefix, .., suffix]`. Moreover `other_slice` is guaranteed to have a larger
+                // arity. So we fill the middle part with enough wildcards to reach the length of
+                // the new, larger slice.
+                match self_slice.kind {
+                    FixedLen(_) => bug!("{:?} doesn't cover {:?}", self_slice, other_slice),
+                    VarLen(prefix, suffix) => {
+                        let (ty::Slice(inner_ty) | ty::Array(inner_ty, _)) = *self.ty.kind() else {
+                            bug!("bad slice pattern {:?} {:?}", self.ctor, self.ty);
+                        };
+                        let prefix = &self.fields[..prefix];
+                        let suffix = &self.fields[self_slice.arity() - suffix..];
+                        let wildcard: &_ = pcx
+                            .cx
+                            .pattern_arena
+                            .alloc(DeconstructedPat::wildcard(inner_ty, DUMMY_SP));
+                        let extra_wildcards = other_slice.arity() - self_slice.arity();
+                        let extra_wildcards = (0..extra_wildcards).map(|_| wildcard);
+                        prefix.iter().chain(extra_wildcards).chain(suffix).collect()
+                    }
+                }
+            }
+            _ => self.fields.iter().collect(),
+        }
+    }
+
+    /// We keep track for each pattern if it was ever useful during the analysis. This is used
+    /// with `redundant_spans` to report redundant subpatterns arising from or patterns.
+    pub(crate) fn set_useful(&self) {
+        self.useful.set(true)
+    }
+    pub(crate) fn is_useful(&self) -> bool {
+        if self.useful.get() {
+            true
+        } else if self.is_or_pat() && self.iter_fields().any(|f| f.is_useful()) {
+            // We always expand or patterns in the matrix, so we will never see the actual
+            // or-pattern (the one with constructor `Or`) in the column. As such, it will not be
+            // marked as useful itself, only its children will. We recover this information here.
+            self.set_useful();
+            true
+        } else {
+            false
+        }
+    }
+
+    /// Report the spans of subpatterns that were not useful, if any.
+    pub(crate) fn redundant_spans(&self) -> Vec<Span> {
+        let mut spans = Vec::new();
+        self.collect_redundant_spans(&mut spans);
+        spans
+    }
+    fn collect_redundant_spans(&self, spans: &mut Vec<Span>) {
+        // We don't look at subpatterns if we already reported the whole pattern as redundant.
+        if !self.is_useful() {
+            spans.push(self.span);
+        } else {
+            for p in self.iter_fields() {
+                p.collect_redundant_spans(spans);
+            }
+        }
+    }
+}
+
+/// This is mostly copied from the `Pat` impl. This is best effort and not good enough for a
+/// `Display` impl.
+impl<'p, 'tcx> fmt::Debug for DeconstructedPat<'p, 'tcx> {
+    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
+        MatchCheckCtxt::debug_pat(f, self)
+    }
+}
+
+/// Same idea as `DeconstructedPat`, except this is a fictitious pattern built up for diagnostics
+/// purposes. As such they don't use interning and can be cloned.
+#[derive(Debug, Clone)]
+pub struct WitnessPat<'tcx> {
+    ctor: Constructor<'tcx>,
+    pub(crate) fields: Vec<WitnessPat<'tcx>>,
+    ty: Ty<'tcx>,
+}
+
+impl<'tcx> WitnessPat<'tcx> {
+    pub(crate) fn new(ctor: Constructor<'tcx>, fields: Vec<Self>, ty: Ty<'tcx>) -> Self {
+        Self { ctor, fields, ty }
+    }
+    pub(crate) fn wildcard(ty: Ty<'tcx>) -> Self {
+        Self::new(Wildcard, Vec::new(), ty)
+    }
+
+    /// Construct a pattern that matches everything that starts with this constructor.
+    /// For example, if `ctor` is a `Constructor::Variant` for `Option::Some`, we get the pattern
+    /// `Some(_)`.
+    pub(crate) fn wild_from_ctor(pcx: &PatCtxt<'_, '_, 'tcx>, ctor: Constructor<'tcx>) -> Self {
+        let field_tys =
+            pcx.cx.ctor_wildcard_fields(&ctor, pcx.ty).iter().map(|deco_pat| deco_pat.ty());
+        let fields = field_tys.map(|ty| Self::wildcard(ty)).collect();
+        Self::new(ctor, fields, pcx.ty)
+    }
+
+    pub fn ctor(&self) -> &Constructor<'tcx> {
+        &self.ctor
+    }
+    pub fn ty(&self) -> Ty<'tcx> {
+        self.ty
+    }
+
+    pub fn iter_fields<'a>(&'a self) -> impl Iterator<Item = &'a WitnessPat<'tcx>> {
+        self.fields.iter()
+    }
+}
diff --git a/compiler/rustc_pattern_analysis/src/usefulness.rs b/compiler/rustc_pattern_analysis/src/usefulness.rs
new file mode 100644
index 00000000000..f268a551547
--- /dev/null
+++ b/compiler/rustc_pattern_analysis/src/usefulness.rs
@@ -0,0 +1,1319 @@
+//! # Match exhaustiveness and redundancy algorithm
+//!
+//! This file contains the logic for exhaustiveness and usefulness checking for pattern-matching.
+//! Specifically, given a list of patterns in a match, we can tell whether:
+//! (a) a given pattern is redundant
+//! (b) the patterns cover every possible value for the type (exhaustiveness)
+//!
+//! The algorithm implemented here is inspired from the one described in [this
+//! paper](http://moscova.inria.fr/~maranget/papers/warn/index.html). We have however changed it in
+//! various ways to accommodate the variety of patterns that Rust supports. We thus explain our
+//! version here, without being as precise.
+//!
+//! Fun fact: computing exhaustiveness is NP-complete, because we can encode a SAT problem as an
+//! exhaustiveness problem. See [here](https://niedzejkob.p4.team/rust-np) for the fun details.
+//!
+//!
+//! # Summary
+//!
+//! The algorithm is given as input a list of patterns, one for each arm of a match, and computes
+//! the following:
+//! - a set of values that match none of the patterns (if any),
+//! - for each subpattern (taking into account or-patterns), whether removing it would change
+//!     anything about how the match executes, i.e. whether it is useful/not redundant.
+//!
+//! To a first approximation, the algorithm works by exploring all possible values for the type
+//! being matched on, and determining which arm(s) catch which value. To make this tractable we
+//! cleverly group together values, as we'll see below.
+//!
+//! The entrypoint of this file is the [`compute_match_usefulness`] function, which computes
+//! usefulness for each subpattern and exhaustiveness for the whole match.
+//!
+//! In this page we explain the necessary concepts to understand how the algorithm works.
+//!
+//!
+//! # Usefulness
+//!
+//! The central concept of this file is the notion of "usefulness". Given some patterns `p_1 ..
+//! p_n`, a pattern `q` is said to be *useful* if there is a value that is matched by `q` and by
+//! none of the `p_i`. We write `usefulness(p_1 .. p_n, q)` for a function that returns a list of
+//! such values. The aim of this file is to compute it efficiently.
+//!
+//! This is enough to compute usefulness: a pattern in a `match` expression is redundant iff it is
+//! not useful w.r.t. the patterns above it:
+//! ```compile_fail,E0004
+//! # #![feature(exclusive_range_pattern)]
+//! # fn foo() {
+//! match Some(0u32) {
+//!     Some(0..100) => {},
+//!     Some(90..190) => {}, // useful: `Some(150)` is matched by this but not the branch above
+//!     Some(50..150) => {}, // redundant: all the values this matches are already matched by
+//!                          //   the branches above
+//!     None => {},          // useful: `None` is matched by this but not the branches above
+//! }
+//! # }
+//! ```
+//!
+//! This is also enough to compute exhaustiveness: a match is exhaustive iff the wildcard `_`
+//! pattern is _not_ useful w.r.t. the patterns in the match. The values returned by `usefulness`
+//! are used to tell the user which values are missing.
+//! ```compile_fail,E0004
+//! # fn foo(x: Option<u32>) {
+//! match x {
+//!     None => {},
+//!     Some(0) => {},
+//!     // not exhaustive: `_` is useful because it matches `Some(1)`
+//! }
+//! # }
+//! ```
+//!
+//!
+//! # Constructors and fields
+//!
+//! In the value `Pair(Some(0), true)`, `Pair` is called the constructor of the value, and `Some(0)`
+//! and `true` are its fields. Every matcheable value can be decomposed in this way. Examples of
+//! constructors are: `Some`, `None`, `(,)` (the 2-tuple constructor), `Foo {..}` (the constructor
+//! for a struct `Foo`), and `2` (the constructor for the number `2`).
+//!
+//! Each constructor takes a fixed number of fields; this is called its arity. `Pair` and `(,)` have
+//! arity 2, `Some` has arity 1, `None` and `42` have arity 0. Each type has a known set of
+//! constructors. Some types have many constructors (like `u64`) or even an infinitely many (like
+//! `&str` and `&[T]`).
+//!
+//! Patterns are similar: `Pair(Some(_), _)` has constructor `Pair` and two fields. The difference
+//! is that we get some extra pattern-only constructors, namely: the wildcard `_`, variable
+//! bindings, integer ranges like `0..=10`, and variable-length slices like `[_, .., _]`. We treat
+//! or-patterns separately, see the dedicated section below.
+//!
+//! Now to check if a value `v` matches a pattern `p`, we check if `v`'s constructor matches `p`'s
+//! constructor, then recursively compare their fields if necessary. A few representative examples:
+//!
+//! - `matches!(v, _) := true`
+//! - `matches!((v0,  v1), (p0,  p1)) := matches!(v0, p0) && matches!(v1, p1)`
+//! - `matches!(Foo { bar: v0, baz: v1 }, Foo { bar: p0, baz: p1 }) := matches!(v0, p0) && matches!(v1, p1)`
+//! - `matches!(Ok(v0), Ok(p0)) := matches!(v0, p0)`
+//! - `matches!(Ok(v0), Err(p0)) := false` (incompatible variants)
+//! - `matches!(v, 1..=100) := matches!(v, 1) || ... || matches!(v, 100)`
+//! - `matches!([v0], [p0, .., p1]) := false` (incompatible lengths)
+//! - `matches!([v0, v1, v2], [p0, .., p1]) := matches!(v0, p0) && matches!(v2, p1)`
+//!
+//! Constructors and relevant operations are defined in the [`crate::constructor`] module. A
+//! representation of patterns that uses constructors is available in [`crate::pat`]. The question
+//! of whether a constructor is matched by another one is answered by
+//! [`Constructor::is_covered_by`].
+//!
+//! Note 1: variable bindings (like the `x` in `Some(x)`) match anything, so we treat them as wildcards.
+//! Note 2: this only applies to matcheable values. For example a value of type `Rc<u64>` can't be
+//! deconstructed that way.
+//!
+//!
+//!
+//! # Specialization
+//!
+//! The examples in the previous section motivate the operation at the heart of the algorithm:
+//! "specialization". It captures this idea of "removing one layer of constructor".
+//!
+//! `specialize(c, p)` takes a value-only constructor `c` and a pattern `p`, and returns a
+//! pattern-tuple or nothing. It works as follows:
+//!
+//! - Specializing for the wrong constructor returns nothing
+//!
+//!   - `specialize(None, Some(p0)) := <nothing>`
+//!   - `specialize([,,,], [p0]) := <nothing>`
+//!
+//! - Specializing for the correct constructor returns a tuple of the fields
+//!
+//!   - `specialize(Variant1, Variant1(p0, p1, p2)) := (p0, p1, p2)`
+//!   - `specialize(Foo{ bar, baz, quz }, Foo { bar: p0, baz: p1, .. }) := (p0, p1, _)`
+//!   - `specialize([,,,], [p0, .., p1]) := (p0, _, _, p1)`
+//!
+//! We get the following property: for any values `v_1, .., v_n` of appropriate types, we have:
+//! ```text
+//! matches!(c(v_1, .., v_n), p)
+//! <=> specialize(c, p) returns something
+//!     && matches!((v_1, .., v_n), specialize(c, p))
+//! ```
+//!
+//! We also extend specialization to pattern-tuples by applying it to the first pattern:
+//! `specialize(c, (p_0, .., p_n)) := specialize(c, p_0) ++ (p_1, .., p_m)`
+//! where `++` is concatenation of tuples.
+//!
+//!
+//! The previous property extends to pattern-tuples:
+//! ```text
+//! matches!((c(v_1, .., v_n), w_1, .., w_m), (p_0, p_1, .., p_m))
+//! <=> specialize(c, p_0) does not error
+//!     && matches!((v_1, .., v_n, w_1, .., w_m), specialize(c, (p_0, p_1, .., p_m)))
+//! ```
+//!
+//! Whether specialization returns something or not is given by [`Constructor::is_covered_by`].
+//! Specialization of a pattern is computed in [`DeconstructedPat::specialize`]. Specialization for
+//! a pattern-tuple is computed in [`PatStack::pop_head_constructor`]. Finally, specialization for a
+//! set of pattern-tuples is computed in [`Matrix::specialize_constructor`].
+//!
+//!
+//!
+//! # Undoing specialization
+//!
+//! To construct witnesses we will need an inverse of specialization. If `c` is a constructor of
+//! arity `n`, we define `unspecialize` as:
+//! `unspecialize(c, (p_1, .., p_n, q_1, .., q_m)) := (c(p_1, .., p_n), q_1, .., q_m)`.
+//!
+//! This is done for a single witness-tuple in [`WitnessStack::apply_constructor`], and for a set of
+//! witness-tuples in [`WitnessMatrix::apply_constructor`].
+//!
+//!
+//!
+//! # Computing usefulness
+//!
+//! We now present a naive version of the algorithm for computing usefulness. From now on we operate
+//! on pattern-tuples.
+//!
+//! Let `pt_1, .., pt_n` and `qt` be length-m tuples of patterns for the same type `(T_1, .., T_m)`.
+//! We compute `usefulness(tp_1, .., tp_n, tq)` as follows:
+//!
+//! - Base case: `m == 0`.
+//!     The pattern-tuples are all empty, i.e. they're all `()`. Thus `tq` is useful iff there are
+//!     no rows above it, i.e. if `n == 0`. In that case we return `()` as a witness-tuple of
+//!     usefulness of `tq`.
+//!
+//! - Inductive case: `m > 0`.
+//!     In this naive version, we list all the possible constructors for values of type `T1` (we
+//!     will be more clever in the next section).
+//!
+//!     - For each such constructor `c` for which `specialize(c, tq)` is not nothing:
+//!         - We recursively compute `usefulness(specialize(c, tp_1) ... specialize(c, tp_n), specialize(c, tq))`,
+//!             where we discard any `specialize(c, p_i)` that returns nothing.
+//!         - For each witness-tuple `w` found, we apply `unspecialize(c, w)` to it.
+//!
+//!     - We return the all the witnesses found, if any.
+//!
+//!
+//! Let's take the following example:
+//! ```compile_fail,E0004
+//! # enum Enum { Variant1(()), Variant2(Option<bool>, u32)}
+//! # use Enum::*;
+//! # fn foo(x: Enum) {
+//! match x {
+//!     Variant1(_) => {} // `p1`
+//!     Variant2(None, 0) => {} // `p2`
+//!     Variant2(Some(_), 0) => {} // `q`
+//! }
+//! # }
+//! ```
+//!
+//! To compute the usefulness of `q`, we would proceed as follows:
+//! ```text
+//! Start:
+//!   `tp1 = [Variant1(_)]`
+//!   `tp2 = [Variant2(None, 0)]`
+//!   `tq  = [Variant2(Some(true), 0)]`
+//!
+//!   Constructors are `Variant1` and `Variant2`. Only `Variant2` can specialize `tq`.
+//!   Specialize with `Variant2`:
+//!     `tp2 = [None, 0]`
+//!     `tq  = [Some(true), 0]`
+//!
+//!     Constructors are `None` and `Some`. Only `Some` can specialize `tq`.
+//!     Specialize with `Some`:
+//!       `tq  = [true, 0]`
+//!
+//!       Constructors are `false` and `true`. Only `true` can specialize `tq`.
+//!       Specialize with `true`:
+//!         `tq  = [0]`
+//!
+//!         Constructors are `0`, `1`, .. up to infinity. Only `0` can specialize `tq`.
+//!         Specialize with `0`:
+//!           `tq  = []`
+//!
+//!           m == 0 and n == 0, so `tq` is useful with witness `[]`.
+//!             `witness  = []`
+//!
+//!         Unspecialize with `0`:
+//!           `witness  = [0]`
+//!       Unspecialize with `true`:
+//!         `witness  = [true, 0]`
+//!     Unspecialize with `Some`:
+//!       `witness  = [Some(true), 0]`
+//!   Unspecialize with `Variant2`:
+//!     `witness  = [Variant2(Some(true), 0)]`
+//! ```
+//!
+//! Therefore `usefulness(tp_1, tp_2, tq)` returns the single witness-tuple `[Variant2(Some(true), 0)]`.
+//!
+//!
+//! Computing the set of constructors for a type is done in [`MatchCheckCtxt::ctors_for_ty`]. See
+//! the following sections for more accurate versions of the algorithm and corresponding links.
+//!
+//!
+//!
+//! # Computing usefulness and exhaustiveness in one go
+//!
+//! The algorithm we have described so far computes usefulness of each pattern in turn, and ends by
+//! checking if `_` is useful to determine exhaustiveness of the whole match. In practice, instead
+//! of doing "for each pattern { for each constructor { ... } }", we do "for each constructor { for
+//! each pattern { ... } }". This allows us to compute everything in one go.
+//!
+//! [`Matrix`] stores the set of pattern-tuples under consideration. We track usefulness of each
+//! row mutably in the matrix as we go along. We ignore witnesses of usefulness of the match rows.
+//! We gather witnesses of the usefulness of `_` in [`WitnessMatrix`]. The algorithm that computes
+//! all this is in [`compute_exhaustiveness_and_usefulness`].
+//!
+//! See the full example at the bottom of this documentation.
+//!
+//!
+//!
+//! # Making usefulness tractable: constructor splitting
+//!
+//! We're missing one last detail: which constructors do we list? Naively listing all value
+//! constructors cannot work for types like `u64` or `&str`, so we need to be more clever. The final
+//! clever idea for this algorithm is that we can group together constructors that behave the same.
+//!
+//! Examples:
+//! ```compile_fail,E0004
+//! match (0, false) {
+//!     (0 ..=100, true) => {}
+//!     (50..=150, false) => {}
+//!     (0 ..=200, _) => {}
+//! }
+//! ```
+//!
+//! In this example, trying any of `0`, `1`, .., `49` will give the same specialized matrix, and
+//! thus the same usefulness/exhaustiveness results. We can thus accelerate the algorithm by
+//! trying them all at once. Here in fact, the only cases we need to consider are: `0..50`,
+//! `50..=100`, `101..=150`,`151..=200` and `201..`.
+//!
+//! ```
+//! enum Direction { North, South, East, West }
+//! # let wind = (Direction::North, 0u8);
+//! match wind {
+//!     (Direction::North, 50..) => {}
+//!     (_, _) => {}
+//! }
+//! ```
+//!
+//! In this example, trying any of `South`, `East`, `West` will give the same specialized matrix. By
+//! the same reasoning, we only need to try two cases: `North`, and "everything else".
+//!
+//! We call _constructor splitting_ the operation that computes such a minimal set of cases to try.
+//! This is done in [`ConstructorSet::split`] and explained in [`crate::constructor`].
+//!
+//!
+//!
+//! # Or-patterns
+//!
+//! What we have described so far works well if there are no or-patterns. To handle them, if the
+//! first pattern of a row in the matrix is an or-pattern, we expand it by duplicating the rest of
+//! the row as necessary. This is handled automatically in [`Matrix`].
+//!
+//! This makes usefulness tracking subtle, because we also want to compute whether an alternative
+//! of an or-pattern is redundant, e.g. in `Some(_) | Some(0)`. We track usefulness of each
+//! subpattern by interior mutability in [`DeconstructedPat`] with `set_useful`/`is_useful`.
+//!
+//! It's unfortunate that we have to use interior mutability, but believe me (Nadrieril), I have
+//! tried [other](https://github.com/rust-lang/rust/pull/80104)
+//! [solutions](https://github.com/rust-lang/rust/pull/80632) and nothing is remotely as simple.
+//!
+//!
+//!
+//! # Constants and opaques
+//!
+//! There are two kinds of constants in patterns:
+//!
+//! * literals (`1`, `true`, `"foo"`)
+//! * named or inline consts (`FOO`, `const { 5 + 6 }`)
+//!
+//! The latter are converted into the corresponding patterns by a previous phase. For example
+//! `const_to_pat(const { [1, 2, 3] })` becomes an `Array(vec![Const(1), Const(2), Const(3)])`
+//! pattern. This gets problematic when comparing the constant via `==` would behave differently
+//! from matching on the constant converted to a pattern. The situation around this is currently
+//! unclear and the lang team is working on clarifying what we want to do there. In any case, there
+//! are constants we will not turn into patterns. We capture these with `Constructor::Opaque`. These
+//! `Opaque` patterns do not participate in exhaustiveness, specialization or overlap checking.
+//!
+//!
+//!
+//! # Usefulness vs reachability, validity, and empty patterns
+//!
+//! This is likely the subtlest aspect of the algorithm. To be fully precise, a match doesn't
+//! operate on a value, it operates on a place. In certain unsafe circumstances, it is possible for
+//! a place to not contain valid data for its type. This has subtle consequences for empty types.
+//! Take the following:
+//!
+//! ```rust
+//! enum Void {}
+//! let x: u8 = 0;
+//! let ptr: *const Void = &x as *const u8 as *const Void;
+//! unsafe {
+//!     match *ptr {
+//!         _ => println!("Reachable!"),
+//!     }
+//! }
+//! ```
+//!
+//! In this example, `ptr` is a valid pointer pointing to a place with invalid data. The `_` pattern
+//! does not look at the contents of `*ptr`, so this is ok and the arm is taken. In other words,
+//! despite the place we are inspecting being of type `Void`, there is a reachable arm. If the
+//! arm had a binding however:
+//!
+//! ```rust
+//! # #[derive(Copy, Clone)]
+//! # enum Void {}
+//! # let x: u8 = 0;
+//! # let ptr: *const Void = &x as *const u8 as *const Void;
+//! # unsafe {
+//! match *ptr {
+//!     _a => println!("Unreachable!"),
+//! }
+//! # }
+//! ```
+//!
+//! Here the binding loads the value of type `Void` from the `*ptr` place. In this example, this
+//! causes UB since the data is not valid. In the general case, this asserts validity of the data at
+//! `*ptr`. Either way, this arm will never be taken.
+//!
+//! Finally, let's consider the empty match `match *ptr {}`. If we consider this exhaustive, then
+//! having invalid data at `*ptr` is invalid. In other words, the empty match is semantically
+//! equivalent to the `_a => ...` match. In the interest of explicitness, we prefer the case with an
+//! arm, hence we won't tell the user to remove the `_a` arm. In other words, the `_a` arm is
+//! unreachable yet not redundant. This is why we lint on redundant arms rather than unreachable
+//! arms, despite the fact that the lint says "unreachable".
+//!
+//! These considerations only affects certain places, namely those that can contain non-valid data
+//! without UB. These are: pointer dereferences, reference dereferences, and union field accesses.
+//! We track in the algorithm whether a given place is known to contain valid data. This is done
+//! first by inspecting the scrutinee syntactically (which gives us `cx.known_valid_scrutinee`), and
+//! then by tracking validity of each column of the matrix (which correspond to places) as we
+//! recurse into subpatterns. That second part is done through [`ValidityConstraint`], most notably
+//! [`ValidityConstraint::specialize`].
+//!
+//! Having said all that, in practice we don't fully follow what's been presented in this section.
+//! Under `exhaustive_patterns`, we allow omitting empty arms even in `!known_valid` places, for
+//! backwards-compatibility until we have a better alternative. Without `exhaustive_patterns`, we
+//! mostly treat empty types as inhabited, except specifically a non-nested `!` or empty enum. In
+//! this specific case we also allow the empty match regardless of place validity, for
+//! backwards-compatibility. Hopefully we can eventually deprecate this.
+//!
+//!
+//!
+//! # Full example
+//!
+//! We illustrate a full run of the algorithm on the following match.
+//!
+//! ```compile_fail,E0004
+//! # struct Pair(Option<u32>, bool);
+//! # fn foo(x: Pair) -> u32 {
+//! match x {
+//!     Pair(Some(0), _) => 1,
+//!     Pair(_, false) => 2,
+//!     Pair(Some(0), false) => 3,
+//! }
+//! # }
+//! ```
+//!
+//! We keep track of the original row for illustration purposes, this is not what the algorithm
+//! actually does (it tracks usefulness as a boolean on each row).
+//!
+//! ```text
+//!  ┐ Patterns:
+//!  │   1. `[Pair(Some(0), _)]`
+//!  │   2. `[Pair(_, false)]`
+//!  │   3. `[Pair(Some(0), false)]`
+//!  │
+//!  │ Specialize with `Pair`:
+//!  ├─┐ Patterns:
+//!  │ │   1. `[Some(0), _]`
+//!  │ │   2. `[_, false]`
+//!  │ │   3. `[Some(0), false]`
+//!  │ │
+//!  │ │ Specialize with `Some`:
+//!  │ ├─┐ Patterns:
+//!  │ │ │   1. `[0, _]`
+//!  │ │ │   2. `[_, false]`
+//!  │ │ │   3. `[0, false]`
+//!  │ │ │
+//!  │ │ │ Specialize with `0`:
+//!  │ │ ├─┐ Patterns:
+//!  │ │ │ │   1. `[_]`
+//!  │ │ │ │   3. `[false]`
+//!  │ │ │ │
+//!  │ │ │ │ Specialize with `true`:
+//!  │ │ │ ├─┐ Patterns:
+//!  │ │ │ │ │   1. `[]`
+//!  │ │ │ │ │
+//!  │ │ │ │ │ We note arm 1 is useful (by `Pair(Some(0), true)`).
+//!  │ │ │ ├─┘
+//!  │ │ │ │
+//!  │ │ │ │ Specialize with `false`:
+//!  │ │ │ ├─┐ Patterns:
+//!  │ │ │ │ │   1. `[]`
+//!  │ │ │ │ │   3. `[]`
+//!  │ │ │ │ │
+//!  │ │ │ │ │ We note arm 1 is useful (by `Pair(Some(0), false)`).
+//!  │ │ │ ├─┘
+//!  │ │ ├─┘
+//!  │ │ │
+//!  │ │ │ Specialize with `1..`:
+//!  │ │ ├─┐ Patterns:
+//!  │ │ │ │   2. `[false]`
+//!  │ │ │ │
+//!  │ │ │ │ Specialize with `true`:
+//!  │ │ │ ├─┐ Patterns:
+//!  │ │ │ │ │   // no rows left
+//!  │ │ │ │ │
+//!  │ │ │ │ │ We have found an unmatched value (`Pair(Some(1..), true)`)! This gives us a witness.
+//!  │ │ │ │ │ New witnesses:
+//!  │ │ │ │ │   `[]`
+//!  │ │ │ ├─┘
+//!  │ │ │ │ Unspecialize new witnesses with `true`:
+//!  │ │ │ │   `[true]`
+//!  │ │ │ │
+//!  │ │ │ │ Specialize with `false`:
+//!  │ │ │ ├─┐ Patterns:
+//!  │ │ │ │ │   2. `[]`
+//!  │ │ │ │ │
+//!  │ │ │ │ │ We note arm 2 is useful (by `Pair(Some(1..), false)`).
+//!  │ │ │ ├─┘
+//!  │ │ │ │
+//!  │ │ │ │ Total witnesses for `1..`:
+//!  │ │ │ │   `[true]`
+//!  │ │ ├─┘
+//!  │ │ │ Unspecialize new witnesses with `1..`:
+//!  │ │ │   `[1.., true]`
+//!  │ │ │
+//!  │ │ │ Total witnesses for `Some`:
+//!  │ │ │   `[1.., true]`
+//!  │ ├─┘
+//!  │ │ Unspecialize new witnesses with `Some`:
+//!  │ │   `[Some(1..), true]`
+//!  │ │
+//!  │ │ Specialize with `None`:
+//!  │ ├─┐ Patterns:
+//!  │ │ │   2. `[false]`
+//!  │ │ │
+//!  │ │ │ Specialize with `true`:
+//!  │ │ ├─┐ Patterns:
+//!  │ │ │ │   // no rows left
+//!  │ │ │ │
+//!  │ │ │ │ We have found an unmatched value (`Pair(None, true)`)! This gives us a witness.
+//!  │ │ │ │ New witnesses:
+//!  │ │ │ │   `[]`
+//!  │ │ ├─┘
+//!  │ │ │ Unspecialize new witnesses with `true`:
+//!  │ │ │   `[true]`
+//!  │ │ │
+//!  │ │ │ Specialize with `false`:
+//!  │ │ ├─┐ Patterns:
+//!  │ │ │ │   2. `[]`
+//!  │ │ │ │
+//!  │ │ │ │ We note arm 2 is useful (by `Pair(None, false)`).
+//!  │ │ ├─┘
+//!  │ │ │
+//!  │ │ │ Total witnesses for `None`:
+//!  │ │ │   `[true]`
+//!  │ ├─┘
+//!  │ │ Unspecialize new witnesses with `None`:
+//!  │ │   `[None, true]`
+//!  │ │
+//!  │ │ Total witnesses for `Pair`:
+//!  │ │   `[Some(1..), true]`
+//!  │ │   `[None, true]`
+//!  ├─┘
+//!  │ Unspecialize new witnesses with `Pair`:
+//!  │   `[Pair(Some(1..), true)]`
+//!  │   `[Pair(None, true)]`
+//!  │
+//!  │ Final witnesses:
+//!  │   `[Pair(Some(1..), true)]`
+//!  │   `[Pair(None, true)]`
+//!  ┘
+//! ```
+//!
+//! We conclude:
+//! - Arm 3 is redundant (it was never marked as useful);
+//! - The match is not exhaustive;
+//! - Adding arms with `Pair(Some(1..), true)` and `Pair(None, true)` would make the match exhaustive.
+//!
+//! Note that when we're deep in the algorithm, we don't know what specialization steps got us here.
+//! We can only figure out what our witnesses correspond to by unspecializing back up the stack.
+//!
+//!
+//! # Tests
+//!
+//! Note: tests specific to this file can be found in:
+//!
+//!   - `ui/pattern/usefulness`
+//!   - `ui/or-patterns`
+//!   - `ui/consts/const_in_pattern`
+//!   - `ui/rfc-2008-non-exhaustive`
+//!   - `ui/half-open-range-patterns`
+//!   - probably many others
+//!
+//! I (Nadrieril) prefer to put new tests in `ui/pattern/usefulness` unless there's a specific
+//! reason not to, for example if they crucially depend on a particular feature like `or_patterns`.
+
+use smallvec::{smallvec, SmallVec};
+use std::fmt;
+
+use rustc_data_structures::{captures::Captures, stack::ensure_sufficient_stack};
+use rustc_middle::ty::{self, Ty};
+use rustc_span::{Span, DUMMY_SP};
+
+use crate::constructor::{Constructor, ConstructorSet};
+use crate::cx::MatchCheckCtxt;
+use crate::pat::{DeconstructedPat, WitnessPat};
+use crate::MatchArm;
+
+use self::ValidityConstraint::*;
+
+#[derive(Copy, Clone)]
+pub(crate) struct PatCtxt<'a, 'p, 'tcx> {
+    pub(crate) cx: &'a MatchCheckCtxt<'p, 'tcx>,
+    /// Type of the current column under investigation.
+    pub(crate) ty: Ty<'tcx>,
+    /// Whether the current pattern is the whole pattern as found in a match arm, or if it's a
+    /// subpattern.
+    pub(crate) is_top_level: bool,
+}
+
+impl<'a, 'p, 'tcx> PatCtxt<'a, 'p, 'tcx> {
+    /// A `PatCtxt` when code other than `is_useful` needs one.
+    pub(crate) fn new_dummy(cx: &'a MatchCheckCtxt<'p, 'tcx>, ty: Ty<'tcx>) -> Self {
+        PatCtxt { cx, ty, is_top_level: false }
+    }
+}
+
+impl<'a, 'p, 'tcx> fmt::Debug for PatCtxt<'a, 'p, 'tcx> {
+    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
+        f.debug_struct("PatCtxt").field("ty", &self.ty).finish()
+    }
+}
+
+/// Serves two purposes:
+/// - in a wildcard, tracks whether the wildcard matches only valid values (i.e. is a binding `_a`)
+///     or also invalid values (i.e. is a true `_` pattern).
+/// - in the matrix, track whether a given place (aka column) is known to contain a valid value or
+///     not.
+#[derive(Debug, Copy, Clone, PartialEq, Eq)]
+enum ValidityConstraint {
+    ValidOnly,
+    MaybeInvalid,
+    /// Option for backwards compatibility: the place is not known to be valid but we allow omitting
+    /// `useful && !reachable` arms anyway.
+    MaybeInvalidButAllowOmittingArms,
+}
+
+impl ValidityConstraint {
+    fn from_bool(is_valid_only: bool) -> Self {
+        if is_valid_only { ValidOnly } else { MaybeInvalid }
+    }
+
+    fn allow_omitting_side_effecting_arms(self) -> Self {
+        match self {
+            MaybeInvalid | MaybeInvalidButAllowOmittingArms => MaybeInvalidButAllowOmittingArms,
+            // There are no side-effecting empty arms here, nothing to do.
+            ValidOnly => ValidOnly,
+        }
+    }
+
+    fn is_known_valid(self) -> bool {
+        matches!(self, ValidOnly)
+    }
+    fn allows_omitting_empty_arms(self) -> bool {
+        matches!(self, ValidOnly | MaybeInvalidButAllowOmittingArms)
+    }
+
+    /// If the place has validity given by `self` and we read that the value at the place has
+    /// constructor `ctor`, this computes what we can assume about the validity of the constructor
+    /// fields.
+    ///
+    /// Pending further opsem decisions, the current behavior is: validity is preserved, except
+    /// inside `&` and union fields where validity is reset to `MaybeInvalid`.
+    fn specialize<'tcx>(self, pcx: &PatCtxt<'_, '_, 'tcx>, ctor: &Constructor<'tcx>) -> Self {
+        // We preserve validity except when we go inside a reference or a union field.
+        if matches!(ctor, Constructor::Single)
+            && (matches!(pcx.ty.kind(), ty::Ref(..))
+                || matches!(pcx.ty.kind(), ty::Adt(def, ..) if def.is_union()))
+        {
+            // Validity of `x: &T` does not imply validity of `*x: T`.
+            MaybeInvalid
+        } else {
+            self
+        }
+    }
+}
+
+impl fmt::Display for ValidityConstraint {
+    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
+        let s = match self {
+            ValidOnly => "✓",
+            MaybeInvalid | MaybeInvalidButAllowOmittingArms => "?",
+        };
+        write!(f, "{s}")
+    }
+}
+
+/// Represents a pattern-tuple under investigation.
+#[derive(Clone)]
+struct PatStack<'p, 'tcx> {
+    // Rows of len 1 are very common, which is why `SmallVec[_; 2]` works well.
+    pats: SmallVec<[&'p DeconstructedPat<'p, 'tcx>; 2]>,
+}
+
+impl<'p, 'tcx> PatStack<'p, 'tcx> {
+    fn from_pattern(pat: &'p DeconstructedPat<'p, 'tcx>) -> Self {
+        PatStack { pats: smallvec![pat] }
+    }
+
+    fn is_empty(&self) -> bool {
+        self.pats.is_empty()
+    }
+
+    fn len(&self) -> usize {
+        self.pats.len()
+    }
+
+    fn head(&self) -> &'p DeconstructedPat<'p, 'tcx> {
+        self.pats[0]
+    }
+
+    fn iter(&self) -> impl Iterator<Item = &DeconstructedPat<'p, 'tcx>> {
+        self.pats.iter().copied()
+    }
+
+    // Recursively expand the first or-pattern into its subpatterns. Only useful if the pattern is
+    // an or-pattern. Panics if `self` is empty.
+    fn expand_or_pat<'a>(&'a self) -> impl Iterator<Item = PatStack<'p, 'tcx>> + Captures<'a> {
+        self.head().flatten_or_pat().into_iter().map(move |pat| {
+            let mut new = self.clone();
+            new.pats[0] = pat;
+            new
+        })
+    }
+
+    /// This computes `specialize(ctor, self)`. See top of the file for explanations.
+    /// Only call if `ctor.is_covered_by(self.head().ctor())` is true.
+    fn pop_head_constructor(
+        &self,
+        pcx: &PatCtxt<'_, 'p, 'tcx>,
+        ctor: &Constructor<'tcx>,
+    ) -> PatStack<'p, 'tcx> {
+        // We pop the head pattern and push the new fields extracted from the arguments of
+        // `self.head()`.
+        let mut new_pats = self.head().specialize(pcx, ctor);
+        new_pats.extend_from_slice(&self.pats[1..]);
+        PatStack { pats: new_pats }
+    }
+}
+
+impl<'p, 'tcx> fmt::Debug for PatStack<'p, 'tcx> {
+    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
+        // We pretty-print similarly to the `Debug` impl of `Matrix`.
+        write!(f, "+")?;
+        for pat in self.iter() {
+            write!(f, " {pat:?} +")?;
+        }
+        Ok(())
+    }
+}
+
+/// A row of the matrix.
+#[derive(Clone)]
+struct MatrixRow<'p, 'tcx> {
+    // The patterns in the row.
+    pats: PatStack<'p, 'tcx>,
+    /// Whether the original arm had a guard. This is inherited when specializing.
+    is_under_guard: bool,
+    /// When we specialize, we remember which row of the original matrix produced a given row of the
+    /// specialized matrix. When we unspecialize, we use this to propagate usefulness back up the
+    /// callstack.
+    parent_row: usize,
+    /// False when the matrix is just built. This is set to `true` by
+    /// [`compute_exhaustiveness_and_usefulness`] if the arm is found to be useful.
+    /// This is reset to `false` when specializing.
+    useful: bool,
+}
+
+impl<'p, 'tcx> MatrixRow<'p, 'tcx> {
+    fn is_empty(&self) -> bool {
+        self.pats.is_empty()
+    }
+
+    fn len(&self) -> usize {
+        self.pats.len()
+    }
+
+    fn head(&self) -> &'p DeconstructedPat<'p, 'tcx> {
+        self.pats.head()
+    }
+
+    fn iter(&self) -> impl Iterator<Item = &DeconstructedPat<'p, 'tcx>> {
+        self.pats.iter()
+    }
+
+    // Recursively expand the first or-pattern into its subpatterns. Only useful if the pattern is
+    // an or-pattern. Panics if `self` is empty.
+    fn expand_or_pat<'a>(&'a self) -> impl Iterator<Item = MatrixRow<'p, 'tcx>> + Captures<'a> {
+        self.pats.expand_or_pat().map(|patstack| MatrixRow {
+            pats: patstack,
+            parent_row: self.parent_row,
+            is_under_guard: self.is_under_guard,
+            useful: false,
+        })
+    }
+
+    /// This computes `specialize(ctor, self)`. See top of the file for explanations.
+    /// Only call if `ctor.is_covered_by(self.head().ctor())` is true.
+    fn pop_head_constructor(
+        &self,
+        pcx: &PatCtxt<'_, 'p, 'tcx>,
+        ctor: &Constructor<'tcx>,
+        parent_row: usize,
+    ) -> MatrixRow<'p, 'tcx> {
+        MatrixRow {
+            pats: self.pats.pop_head_constructor(pcx, ctor),
+            parent_row,
+            is_under_guard: self.is_under_guard,
+            useful: false,
+        }
+    }
+}
+
+impl<'p, 'tcx> fmt::Debug for MatrixRow<'p, 'tcx> {
+    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
+        self.pats.fmt(f)
+    }
+}
+
+/// A 2D matrix. Represents a list of pattern-tuples under investigation.
+///
+/// Invariant: each row must have the same length, and each column must have the same type.
+///
+/// Invariant: the first column must not contain or-patterns. This is handled by
+/// [`Matrix::expand_and_push`].
+///
+/// In fact each column corresponds to a place inside the scrutinee of the match. E.g. after
+/// specializing `(,)` and `Some` on a pattern of type `(Option<u32>, bool)`, the first column of
+/// the matrix will correspond to `scrutinee.0.Some.0` and the second column to `scrutinee.1`.
+#[derive(Clone)]
+struct Matrix<'p, 'tcx> {
+    /// Vector of rows. The rows must form a rectangular 2D array. Moreover, all the patterns of
+    /// each column must have the same type. Each column corresponds to a place within the
+    /// scrutinee.
+    rows: Vec<MatrixRow<'p, 'tcx>>,
+    /// Stores an extra fictitious row full of wildcards. Mostly used to keep track of the type of
+    /// each column. This must obey the same invariants as the real rows.
+    wildcard_row: PatStack<'p, 'tcx>,
+    /// Track for each column/place whether it contains a known valid value.
+    place_validity: SmallVec<[ValidityConstraint; 2]>,
+}
+
+impl<'p, 'tcx> Matrix<'p, 'tcx> {
+    /// Pushes a new row to the matrix. If the row starts with an or-pattern, this recursively
+    /// expands it. Internal method, prefer [`Matrix::new`].
+    fn expand_and_push(&mut self, row: MatrixRow<'p, 'tcx>) {
+        if !row.is_empty() && row.head().is_or_pat() {
+            // Expand nested or-patterns.
+            for new_row in row.expand_or_pat() {
+                self.rows.push(new_row);
+            }
+        } else {
+            self.rows.push(row);
+        }
+    }
+
+    /// Build a new matrix from an iterator of `MatchArm`s.
+    fn new<'a>(
+        cx: &MatchCheckCtxt<'p, 'tcx>,
+        arms: &[MatchArm<'p, 'tcx>],
+        scrut_ty: Ty<'tcx>,
+        scrut_validity: ValidityConstraint,
+    ) -> Self
+    where
+        'p: 'a,
+    {
+        let wild_pattern = cx.pattern_arena.alloc(DeconstructedPat::wildcard(scrut_ty, DUMMY_SP));
+        let wildcard_row = PatStack::from_pattern(wild_pattern);
+        let mut matrix = Matrix {
+            rows: Vec::with_capacity(arms.len()),
+            wildcard_row,
+            place_validity: smallvec![scrut_validity],
+        };
+        for (row_id, arm) in arms.iter().enumerate() {
+            let v = MatrixRow {
+                pats: PatStack::from_pattern(arm.pat),
+                parent_row: row_id, // dummy, we won't read it
+                is_under_guard: arm.has_guard,
+                useful: false,
+            };
+            matrix.expand_and_push(v);
+        }
+        matrix
+    }
+
+    fn head_ty(&self) -> Option<Ty<'tcx>> {
+        if self.column_count() == 0 {
+            return None;
+        }
+
+        let mut ty = self.wildcard_row.head().ty();
+        // If the type is opaque and it is revealed anywhere in the column, we take the revealed
+        // version. Otherwise we could encounter constructors for the revealed type and crash.
+        let is_opaque = |ty: Ty<'tcx>| matches!(ty.kind(), ty::Alias(ty::Opaque, ..));
+        if is_opaque(ty) {
+            for pat in self.heads() {
+                let pat_ty = pat.ty();
+                if !is_opaque(pat_ty) {
+                    ty = pat_ty;
+                    break;
+                }
+            }
+        }
+        Some(ty)
+    }
+    fn column_count(&self) -> usize {
+        self.wildcard_row.len()
+    }
+
+    fn rows<'a>(
+        &'a self,
+    ) -> impl Iterator<Item = &'a MatrixRow<'p, 'tcx>> + Clone + DoubleEndedIterator + ExactSizeIterator
+    {
+        self.rows.iter()
+    }
+    fn rows_mut<'a>(
+        &'a mut self,
+    ) -> impl Iterator<Item = &'a mut MatrixRow<'p, 'tcx>> + DoubleEndedIterator + ExactSizeIterator
+    {
+        self.rows.iter_mut()
+    }
+
+    /// Iterate over the first pattern of each row.
+    fn heads<'a>(
+        &'a self,
+    ) -> impl Iterator<Item = &'p DeconstructedPat<'p, 'tcx>> + Clone + Captures<'a> {
+        self.rows().map(|r| r.head())
+    }
+
+    /// This computes `specialize(ctor, self)`. See top of the file for explanations.
+    fn specialize_constructor(
+        &self,
+        pcx: &PatCtxt<'_, 'p, 'tcx>,
+        ctor: &Constructor<'tcx>,
+    ) -> Matrix<'p, 'tcx> {
+        let wildcard_row = self.wildcard_row.pop_head_constructor(pcx, ctor);
+        let new_validity = self.place_validity[0].specialize(pcx, ctor);
+        let new_place_validity = std::iter::repeat(new_validity)
+            .take(ctor.arity(pcx))
+            .chain(self.place_validity[1..].iter().copied())
+            .collect();
+        let mut matrix =
+            Matrix { rows: Vec::new(), wildcard_row, place_validity: new_place_validity };
+        for (i, row) in self.rows().enumerate() {
+            if ctor.is_covered_by(pcx, row.head().ctor()) {
+                let new_row = row.pop_head_constructor(pcx, ctor, i);
+                matrix.expand_and_push(new_row);
+            }
+        }
+        matrix
+    }
+}
+
+/// Pretty-printer for matrices of patterns, example:
+///
+/// ```text
+/// + _     + []                +
+/// + true  + [First]           +
+/// + true  + [Second(true)]    +
+/// + false + [_]               +
+/// + _     + [_, _, tail @ ..] +
+/// | ✓     | ?                 | // column validity
+/// ```
+impl<'p, 'tcx> fmt::Debug for Matrix<'p, 'tcx> {
+    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
+        write!(f, "\n")?;
+
+        let mut pretty_printed_matrix: Vec<Vec<String>> = self
+            .rows
+            .iter()
+            .map(|row| row.iter().map(|pat| format!("{pat:?}")).collect())
+            .collect();
+        pretty_printed_matrix
+            .push(self.place_validity.iter().map(|validity| format!("{validity}")).collect());
+
+        let column_count = self.column_count();
+        assert!(self.rows.iter().all(|row| row.len() == column_count));
+        assert!(self.place_validity.len() == column_count);
+        let column_widths: Vec<usize> = (0..column_count)
+            .map(|col| pretty_printed_matrix.iter().map(|row| row[col].len()).max().unwrap_or(0))
+            .collect();
+
+        for (row_i, row) in pretty_printed_matrix.into_iter().enumerate() {
+            let is_validity_row = row_i == self.rows.len();
+            let sep = if is_validity_row { "|" } else { "+" };
+            write!(f, "{sep}")?;
+            for (column, pat_str) in row.into_iter().enumerate() {
+                write!(f, " ")?;
+                write!(f, "{:1$}", pat_str, column_widths[column])?;
+                write!(f, " {sep}")?;
+            }
+            if is_validity_row {
+                write!(f, " // column validity")?;
+            }
+            write!(f, "\n")?;
+        }
+        Ok(())
+    }
+}
+
+/// A witness-tuple of non-exhaustiveness for error reporting, represented as a list of patterns (in
+/// reverse order of construction).
+///
+/// This mirrors `PatStack`: they function similarly, except `PatStack` contains user patterns we
+/// are inspecting, and `WitnessStack` contains witnesses we are constructing.
+/// FIXME(Nadrieril): use the same order of patterns for both.
+///
+/// A `WitnessStack` should have the same types and length as the `PatStack`s we are inspecting
+/// (except we store the patterns in reverse order). The same way `PatStack` starts with length 1,
+/// at the end of the algorithm this will have length 1. In the middle of the algorithm, it can
+/// contain multiple patterns.
+///
+/// For example, if we are constructing a witness for the match against
+///
+/// ```compile_fail,E0004
+/// struct Pair(Option<(u32, u32)>, bool);
+/// # fn foo(p: Pair) {
+/// match p {
+///    Pair(None, _) => {}
+///    Pair(_, false) => {}
+/// }
+/// # }
+/// ```
+///
+/// We'll perform the following steps (among others):
+/// ```text
+/// - Start with a matrix representing the match
+///     `PatStack(vec![Pair(None, _)])`
+///     `PatStack(vec![Pair(_, false)])`
+/// - Specialize with `Pair`
+///     `PatStack(vec![None, _])`
+///     `PatStack(vec![_, false])`
+/// - Specialize with `Some`
+///     `PatStack(vec![_, false])`
+/// - Specialize with `_`
+///     `PatStack(vec![false])`
+/// - Specialize with `true`
+///     // no patstacks left
+/// - This is a non-exhaustive match: we have the empty witness stack as a witness.
+///     `WitnessStack(vec![])`
+/// - Apply `true`
+///     `WitnessStack(vec![true])`
+/// - Apply `_`
+///     `WitnessStack(vec![true, _])`
+/// - Apply `Some`
+///     `WitnessStack(vec![true, Some(_)])`
+/// - Apply `Pair`
+///     `WitnessStack(vec![Pair(Some(_), true)])`
+/// ```
+///
+/// The final `Pair(Some(_), true)` is then the resulting witness.
+///
+/// See the top of the file for more detailed explanations and examples.
+#[derive(Debug, Clone)]
+struct WitnessStack<'tcx>(Vec<WitnessPat<'tcx>>);
+
+impl<'tcx> WitnessStack<'tcx> {
+    /// Asserts that the witness contains a single pattern, and returns it.
+    fn single_pattern(self) -> WitnessPat<'tcx> {
+        assert_eq!(self.0.len(), 1);
+        self.0.into_iter().next().unwrap()
+    }
+
+    /// Reverses specialization by the `Missing` constructor by pushing a whole new pattern.
+    fn push_pattern(&mut self, pat: WitnessPat<'tcx>) {
+        self.0.push(pat);
+    }
+
+    /// Reverses specialization. Given a witness obtained after specialization, this constructs a
+    /// new witness valid for before specialization. See the section on `unspecialize` at the top of
+    /// the file.
+    ///
+    /// Examples:
+    /// ```text
+    /// ctor: tuple of 2 elements
+    /// pats: [false, "foo", _, true]
+    /// result: [(false, "foo"), _, true]
+    ///
+    /// ctor: Enum::Variant { a: (bool, &'static str), b: usize}
+    /// pats: [(false, "foo"), _, true]
+    /// result: [Enum::Variant { a: (false, "foo"), b: _ }, true]
+    /// ```
+    fn apply_constructor(&mut self, pcx: &PatCtxt<'_, '_, 'tcx>, ctor: &Constructor<'tcx>) {
+        let len = self.0.len();
+        let arity = ctor.arity(pcx);
+        let fields = self.0.drain((len - arity)..).rev().collect();
+        let pat = WitnessPat::new(ctor.clone(), fields, pcx.ty);
+        self.0.push(pat);
+    }
+}
+
+/// Represents a set of pattern-tuples that are witnesses of non-exhaustiveness for error
+/// reporting. This has similar invariants as `Matrix` does.
+///
+/// The `WitnessMatrix` returned by [`compute_exhaustiveness_and_usefulness`] obeys the invariant
+/// that the union of the input `Matrix` and the output `WitnessMatrix` together matches the type
+/// exhaustively.
+///
+/// Just as the `Matrix` starts with a single column, by the end of the algorithm, this has a single
+/// column, which contains the patterns that are missing for the match to be exhaustive.
+#[derive(Debug, Clone)]
+struct WitnessMatrix<'tcx>(Vec<WitnessStack<'tcx>>);
+
+impl<'tcx> WitnessMatrix<'tcx> {
+    /// New matrix with no witnesses.
+    fn empty() -> Self {
+        WitnessMatrix(vec![])
+    }
+    /// New matrix with one `()` witness, i.e. with no columns.
+    fn unit_witness() -> Self {
+        WitnessMatrix(vec![WitnessStack(vec![])])
+    }
+
+    /// Whether this has any witnesses.
+    fn is_empty(&self) -> bool {
+        self.0.is_empty()
+    }
+    /// Asserts that there is a single column and returns the patterns in it.
+    fn single_column(self) -> Vec<WitnessPat<'tcx>> {
+        self.0.into_iter().map(|w| w.single_pattern()).collect()
+    }
+
+    /// Reverses specialization by the `Missing` constructor by pushing a whole new pattern.
+    fn push_pattern(&mut self, pat: WitnessPat<'tcx>) {
+        for witness in self.0.iter_mut() {
+            witness.push_pattern(pat.clone())
+        }
+    }
+
+    /// Reverses specialization by `ctor`. See the section on `unspecialize` at the top of the file.
+    fn apply_constructor(
+        &mut self,
+        pcx: &PatCtxt<'_, '_, 'tcx>,
+        missing_ctors: &[Constructor<'tcx>],
+        ctor: &Constructor<'tcx>,
+        report_individual_missing_ctors: bool,
+    ) {
+        if self.is_empty() {
+            return;
+        }
+        if matches!(ctor, Constructor::Missing) {
+            // We got the special `Missing` constructor that stands for the constructors not present
+            // in the match.
+            if !report_individual_missing_ctors {
+                // Report `_` as missing.
+                let pat = WitnessPat::wild_from_ctor(pcx, Constructor::Wildcard);
+                self.push_pattern(pat);
+            } else if missing_ctors.iter().any(|c| c.is_non_exhaustive()) {
+                // We need to report a `_` anyway, so listing other constructors would be redundant.
+                // `NonExhaustive` is displayed as `_` just like `Wildcard`, but it will be picked
+                // up by diagnostics to add a note about why `_` is required here.
+                let pat = WitnessPat::wild_from_ctor(pcx, Constructor::NonExhaustive);
+                self.push_pattern(pat);
+            } else {
+                // For each missing constructor `c`, we add a `c(_, _, _)` witness appropriately
+                // filled with wildcards.
+                let mut ret = Self::empty();
+                for ctor in missing_ctors {
+                    let pat = WitnessPat::wild_from_ctor(pcx, ctor.clone());
+                    // Clone `self` and add `c(_, _, _)` to each of its witnesses.
+                    let mut wit_matrix = self.clone();
+                    wit_matrix.push_pattern(pat);
+                    ret.extend(wit_matrix);
+                }
+                *self = ret;
+            }
+        } else {
+            // Any other constructor we unspecialize as expected.
+            for witness in self.0.iter_mut() {
+                witness.apply_constructor(pcx, ctor)
+            }
+        }
+    }
+
+    /// Merges the witnesses of two matrices. Their column types must match.
+    fn extend(&mut self, other: Self) {
+        self.0.extend(other.0)
+    }
+}
+
+/// The core of the algorithm.
+///
+/// This recursively computes witnesses of the non-exhaustiveness of `matrix` (if any). Also tracks
+/// usefulness of each row in the matrix (in `row.useful`). We track usefulness of each
+/// subpattern using interior mutability in `DeconstructedPat`.
+///
+/// The input `Matrix` and the output `WitnessMatrix` together match the type exhaustively.
+///
+/// The key steps are:
+/// - specialization, where we dig into the rows that have a specific constructor and call ourselves
+///     recursively;
+/// - unspecialization, where we lift the results from the previous step into results for this step
+///     (using `apply_constructor` and by updating `row.useful` for each parent row).
+/// This is all explained at the top of the file.
+#[instrument(level = "debug", skip(cx, is_top_level), ret)]
+fn compute_exhaustiveness_and_usefulness<'p, 'tcx>(
+    cx: &MatchCheckCtxt<'p, 'tcx>,
+    matrix: &mut Matrix<'p, 'tcx>,
+    is_top_level: bool,
+) -> WitnessMatrix<'tcx> {
+    debug_assert!(matrix.rows().all(|r| r.len() == matrix.column_count()));
+
+    let Some(ty) = matrix.head_ty() else {
+        // The base case: there are no columns in the matrix. We are morally pattern-matching on ().
+        // A row is useful iff it has no (unguarded) rows above it.
+        for row in matrix.rows_mut() {
+            // All rows are useful until they're not.
+            row.useful = true;
+            // When there's an unguarded row, the match is exhaustive and any subsequent row is not
+            // useful.
+            if !row.is_under_guard {
+                return WitnessMatrix::empty();
+            }
+        }
+        // No (unguarded) rows, so the match is not exhaustive. We return a new witness.
+        return WitnessMatrix::unit_witness();
+    };
+
+    debug!("ty: {ty:?}");
+    let pcx = &PatCtxt { cx, ty, is_top_level };
+
+    // Whether the place/column we are inspecting is known to contain valid data.
+    let place_validity = matrix.place_validity[0];
+    // For backwards compability we allow omitting some empty arms that we ideally shouldn't.
+    let place_validity = place_validity.allow_omitting_side_effecting_arms();
+
+    // Analyze the constructors present in this column.
+    let ctors = matrix.heads().map(|p| p.ctor());
+    let ctors_for_ty = &cx.ctors_for_ty(ty);
+    let is_integers = matches!(ctors_for_ty, ConstructorSet::Integers { .. }); // For diagnostics.
+    let split_set = ctors_for_ty.split(pcx, ctors);
+    let all_missing = split_set.present.is_empty();
+
+    // Build the set of constructors we will specialize with. It must cover the whole type.
+    let mut split_ctors = split_set.present;
+    if !split_set.missing.is_empty() {
+        // We need to iterate over a full set of constructors, so we add `Missing` to represent the
+        // missing ones. This is explained under "Constructor Splitting" at the top of this file.
+        split_ctors.push(Constructor::Missing);
+    } else if !split_set.missing_empty.is_empty() && !place_validity.is_known_valid() {
+        // The missing empty constructors are reachable if the place can contain invalid data.
+        split_ctors.push(Constructor::Missing);
+    }
+
+    // Decide what constructors to report.
+    let always_report_all = is_top_level && !is_integers;
+    // Whether we should report "Enum::A and Enum::C are missing" or "_ is missing".
+    let report_individual_missing_ctors = always_report_all || !all_missing;
+    // Which constructors are considered missing. We ensure that `!missing_ctors.is_empty() =>
+    // split_ctors.contains(Missing)`. The converse usually holds except in the
+    // `MaybeInvalidButAllowOmittingArms` backwards-compatibility case.
+    let mut missing_ctors = split_set.missing;
+    if !place_validity.allows_omitting_empty_arms() {
+        missing_ctors.extend(split_set.missing_empty);
+    }
+
+    let mut ret = WitnessMatrix::empty();
+    for ctor in split_ctors {
+        debug!("specialize({:?})", ctor);
+        // Dig into rows that match `ctor`.
+        let mut spec_matrix = matrix.specialize_constructor(pcx, &ctor);
+        let mut witnesses = ensure_sufficient_stack(|| {
+            compute_exhaustiveness_and_usefulness(cx, &mut spec_matrix, false)
+        });
+
+        let counts_for_exhaustiveness = match ctor {
+            Constructor::Missing => !missing_ctors.is_empty(),
+            // If there are missing constructors we'll report those instead. Since `Missing` matches
+            // only the wildcard rows, it matches fewer rows than this constructor, and is therefore
+            // guaranteed to result in the same or more witnesses. So skipping this does not
+            // jeopardize correctness.
+            _ => missing_ctors.is_empty(),
+        };
+        if counts_for_exhaustiveness {
+            // Transform witnesses for `spec_matrix` into witnesses for `matrix`.
+            witnesses.apply_constructor(
+                pcx,
+                &missing_ctors,
+                &ctor,
+                report_individual_missing_ctors,
+            );
+            // Accumulate the found witnesses.
+            ret.extend(witnesses);
+        }
+
+        // A parent row is useful if any of its children is.
+        for child_row in spec_matrix.rows() {
+            let parent_row = &mut matrix.rows[child_row.parent_row];
+            parent_row.useful = parent_row.useful || child_row.useful;
+        }
+    }
+
+    // Record usefulness in the patterns.
+    for row in matrix.rows() {
+        if row.useful {
+            row.head().set_useful();
+        }
+    }
+
+    ret
+}
+
+/// Indicates whether or not a given arm is useful.
+#[derive(Clone, Debug)]
+pub enum Usefulness {
+    /// The arm is useful. This additionally carries a set of or-pattern branches that have been
+    /// found to be redundant despite the overall arm being useful. Used only in the presence of
+    /// or-patterns, otherwise it stays empty.
+    Useful(Vec<Span>),
+    /// The arm is redundant and can be removed without changing the behavior of the match
+    /// expression.
+    Redundant,
+}
+
+/// The output of checking a match for exhaustiveness and arm usefulness.
+pub struct UsefulnessReport<'p, 'tcx> {
+    /// For each arm of the input, whether that arm is useful after the arms above it.
+    pub arm_usefulness: Vec<(MatchArm<'p, 'tcx>, Usefulness)>,
+    /// If the match is exhaustive, this is empty. If not, this contains witnesses for the lack of
+    /// exhaustiveness.
+    pub non_exhaustiveness_witnesses: Vec<WitnessPat<'tcx>>,
+}
+
+/// Computes whether a match is exhaustive and which of its arms are useful.
+#[instrument(skip(cx, arms), level = "debug")]
+pub(crate) fn compute_match_usefulness<'p, 'tcx>(
+    cx: &MatchCheckCtxt<'p, 'tcx>,
+    arms: &[MatchArm<'p, 'tcx>],
+    scrut_ty: Ty<'tcx>,
+) -> UsefulnessReport<'p, 'tcx> {
+    let scrut_validity = ValidityConstraint::from_bool(cx.known_valid_scrutinee);
+    let mut matrix = Matrix::new(cx, arms, scrut_ty, scrut_validity);
+    let non_exhaustiveness_witnesses = compute_exhaustiveness_and_usefulness(cx, &mut matrix, true);
+
+    let non_exhaustiveness_witnesses: Vec<_> = non_exhaustiveness_witnesses.single_column();
+    let arm_usefulness: Vec<_> = arms
+        .iter()
+        .copied()
+        .map(|arm| {
+            debug!(?arm);
+            // We warn when a pattern is not useful.
+            let usefulness = if arm.pat.is_useful() {
+                Usefulness::Useful(arm.pat.redundant_spans())
+            } else {
+                Usefulness::Redundant
+            };
+            (arm, usefulness)
+        })
+        .collect();
+    UsefulnessReport { arm_usefulness, non_exhaustiveness_witnesses }
+}