Auto merge of #54649 - nikomatsakis:universes-refactor-1, r=scalexm

adopt "placeholders" to represent universally quantified regions

This does a few preliminary refactorings that lay some groundwork for moving towards universe integration. Two things, primarily:

- Rename from "skolemized" to "placeholder"
- When instantiating `for<'a, 'b, 'c>`, just create one universe for all 3 regions, and distinguish them from one another using the `BoundRegion`.
    - This is more accurate, and I think that in general we'll be moving towards a model of separating "binder" (universe, debruijn index) from "index within binder" in a number of places.
    - In principle, it feels the current setup of making lots of universes could lead to us doing the wrong thing, but I've actually not been able to come up with an example where this is so.

r? @scalexm
cc @arielb1
This commit is contained in:
bors 2018-10-04 20:28:57 +00:00
commit 8c4ad4e9e4
34 changed files with 1868 additions and 1532 deletions

View File

@ -131,7 +131,7 @@ for ty::RegionKind {
} }
ty::ReLateBound(..) | ty::ReLateBound(..) |
ty::ReVar(..) | ty::ReVar(..) |
ty::ReSkolemized(..) => { ty::RePlaceholder(..) => {
bug!("StableHasher: unexpected region {:?}", *self) bug!("StableHasher: unexpected region {:?}", *self)
} }
} }

View File

@ -224,7 +224,7 @@ impl<'cx, 'gcx, 'tcx> TypeFolder<'gcx, 'tcx> for Canonicalizer<'cx, 'gcx, 'tcx>
ty::ReEarlyBound(..) ty::ReEarlyBound(..)
| ty::ReFree(_) | ty::ReFree(_)
| ty::ReScope(_) | ty::ReScope(_)
| ty::ReSkolemized(..) | ty::RePlaceholder(..)
| ty::ReEmpty | ty::ReEmpty
| ty::ReErased => { | ty::ReErased => {
if self.canonicalize_region_mode.other_free_regions { if self.canonicalize_region_mode.other_free_regions {

View File

@ -458,9 +458,10 @@ impl<'cx, 'gcx, 'tcx> TypeRelation<'cx, 'gcx, 'tcx> for Generalizer<'cx, 'gcx, '
return Ok(r); return Ok(r);
} }
// Always make a fresh region variable for skolemized regions; // Always make a fresh region variable for placeholder
// the higher-ranked decision procedures rely on this. // regions; the higher-ranked decision procedures rely on
ty::ReSkolemized(..) => { } // this.
ty::RePlaceholder(..) => { }
// For anything else, we make a region variable, unless we // For anything else, we make a region variable, unless we
// are *equating*, in which case it's just wasteful. // are *equating*, in which case it's just wasteful.

View File

@ -142,12 +142,12 @@ impl<'a, 'gcx, 'tcx> TyCtxt<'a, 'gcx, 'tcx> {
ty::ReEmpty => ("the empty lifetime".to_owned(), None), ty::ReEmpty => ("the empty lifetime".to_owned(), None),
// FIXME(#13998) ReSkolemized should probably print like // FIXME(#13998) RePlaceholder should probably print like
// ReFree rather than dumping Debug output on the user. // ReFree rather than dumping Debug output on the user.
// //
// We shouldn't really be having unification failures with ReVar // We shouldn't really be having unification failures with ReVar
// and ReLateBound though. // and ReLateBound though.
ty::ReSkolemized(..) | ty::ReVar(_) | ty::ReLateBound(..) | ty::ReErased => { ty::RePlaceholder(..) | ty::ReVar(_) | ty::ReLateBound(..) | ty::ReErased => {
(format!("lifetime {:?}", region), None) (format!("lifetime {:?}", region), None)
} }

View File

@ -107,7 +107,7 @@ impl<'a, 'gcx, 'tcx> TypeFolder<'gcx, 'tcx> for TypeFreshener<'a, 'gcx, 'tcx> {
ty::ReFree(_) | ty::ReFree(_) |
ty::ReScope(_) | ty::ReScope(_) |
ty::ReVar(_) | ty::ReVar(_) |
ty::ReSkolemized(..) | ty::RePlaceholder(..) |
ty::ReEmpty | ty::ReEmpty |
ty::ReErased => { ty::ReErased => {
// replace all free regions with 'erased // replace all free regions with 'erased

View File

@ -72,11 +72,11 @@ the same lifetime, but not the reverse.
Here is the algorithm we use to perform the subtyping check: Here is the algorithm we use to perform the subtyping check:
1. Replace all bound regions in the subtype with new variables 1. Replace all bound regions in the subtype with new variables
2. Replace all bound regions in the supertype with skolemized 2. Replace all bound regions in the supertype with placeholder
equivalents. A "skolemized" region is just a new fresh region equivalents. A "placeholder" region is just a new fresh region
name. name.
3. Check that the parameter and return types match as normal 3. Check that the parameter and return types match as normal
4. Ensure that no skolemized regions 'leak' into region variables 4. Ensure that no placeholder regions 'leak' into region variables
visible from "the outside" visible from "the outside"
Let's walk through some examples and see how this algorithm plays out. Let's walk through some examples and see how this algorithm plays out.
@ -95,7 +95,7 @@ like so:
Here the upper case `&A` indicates a *region variable*, that is, a Here the upper case `&A` indicates a *region variable*, that is, a
region whose value is being inferred by the system. I also replaced region whose value is being inferred by the system. I also replaced
`&b` with `&x`---I'll use letters late in the alphabet (`x`, `y`, `z`) `&b` with `&x`---I'll use letters late in the alphabet (`x`, `y`, `z`)
to indicate skolemized region names. We can assume they don't appear to indicate placeholder region names. We can assume they don't appear
elsewhere. Note that neither the sub- nor the supertype bind any elsewhere. Note that neither the sub- nor the supertype bind any
region names anymore (as indicated by the absence of `<` and `>`). region names anymore (as indicated by the absence of `<` and `>`).
@ -133,7 +133,7 @@ match. This will ultimately require (as before) that `'a` <= `&x`
must hold: but this does not hold. `self` and `x` are both distinct must hold: but this does not hold. `self` and `x` are both distinct
free regions. So the subtype check fails. free regions. So the subtype check fails.
#### Checking for skolemization leaks #### Checking for placeholder leaks
You may be wondering about that mysterious last step in the algorithm. You may be wondering about that mysterious last step in the algorithm.
So far it has not been relevant. The purpose of that last step is to So far it has not been relevant. The purpose of that last step is to
@ -159,7 +159,7 @@ Now we compare the return types, which are covariant, and hence we have:
fn(&'A T) <: for<'b> fn(&'b T)? fn(&'A T) <: for<'b> fn(&'b T)?
Here we skolemize the bound region in the supertype to yield: Here we replace the bound region in the supertype with a placeholder to yield:
fn(&'A T) <: fn(&'x T)? fn(&'A T) <: fn(&'x T)?
@ -175,23 +175,23 @@ region `x` and think that everything is happy. In fact, this behavior
is *necessary*, it was key to the first example we walked through. is *necessary*, it was key to the first example we walked through.
The difference between this example and the first one is that the variable The difference between this example and the first one is that the variable
`A` already existed at the point where the skolemization occurred. In `A` already existed at the point where the placeholders were added. In
the first example, you had two functions: the first example, you had two functions:
for<'a> fn(&'a T) <: for<'b> fn(&'b T) for<'a> fn(&'a T) <: for<'b> fn(&'b T)
and hence `&A` and `&x` were created "together". In general, the and hence `&A` and `&x` were created "together". In general, the
intention of the skolemized names is that they are supposed to be intention of the placeholder names is that they are supposed to be
fresh names that could never be equal to anything from the outside. fresh names that could never be equal to anything from the outside.
But when inference comes into play, we might not be respecting this But when inference comes into play, we might not be respecting this
rule. rule.
So the way we solve this is to add a fourth step that examines the So the way we solve this is to add a fourth step that examines the
constraints that refer to skolemized names. Basically, consider a constraints that refer to placeholder names. Basically, consider a
non-directed version of the constraint graph. Let `Tainted(x)` be the non-directed version of the constraint graph. Let `Tainted(x)` be the
set of all things reachable from a skolemized variable `x`. set of all things reachable from a placeholder variable `x`.
`Tainted(x)` should not contain any regions that existed before the `Tainted(x)` should not contain any regions that existed before the
step at which the skolemization was performed. So this case here step at which the placeholders were created. So this case here
would fail because `&x` was created alone, but is relatable to `&A`. would fail because `&x` was created alone, but is relatable to `&A`.
## Computing the LUB and GLB ## Computing the LUB and GLB

View File

@ -15,7 +15,7 @@ use super::{CombinedSnapshot,
InferCtxt, InferCtxt,
HigherRankedType, HigherRankedType,
SubregionOrigin, SubregionOrigin,
SkolemizationMap}; PlaceholderMap};
use super::combine::CombineFields; use super::combine::CombineFields;
use super::region_constraints::{TaintDirections}; use super::region_constraints::{TaintDirections};
@ -51,19 +51,21 @@ impl<'a, 'gcx, 'tcx> CombineFields<'a, 'gcx, 'tcx> {
return self.infcx.commit_if_ok(|snapshot| { return self.infcx.commit_if_ok(|snapshot| {
let span = self.trace.cause.span; let span = self.trace.cause.span;
// First, we instantiate each bound region in the subtype with a fresh // First, we instantiate each bound region in the supertype with a
// region variable. // fresh placeholder region.
let (b_prime, placeholder_map) =
self.infcx.replace_late_bound_regions_with_placeholders(b);
// Next, we instantiate each bound region in the subtype
// with a fresh region variable. These region variables --
// but no other pre-existing region variables -- can name
// the placeholders.
let (a_prime, _) = let (a_prime, _) =
self.infcx.replace_late_bound_regions_with_fresh_var( self.infcx.replace_late_bound_regions_with_fresh_var(
span, span,
HigherRankedType, HigherRankedType,
a); a);
// Second, we instantiate each bound region in the supertype with a
// fresh concrete region.
let (b_prime, skol_map) =
self.infcx.skolemize_late_bound_regions(b);
debug!("a_prime={:?}", a_prime); debug!("a_prime={:?}", a_prime);
debug!("b_prime={:?}", b_prime); debug!("b_prime={:?}", b_prime);
@ -71,12 +73,12 @@ impl<'a, 'gcx, 'tcx> CombineFields<'a, 'gcx, 'tcx> {
let result = self.sub(a_is_expected).relate(&a_prime, &b_prime)?; let result = self.sub(a_is_expected).relate(&a_prime, &b_prime)?;
// Presuming type comparison succeeds, we need to check // Presuming type comparison succeeds, we need to check
// that the skolemized regions do not "leak". // that the placeholder regions do not "leak".
self.infcx.leak_check(!a_is_expected, span, &skol_map, snapshot)?; self.infcx.leak_check(!a_is_expected, span, &placeholder_map, snapshot)?;
// We are finished with the skolemized regions now so pop // We are finished with the placeholder regions now so pop
// them off. // them off.
self.infcx.pop_skolemized(skol_map, snapshot); self.infcx.pop_placeholders(placeholder_map, snapshot);
debug!("higher_ranked_sub: OK result={:?}", result); debug!("higher_ranked_sub: OK result={:?}", result);
@ -112,68 +114,68 @@ impl<'a, 'gcx, 'tcx> CombineFields<'a, 'gcx, 'tcx> {
// created as part of this type comparison". // created as part of this type comparison".
return self.infcx.commit_if_ok(|snapshot| { return self.infcx.commit_if_ok(|snapshot| {
// First, we instantiate each bound region in the matcher // First, we instantiate each bound region in the matcher
// with a skolemized region. // with a placeholder region.
let ((a_match, a_value), skol_map) = let ((a_match, a_value), placeholder_map) =
self.infcx.skolemize_late_bound_regions(a_pair); self.infcx.replace_late_bound_regions_with_placeholders(a_pair);
debug!("higher_ranked_match: a_match={:?}", a_match); debug!("higher_ranked_match: a_match={:?}", a_match);
debug!("higher_ranked_match: skol_map={:?}", skol_map); debug!("higher_ranked_match: placeholder_map={:?}", placeholder_map);
// Equate types now that bound regions have been replaced. // Equate types now that bound regions have been replaced.
self.equate(a_is_expected).relate(&a_match, &b_match)?; self.equate(a_is_expected).relate(&a_match, &b_match)?;
// Map each skolemized region to a vector of other regions that it // Map each placeholder region to a vector of other regions that it
// must be equated with. (Note that this vector may include other // must be equated with. (Note that this vector may include other
// skolemized regions from `skol_map`.) // placeholder regions from `placeholder_map`.)
let skol_resolution_map: FxHashMap<_, _> = let placeholder_resolution_map: FxHashMap<_, _> =
skol_map placeholder_map
.iter() .iter()
.map(|(&br, &skol)| { .map(|(&br, &placeholder)| {
let tainted_regions = let tainted_regions =
self.infcx.tainted_regions(snapshot, self.infcx.tainted_regions(snapshot,
skol, placeholder,
TaintDirections::incoming()); // [1] TaintDirections::incoming()); // [1]
// [1] this routine executes after the skolemized // [1] this routine executes after the placeholder
// regions have been *equated* with something // regions have been *equated* with something
// else, so examining the incoming edges ought to // else, so examining the incoming edges ought to
// be enough to collect all constraints // be enough to collect all constraints
(skol, (br, tainted_regions)) (placeholder, (br, tainted_regions))
}) })
.collect(); .collect();
// For each skolemized region, pick a representative -- which can // For each placeholder region, pick a representative -- which can
// be any region from the sets above, except for other members of // be any region from the sets above, except for other members of
// `skol_map`. There should always be a representative if things // `placeholder_map`. There should always be a representative if things
// are properly well-formed. // are properly well-formed.
let skol_representatives: FxHashMap<_, _> = let placeholder_representatives: FxHashMap<_, _> =
skol_resolution_map placeholder_resolution_map
.iter() .iter()
.map(|(&skol, &(_, ref regions))| { .map(|(&placeholder, &(_, ref regions))| {
let representative = let representative =
regions.iter() regions.iter()
.filter(|&&r| !skol_resolution_map.contains_key(r)) .filter(|&&r| !placeholder_resolution_map.contains_key(r))
.cloned() .cloned()
.next() .next()
.unwrap_or_else(|| { .unwrap_or_else(|| {
bug!("no representative region for `{:?}` in `{:?}`", bug!("no representative region for `{:?}` in `{:?}`",
skol, regions) placeholder, regions)
}); });
(skol, representative) (placeholder, representative)
}) })
.collect(); .collect();
// Equate all the members of each skolemization set with the // Equate all the members of each placeholder set with the
// representative. // representative.
for (skol, &(_br, ref regions)) in &skol_resolution_map { for (placeholder, &(_br, ref regions)) in &placeholder_resolution_map {
let representative = &skol_representatives[skol]; let representative = &placeholder_representatives[placeholder];
debug!("higher_ranked_match: \ debug!("higher_ranked_match: \
skol={:?} representative={:?} regions={:?}", placeholder={:?} representative={:?} regions={:?}",
skol, representative, regions); placeholder, representative, regions);
for region in regions.iter() for region in regions.iter()
.filter(|&r| !skol_resolution_map.contains_key(r)) .filter(|&r| !placeholder_resolution_map.contains_key(r))
.filter(|&r| r != representative) .filter(|&r| r != representative)
{ {
let origin = SubregionOrigin::Subtype(self.trace.clone()); let origin = SubregionOrigin::Subtype(self.trace.clone());
@ -184,18 +186,18 @@ impl<'a, 'gcx, 'tcx> CombineFields<'a, 'gcx, 'tcx> {
} }
} }
// Replace the skolemized regions appearing in value with // Replace the placeholder regions appearing in value with
// their representatives // their representatives
let a_value = let a_value =
fold_regions_in( fold_regions_in(
self.tcx(), self.tcx(),
&a_value, &a_value,
|r, _| skol_representatives.get(&r).cloned().unwrap_or(r)); |r, _| placeholder_representatives.get(&r).cloned().unwrap_or(r));
debug!("higher_ranked_match: value={:?}", a_value); debug!("higher_ranked_match: value={:?}", a_value);
// We are now done with these skolemized variables. // We are now done with these placeholder variables.
self.infcx.pop_skolemized(skol_map, snapshot); self.infcx.pop_placeholders(placeholder_map, snapshot);
Ok(HrMatchResult { value: a_value }) Ok(HrMatchResult { value: a_value })
}); });
@ -500,7 +502,7 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
* started. This is used in the sub/lub/glb computations. The * started. This is used in the sub/lub/glb computations. The
* idea here is that when we are computing lub/glb of two * idea here is that when we are computing lub/glb of two
* regions, we sometimes create intermediate region variables. * regions, we sometimes create intermediate region variables.
* Those region variables may touch some of the skolemized or * Those region variables may touch some of the placeholder or
* other "forbidden" regions we created to replace bound * other "forbidden" regions we created to replace bound
* regions, but they don't really represent an "external" * regions, but they don't really represent an "external"
* constraint. * constraint.
@ -527,10 +529,10 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
* we're not careful, it will succeed. * we're not careful, it will succeed.
* *
* The reason is that when we walk through the subtyping * The reason is that when we walk through the subtyping
* algorithm, we begin by replacing `'a` with a skolemized * algorithm, we begin by replacing `'a` with a placeholder
* variable `'1`. We then have `fn(_#0t) <: fn(&'1 int)`. This * variable `'1`. We then have `fn(_#0t) <: fn(&'1 int)`. This
* can be made true by unifying `_#0t` with `&'1 int`. In the * can be made true by unifying `_#0t` with `&'1 int`. In the
* process, we create a fresh variable for the skolemized * process, we create a fresh variable for the placeholder
* region, `'$2`, and hence we have that `_#0t == &'$2 * region, `'$2`, and hence we have that `_#0t == &'$2
* int`. However, because `'$2` was created during the sub * int`. However, because `'$2` was created during the sub
* computation, if we're not careful we will erroneously * computation, if we're not careful we will erroneously
@ -568,33 +570,39 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
region_vars region_vars
} }
/// Replace all regions bound by `binder` with skolemized regions and /// Replace all regions bound by `binder` with placeholder regions and
/// return a map indicating which bound-region was replaced with what /// return a map indicating which bound-region was replaced with what
/// skolemized region. This is the first step of checking subtyping /// placeholder region. This is the first step of checking subtyping
/// when higher-ranked things are involved. /// when higher-ranked things are involved.
/// ///
/// **Important:** you must call this function from within a snapshot. /// **Important:** you must call this function from within a snapshot.
/// Moreover, before committing the snapshot, you must eventually call /// Moreover, before committing the snapshot, you must eventually call
/// either `plug_leaks` or `pop_skolemized` to remove the skolemized /// either `plug_leaks` or `pop_placeholders` to remove the placeholder
/// regions. If you rollback the snapshot (or are using a probe), then /// regions. If you rollback the snapshot (or are using a probe), then
/// the pop occurs as part of the rollback, so an explicit call is not /// the pop occurs as part of the rollback, so an explicit call is not
/// needed (but is also permitted). /// needed (but is also permitted).
/// ///
/// For more information about how skolemization for HRTBs works, see /// For more information about how placeholders and HRTBs work, see
/// the [rustc guide]. /// the [rustc guide].
/// ///
/// [rustc guide]: https://rust-lang-nursery.github.io/rustc-guide/traits/hrtb.html /// [rustc guide]: https://rust-lang-nursery.github.io/rustc-guide/traits/hrtb.html
pub fn skolemize_late_bound_regions<T>(&self, pub fn replace_late_bound_regions_with_placeholders<T>(
binder: &ty::Binder<T>) &self,
-> (T, SkolemizationMap<'tcx>) binder: &ty::Binder<T>,
where T : TypeFoldable<'tcx> ) -> (T, PlaceholderMap<'tcx>)
where
T : TypeFoldable<'tcx>,
{ {
let new_universe = self.create_subuniverse();
let (result, map) = self.tcx.replace_late_bound_regions(binder, |br| { let (result, map) = self.tcx.replace_late_bound_regions(binder, |br| {
self.universe.set(self.universe().subuniverse()); self.tcx.mk_region(ty::RePlaceholder(ty::Placeholder {
self.tcx.mk_region(ty::ReSkolemized(self.universe(), br)) universe: new_universe,
name: br,
}))
}); });
debug!("skolemize_bound_regions(binder={:?}, result={:?}, map={:?})", debug!("replace_late_bound_regions_with_placeholders(binder={:?}, result={:?}, map={:?})",
binder, binder,
result, result,
map); map);
@ -603,19 +611,19 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
} }
/// Searches the region constraints created since `snapshot` was started /// Searches the region constraints created since `snapshot` was started
/// and checks to determine whether any of the skolemized regions created /// and checks to determine whether any of the placeholder regions created
/// in `skol_map` would "escape" -- meaning that they are related to /// in `placeholder_map` would "escape" -- meaning that they are related to
/// other regions in some way. If so, the higher-ranked subtyping doesn't /// other regions in some way. If so, the higher-ranked subtyping doesn't
/// hold. See `README.md` for more details. /// hold. See `README.md` for more details.
pub fn leak_check(&self, pub fn leak_check(&self,
overly_polymorphic: bool, overly_polymorphic: bool,
_span: Span, _span: Span,
skol_map: &SkolemizationMap<'tcx>, placeholder_map: &PlaceholderMap<'tcx>,
snapshot: &CombinedSnapshot<'a, 'tcx>) snapshot: &CombinedSnapshot<'a, 'tcx>)
-> RelateResult<'tcx, ()> -> RelateResult<'tcx, ()>
{ {
debug!("leak_check: skol_map={:?}", debug!("leak_check: placeholder_map={:?}",
skol_map); placeholder_map);
// If the user gave `-Zno-leak-check`, then skip the leak // If the user gave `-Zno-leak-check`, then skip the leak
// check completely. This is wildly unsound and also not // check completely. This is wildly unsound and also not
@ -630,14 +638,14 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
} }
let new_vars = self.region_vars_confined_to_snapshot(snapshot); let new_vars = self.region_vars_confined_to_snapshot(snapshot);
for (&skol_br, &skol) in skol_map { for (&placeholder_br, &placeholder) in placeholder_map {
// The inputs to a skolemized variable can only // The inputs to a placeholder variable can only
// be itself or other new variables. // be itself or other new variables.
let incoming_taints = self.tainted_regions(snapshot, let incoming_taints = self.tainted_regions(snapshot,
skol, placeholder,
TaintDirections::both()); TaintDirections::both());
for &tainted_region in &incoming_taints { for &tainted_region in &incoming_taints {
// Each skolemized should only be relatable to itself // Each placeholder should only be relatable to itself
// or new variables: // or new variables:
match *tainted_region { match *tainted_region {
ty::ReVar(vid) => { ty::ReVar(vid) => {
@ -646,21 +654,21 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
} }
} }
_ => { _ => {
if tainted_region == skol { continue; } if tainted_region == placeholder { continue; }
} }
}; };
debug!("{:?} (which replaced {:?}) is tainted by {:?}", debug!("{:?} (which replaced {:?}) is tainted by {:?}",
skol, placeholder,
skol_br, placeholder_br,
tainted_region); tainted_region);
return Err(if overly_polymorphic { return Err(if overly_polymorphic {
debug!("Overly polymorphic!"); debug!("Overly polymorphic!");
TypeError::RegionsOverlyPolymorphic(skol_br, tainted_region) TypeError::RegionsOverlyPolymorphic(placeholder_br, tainted_region)
} else { } else {
debug!("Not as polymorphic!"); debug!("Not as polymorphic!");
TypeError::RegionsInsufficientlyPolymorphic(skol_br, tainted_region) TypeError::RegionsInsufficientlyPolymorphic(placeholder_br, tainted_region)
}) })
} }
} }
@ -668,9 +676,9 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
Ok(()) Ok(())
} }
/// This code converts from skolemized regions back to late-bound /// This code converts from placeholder regions back to late-bound
/// regions. It works by replacing each region in the taint set of a /// regions. It works by replacing each region in the taint set of a
/// skolemized region with a bound-region. The bound region will be bound /// placeholder region with a bound-region. The bound region will be bound
/// by the outer-most binder in `value`; the caller must ensure that there is /// by the outer-most binder in `value`; the caller must ensure that there is
/// such a binder and it is the right place. /// such a binder and it is the right place.
/// ///
@ -687,7 +695,7 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
/// where A : Clone /// where A : Clone
/// { ... } /// { ... }
/// ///
/// Here we will have replaced `'a` with a skolemized region /// Here we will have replaced `'a` with a placeholder region
/// `'0`. This means that our substitution will be `{A=>&'0 /// `'0`. This means that our substitution will be `{A=>&'0
/// int, R=>&'0 int}`. /// int, R=>&'0 int}`.
/// ///
@ -697,65 +705,65 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
/// to the depth of the predicate, in this case 1, so that the final /// to the depth of the predicate, in this case 1, so that the final
/// predicate is `for<'a> &'a int : Clone`. /// predicate is `for<'a> &'a int : Clone`.
pub fn plug_leaks<T>(&self, pub fn plug_leaks<T>(&self,
skol_map: SkolemizationMap<'tcx>, placeholder_map: PlaceholderMap<'tcx>,
snapshot: &CombinedSnapshot<'a, 'tcx>, snapshot: &CombinedSnapshot<'a, 'tcx>,
value: T) -> T value: T) -> T
where T : TypeFoldable<'tcx> where T : TypeFoldable<'tcx>
{ {
debug!("plug_leaks(skol_map={:?}, value={:?})", debug!("plug_leaks(placeholder_map={:?}, value={:?})",
skol_map, placeholder_map,
value); value);
if skol_map.is_empty() { if placeholder_map.is_empty() {
return value; return value;
} }
// Compute a mapping from the "taint set" of each skolemized // Compute a mapping from the "taint set" of each placeholder
// region back to the `ty::BoundRegion` that it originally // region back to the `ty::BoundRegion` that it originally
// represented. Because `leak_check` passed, we know that // represented. Because `leak_check` passed, we know that
// these taint sets are mutually disjoint. // these taint sets are mutually disjoint.
let inv_skol_map: FxHashMap<ty::Region<'tcx>, ty::BoundRegion> = let inv_placeholder_map: FxHashMap<ty::Region<'tcx>, ty::BoundRegion> =
skol_map placeholder_map
.iter() .iter()
.flat_map(|(&skol_br, &skol)| { .flat_map(|(&placeholder_br, &placeholder)| {
self.tainted_regions(snapshot, skol, TaintDirections::both()) self.tainted_regions(snapshot, placeholder, TaintDirections::both())
.into_iter() .into_iter()
.map(move |tainted_region| (tainted_region, skol_br)) .map(move |tainted_region| (tainted_region, placeholder_br))
}) })
.collect(); .collect();
debug!("plug_leaks: inv_skol_map={:?}", debug!("plug_leaks: inv_placeholder_map={:?}",
inv_skol_map); inv_placeholder_map);
// Remove any instantiated type variables from `value`; those can hide // Remove any instantiated type variables from `value`; those can hide
// references to regions from the `fold_regions` code below. // references to regions from the `fold_regions` code below.
let value = self.resolve_type_vars_if_possible(&value); let value = self.resolve_type_vars_if_possible(&value);
// Map any skolemization byproducts back to a late-bound // Map any placeholder byproducts back to a late-bound
// region. Put that late-bound region at whatever the outermost // region. Put that late-bound region at whatever the outermost
// binder is that we encountered in `value`. The caller is // binder is that we encountered in `value`. The caller is
// responsible for ensuring that (a) `value` contains at least one // responsible for ensuring that (a) `value` contains at least one
// binder and (b) that binder is the one we want to use. // binder and (b) that binder is the one we want to use.
let result = self.tcx.fold_regions(&value, &mut false, |r, current_depth| { let result = self.tcx.fold_regions(&value, &mut false, |r, current_depth| {
match inv_skol_map.get(&r) { match inv_placeholder_map.get(&r) {
None => r, None => r,
Some(br) => { Some(br) => {
// It is the responsibility of the caller to ensure // It is the responsibility of the caller to ensure
// that each skolemized region appears within a // that each placeholder region appears within a
// binder. In practice, this routine is only used by // binder. In practice, this routine is only used by
// trait checking, and all of the skolemized regions // trait checking, and all of the placeholder regions
// appear inside predicates, which always have // appear inside predicates, which always have
// binders, so this assert is satisfied. // binders, so this assert is satisfied.
assert!(current_depth > ty::INNERMOST); assert!(current_depth > ty::INNERMOST);
// since leak-check passed, this skolemized region // since leak-check passed, this placeholder region
// should only have incoming edges from variables // should only have incoming edges from variables
// (which ought not to escape the snapshot, but we // (which ought not to escape the snapshot, but we
// don't check that) or itself // don't check that) or itself
assert!( assert!(
match *r { match *r {
ty::ReVar(_) => true, ty::ReVar(_) => true,
ty::ReSkolemized(_, ref br1) => br == br1, ty::RePlaceholder(index) => index.name == *br,
_ => false, _ => false,
}, },
"leak-check would have us replace {:?} with {:?}", "leak-check would have us replace {:?} with {:?}",
@ -769,31 +777,36 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
} }
}); });
self.pop_skolemized(skol_map, snapshot); self.pop_placeholders(placeholder_map, snapshot);
debug!("plug_leaks: result={:?}", result); debug!("plug_leaks: result={:?}", result);
result result
} }
/// Pops the skolemized regions found in `skol_map` from the region /// Pops the placeholder regions found in `placeholder_map` from the region
/// inference context. Whenever you create skolemized regions via /// inference context. Whenever you create placeholder regions via
/// `skolemize_late_bound_regions`, they must be popped before you /// `replace_late_bound_regions_with_placeholders`, they must be popped before you
/// commit the enclosing snapshot (if you do not commit, e.g. within a /// commit the enclosing snapshot (if you do not commit, e.g. within a
/// probe or as a result of an error, then this is not necessary, as /// probe or as a result of an error, then this is not necessary, as
/// popping happens as part of the rollback). /// popping happens as part of the rollback).
/// ///
/// Note: popping also occurs implicitly as part of `leak_check`. /// Note: popping also occurs implicitly as part of `leak_check`.
pub fn pop_skolemized(&self, pub fn pop_placeholders(
skol_map: SkolemizationMap<'tcx>, &self,
snapshot: &CombinedSnapshot<'a, 'tcx>) { placeholder_map: PlaceholderMap<'tcx>,
debug!("pop_skolemized({:?})", skol_map); snapshot: &CombinedSnapshot<'a, 'tcx>,
let skol_regions: FxHashSet<_> = skol_map.values().cloned().collect(); ) {
debug!("pop_placeholders({:?})", placeholder_map);
let placeholder_regions: FxHashSet<_> = placeholder_map.values().cloned().collect();
self.borrow_region_constraints() self.borrow_region_constraints()
.pop_skolemized(self.universe(), &skol_regions, &snapshot.region_constraints_snapshot); .pop_placeholders(
&placeholder_regions,
&snapshot.region_constraints_snapshot,
);
self.universe.set(snapshot.universe); self.universe.set(snapshot.universe);
if !skol_map.is_empty() { if !placeholder_map.is_empty() {
self.projection_cache.borrow_mut().rollback_skolemized( self.projection_cache.borrow_mut().rollback_placeholder(
&snapshot.projection_cache_snapshot); &snapshot.projection_cache_snapshot);
} }
} }

View File

@ -28,7 +28,7 @@ use std::u32;
use ty::fold::TypeFoldable; use ty::fold::TypeFoldable;
use ty::{self, Ty, TyCtxt}; use ty::{self, Ty, TyCtxt};
use ty::{ReEarlyBound, ReEmpty, ReErased, ReFree, ReStatic}; use ty::{ReEarlyBound, ReEmpty, ReErased, ReFree, ReStatic};
use ty::{ReLateBound, ReScope, ReSkolemized, ReVar}; use ty::{ReLateBound, ReScope, RePlaceholder, ReVar};
use ty::{Region, RegionVid}; use ty::{Region, RegionVid};
mod graphviz; mod graphviz;
@ -341,7 +341,7 @@ impl<'cx, 'gcx, 'tcx> LexicalResolver<'cx, 'gcx, 'tcx> {
// For these types, we cannot define any additional // For these types, we cannot define any additional
// relationship: // relationship:
(&ReSkolemized(..), _) | (_, &ReSkolemized(..)) => if a == b { (&RePlaceholder(..), _) | (_, &RePlaceholder(..)) => if a == b {
a a
} else { } else {
tcx.types.re_static tcx.types.re_static

View File

@ -229,9 +229,10 @@ pub struct InferCtxt<'a, 'gcx: 'a + 'tcx, 'tcx: 'a> {
universe: Cell<ty::UniverseIndex>, universe: Cell<ty::UniverseIndex>,
} }
/// A map returned by `skolemize_late_bound_regions()` indicating the skolemized /// A map returned by `replace_late_bound_regions_with_placeholders()`
/// region that each late-bound region was replaced with. /// indicating the placeholder region that each late-bound region was
pub type SkolemizationMap<'tcx> = BTreeMap<ty::BoundRegion, ty::Region<'tcx>>; /// replaced with.
pub type PlaceholderMap<'tcx> = BTreeMap<ty::BoundRegion, ty::Region<'tcx>>;
/// See `error_reporting` module for more details /// See `error_reporting` module for more details
#[derive(Clone, Debug)] #[derive(Clone, Debug)]
@ -405,12 +406,14 @@ pub enum RegionVariableOrigin {
#[derive(Copy, Clone, Debug, PartialEq, Eq, Hash)] #[derive(Copy, Clone, Debug, PartialEq, Eq, Hash)]
pub enum NLLRegionVariableOrigin { pub enum NLLRegionVariableOrigin {
// During NLL region processing, we create variables for free /// During NLL region processing, we create variables for free
// regions that we encounter in the function signature and /// regions that we encounter in the function signature and
// elsewhere. This origin indices we've got one of those. /// elsewhere. This origin indices we've got one of those.
FreeRegion, FreeRegion,
BoundRegion(ty::UniverseIndex), /// "Universal" instantiation of a higher-ranked region (e.g.,
/// from a `for<'a> T` binder). Meant to represent "any region".
Placeholder(ty::Placeholder),
Existential, Existential,
} }
@ -419,7 +422,7 @@ impl NLLRegionVariableOrigin {
pub fn is_universal(self) -> bool { pub fn is_universal(self) -> bool {
match self { match self {
NLLRegionVariableOrigin::FreeRegion => true, NLLRegionVariableOrigin::FreeRegion => true,
NLLRegionVariableOrigin::BoundRegion(..) => true, NLLRegionVariableOrigin::Placeholder(..) => true,
NLLRegionVariableOrigin::Existential => false, NLLRegionVariableOrigin::Existential => false,
} }
} }
@ -913,13 +916,13 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
a, a,
b, b,
}, },
skol_map, placeholder_map,
) = self.skolemize_late_bound_regions(predicate); ) = self.replace_late_bound_regions_with_placeholders(predicate);
let cause_span = cause.span; let cause_span = cause.span;
let ok = self.at(cause, param_env).sub_exp(a_is_expected, a, b)?; let ok = self.at(cause, param_env).sub_exp(a_is_expected, a, b)?;
self.leak_check(false, cause_span, &skol_map, snapshot)?; self.leak_check(false, cause_span, &placeholder_map, snapshot)?;
self.pop_skolemized(skol_map, snapshot); self.pop_placeholders(placeholder_map, snapshot);
Ok(ok.unit()) Ok(ok.unit())
})) }))
} }
@ -930,14 +933,14 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
predicate: &ty::PolyRegionOutlivesPredicate<'tcx>, predicate: &ty::PolyRegionOutlivesPredicate<'tcx>,
) -> UnitResult<'tcx> { ) -> UnitResult<'tcx> {
self.commit_if_ok(|snapshot| { self.commit_if_ok(|snapshot| {
let (ty::OutlivesPredicate(r_a, r_b), skol_map) = let (ty::OutlivesPredicate(r_a, r_b), placeholder_map) =
self.skolemize_late_bound_regions(predicate); self.replace_late_bound_regions_with_placeholders(predicate);
let origin = SubregionOrigin::from_obligation_cause(cause, || { let origin = SubregionOrigin::from_obligation_cause(cause, || {
RelateRegionParamBound(cause.span) RelateRegionParamBound(cause.span)
}); });
self.sub_regions(origin, r_b, r_a); // `b : a` ==> `a <= b` self.sub_regions(origin, r_b, r_a); // `b : a` ==> `a <= b`
self.leak_check(false, cause.span, &skol_map, snapshot)?; self.leak_check(false, cause.span, &placeholder_map, snapshot)?;
Ok(self.pop_skolemized(skol_map, snapshot)) Ok(self.pop_placeholders(placeholder_map, snapshot))
}) })
} }

View File

@ -10,19 +10,19 @@
//! See README.md //! See README.md
use self::UndoLogEntry::*;
use self::CombineMapType::*; use self::CombineMapType::*;
use self::UndoLogEntry::*;
use super::{MiscVariable, RegionVariableOrigin, SubregionOrigin};
use super::unify_key; use super::unify_key;
use super::{MiscVariable, RegionVariableOrigin, SubregionOrigin};
use rustc_data_structures::indexed_vec::IndexVec;
use rustc_data_structures::fx::{FxHashMap, FxHashSet}; use rustc_data_structures::fx::{FxHashMap, FxHashSet};
use rustc_data_structures::indexed_vec::IndexVec;
use rustc_data_structures::unify as ut; use rustc_data_structures::unify as ut;
use ty::{self, Ty, TyCtxt};
use ty::{Region, RegionVid};
use ty::ReStatic; use ty::ReStatic;
use ty::{self, Ty, TyCtxt};
use ty::{BrFresh, ReLateBound, ReVar}; use ty::{BrFresh, ReLateBound, ReVar};
use ty::{Region, RegionVid};
use std::collections::BTreeMap; use std::collections::BTreeMap;
use std::{cmp, fmt, mem, u32}; use std::{cmp, fmt, mem, u32};
@ -306,10 +306,10 @@ pub struct RegionSnapshot {
any_unifications: bool, any_unifications: bool,
} }
/// When working with skolemized regions, we often wish to find all of /// When working with placeholder regions, we often wish to find all of
/// the regions that are either reachable from a skolemized region, or /// the regions that are either reachable from a placeholder region, or
/// which can reach a skolemized region, or both. We call such regions /// which can reach a placeholder region, or both. We call such regions
/// *tained* regions. This struct allows you to decide what set of /// *tainted* regions. This struct allows you to decide what set of
/// tainted regions you want. /// tainted regions you want.
#[derive(Debug)] #[derive(Debug)]
pub struct TaintDirections { pub struct TaintDirections {
@ -495,13 +495,12 @@ impl<'tcx> RegionConstraintCollector<'tcx> {
} }
} }
pub fn new_region_var(&mut self, pub fn new_region_var(
&mut self,
universe: ty::UniverseIndex, universe: ty::UniverseIndex,
origin: RegionVariableOrigin) -> RegionVid { origin: RegionVariableOrigin,
let vid = self.var_infos.push(RegionVariableInfo { ) -> RegionVid {
origin, let vid = self.var_infos.push(RegionVariableInfo { origin, universe });
universe,
});
let u_vid = self.unification_table let u_vid = self.unification_table
.new_key(unify_key::RegionVidKey { min_vid: vid }); .new_key(unify_key::RegionVidKey { min_vid: vid });
@ -511,8 +510,7 @@ impl<'tcx> RegionConstraintCollector<'tcx> {
} }
debug!( debug!(
"created new region variable {:?} with origin {:?}", "created new region variable {:?} with origin {:?}",
vid, vid, origin
origin
); );
return vid; return vid;
} }
@ -527,51 +525,25 @@ impl<'tcx> RegionConstraintCollector<'tcx> {
self.var_infos[vid].origin self.var_infos[vid].origin
} }
/// Removes all the edges to/from the skolemized regions that are /// Removes all the edges to/from the placeholder regions that are
/// in `skols`. This is used after a higher-ranked operation /// in `skols`. This is used after a higher-ranked operation
/// completes to remove all trace of the skolemized regions /// completes to remove all trace of the placeholder regions
/// created in that time. /// created in that time.
pub fn pop_skolemized( pub fn pop_placeholders(
&mut self, &mut self,
skolemization_count: ty::UniverseIndex, placeholders: &FxHashSet<ty::Region<'tcx>>,
skols: &FxHashSet<ty::Region<'tcx>>,
snapshot: &RegionSnapshot, snapshot: &RegionSnapshot,
) { ) {
debug!("pop_skolemized_regions(skols={:?})", skols); debug!("pop_placeholders(placeholders={:?})", placeholders);
assert!(self.in_snapshot()); assert!(self.in_snapshot());
assert!(self.undo_log[snapshot.length] == OpenSnapshot); assert!(self.undo_log[snapshot.length] == OpenSnapshot);
assert!(
skolemization_count.as_usize() >= skols.len(),
"popping more skolemized variables than actually exist, \
sc now = {:?}, skols.len = {:?}",
skolemization_count,
skols.len()
);
let last_to_pop = skolemization_count.subuniverse();
let first_to_pop = ty::UniverseIndex::from(last_to_pop.as_u32() - skols.len() as u32);
debug_assert! {
skols.iter()
.all(|&k| match *k {
ty::ReSkolemized(universe, _) =>
universe >= first_to_pop &&
universe < last_to_pop,
_ =>
false
}),
"invalid skolemization keys or keys out of range ({:?}..{:?}): {:?}",
first_to_pop,
last_to_pop,
skols
}
let constraints_to_kill: Vec<usize> = self.undo_log let constraints_to_kill: Vec<usize> = self.undo_log
.iter() .iter()
.enumerate() .enumerate()
.rev() .rev()
.filter(|&(_, undo_entry)| kill_constraint(skols, undo_entry)) .filter(|&(_, undo_entry)| kill_constraint(placeholders, undo_entry))
.map(|(index, _)| index) .map(|(index, _)| index)
.collect(); .collect();
@ -583,20 +555,20 @@ impl<'tcx> RegionConstraintCollector<'tcx> {
return; return;
fn kill_constraint<'tcx>( fn kill_constraint<'tcx>(
skols: &FxHashSet<ty::Region<'tcx>>, placeholders: &FxHashSet<ty::Region<'tcx>>,
undo_entry: &UndoLogEntry<'tcx>, undo_entry: &UndoLogEntry<'tcx>,
) -> bool { ) -> bool {
match undo_entry { match undo_entry {
&AddConstraint(Constraint::VarSubVar(..)) => false, &AddConstraint(Constraint::VarSubVar(..)) => false,
&AddConstraint(Constraint::RegSubVar(a, _)) => skols.contains(&a), &AddConstraint(Constraint::RegSubVar(a, _)) => placeholders.contains(&a),
&AddConstraint(Constraint::VarSubReg(_, b)) => skols.contains(&b), &AddConstraint(Constraint::VarSubReg(_, b)) => placeholders.contains(&b),
&AddConstraint(Constraint::RegSubReg(a, b)) => { &AddConstraint(Constraint::RegSubReg(a, b)) => {
skols.contains(&a) || skols.contains(&b) placeholders.contains(&a) || placeholders.contains(&b)
} }
&AddGiven(..) => false, &AddGiven(..) => false,
&AddVerify(_) => false, &AddVerify(_) => false,
&AddCombination(_, ref two_regions) => { &AddCombination(_, ref two_regions) => {
skols.contains(&two_regions.a) || skols.contains(&two_regions.b) placeholders.contains(&two_regions.a) || placeholders.contains(&two_regions.b)
} }
&AddVar(..) | &OpenSnapshot | &Purged | &CommitedSnapshot => false, &AddVar(..) | &OpenSnapshot | &Purged | &CommitedSnapshot => false,
} }
@ -713,9 +685,7 @@ impl<'tcx> RegionConstraintCollector<'tcx> {
// cannot add constraints once regions are resolved // cannot add constraints once regions are resolved
debug!( debug!(
"RegionConstraintCollector: make_subregion({:?}, {:?}) due to {:?}", "RegionConstraintCollector: make_subregion({:?}, {:?}) due to {:?}",
sub, sub, sup, origin
sup,
origin
); );
match (sub, sup) { match (sub, sup) {
@ -854,19 +824,19 @@ impl<'tcx> RegionConstraintCollector<'tcx> {
fn universe(&self, region: Region<'tcx>) -> ty::UniverseIndex { fn universe(&self, region: Region<'tcx>) -> ty::UniverseIndex {
match *region { match *region {
ty::ReScope(..) | ty::ReScope(..)
ty::ReStatic | | ty::ReStatic
ty::ReEmpty | | ty::ReEmpty
ty::ReErased | | ty::ReErased
ty::ReFree(..) | | ty::ReFree(..)
ty::ReEarlyBound(..) => ty::UniverseIndex::ROOT, | ty::ReEarlyBound(..) => ty::UniverseIndex::ROOT,
ty::ReSkolemized(universe, _) => universe, ty::RePlaceholder(placeholder) => placeholder.universe,
ty::ReClosureBound(vid) | ty::ReClosureBound(vid) | ty::ReVar(vid) => self.var_universe(vid),
ty::ReVar(vid) => self.var_universe(vid), ty::ReLateBound(..) => bug!("universe(): encountered bound region {:?}", region),
ty::ReLateBound(..) => ty::ReCanonical(..) => bug!(
bug!("universe(): encountered bound region {:?}", region), "region_universe(): encountered canonical region {:?}",
ty::ReCanonical(..) => region
bug!("region_universe(): encountered canonical region {:?}", region), ),
} }
} }
@ -886,7 +856,7 @@ impl<'tcx> RegionConstraintCollector<'tcx> {
/// relations are considered. For example, one can say that only /// relations are considered. For example, one can say that only
/// "incoming" edges to `r0` are desired, in which case one will /// "incoming" edges to `r0` are desired, in which case one will
/// get the set of regions `{r|r <= r0}`. This is used when /// get the set of regions `{r|r <= r0}`. This is used when
/// checking whether skolemized regions are being improperly /// checking whether placeholder regions are being improperly
/// related to other regions. /// related to other regions.
pub fn tainted( pub fn tainted(
&self, &self,
@ -897,9 +867,7 @@ impl<'tcx> RegionConstraintCollector<'tcx> {
) -> FxHashSet<ty::Region<'tcx>> { ) -> FxHashSet<ty::Region<'tcx>> {
debug!( debug!(
"tainted(mark={:?}, r0={:?}, directions={:?})", "tainted(mark={:?}, r0={:?}, directions={:?})",
mark, mark, r0, directions
r0,
directions
); );
// `result_set` acts as a worklist: we explore all outgoing // `result_set` acts as a worklist: we explore all outgoing

View File

@ -117,7 +117,7 @@ fn overlap<'cx, 'gcx, 'tcx>(selcx: &mut SelectionContext<'cx, 'gcx, 'tcx>,
{ {
debug!("overlap(a_def_id={:?}, b_def_id={:?})", a_def_id, b_def_id); debug!("overlap(a_def_id={:?}, b_def_id={:?})", a_def_id, b_def_id);
// For the purposes of this check, we don't bring any skolemized // For the purposes of this check, we don't bring any placeholder
// types into scope; instead, we replace the generic types with // types into scope; instead, we replace the generic types with
// fresh type variables, and hence we do our evaluations in an // fresh type variables, and hence we do our evaluations in an
// empty environment. // empty environment.

View File

@ -206,15 +206,15 @@ pub fn poly_project_and_unify_type<'cx, 'gcx, 'tcx>(
let infcx = selcx.infcx(); let infcx = selcx.infcx();
infcx.commit_if_ok(|snapshot| { infcx.commit_if_ok(|snapshot| {
let (skol_predicate, skol_map) = let (placeholder_predicate, placeholder_map) =
infcx.skolemize_late_bound_regions(&obligation.predicate); infcx.replace_late_bound_regions_with_placeholders(&obligation.predicate);
let skol_obligation = obligation.with(skol_predicate); let skol_obligation = obligation.with(placeholder_predicate);
let r = match project_and_unify_type(selcx, &skol_obligation) { let r = match project_and_unify_type(selcx, &skol_obligation) {
Ok(result) => { Ok(result) => {
let span = obligation.cause.span; let span = obligation.cause.span;
match infcx.leak_check(false, span, &skol_map, snapshot) { match infcx.leak_check(false, span, &placeholder_map, snapshot) {
Ok(()) => Ok(infcx.plug_leaks(skol_map, snapshot, result)), Ok(()) => Ok(infcx.plug_leaks(placeholder_map, snapshot, result)),
Err(e) => { Err(e) => {
debug!("poly_project_and_unify_type: leak check encountered error {:?}", e); debug!("poly_project_and_unify_type: leak check encountered error {:?}", e);
Err(MismatchedProjectionTypes { err: e }) Err(MismatchedProjectionTypes { err: e })
@ -1571,11 +1571,11 @@ fn assoc_ty_def<'cx, 'gcx, 'tcx>(
// # Cache // # Cache
/// The projection cache. Unlike the standard caches, this can /// The projection cache. Unlike the standard caches, this can include
/// include infcx-dependent type variables - therefore, we have to roll /// infcx-dependent type variables - therefore, we have to roll the
/// the cache back each time we roll a snapshot back, to avoid assumptions /// cache back each time we roll a snapshot back, to avoid assumptions
/// on yet-unresolved inference variables. Types with skolemized regions /// on yet-unresolved inference variables. Types with placeholder
/// also have to be removed when the respective snapshot ends. /// regions also have to be removed when the respective snapshot ends.
/// ///
/// Because of that, projection cache entries can be "stranded" and left /// Because of that, projection cache entries can be "stranded" and left
/// inaccessible when type variables inside the key are resolved. We make no /// inaccessible when type variables inside the key are resolved. We make no
@ -1661,7 +1661,7 @@ impl<'tcx> ProjectionCache<'tcx> {
self.map.rollback_to(&snapshot.snapshot); self.map.rollback_to(&snapshot.snapshot);
} }
pub fn rollback_skolemized(&mut self, snapshot: &ProjectionCacheSnapshot) { pub fn rollback_placeholder(&mut self, snapshot: &ProjectionCacheSnapshot) {
self.map.partial_rollback(&snapshot.snapshot, &|k| k.ty.has_re_skol()); self.map.partial_rollback(&snapshot.snapshot, &|k| k.ty.has_re_skol());
} }

File diff suppressed because it is too large Load Diff

View File

@ -184,7 +184,7 @@ pub(super) fn specializes<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
return false; return false;
} }
// create a parameter environment corresponding to a (skolemized) instantiation of impl1 // create a parameter environment corresponding to a (placeholder) instantiation of impl1
let penv = tcx.param_env(impl1_def_id); let penv = tcx.param_env(impl1_def_id);
let impl1_trait_ref = tcx.impl_trait_ref(impl1_def_id).unwrap(); let impl1_trait_ref = tcx.impl_trait_ref(impl1_def_id).unwrap();

View File

@ -218,9 +218,9 @@ impl<'a, 'gcx, 'lcx, 'tcx> ty::TyS<'tcx> {
ty::Infer(ty::IntVar(_)) => "integral variable".to_string(), ty::Infer(ty::IntVar(_)) => "integral variable".to_string(),
ty::Infer(ty::FloatVar(_)) => "floating-point variable".to_string(), ty::Infer(ty::FloatVar(_)) => "floating-point variable".to_string(),
ty::Infer(ty::CanonicalTy(_)) | ty::Infer(ty::CanonicalTy(_)) |
ty::Infer(ty::FreshTy(_)) => "skolemized type".to_string(), ty::Infer(ty::FreshTy(_)) => "fresh type".to_string(),
ty::Infer(ty::FreshIntTy(_)) => "skolemized integral type".to_string(), ty::Infer(ty::FreshIntTy(_)) => "fresh integral type".to_string(),
ty::Infer(ty::FreshFloatTy(_)) => "skolemized floating-point type".to_string(), ty::Infer(ty::FreshFloatTy(_)) => "fresh floating-point type".to_string(),
ty::Projection(_) => "associated type".to_string(), ty::Projection(_) => "associated type".to_string(),
ty::UnnormalizedProjection(_) => "non-normalized associated type".to_string(), ty::UnnormalizedProjection(_) => "non-normalized associated type".to_string(),
ty::Param(ref p) => { ty::Param(ref p) => {

View File

@ -667,12 +667,14 @@ pub fn shift_regions<'a, 'gcx, 'tcx, T>(tcx: TyCtxt<'a, 'gcx, 'tcx>,
/// we already use the term "free region". It refers to the regions that we use to represent bound /// we already use the term "free region". It refers to the regions that we use to represent bound
/// regions on a fn definition while we are typechecking its body. /// regions on a fn definition while we are typechecking its body.
/// ///
/// To clarify, conceptually there is no particular difference between an "escaping" region and a /// To clarify, conceptually there is no particular difference between
/// "free" region. However, there is a big difference in practice. Basically, when "entering" a /// an "escaping" region and a "free" region. However, there is a big
/// binding level, one is generally required to do some sort of processing to a bound region, such /// difference in practice. Basically, when "entering" a binding
/// as replacing it with a fresh/skolemized region, or making an entry in the environment to /// level, one is generally required to do some sort of processing to
/// represent the scope to which it is attached, etc. An escaping region represents a bound region /// a bound region, such as replacing it with a fresh/placeholder
/// for which this processing has not yet been done. /// region, or making an entry in the environment to represent the
/// scope to which it is attached, etc. An escaping region represents
/// a bound region for which this processing has not yet been done.
struct HasEscapingRegionsVisitor { struct HasEscapingRegionsVisitor {
/// Anything bound by `outer_index` or "above" is escaping /// Anything bound by `outer_index` or "above" is escaping
outer_index: ty::DebruijnIndex, outer_index: ty::DebruijnIndex,

View File

@ -1479,18 +1479,17 @@ impl<'tcx> InstantiatedPredicates<'tcx> {
/// region `'a` is in a subuniverse U2 of U1, because we can name it /// region `'a` is in a subuniverse U2 of U1, because we can name it
/// inside the fn type but not outside. /// inside the fn type but not outside.
/// ///
/// Universes are related to **skolemization** -- which is a way of /// Universes are used to do type- and trait-checking around these
/// doing type- and trait-checking around these "forall" binders (also /// "forall" binders (also called **universal quantification**). The
/// called **universal quantification**). The idea is that when, in /// idea is that when, in the body of `bar`, we refer to `T` as a
/// the body of `bar`, we refer to `T` as a type, we aren't referring /// type, we aren't referring to any type in particular, but rather a
/// to any type in particular, but rather a kind of "fresh" type that /// kind of "fresh" type that is distinct from all other types we have
/// is distinct from all other types we have actually declared. This /// actually declared. This is called a **placeholder** type, and we
/// is called a **skolemized** type, and we use universes to talk /// use universes to talk about this. In other words, a type name in
/// about this. In other words, a type name in universe 0 always /// universe 0 always corresponds to some "ground" type that the user
/// corresponds to some "ground" type that the user declared, but a /// declared, but a type name in a non-zero universe is a placeholder
/// type name in a non-zero universe is a skolemized type -- an /// type -- an idealized representative of "types in general" that we
/// idealized representative of "types in general" that we use for /// use for checking generic functions.
/// checking generic functions.
#[derive(Copy, Clone, PartialEq, Eq, PartialOrd, Ord, Hash, RustcEncodable, RustcDecodable)] #[derive(Copy, Clone, PartialEq, Eq, PartialOrd, Ord, Hash, RustcEncodable, RustcDecodable)]
pub struct UniverseIndex(u32); pub struct UniverseIndex(u32);
@ -1553,6 +1552,18 @@ impl From<u32> for UniverseIndex {
} }
} }
/// The "placeholder index" fully defines a placeholder region.
/// Placeholder regions are identified by both a **universe** as well
/// as a "bound-region" within that universe. The `bound_region` is
/// basically a name -- distinct bound regions within the same
/// universe are just two regions with an unknown relationship to one
/// another.
#[derive(Copy, Clone, Debug, PartialEq, Eq, Hash, RustcEncodable, RustcDecodable, PartialOrd, Ord)]
pub struct Placeholder {
pub universe: UniverseIndex,
pub name: BoundRegion,
}
/// When type checking, we use the `ParamEnv` to track /// When type checking, we use the `ParamEnv` to track
/// details about the set of where-clauses that are in scope at this /// details about the set of where-clauses that are in scope at this
/// particular point. /// particular point.

View File

@ -709,7 +709,7 @@ impl<'a, 'gcx, 'tcx> ExistentialTraitRef<'tcx> {
/// Object types don't have a self-type specified. Therefore, when /// Object types don't have a self-type specified. Therefore, when
/// we convert the principal trait-ref into a normal trait-ref, /// we convert the principal trait-ref into a normal trait-ref,
/// you must give *some* self-type. A common choice is `mk_err()` /// you must give *some* self-type. A common choice is `mk_err()`
/// or some skolemized type. /// or some placeholder type.
pub fn with_self_ty(&self, tcx: TyCtxt<'a, 'gcx, 'tcx>, self_ty: Ty<'tcx>) pub fn with_self_ty(&self, tcx: TyCtxt<'a, 'gcx, 'tcx>, self_ty: Ty<'tcx>)
-> ty::TraitRef<'tcx> { -> ty::TraitRef<'tcx> {
// otherwise the escaping regions would be captured by the binder // otherwise the escaping regions would be captured by the binder
@ -732,7 +732,7 @@ impl<'tcx> PolyExistentialTraitRef<'tcx> {
/// Object types don't have a self-type specified. Therefore, when /// Object types don't have a self-type specified. Therefore, when
/// we convert the principal trait-ref into a normal trait-ref, /// we convert the principal trait-ref into a normal trait-ref,
/// you must give *some* self-type. A common choice is `mk_err()` /// you must give *some* self-type. A common choice is `mk_err()`
/// or some skolemized type. /// or some placeholder type.
pub fn with_self_ty(&self, tcx: TyCtxt<'_, '_, 'tcx>, pub fn with_self_ty(&self, tcx: TyCtxt<'_, '_, 'tcx>,
self_ty: Ty<'tcx>) self_ty: Ty<'tcx>)
-> ty::PolyTraitRef<'tcx> { -> ty::PolyTraitRef<'tcx> {
@ -743,7 +743,7 @@ impl<'tcx> PolyExistentialTraitRef<'tcx> {
/// Binder is a binder for higher-ranked lifetimes. It is part of the /// Binder is a binder for higher-ranked lifetimes. It is part of the
/// compiler's representation for things like `for<'a> Fn(&'a isize)` /// compiler's representation for things like `for<'a> Fn(&'a isize)`
/// (which would be represented by the type `PolyTraitRef == /// (which would be represented by the type `PolyTraitRef ==
/// Binder<TraitRef>`). Note that when we skolemize, instantiate, /// Binder<TraitRef>`). Note that when we instantiate,
/// erase, or otherwise "discharge" these bound regions, we change the /// erase, or otherwise "discharge" these bound regions, we change the
/// type from `Binder<T>` to just `T` (see /// type from `Binder<T>` to just `T` (see
/// e.g. `liberate_late_bound_regions`). /// e.g. `liberate_late_bound_regions`).
@ -1066,10 +1066,10 @@ pub type Region<'tcx> = &'tcx RegionKind;
/// ///
/// Unlike Param-s, bound regions are not supposed to exist "in the wild" /// Unlike Param-s, bound regions are not supposed to exist "in the wild"
/// outside their binder, e.g. in types passed to type inference, and /// outside their binder, e.g. in types passed to type inference, and
/// should first be substituted (by skolemized regions, free regions, /// should first be substituted (by placeholder regions, free regions,
/// or region variables). /// or region variables).
/// ///
/// ## Skolemized and Free Regions /// ## Placeholder and Free Regions
/// ///
/// One often wants to work with bound regions without knowing their precise /// One often wants to work with bound regions without knowing their precise
/// identity. For example, when checking a function, the lifetime of a borrow /// identity. For example, when checking a function, the lifetime of a borrow
@ -1077,12 +1077,11 @@ pub type Region<'tcx> = &'tcx RegionKind;
/// it must be ensured that bounds on the region can't be accidentally /// it must be ensured that bounds on the region can't be accidentally
/// assumed without being checked. /// assumed without being checked.
/// ///
/// The process of doing that is called "skolemization". The bound regions /// To do this, we replace the bound regions with placeholder markers,
/// are replaced by skolemized markers, which don't satisfy any relation /// which don't satisfy any relation not explicitly provided.
/// not explicitly provided.
/// ///
/// There are 2 kinds of skolemized regions in rustc: `ReFree` and /// There are 2 kinds of placeholder regions in rustc: `ReFree` and
/// `ReSkolemized`. When checking an item's body, `ReFree` is supposed /// `RePlaceholder`. When checking an item's body, `ReFree` is supposed
/// to be used. These also support explicit bounds: both the internally-stored /// to be used. These also support explicit bounds: both the internally-stored
/// *scope*, which the region is assumed to outlive, as well as other /// *scope*, which the region is assumed to outlive, as well as other
/// relations stored in the `FreeRegionMap`. Note that these relations /// relations stored in the `FreeRegionMap`. Note that these relations
@ -1091,14 +1090,14 @@ pub type Region<'tcx> = &'tcx RegionKind;
/// ///
/// When working with higher-ranked types, some region relations aren't /// When working with higher-ranked types, some region relations aren't
/// yet known, so you can't just call `resolve_regions_and_report_errors`. /// yet known, so you can't just call `resolve_regions_and_report_errors`.
/// `ReSkolemized` is designed for this purpose. In these contexts, /// `RePlaceholder` is designed for this purpose. In these contexts,
/// there's also the risk that some inference variable laying around will /// there's also the risk that some inference variable laying around will
/// get unified with your skolemized region: if you want to check whether /// get unified with your placeholder region: if you want to check whether
/// `for<'a> Foo<'_>: 'a`, and you substitute your bound region `'a` /// `for<'a> Foo<'_>: 'a`, and you substitute your bound region `'a`
/// with a skolemized region `'%a`, the variable `'_` would just be /// with a placeholder region `'%a`, the variable `'_` would just be
/// instantiated to the skolemized region `'%a`, which is wrong because /// instantiated to the placeholder region `'%a`, which is wrong because
/// the inference variable is supposed to satisfy the relation /// the inference variable is supposed to satisfy the relation
/// *for every value of the skolemized region*. To ensure that doesn't /// *for every value of the placeholder region*. To ensure that doesn't
/// happen, you can use `leak_check`. This is more clearly explained /// happen, you can use `leak_check`. This is more clearly explained
/// by the [rustc guide]. /// by the [rustc guide].
/// ///
@ -1132,9 +1131,9 @@ pub enum RegionKind {
/// A region variable. Should not exist after typeck. /// A region variable. Should not exist after typeck.
ReVar(RegionVid), ReVar(RegionVid),
/// A skolemized region - basically the higher-ranked version of ReFree. /// A placeholder region - basically the higher-ranked version of ReFree.
/// Should not exist after typeck. /// Should not exist after typeck.
ReSkolemized(ty::UniverseIndex, BoundRegion), RePlaceholder(ty::Placeholder),
/// Empty lifetime is for data that is never accessed. /// Empty lifetime is for data that is never accessed.
/// Bottom in the region lattice. We treat ReEmpty somewhat /// Bottom in the region lattice. We treat ReEmpty somewhat
@ -1338,7 +1337,7 @@ impl RegionKind {
RegionKind::ReScope(..) => false, RegionKind::ReScope(..) => false,
RegionKind::ReStatic => true, RegionKind::ReStatic => true,
RegionKind::ReVar(..) => false, RegionKind::ReVar(..) => false,
RegionKind::ReSkolemized(_, br) => br.is_named(), RegionKind::RePlaceholder(placeholder) => placeholder.name.is_named(),
RegionKind::ReEmpty => false, RegionKind::ReEmpty => false,
RegionKind::ReErased => false, RegionKind::ReErased => false,
RegionKind::ReClosureBound(..) => false, RegionKind::ReClosureBound(..) => false,
@ -1410,7 +1409,7 @@ impl RegionKind {
flags = flags | TypeFlags::HAS_FREE_REGIONS; flags = flags | TypeFlags::HAS_FREE_REGIONS;
flags = flags | TypeFlags::HAS_RE_INFER; flags = flags | TypeFlags::HAS_RE_INFER;
} }
ty::ReSkolemized(..) => { ty::RePlaceholder(..) => {
flags = flags | TypeFlags::HAS_FREE_REGIONS; flags = flags | TypeFlags::HAS_FREE_REGIONS;
flags = flags | TypeFlags::HAS_RE_SKOL; flags = flags | TypeFlags::HAS_RE_SKOL;
} }

View File

@ -520,7 +520,7 @@ pub fn object_region_bounds<'a, 'gcx, 'tcx>(
{ {
// Since we don't actually *know* the self type for an object, // Since we don't actually *know* the self type for an object,
// this "open(err)" serves as a kind of dummy standin -- basically // this "open(err)" serves as a kind of dummy standin -- basically
// a skolemized type. // a placeholder type.
let open_ty = tcx.mk_infer(ty::FreshTy(0)); let open_ty = tcx.mk_infer(ty::FreshTy(0));
let predicates = existential_predicates.iter().filter_map(|predicate| { let predicates = existential_predicates.iter().filter_map(|predicate| {

View File

@ -803,7 +803,7 @@ define_print! {
} }
ty::ReLateBound(_, br) | ty::ReLateBound(_, br) |
ty::ReFree(ty::FreeRegion { bound_region: br, .. }) | ty::ReFree(ty::FreeRegion { bound_region: br, .. }) |
ty::ReSkolemized(_, br) => { ty::RePlaceholder(ty::Placeholder { name: br, .. }) => {
write!(f, "{}", br) write!(f, "{}", br)
} }
ty::ReScope(scope) if cx.identify_regions => { ty::ReScope(scope) if cx.identify_regions => {
@ -872,8 +872,8 @@ define_print! {
write!(f, "'?{}", c.index()) write!(f, "'?{}", c.index())
} }
ty::ReSkolemized(universe, ref bound_region) => { ty::RePlaceholder(placeholder) => {
write!(f, "ReSkolemized({:?}, {:?})", universe, bound_region) write!(f, "RePlaceholder({:?})", placeholder)
} }
ty::ReEmpty => write!(f, "ReEmpty"), ty::ReEmpty => write!(f, "ReEmpty"),

View File

@ -426,7 +426,7 @@ impl<'a, 'tcx> CheckLoanCtxt<'a, 'tcx> {
// These cannot exist in borrowck // These cannot exist in borrowck
RegionKind::ReVar(..) | RegionKind::ReVar(..) |
RegionKind::ReCanonical(..) | RegionKind::ReCanonical(..) |
RegionKind::ReSkolemized(..) | RegionKind::RePlaceholder(..) |
RegionKind::ReClosureBound(..) | RegionKind::ReClosureBound(..) |
RegionKind::ReErased => span_bug!(borrow_span, RegionKind::ReErased => span_bug!(borrow_span,
"unexpected region in borrowck {:?}", "unexpected region in borrowck {:?}",

View File

@ -368,7 +368,7 @@ impl<'a, 'tcx> GatherLoanCtxt<'a, 'tcx> {
ty::ReClosureBound(..) | ty::ReClosureBound(..) |
ty::ReLateBound(..) | ty::ReLateBound(..) |
ty::ReVar(..) | ty::ReVar(..) |
ty::ReSkolemized(..) | ty::RePlaceholder(..) |
ty::ReErased => { ty::ReErased => {
span_bug!( span_bug!(
cmt.span, cmt.span,

View File

@ -1789,9 +1789,11 @@ impl<'tcx> AnnotatedBorrowFnSignature<'tcx> {
// lifetimes without names with the value `'0`. // lifetimes without names with the value `'0`.
match ty.sty { match ty.sty {
ty::TyKind::Ref(ty::RegionKind::ReLateBound(_, br), _, _) ty::TyKind::Ref(ty::RegionKind::ReLateBound(_, br), _, _)
| ty::TyKind::Ref(ty::RegionKind::ReSkolemized(_, br), _, _) => { | ty::TyKind::Ref(
with_highlight_region_for_bound_region(*br, counter, || format!("{}", ty)) ty::RegionKind::RePlaceholder(ty::Placeholder { name: br, .. }),
} _,
_,
) => with_highlight_region_for_bound_region(*br, counter, || format!("{}", ty)),
_ => format!("{}", ty), _ => format!("{}", ty),
} }
} }
@ -1801,7 +1803,8 @@ impl<'tcx> AnnotatedBorrowFnSignature<'tcx> {
fn get_region_name_for_ty(&self, ty: ty::Ty<'tcx>, counter: usize) -> String { fn get_region_name_for_ty(&self, ty: ty::Ty<'tcx>, counter: usize) -> String {
match ty.sty { match ty.sty {
ty::TyKind::Ref(region, _, _) => match region { ty::TyKind::Ref(region, _, _) => match region {
ty::RegionKind::ReLateBound(_, br) | ty::RegionKind::ReSkolemized(_, br) => { ty::RegionKind::ReLateBound(_, br)
| ty::RegionKind::RePlaceholder(ty::Placeholder { name: br, .. }) => {
with_highlight_region_for_bound_region(*br, counter, || format!("{}", region)) with_highlight_region_for_bound_region(*br, counter, || format!("{}", region))
} }
_ => format!("{}", region), _ => format!("{}", region),

View File

@ -107,6 +107,7 @@ pub(in borrow_check) fn compute_regions<'cx, 'gcx, 'tcx>(
// Run the MIR type-checker. // Run the MIR type-checker.
let MirTypeckResults { let MirTypeckResults {
constraints, constraints,
placeholder_indices,
universal_region_relations, universal_region_relations,
} = type_check::type_check( } = type_check::type_check(
infcx, infcx,
@ -122,6 +123,8 @@ pub(in borrow_check) fn compute_regions<'cx, 'gcx, 'tcx>(
elements, elements,
); );
let placeholder_indices = Rc::new(placeholder_indices);
if let Some(all_facts) = &mut all_facts { if let Some(all_facts) = &mut all_facts {
all_facts all_facts
.universal_region .universal_region
@ -150,6 +153,7 @@ pub(in borrow_check) fn compute_regions<'cx, 'gcx, 'tcx>(
let mut regioncx = RegionInferenceContext::new( let mut regioncx = RegionInferenceContext::new(
var_origins, var_origins,
universal_regions, universal_regions,
placeholder_indices,
universal_region_relations, universal_region_relations,
mir, mir,
outlives_constraints, outlives_constraints,

View File

@ -274,7 +274,7 @@ impl<'tcx> RegionInferenceContext<'tcx> {
ty::ReLateBound(..) ty::ReLateBound(..)
| ty::ReScope(..) | ty::ReScope(..)
| ty::ReVar(..) | ty::ReVar(..)
| ty::ReSkolemized(..) | ty::RePlaceholder(..)
| ty::ReEmpty | ty::ReEmpty
| ty::ReErased | ty::ReErased
| ty::ReClosureBound(..) | ty::ReClosureBound(..)

View File

@ -11,7 +11,7 @@
use super::universal_regions::UniversalRegions; use super::universal_regions::UniversalRegions;
use borrow_check::nll::constraints::graph::NormalConstraintGraph; use borrow_check::nll::constraints::graph::NormalConstraintGraph;
use borrow_check::nll::constraints::{ConstraintSccIndex, ConstraintSet, OutlivesConstraint}; use borrow_check::nll::constraints::{ConstraintSccIndex, ConstraintSet, OutlivesConstraint};
use borrow_check::nll::region_infer::values::{RegionElement, ToElementIndex}; use borrow_check::nll::region_infer::values::{PlaceholderIndices, RegionElement, ToElementIndex};
use borrow_check::nll::type_check::free_region_relations::UniversalRegionRelations; use borrow_check::nll::type_check::free_region_relations::UniversalRegionRelations;
use borrow_check::nll::type_check::Locations; use borrow_check::nll::type_check::Locations;
use rustc::hir::def_id::DefId; use rustc::hir::def_id::DefId;
@ -183,6 +183,7 @@ impl<'tcx> RegionInferenceContext<'tcx> {
pub(crate) fn new( pub(crate) fn new(
var_infos: VarInfos, var_infos: VarInfos,
universal_regions: Rc<UniversalRegions<'tcx>>, universal_regions: Rc<UniversalRegions<'tcx>>,
placeholder_indices: Rc<PlaceholderIndices>,
universal_region_relations: Rc<UniversalRegionRelations<'tcx>>, universal_region_relations: Rc<UniversalRegionRelations<'tcx>>,
_mir: &Mir<'tcx>, _mir: &Mir<'tcx>,
outlives_constraints: ConstraintSet, outlives_constraints: ConstraintSet,
@ -196,19 +197,13 @@ impl<'tcx> RegionInferenceContext<'tcx> {
.map(|info| RegionDefinition::new(info.universe, info.origin)) .map(|info| RegionDefinition::new(info.universe, info.origin))
.collect(); .collect();
// Compute the max universe used anywhere amongst the regions.
let max_universe = definitions
.iter()
.map(|d| d.universe)
.max()
.unwrap_or(ty::UniverseIndex::ROOT);
let constraints = Rc::new(outlives_constraints); // freeze constraints let constraints = Rc::new(outlives_constraints); // freeze constraints
let constraint_graph = Rc::new(constraints.graph(definitions.len())); let constraint_graph = Rc::new(constraints.graph(definitions.len()));
let fr_static = universal_regions.fr_static; let fr_static = universal_regions.fr_static;
let constraint_sccs = Rc::new(constraints.compute_sccs(&constraint_graph, fr_static)); let constraint_sccs = Rc::new(constraints.compute_sccs(&constraint_graph, fr_static));
let mut scc_values = RegionValues::new(elements, universal_regions.len(), max_universe); let mut scc_values =
RegionValues::new(elements, universal_regions.len(), &placeholder_indices);
for region in liveness_constraints.rows() { for region in liveness_constraints.rows() {
let scc = constraint_sccs.scc(region); let scc = constraint_sccs.scc(region);
@ -329,17 +324,14 @@ impl<'tcx> RegionInferenceContext<'tcx> {
self.scc_values.add_element(scc, variable); self.scc_values.add_element(scc, variable);
} }
NLLRegionVariableOrigin::BoundRegion(ui) => { NLLRegionVariableOrigin::Placeholder(placeholder) => {
// Each placeholder region X outlives its // Each placeholder region is only visible from
// associated universe but nothing else. Every // its universe `ui` and its superuniverses. So we
// placeholder region is always in a universe that // can't just add it into `scc` unless the
// contains `ui` -- but when placeholder regions // universe of the scc can name this region.
// are placed into an SCC, that SCC may include
// things from other universes that do not include
// `ui`.
let scc_universe = self.scc_universes[scc]; let scc_universe = self.scc_universes[scc];
if ui.is_subset_of(scc_universe) { if placeholder.universe.is_subset_of(scc_universe) {
self.scc_values.add_element(scc, ui); self.scc_values.add_element(scc, placeholder);
} else { } else {
self.add_incompatible_universe(scc); self.add_incompatible_universe(scc);
} }
@ -544,8 +536,8 @@ impl<'tcx> RegionInferenceContext<'tcx> {
// B's value, and check whether all of them are nameable // B's value, and check whether all of them are nameable
// from universe_a // from universe_a
self.scc_values self.scc_values
.subuniverses_contained_in(scc_b) .placeholders_contained_in(scc_b)
.all(|u| u.is_subset_of(universe_a)) .all(|p| p.universe.is_subset_of(universe_a))
} }
/// Extend `scc` so that it can outlive some placeholder region /// Extend `scc` so that it can outlive some placeholder region
@ -1076,8 +1068,8 @@ impl<'tcx> RegionInferenceContext<'tcx> {
); );
} }
NLLRegionVariableOrigin::BoundRegion(universe) => { NLLRegionVariableOrigin::Placeholder(placeholder) => {
self.check_bound_universal_region(infcx, mir, mir_def_id, fr, universe); self.check_bound_universal_region(infcx, mir, mir_def_id, fr, placeholder);
} }
NLLRegionVariableOrigin::Existential => { NLLRegionVariableOrigin::Existential => {
@ -1113,7 +1105,7 @@ impl<'tcx> RegionInferenceContext<'tcx> {
assert!(self.scc_universes[longer_fr_scc] == ty::UniverseIndex::ROOT); assert!(self.scc_universes[longer_fr_scc] == ty::UniverseIndex::ROOT);
debug_assert!( debug_assert!(
self.scc_values self.scc_values
.subuniverses_contained_in(longer_fr_scc) .placeholders_contained_in(longer_fr_scc)
.next() .next()
.is_none() .is_none()
); );
@ -1181,9 +1173,12 @@ impl<'tcx> RegionInferenceContext<'tcx> {
mir: &Mir<'tcx>, mir: &Mir<'tcx>,
_mir_def_id: DefId, _mir_def_id: DefId,
longer_fr: RegionVid, longer_fr: RegionVid,
universe: ty::UniverseIndex, placeholder: ty::Placeholder,
) { ) {
debug!("check_bound_universal_region(fr={:?})", longer_fr); debug!(
"check_bound_universal_region(fr={:?}, placeholder={:?})",
longer_fr, placeholder,
);
let longer_fr_scc = self.constraint_sccs.scc(longer_fr); let longer_fr_scc = self.constraint_sccs.scc(longer_fr);
@ -1196,7 +1191,7 @@ impl<'tcx> RegionInferenceContext<'tcx> {
.find(|element| match element { .find(|element| match element {
RegionElement::Location(_) => true, RegionElement::Location(_) => true,
RegionElement::RootUniversalRegion(_) => true, RegionElement::RootUniversalRegion(_) => true,
RegionElement::SubUniversalRegion(ui) => *ui != universe, RegionElement::PlaceholderRegion(placeholder1) => placeholder != *placeholder1,
}) })
} { } {
Some(v) => v, Some(v) => v,
@ -1207,10 +1202,10 @@ impl<'tcx> RegionInferenceContext<'tcx> {
let error_region = match error_element { let error_region = match error_element {
RegionElement::Location(l) => self.find_sub_region_live_at(longer_fr, l), RegionElement::Location(l) => self.find_sub_region_live_at(longer_fr, l),
RegionElement::RootUniversalRegion(r) => r, RegionElement::RootUniversalRegion(r) => r,
RegionElement::SubUniversalRegion(error_ui) => self.definitions RegionElement::PlaceholderRegion(error_placeholder) => self.definitions
.iter_enumerated() .iter_enumerated()
.filter_map(|(r, definition)| match definition.origin { .filter_map(|(r, definition)| match definition.origin {
NLLRegionVariableOrigin::BoundRegion(ui) if error_ui == ui => Some(r), NLLRegionVariableOrigin::Placeholder(p) if p == error_placeholder => Some(r),
_ => None, _ => None,
}) })
.next() .next()

View File

@ -11,6 +11,7 @@
use rustc::mir::{BasicBlock, Location, Mir}; use rustc::mir::{BasicBlock, Location, Mir};
use rustc::ty::{self, RegionVid}; use rustc::ty::{self, RegionVid};
use rustc_data_structures::bit_set::{HybridBitSet, SparseBitMatrix}; use rustc_data_structures::bit_set::{HybridBitSet, SparseBitMatrix};
use rustc_data_structures::fx::FxHashMap;
use rustc_data_structures::indexed_vec::Idx; use rustc_data_structures::indexed_vec::Idx;
use rustc_data_structures::indexed_vec::IndexVec; use rustc_data_structures::indexed_vec::IndexVec;
use std::fmt::Debug; use std::fmt::Debug;
@ -31,8 +32,7 @@ crate struct RegionValueElements {
impl RegionValueElements { impl RegionValueElements {
crate fn new(mir: &Mir<'_>) -> Self { crate fn new(mir: &Mir<'_>) -> Self {
let mut num_points = 0; let mut num_points = 0;
let statements_before_block: IndexVec<BasicBlock, usize> = mir let statements_before_block: IndexVec<BasicBlock, usize> = mir.basic_blocks()
.basic_blocks()
.iter() .iter()
.map(|block_data| { .map(|block_data| {
let v = num_points; let v = num_points;
@ -48,7 +48,7 @@ impl RegionValueElements {
let mut basic_blocks = IndexVec::with_capacity(num_points); let mut basic_blocks = IndexVec::with_capacity(num_points);
for (bb, bb_data) in mir.basic_blocks().iter_enumerated() { for (bb, bb_data) in mir.basic_blocks().iter_enumerated() {
basic_blocks.extend((0 .. bb_data.statements.len() + 1).map(|_| bb)); basic_blocks.extend((0..bb_data.statements.len() + 1).map(|_| bb));
} }
Self { Self {
@ -85,7 +85,10 @@ impl RegionValueElements {
let block = self.basic_blocks[index]; let block = self.basic_blocks[index];
let start_index = self.statements_before_block[block]; let start_index = self.statements_before_block[block];
let statement_index = index.index() - start_index; let statement_index = index.index() - start_index;
Location { block, statement_index } Location {
block,
statement_index,
}
} }
/// Sometimes we get point-indices back from bitsets that may be /// Sometimes we get point-indices back from bitsets that may be
@ -103,13 +106,15 @@ impl RegionValueElements {
index: PointIndex, index: PointIndex,
stack: &mut Vec<PointIndex>, stack: &mut Vec<PointIndex>,
) { ) {
let Location { block, statement_index } = self.to_location(index); let Location {
block,
statement_index,
} = self.to_location(index);
if statement_index == 0 { if statement_index == 0 {
// If this is a basic block head, then the predecessors are // If this is a basic block head, then the predecessors are
// the the terminators of other basic blocks // the the terminators of other basic blocks
stack.extend( stack.extend(
mir mir.predecessors_for(block)
.predecessors_for(block)
.iter() .iter()
.map(|&pred_bb| mir.terminator_loc(pred_bb)) .map(|&pred_bb| mir.terminator_loc(pred_bb))
.map(|pred_loc| self.point_from_location(pred_loc)), .map(|pred_loc| self.point_from_location(pred_loc)),
@ -127,10 +132,7 @@ newtype_index! {
pub struct PointIndex { DEBUG_FORMAT = "PointIndex({})" } pub struct PointIndex { DEBUG_FORMAT = "PointIndex({})" }
} }
/// A single integer representing a (non-zero) `UniverseIndex`. /// A single integer representing a `ty::Placeholder`.
/// Computed just by subtracting one from `UniverseIndex`; this is
/// because the `0` value for `UniverseIndex` represents the root
/// universe, and we don't need/want a bit for that one.
newtype_index! { newtype_index! {
pub struct PlaceholderIndex { DEBUG_FORMAT = "PlaceholderIndex({})" } pub struct PlaceholderIndex { DEBUG_FORMAT = "PlaceholderIndex({})" }
} }
@ -148,7 +150,7 @@ crate enum RegionElement {
/// A subuniverse from a subuniverse (e.g., instantiated from a /// A subuniverse from a subuniverse (e.g., instantiated from a
/// `for<'a> fn(&'a u32)` type). /// `for<'a> fn(&'a u32)` type).
SubUniversalRegion(ty::UniverseIndex), PlaceholderRegion(ty::Placeholder),
} }
/// When we initially compute liveness, we use a bit matrix storing /// When we initially compute liveness, we use a bit matrix storing
@ -185,7 +187,10 @@ impl<N: Idx> LivenessValues<N> {
/// Adds all the elements in the given bit array into the given /// Adds all the elements in the given bit array into the given
/// region. Returns true if any of them are newly added. /// region. Returns true if any of them are newly added.
crate fn add_elements(&mut self, row: N, locations: &HybridBitSet<PointIndex>) -> bool { crate fn add_elements(&mut self, row: N, locations: &HybridBitSet<PointIndex>) -> bool {
debug!("LivenessValues::add_elements(row={:?}, locations={:?})", row, locations); debug!(
"LivenessValues::add_elements(row={:?}, locations={:?})",
row, locations
);
self.points.union_into_row(row, locations) self.points.union_into_row(row, locations)
} }
@ -214,6 +219,52 @@ impl<N: Idx> LivenessValues<N> {
} }
} }
/// Maps from `ty::Placeholder` values that are used in the rest of
/// rustc to the internal `PlaceholderIndex` values that are used in
/// NLL.
#[derive(Default)]
crate struct PlaceholderIndices {
to_index: FxHashMap<ty::Placeholder, PlaceholderIndex>,
from_index: IndexVec<PlaceholderIndex, ty::Placeholder>,
}
impl PlaceholderIndices {
crate fn insert(&mut self, placeholder: ty::Placeholder) -> PlaceholderIndex {
let PlaceholderIndices {
to_index,
from_index,
} = self;
*to_index
.entry(placeholder)
.or_insert_with(|| from_index.push(placeholder))
}
crate fn lookup_index(&self, placeholder: ty::Placeholder) -> PlaceholderIndex {
self.to_index[&placeholder]
}
crate fn lookup_placeholder(&self, placeholder: PlaceholderIndex) -> ty::Placeholder {
self.from_index[placeholder]
}
crate fn len(&self) -> usize {
self.from_index.len()
}
}
impl ::std::iter::FromIterator<ty::Placeholder> for PlaceholderIndices {
fn from_iter<I>(iter: I) -> Self
where
I: IntoIterator<Item = ty::Placeholder>,
{
let mut result = Self::default();
iter.into_iter().for_each(|p| {
result.insert(p);
});
result
}
}
/// Stores the full values for a set of regions (in contrast to /// Stores the full values for a set of regions (in contrast to
/// `LivenessValues`, which only stores those points in the where a /// `LivenessValues`, which only stores those points in the where a
/// region is live). The full value for a region may contain points in /// region is live). The full value for a region may contain points in
@ -235,6 +286,7 @@ impl<N: Idx> LivenessValues<N> {
#[derive(Clone)] #[derive(Clone)]
crate struct RegionValues<N: Idx> { crate struct RegionValues<N: Idx> {
elements: Rc<RegionValueElements>, elements: Rc<RegionValueElements>,
placeholder_indices: Rc<PlaceholderIndices>,
points: SparseBitMatrix<N, PointIndex>, points: SparseBitMatrix<N, PointIndex>,
free_regions: SparseBitMatrix<N, RegionVid>, free_regions: SparseBitMatrix<N, RegionVid>,
@ -250,12 +302,13 @@ impl<N: Idx> RegionValues<N> {
crate fn new( crate fn new(
elements: &Rc<RegionValueElements>, elements: &Rc<RegionValueElements>,
num_universal_regions: usize, num_universal_regions: usize,
max_universe: ty::UniverseIndex, placeholder_indices: &Rc<PlaceholderIndices>,
) -> Self { ) -> Self {
let num_placeholders = max_universe.as_usize(); let num_placeholders = placeholder_indices.len();
Self { Self {
elements: elements.clone(), elements: elements.clone(),
points: SparseBitMatrix::new(elements.num_points), points: SparseBitMatrix::new(elements.num_points),
placeholder_indices: placeholder_indices.clone(),
free_regions: SparseBitMatrix::new(num_universal_regions), free_regions: SparseBitMatrix::new(num_universal_regions),
placeholders: SparseBitMatrix::new(num_placeholders), placeholders: SparseBitMatrix::new(num_placeholders),
} }
@ -313,10 +366,7 @@ impl<N: Idx> RegionValues<N> {
/// Returns the locations contained within a given region `r`. /// Returns the locations contained within a given region `r`.
crate fn locations_outlived_by<'a>(&'a self, r: N) -> impl Iterator<Item = Location> + 'a { crate fn locations_outlived_by<'a>(&'a self, r: N) -> impl Iterator<Item = Location> + 'a {
self.points self.points.row(r).into_iter().flat_map(move |set| {
.row(r)
.into_iter()
.flat_map(move |set| {
set.iter() set.iter()
.take_while(move |&p| self.elements.point_in_range(p)) .take_while(move |&p| self.elements.point_in_range(p))
.map(move |p| self.elements.to_location(p)) .map(move |p| self.elements.to_location(p))
@ -335,32 +385,30 @@ impl<N: Idx> RegionValues<N> {
} }
/// Returns all the elements contained in a given region's value. /// Returns all the elements contained in a given region's value.
crate fn subuniverses_contained_in<'a>( crate fn placeholders_contained_in<'a>(
&'a self, &'a self,
r: N, r: N,
) -> impl Iterator<Item = ty::UniverseIndex> + 'a { ) -> impl Iterator<Item = ty::Placeholder> + 'a {
self.placeholders self.placeholders
.row(r) .row(r)
.into_iter() .into_iter()
.flat_map(|set| set.iter()) .flat_map(|set| set.iter())
.map(|p| ty::UniverseIndex::from_u32((p.index() + 1) as u32)) .map(move |p| self.placeholder_indices.lookup_placeholder(p))
} }
/// Returns all the elements contained in a given region's value. /// Returns all the elements contained in a given region's value.
crate fn elements_contained_in<'a>(&'a self, r: N) -> impl Iterator<Item = RegionElement> + 'a { crate fn elements_contained_in<'a>(&'a self, r: N) -> impl Iterator<Item = RegionElement> + 'a {
let points_iter = self.locations_outlived_by(r).map(RegionElement::Location); let points_iter = self.locations_outlived_by(r).map(RegionElement::Location);
let free_regions_iter = self let free_regions_iter = self.universal_regions_outlived_by(r)
.universal_regions_outlived_by(r)
.map(RegionElement::RootUniversalRegion); .map(RegionElement::RootUniversalRegion);
let subuniverses_iter = self let placeholder_universes_iter = self.placeholders_contained_in(r)
.subuniverses_contained_in(r) .map(RegionElement::PlaceholderRegion);
.map(RegionElement::SubUniversalRegion);
points_iter points_iter
.chain(free_regions_iter) .chain(free_regions_iter)
.chain(subuniverses_iter) .chain(placeholder_universes_iter)
} }
/// Returns a "pretty" string value of the region. Meant for debugging. /// Returns a "pretty" string value of the region. Meant for debugging.
@ -397,14 +445,14 @@ impl ToElementIndex for RegionVid {
} }
} }
impl ToElementIndex for ty::UniverseIndex { impl ToElementIndex for ty::Placeholder {
fn add_to_row<N: Idx>(self, values: &mut RegionValues<N>, row: N) -> bool { fn add_to_row<N: Idx>(self, values: &mut RegionValues<N>, row: N) -> bool {
let index = PlaceholderIndex::new(self.as_usize() - 1); let index = values.placeholder_indices.lookup_index(self);
values.placeholders.insert(row, index) values.placeholders.insert(row, index)
} }
fn contained_in_row<N: Idx>(self, values: &RegionValues<N>, row: N) -> bool { fn contained_in_row<N: Idx>(self, values: &RegionValues<N>, row: N) -> bool {
let index = PlaceholderIndex::new(self.as_usize() - 1); let index = values.placeholder_indices.lookup_index(self);
values.placeholders.contains(row, index) values.placeholders.contains(row, index)
} }
} }
@ -467,7 +515,7 @@ fn region_value_str(elements: impl IntoIterator<Item = RegionElement>) -> String
result.push_str(&format!("{:?}", fr)); result.push_str(&format!("{:?}", fr));
} }
RegionElement::SubUniversalRegion(ur) => { RegionElement::PlaceholderRegion(placeholder) => {
if let Some((location1, location2)) = open_location { if let Some((location1, location2)) = open_location {
push_sep(&mut result); push_sep(&mut result);
push_location_range(&mut result, location1, location2); push_location_range(&mut result, location1, location2);
@ -475,7 +523,7 @@ fn region_value_str(elements: impl IntoIterator<Item = RegionElement>) -> String
} }
push_sep(&mut result); push_sep(&mut result);
result.push_str(&format!("{:?}", ur)); result.push_str(&format!("{:?}", placeholder));
} }
} }
} }

View File

@ -15,7 +15,9 @@ use borrow_check::borrow_set::BorrowSet;
use borrow_check::location::LocationTable; use borrow_check::location::LocationTable;
use borrow_check::nll::constraints::{ConstraintCategory, ConstraintSet, OutlivesConstraint}; use borrow_check::nll::constraints::{ConstraintCategory, ConstraintSet, OutlivesConstraint};
use borrow_check::nll::facts::AllFacts; use borrow_check::nll::facts::AllFacts;
use borrow_check::nll::region_infer::values::{LivenessValues, RegionValueElements}; use borrow_check::nll::region_infer::values::LivenessValues;
use borrow_check::nll::region_infer::values::PlaceholderIndices;
use borrow_check::nll::region_infer::values::RegionValueElements;
use borrow_check::nll::region_infer::{ClosureRegionRequirementsExt, TypeTest}; use borrow_check::nll::region_infer::{ClosureRegionRequirementsExt, TypeTest};
use borrow_check::nll::renumber; use borrow_check::nll::renumber;
use borrow_check::nll::type_check::free_region_relations::{ use borrow_check::nll::type_check::free_region_relations::{
@ -42,13 +44,13 @@ use rustc::traits::{ObligationCause, PredicateObligations};
use rustc::ty::fold::TypeFoldable; use rustc::ty::fold::TypeFoldable;
use rustc::ty::subst::Subst; use rustc::ty::subst::Subst;
use rustc::ty::{self, CanonicalTy, RegionVid, ToPolyTraitRef, Ty, TyCtxt, TyKind}; use rustc::ty::{self, CanonicalTy, RegionVid, ToPolyTraitRef, Ty, TyCtxt, TyKind};
use std::{fmt, iter};
use std::rc::Rc; use std::rc::Rc;
use std::{fmt, iter};
use syntax_pos::{Span, DUMMY_SP}; use syntax_pos::{Span, DUMMY_SP};
use transform::{MirPass, MirSource}; use transform::{MirPass, MirSource};
use rustc_data_structures::fx::FxHashSet;
use either::Either; use either::Either;
use rustc_data_structures::fx::FxHashSet;
macro_rules! span_mirbug { macro_rules! span_mirbug {
($context:expr, $elem:expr, $($message:tt)*) => ({ ($context:expr, $elem:expr, $($message:tt)*) => ({
@ -128,6 +130,7 @@ pub(crate) fn type_check<'gcx, 'tcx>(
outlives_constraints: ConstraintSet::default(), outlives_constraints: ConstraintSet::default(),
type_tests: Vec::default(), type_tests: Vec::default(),
}; };
let mut placeholder_indices = PlaceholderIndices::default();
let CreateResult { let CreateResult {
universal_region_relations, universal_region_relations,
@ -147,6 +150,7 @@ pub(crate) fn type_check<'gcx, 'tcx>(
borrow_set, borrow_set,
all_facts, all_facts,
constraints: &mut constraints, constraints: &mut constraints,
placeholder_indices: &mut placeholder_indices,
}; };
type_check_internal( type_check_internal(
@ -162,12 +166,15 @@ pub(crate) fn type_check<'gcx, 'tcx>(
cx.equate_inputs_and_outputs(mir, universal_regions, &normalized_inputs_and_output); cx.equate_inputs_and_outputs(mir, universal_regions, &normalized_inputs_and_output);
liveness::generate(cx, mir, elements, flow_inits, move_data, location_table); liveness::generate(cx, mir, elements, flow_inits, move_data, location_table);
cx.borrowck_context.as_mut().map(|bcx| translate_outlives_facts(bcx)); cx.borrowck_context
.as_mut()
.map(|bcx| translate_outlives_facts(bcx));
}, },
); );
MirTypeckResults { MirTypeckResults {
constraints, constraints,
placeholder_indices,
universal_region_relations, universal_region_relations,
} }
} }
@ -210,8 +217,10 @@ fn type_check_internal<'a, 'gcx, 'tcx, R>(
fn translate_outlives_facts(cx: &mut BorrowCheckContext) { fn translate_outlives_facts(cx: &mut BorrowCheckContext) {
if let Some(facts) = cx.all_facts { if let Some(facts) = cx.all_facts {
let location_table = cx.location_table; let location_table = cx.location_table;
facts.outlives.extend( facts
cx.constraints.outlives_constraints.iter().flat_map(|constraint: &OutlivesConstraint| { .outlives
.extend(cx.constraints.outlives_constraints.iter().flat_map(
|constraint: &OutlivesConstraint| {
if let Some(from_location) = constraint.locations.from_location() { if let Some(from_location) = constraint.locations.from_location() {
Either::Left(iter::once(( Either::Left(iter::once((
constraint.sup, constraint.sup,
@ -219,12 +228,14 @@ fn translate_outlives_facts(cx: &mut BorrowCheckContext) {
location_table.mid_index(from_location), location_table.mid_index(from_location),
))) )))
} else { } else {
Either::Right(location_table.all_points().map(move |location| { Either::Right(
(constraint.sup, constraint.sub, location) location_table
})) .all_points()
.map(move |location| (constraint.sup, constraint.sub, location)),
)
} }
}) },
); ));
} }
} }
@ -718,10 +729,12 @@ struct BorrowCheckContext<'a, 'tcx: 'a> {
all_facts: &'a mut Option<AllFacts>, all_facts: &'a mut Option<AllFacts>,
borrow_set: &'a BorrowSet<'tcx>, borrow_set: &'a BorrowSet<'tcx>,
constraints: &'a mut MirTypeckRegionConstraints<'tcx>, constraints: &'a mut MirTypeckRegionConstraints<'tcx>,
placeholder_indices: &'a mut PlaceholderIndices,
} }
crate struct MirTypeckResults<'tcx> { crate struct MirTypeckResults<'tcx> {
crate constraints: MirTypeckRegionConstraints<'tcx>, crate constraints: MirTypeckRegionConstraints<'tcx>,
crate placeholder_indices: PlaceholderIndices,
crate universal_region_relations: Rc<UniversalRegionRelations<'tcx>>, crate universal_region_relations: Rc<UniversalRegionRelations<'tcx>>,
} }

View File

@ -146,18 +146,27 @@ trait TypeRelatingDelegate<'tcx> {
/// delegate. /// delegate.
fn push_outlives(&mut self, sup: ty::Region<'tcx>, sub: ty::Region<'tcx>); fn push_outlives(&mut self, sup: ty::Region<'tcx>, sub: ty::Region<'tcx>);
/// Creates a new region variable representing an instantiated /// Creates a new universe index. Used when instantiating placeholders.
/// higher-ranked region; this will be either existential or fn next_subuniverse(&mut self) -> ty::UniverseIndex;
/// universal depending on the context. So e.g. if you have
/// `for<'a> fn(..) <: for<'b> fn(..)`, then we will first /// Creates a new region variable representing a higher-ranked
/// instantiate `'b` with a universally quantitifed region and /// region that is instantiated existentially. This creates an
/// then `'a` with an existentially quantified region (the order /// inference variable, typically.
/// is important so that the existential region `'a` can see the ///
/// universal one). /// So e.g. if you have `for<'a> fn(..) <: for<'b> fn(..)`, then
fn next_region_var( /// we will invoke this method to instantiate `'a` with an
&mut self, /// inference variable (though `'b` would be instantiated first,
universally_quantified: UniversallyQuantified, /// as a placeholder).
) -> ty::Region<'tcx>; fn next_existential_region_var(&mut self) -> ty::Region<'tcx>;
/// Creates a new region variable representing a
/// higher-ranked region that is instantiated universally.
/// This creates a new region placeholder, typically.
///
/// So e.g. if you have `for<'a> fn(..) <: for<'b> fn(..)`, then
/// we will invoke this method to instantiate `'b` with a
/// placeholder region.
fn next_placeholder_region(&mut self, placeholder: ty::Placeholder) -> ty::Region<'tcx>;
/// Creates a new existential region in the given universe. This /// Creates a new existential region in the given universe. This
/// is used when handling subtyping and type variables -- if we /// is used when handling subtyping and type variables -- if we
@ -197,15 +206,20 @@ impl NllTypeRelatingDelegate<'me, 'bccx, 'gcx, 'tcx> {
} }
impl TypeRelatingDelegate<'tcx> for NllTypeRelatingDelegate<'_, '_, '_, 'tcx> { impl TypeRelatingDelegate<'tcx> for NllTypeRelatingDelegate<'_, '_, '_, 'tcx> {
fn next_region_var( fn next_subuniverse(&mut self) -> ty::UniverseIndex {
&mut self, self.infcx.create_subuniverse()
universally_quantified: UniversallyQuantified, }
) -> ty::Region<'tcx> {
let origin = if universally_quantified.0 { fn next_existential_region_var(&mut self) -> ty::Region<'tcx> {
NLLRegionVariableOrigin::BoundRegion(self.infcx.create_subuniverse()) let origin = NLLRegionVariableOrigin::Existential;
} else { self.infcx.next_nll_region_var(origin)
NLLRegionVariableOrigin::Existential }
};
fn next_placeholder_region(&mut self, placeholder: ty::Placeholder) -> ty::Region<'tcx> {
let origin = NLLRegionVariableOrigin::Placeholder(placeholder);
if let Some(borrowck_context) = &mut self.borrowck_context {
borrowck_context.placeholder_indices.insert(placeholder);
}
self.infcx.next_nll_region_var(origin) self.infcx.next_nll_region_var(origin)
} }
@ -286,12 +300,37 @@ where
universally_quantified: UniversallyQuantified, universally_quantified: UniversallyQuantified,
) -> BoundRegionScope<'tcx> { ) -> BoundRegionScope<'tcx> {
let mut scope = BoundRegionScope::default(); let mut scope = BoundRegionScope::default();
// Create a callback that creates (via the delegate) either an
// existential or placeholder region as needed.
let mut next_region = {
let delegate = &mut self.delegate;
let mut lazy_universe = None;
move |br: ty::BoundRegion| {
if universally_quantified.0 {
// The first time this closure is called, create a
// new universe for the placeholders we will make
// from here out.
let universe = lazy_universe.unwrap_or_else(|| {
let universe = delegate.next_subuniverse();
lazy_universe = Some(universe);
universe
});
let placeholder = ty::Placeholder { universe, name: br };
delegate.next_placeholder_region(placeholder)
} else {
delegate.next_existential_region_var()
}
}
};
value.skip_binder().visit_with(&mut ScopeInstantiator { value.skip_binder().visit_with(&mut ScopeInstantiator {
delegate: &mut self.delegate, next_region: &mut next_region,
target_index: ty::INNERMOST, target_index: ty::INNERMOST,
universally_quantified,
bound_region_scope: &mut scope, bound_region_scope: &mut scope,
}); });
scope scope
} }
@ -604,21 +643,14 @@ where
/// binder depth, and finds late-bound regions targeting the /// binder depth, and finds late-bound regions targeting the
/// `for<..`>. For each of those, it creates an entry in /// `for<..`>. For each of those, it creates an entry in
/// `bound_region_scope`. /// `bound_region_scope`.
struct ScopeInstantiator<'me, 'tcx: 'me, D> struct ScopeInstantiator<'me, 'tcx: 'me> {
where next_region: &'me mut dyn FnMut(ty::BoundRegion) -> ty::Region<'tcx>,
D: TypeRelatingDelegate<'tcx> + 'me,
{
delegate: &'me mut D,
// The debruijn index of the scope we are instantiating. // The debruijn index of the scope we are instantiating.
target_index: ty::DebruijnIndex, target_index: ty::DebruijnIndex,
universally_quantified: UniversallyQuantified,
bound_region_scope: &'me mut BoundRegionScope<'tcx>, bound_region_scope: &'me mut BoundRegionScope<'tcx>,
} }
impl<'me, 'tcx, D> TypeVisitor<'tcx> for ScopeInstantiator<'me, 'tcx, D> impl<'me, 'tcx> TypeVisitor<'tcx> for ScopeInstantiator<'me, 'tcx> {
where
D: TypeRelatingDelegate<'tcx>,
{
fn visit_binder<T: TypeFoldable<'tcx>>(&mut self, t: &ty::Binder<T>) -> bool { fn visit_binder<T: TypeFoldable<'tcx>>(&mut self, t: &ty::Binder<T>) -> bool {
self.target_index.shift_in(1); self.target_index.shift_in(1);
t.super_visit_with(self); t.super_visit_with(self);
@ -629,9 +661,8 @@ where
fn visit_region(&mut self, r: ty::Region<'tcx>) -> bool { fn visit_region(&mut self, r: ty::Region<'tcx>) -> bool {
let ScopeInstantiator { let ScopeInstantiator {
universally_quantified,
bound_region_scope, bound_region_scope,
delegate, next_region,
.. ..
} = self; } = self;
@ -640,7 +671,7 @@ where
bound_region_scope bound_region_scope
.map .map
.entry(*br) .entry(*br)
.or_insert_with(|| delegate.next_region_var(*universally_quantified)); .or_insert_with(|| next_region(*br));
} }
_ => {} _ => {}

View File

@ -128,14 +128,14 @@ fn compare_predicate_entailment<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
// We create a mapping `dummy_substs` that maps from the impl type // We create a mapping `dummy_substs` that maps from the impl type
// parameters to fresh types and regions. For type parameters, // parameters to fresh types and regions. For type parameters,
// this is the identity transform, but we could as well use any // this is the identity transform, but we could as well use any
// skolemized types. For regions, we convert from bound to free // placeholder types. For regions, we convert from bound to free
// regions (Note: but only early-bound regions, i.e., those // regions (Note: but only early-bound regions, i.e., those
// declared on the impl or used in type parameter bounds). // declared on the impl or used in type parameter bounds).
// //
// impl_to_skol_substs = {'i => 'i0, U => U0, N => N0 } // impl_to_skol_substs = {'i => 'i0, U => U0, N => N0 }
// //
// Now we can apply skol_substs to the type of the impl method // Now we can apply skol_substs to the type of the impl method
// to yield a new function type in terms of our fresh, skolemized // to yield a new function type in terms of our fresh, placeholder
// types: // types:
// //
// <'b> fn(t: &'i0 U0, m: &'b) -> Foo // <'b> fn(t: &'i0 U0, m: &'b) -> Foo
@ -163,15 +163,15 @@ fn compare_predicate_entailment<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
// We do this by creating a parameter environment which contains a // We do this by creating a parameter environment which contains a
// substitution corresponding to impl_to_skol_substs. We then build // substitution corresponding to impl_to_skol_substs. We then build
// trait_to_skol_substs and use it to convert the predicates contained // trait_to_skol_substs and use it to convert the predicates contained
// in the trait_m.generics to the skolemized form. // in the trait_m.generics to the placeholder form.
// //
// Finally we register each of these predicates as an obligation in // Finally we register each of these predicates as an obligation in
// a fresh FulfillmentCtxt, and invoke select_all_or_error. // a fresh FulfillmentCtxt, and invoke select_all_or_error.
// Create mapping from impl to skolemized. // Create mapping from impl to placeholder.
let impl_to_skol_substs = Substs::identity_for_item(tcx, impl_m.def_id); let impl_to_skol_substs = Substs::identity_for_item(tcx, impl_m.def_id);
// Create mapping from trait to skolemized. // Create mapping from trait to placeholder.
let trait_to_skol_substs = impl_to_skol_substs.rebase_onto(tcx, let trait_to_skol_substs = impl_to_skol_substs.rebase_onto(tcx,
impl_m.container.id(), impl_m.container.id(),
trait_to_impl_substs); trait_to_impl_substs);
@ -212,7 +212,7 @@ fn compare_predicate_entailment<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
hybrid_preds.predicates.extend( hybrid_preds.predicates.extend(
trait_m_predicates.instantiate_own(tcx, trait_to_skol_substs).predicates); trait_m_predicates.instantiate_own(tcx, trait_to_skol_substs).predicates);
// Construct trait parameter environment and then shift it into the skolemized viewpoint. // Construct trait parameter environment and then shift it into the placeholder viewpoint.
// The key step here is to update the caller_bounds's predicates to be // The key step here is to update the caller_bounds's predicates to be
// the new hybrid bounds we computed. // the new hybrid bounds we computed.
let normalize_cause = traits::ObligationCause::misc(impl_m_span, impl_m_node_id); let normalize_cause = traits::ObligationCause::misc(impl_m_span, impl_m_node_id);
@ -259,7 +259,7 @@ fn compare_predicate_entailment<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
// any associated types appearing in the fn arguments or return // any associated types appearing in the fn arguments or return
// type. // type.
// Compute skolemized form of impl and trait method tys. // Compute placeholder form of impl and trait method tys.
let tcx = infcx.tcx; let tcx = infcx.tcx;
let (impl_sig, _) = let (impl_sig, _) =
@ -894,7 +894,7 @@ pub fn compare_const_impl<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
// method. // method.
let impl_c_node_id = tcx.hir.as_local_node_id(impl_c.def_id).unwrap(); let impl_c_node_id = tcx.hir.as_local_node_id(impl_c.def_id).unwrap();
// Compute skolemized form of impl and trait const tys. // Compute placeholder form of impl and trait const tys.
let impl_ty = tcx.type_of(impl_c.def_id); let impl_ty = tcx.type_of(impl_c.def_id);
let trait_ty = tcx.type_of(trait_c.def_id).subst(tcx, trait_to_impl_substs); let trait_ty = tcx.type_of(trait_c.def_id).subst(tcx, trait_to_impl_substs);
let mut cause = ObligationCause::misc(impl_c_span, impl_c_node_id); let mut cause = ObligationCause::misc(impl_c_span, impl_c_node_id);

View File

@ -170,7 +170,7 @@ fn is_free_region<'tcx>(tcx: TyCtxt<'_, 'tcx, 'tcx>, region: Region<'_>) -> bool
| RegionKind::ReCanonical(..) | RegionKind::ReCanonical(..)
| RegionKind::ReScope(..) | RegionKind::ReScope(..)
| RegionKind::ReVar(..) | RegionKind::ReVar(..)
| RegionKind::ReSkolemized(..) | RegionKind::RePlaceholder(..)
| RegionKind::ReFree(..) => { | RegionKind::ReFree(..) => {
bug!("unexpected region in outlives inference: {:?}", region); bug!("unexpected region in outlives inference: {:?}", region);
} }

View File

@ -431,7 +431,7 @@ impl<'a, 'tcx> ConstraintContext<'a, 'tcx> {
ty::ReClosureBound(..) | ty::ReClosureBound(..) |
ty::ReScope(..) | ty::ReScope(..) |
ty::ReVar(..) | ty::ReVar(..) |
ty::ReSkolemized(..) | ty::RePlaceholder(..) |
ty::ReEmpty | ty::ReEmpty |
ty::ReErased => { ty::ReErased => {
// We don't expect to see anything but 'static or bound // We don't expect to see anything but 'static or bound

View File

@ -1258,7 +1258,7 @@ impl Clean<Option<Lifetime>> for ty::RegionKind {
ty::ReFree(..) | ty::ReFree(..) |
ty::ReScope(..) | ty::ReScope(..) |
ty::ReVar(..) | ty::ReVar(..) |
ty::ReSkolemized(..) | ty::RePlaceholder(..) |
ty::ReEmpty | ty::ReEmpty |
ty::ReClosureBound(_) | ty::ReClosureBound(_) |
ty::ReCanonical(_) | ty::ReCanonical(_) |

View File

@ -10,8 +10,8 @@
#![allow(dead_code)] #![allow(dead_code)]
// Regression test for #37154: the problem here was that the cache // Regression test for #37154: the problem here was that the cache
// results in a false error because it was caching skolemized results // results in a false error because it was caching placeholder results
// even after those skolemized regions had been popped. // even after those placeholder regions had been popped.
trait Foo { trait Foo {
fn method(&self) {} fn method(&self) {}