mirror of
https://github.com/rust-lang/rust.git
synced 2024-12-03 12:13:43 +00:00
Auto merge of #45825 - nikomatsakis:nll-factor-region-inference, r=arielb1
integrate MIR type-checker with NLL inference This branch refactors NLL type inference so that it uses the MIR type-checker to gather constraints. Along the way, it also refactors how region constraints are gathered in the normal inference context mildly. The new setup is like this: - What used to be `region_inference` is split into two parts: - `region_constraints`, which just collects up sets of constraints - `lexical_region_resolve`, which does the iterative, lexical region resolution - When `resolve_regions_and_report_errors` is invoked, the inference engine converts the constraints into final values. - In the MIR type checker, however, we do not invoke this method, but instead periodically take the region constraints and package them up for the NLL solver to use later. - This allows us to track when and where those constraints were incurred. - We also remove the central fulfillment context from the MIR type checker, instead instantiating new fulfillment contexts at each point. This allows us to capture the set of obligations that occurred at a particular point, and also to ensure that if the same obligation arises at two points, we will enforce the region constraints at both locations. - The MIR type checker is also enhanced to instantiate late-bound-regions with fresh variables and handle a few other corner cases that arose. - I also extracted some of the 'outlives' logic from the regionck, which will be needed later (see future work) to handle the type-outlives relationships. One concern I have with this branch: since the MIR type checker is used even without the `-Znll` switch, I'm not sure if it will impact performance. One simple fix here would be to only enable the MIR type-checker if debug-assertions are enabled, since it just serves to validate the MIR. Longer term I hope to address this by improving the interface to the trait solver to be more query-based (ongoing work). There is plenty of future work left. Here are two things that leap to mind: - **Type-region outlives.** Currently, the NLL solver will ICE if it is required to handle a constraint like `T: 'a`. Fixing this will require a small amount of refactoring to extract the implied bounds code. I plan to follow a file-up bug on this (hopefully with mentoring instructions). - **Testing.** It's a good idea to enumerate some of the tricky scenarios that need testing, but I think it'd be nice to try and parallelize some of the actual test writing (and resulting bug fixing): - Same obligation occurring at two points. - Well-formedness and trait obligations of various kinds (which are not all processed by the current MIR type-checker). - More tests for how subtyping and region inferencing interact. - More suggestions welcome! r? @arielb1
This commit is contained in:
commit
d0f8e2913a
@ -1,239 +1,227 @@
|
||||
# Type inference engine
|
||||
|
||||
This is loosely based on standard HM-type inference, but with an
|
||||
extension to try and accommodate subtyping. There is nothing
|
||||
principled about this extension; it's sound---I hope!---but it's a
|
||||
heuristic, ultimately, and does not guarantee that it finds a valid
|
||||
typing even if one exists (in fact, there are known scenarios where it
|
||||
fails, some of which may eventually become problematic).
|
||||
The type inference is based on standard HM-type inference, but
|
||||
extended in various way to accommodate subtyping, region inference,
|
||||
and higher-ranked types.
|
||||
|
||||
## Key idea
|
||||
## A note on terminology
|
||||
|
||||
The main change is that each type variable T is associated with a
|
||||
lower-bound L and an upper-bound U. L and U begin as bottom and top,
|
||||
respectively, but gradually narrow in response to new constraints
|
||||
being introduced. When a variable is finally resolved to a concrete
|
||||
type, it can (theoretically) select any type that is a supertype of L
|
||||
and a subtype of U.
|
||||
We use the notation `?T` to refer to inference variables, also called
|
||||
existential variables.
|
||||
|
||||
There are several critical invariants which we maintain:
|
||||
We use the term "region" and "lifetime" interchangeably. Both refer to
|
||||
the `'a` in `&'a T`.
|
||||
|
||||
- the upper-bound of a variable only becomes lower and the lower-bound
|
||||
only becomes higher over time;
|
||||
- the lower-bound L is always a subtype of the upper bound U;
|
||||
- the lower-bound L and upper-bound U never refer to other type variables,
|
||||
but only to types (though those types may contain type variables).
|
||||
The term "bound region" refers to regions bound in a function
|
||||
signature, such as the `'a` in `for<'a> fn(&'a u32)`. A region is
|
||||
"free" if it is not bound.
|
||||
|
||||
> An aside: if the terms upper- and lower-bound confuse you, think of
|
||||
> "supertype" and "subtype". The upper-bound is a "supertype"
|
||||
> (super=upper in Latin, or something like that anyway) and the lower-bound
|
||||
> is a "subtype" (sub=lower in Latin). I find it helps to visualize
|
||||
> a simple class hierarchy, like Java minus interfaces and
|
||||
> primitive types. The class Object is at the root (top) and other
|
||||
> types lie in between. The bottom type is then the Null type.
|
||||
> So the tree looks like:
|
||||
>
|
||||
> ```text
|
||||
> Object
|
||||
> / \
|
||||
> String Other
|
||||
> \ /
|
||||
> (null)
|
||||
> ```
|
||||
>
|
||||
> So the upper bound type is the "supertype" and the lower bound is the
|
||||
> "subtype" (also, super and sub mean upper and lower in Latin, or something
|
||||
> like that anyway).
|
||||
## Creating an inference context
|
||||
|
||||
## Satisfying constraints
|
||||
|
||||
At a primitive level, there is only one form of constraint that the
|
||||
inference understands: a subtype relation. So the outside world can
|
||||
say "make type A a subtype of type B". If there are variables
|
||||
involved, the inferencer will adjust their upper- and lower-bounds as
|
||||
needed to ensure that this relation is satisfied. (We also allow "make
|
||||
type A equal to type B", but this is translated into "A <: B" and "B
|
||||
<: A")
|
||||
|
||||
As stated above, we always maintain the invariant that type bounds
|
||||
never refer to other variables. This keeps the inference relatively
|
||||
simple, avoiding the scenario of having a kind of graph where we have
|
||||
to pump constraints along and reach a fixed point, but it does impose
|
||||
some heuristics in the case where the user is relating two type
|
||||
variables A <: B.
|
||||
|
||||
Combining two variables such that variable A will forever be a subtype
|
||||
of variable B is the trickiest part of the algorithm because there is
|
||||
often no right choice---that is, the right choice will depend on
|
||||
future constraints which we do not yet know. The problem comes about
|
||||
because both A and B have bounds that can be adjusted in the future.
|
||||
Let's look at some of the cases that can come up.
|
||||
|
||||
Imagine, to start, the best case, where both A and B have an upper and
|
||||
lower bound (that is, the bounds are not top nor bot respectively). In
|
||||
that case, if we're lucky, A.ub <: B.lb, and so we know that whatever
|
||||
A and B should become, they will forever have the desired subtyping
|
||||
relation. We can just leave things as they are.
|
||||
|
||||
### Option 1: Unify
|
||||
|
||||
However, suppose that A.ub is *not* a subtype of B.lb. In
|
||||
that case, we must make a decision. One option is to unify A
|
||||
and B so that they are one variable whose bounds are:
|
||||
|
||||
UB = GLB(A.ub, B.ub)
|
||||
LB = LUB(A.lb, B.lb)
|
||||
|
||||
(Note that we will have to verify that LB <: UB; if it does not, the
|
||||
types are not intersecting and there is an error) In that case, A <: B
|
||||
holds trivially because A==B. However, we have now lost some
|
||||
flexibility, because perhaps the user intended for A and B to end up
|
||||
as different types and not the same type.
|
||||
|
||||
Pictorially, what this does is to take two distinct variables with
|
||||
(hopefully not completely) distinct type ranges and produce one with
|
||||
the intersection.
|
||||
|
||||
```text
|
||||
B.ub B.ub
|
||||
/\ /
|
||||
A.ub / \ A.ub /
|
||||
/ \ / \ \ /
|
||||
/ X \ UB
|
||||
/ / \ \ / \
|
||||
/ / / \ / /
|
||||
\ \ / / \ /
|
||||
\ X / LB
|
||||
\ / \ / / \
|
||||
\ / \ / / \
|
||||
A.lb B.lb A.lb B.lb
|
||||
```
|
||||
|
||||
|
||||
### Option 2: Relate UB/LB
|
||||
|
||||
Another option is to keep A and B as distinct variables but set their
|
||||
bounds in such a way that, whatever happens, we know that A <: B will hold.
|
||||
This can be achieved by ensuring that A.ub <: B.lb. In practice there
|
||||
are two ways to do that, depicted pictorially here:
|
||||
|
||||
```text
|
||||
Before Option #1 Option #2
|
||||
|
||||
B.ub B.ub B.ub
|
||||
/\ / \ / \
|
||||
A.ub / \ A.ub /(B')\ A.ub /(B')\
|
||||
/ \ / \ \ / / \ / /
|
||||
/ X \ __UB____/ UB /
|
||||
/ / \ \ / | | /
|
||||
/ / / \ / | | /
|
||||
\ \ / / /(A')| | /
|
||||
\ X / / LB ______LB/
|
||||
\ / \ / / / \ / (A')/ \
|
||||
\ / \ / \ / \ \ / \
|
||||
A.lb B.lb A.lb B.lb A.lb B.lb
|
||||
```
|
||||
|
||||
In these diagrams, UB and LB are defined as before. As you can see,
|
||||
the new ranges `A'` and `B'` are quite different from the range that
|
||||
would be produced by unifying the variables.
|
||||
|
||||
### What we do now
|
||||
|
||||
Our current technique is to *try* (transactionally) to relate the
|
||||
existing bounds of A and B, if there are any (i.e., if `UB(A) != top
|
||||
&& LB(B) != bot`). If that succeeds, we're done. If it fails, then
|
||||
we merge A and B into same variable.
|
||||
|
||||
This is not clearly the correct course. For example, if `UB(A) !=
|
||||
top` but `LB(B) == bot`, we could conceivably set `LB(B)` to `UB(A)`
|
||||
and leave the variables unmerged. This is sometimes the better
|
||||
course, it depends on the program.
|
||||
|
||||
The main case which fails today that I would like to support is:
|
||||
You create and "enter" an inference context by doing something like
|
||||
the following:
|
||||
|
||||
```rust
|
||||
fn foo<T>(x: T, y: T) { ... }
|
||||
|
||||
fn bar() {
|
||||
let x: @mut int = @mut 3;
|
||||
let y: @int = @3;
|
||||
foo(x, y);
|
||||
}
|
||||
tcx.infer_ctxt().enter(|infcx| {
|
||||
// use the inference context `infcx` in here
|
||||
})
|
||||
```
|
||||
|
||||
In principle, the inferencer ought to find that the parameter `T` to
|
||||
`foo(x, y)` is `@const int`. Today, however, it does not; this is
|
||||
because the type variable `T` is merged with the type variable for
|
||||
`X`, and thus inherits its UB/LB of `@mut int`. This leaves no
|
||||
flexibility for `T` to later adjust to accommodate `@int`.
|
||||
Each inference context creates a short-lived type arena to store the
|
||||
fresh types and things that it will create, as described in
|
||||
[the README in the ty module][ty-readme]. This arena is created by the `enter`
|
||||
function and disposed after it returns.
|
||||
|
||||
Note: `@` and `@mut` are replaced with `Rc<T>` and `Rc<RefCell<T>>` in current Rust.
|
||||
[ty-readme]: src/librustc/ty/README.md
|
||||
|
||||
### What to do when not all bounds are present
|
||||
Within the closure, the infcx will have the type `InferCtxt<'cx, 'gcx,
|
||||
'tcx>` for some fresh `'cx` and `'tcx` -- the latter corresponds to
|
||||
the lifetime of this temporary arena, and the `'cx` is the lifetime of
|
||||
the `InferCtxt` itself. (Again, see [that ty README][ty-readme] for
|
||||
more details on this setup.)
|
||||
|
||||
In the prior discussion we assumed that A.ub was not top and B.lb was
|
||||
not bot. Unfortunately this is rarely the case. Often type variables
|
||||
have "lopsided" bounds. For example, if a variable in the program has
|
||||
been initialized but has not been used, then its corresponding type
|
||||
variable will have a lower bound but no upper bound. When that
|
||||
variable is then used, we would like to know its upper bound---but we
|
||||
don't have one! In this case we'll do different things depending on
|
||||
how the variable is being used.
|
||||
The `tcx.infer_ctxt` method actually returns a build, which means
|
||||
there are some kinds of configuration you can do before the `infcx` is
|
||||
created. See `InferCtxtBuilder` for more information.
|
||||
|
||||
## Transactional support
|
||||
## Inference variables
|
||||
|
||||
Whenever we adjust merge variables or adjust their bounds, we always
|
||||
keep a record of the old value. This allows the changes to be undone.
|
||||
The main purpose of the inference context is to house a bunch of
|
||||
**inference variables** -- these represent types or regions whose precise
|
||||
value is not yet known, but will be uncovered as we perform type-checking.
|
||||
|
||||
## Regions
|
||||
If you're familiar with the basic ideas of unification from H-M type
|
||||
systems, or logic languages like Prolog, this is the same concept. If
|
||||
you're not, you might want to read a tutorial on how H-M type
|
||||
inference works, or perhaps this blog post on
|
||||
[unification in the Chalk project].
|
||||
|
||||
I've only talked about type variables here, but region variables
|
||||
follow the same principle. They have upper- and lower-bounds. A
|
||||
region A is a subregion of a region B if A being valid implies that B
|
||||
is valid. This basically corresponds to the block nesting structure:
|
||||
the regions for outer block scopes are superregions of those for inner
|
||||
block scopes.
|
||||
[Unification in the Chalk project]: http://smallcultfollowing.com/babysteps/blog/2017/03/25/unification-in-chalk-part-1/
|
||||
|
||||
## Integral and floating-point type variables
|
||||
All told, the inference context stores four kinds of inference variables as of this
|
||||
writing:
|
||||
|
||||
There is a third variety of type variable that we use only for
|
||||
inferring the types of unsuffixed integer literals. Integral type
|
||||
variables differ from general-purpose type variables in that there's
|
||||
no subtyping relationship among the various integral types, so instead
|
||||
of associating each variable with an upper and lower bound, we just
|
||||
use simple unification. Each integer variable is associated with at
|
||||
most one integer type. Floating point types are handled similarly to
|
||||
integral types.
|
||||
- Type variables, which come in three varieties:
|
||||
- General type variables (the most common). These can be unified with any type.
|
||||
- Integral type variables, which can only be unified with an integral type, and
|
||||
arise from an integer literal expression like `22`.
|
||||
- Float type variables, which can only be unified with a float type, and
|
||||
arise from a float literal expression like `22.0`.
|
||||
- Region variables, which represent lifetimes, and arise all over the dang place.
|
||||
|
||||
## GLB/LUB
|
||||
All the type variables work in much the same way: you can create a new
|
||||
type variable, and what you get is `Ty<'tcx>` representing an
|
||||
unresolved type `?T`. Then later you can apply the various operations
|
||||
that the inferencer supports, such as equality or subtyping, and it
|
||||
will possibly **instantiate** (or **bind**) that `?T` to a specific
|
||||
value as a result.
|
||||
|
||||
Computing the greatest-lower-bound and least-upper-bound of two
|
||||
types/regions is generally straightforward except when type variables
|
||||
are involved. In that case, we follow a similar "try to use the bounds
|
||||
when possible but otherwise merge the variables" strategy. In other
|
||||
words, `GLB(A, B)` where `A` and `B` are variables will often result
|
||||
in `A` and `B` being merged and the result being `A`.
|
||||
The region variables work somewhat differently, and are described
|
||||
below in a separate section.
|
||||
|
||||
## Type coercion
|
||||
## Enforcing equality / subtyping
|
||||
|
||||
We have a notion of assignability which differs somewhat from
|
||||
subtyping; in particular it may cause region borrowing to occur. See
|
||||
the big comment later in this file on Type Coercion for specifics.
|
||||
The most basic operations you can perform in the type inferencer is
|
||||
**equality**, which forces two types `T` and `U` to be the same. The
|
||||
recommended way to add an equality constraint is using the `at`
|
||||
method, roughly like so:
|
||||
|
||||
### In conclusion
|
||||
```
|
||||
infcx.at(...).eq(t, u);
|
||||
```
|
||||
|
||||
I showed you three ways to relate `A` and `B`. There are also more,
|
||||
of course, though I'm not sure if there are any more sensible options.
|
||||
The main point is that there are various options, each of which
|
||||
produce a distinct range of types for `A` and `B`. Depending on what
|
||||
the correct values for A and B are, one of these options will be the
|
||||
right choice: but of course we don't know the right values for A and B
|
||||
yet, that's what we're trying to find! In our code, we opt to unify
|
||||
(Option #1).
|
||||
The first `at()` call provides a bit of context, i.e., why you are
|
||||
doing this unification, and in what environment, and the `eq` method
|
||||
performs the actual equality constraint.
|
||||
|
||||
# Implementation details
|
||||
When you equate things, you force them to be precisely equal. Equating
|
||||
returns a `InferResult` -- if it returns `Err(err)`, then equating
|
||||
failed, and the enclosing `TypeError` will tell you what went wrong.
|
||||
|
||||
We make use of a trait-like implementation strategy to consolidate
|
||||
duplicated code between subtypes, GLB, and LUB computations. See the
|
||||
section on "Type Combining" in combine.rs for more details.
|
||||
The success case is perhaps more interesting. The "primary" return
|
||||
type of `eq` is `()` -- that is, when it succeeds, it doesn't return a
|
||||
value of any particular interest. Rather, it is executed for its
|
||||
side-effects of constraining type variables and so forth. However, the
|
||||
actual return type is not `()`, but rather `InferOk<()>`. The
|
||||
`InferOk` type is used to carry extra trait obligations -- your job is
|
||||
to ensure that these are fulfilled (typically by enrolling them in a
|
||||
fulfillment context). See the [trait README] for more background here.
|
||||
|
||||
[trait README]: ../traits/README.md
|
||||
|
||||
You can also enforce subtyping through `infcx.at(..).sub(..)`. The same
|
||||
basic concepts apply as above.
|
||||
|
||||
## "Trying" equality
|
||||
|
||||
Sometimes you would like to know if it is *possible* to equate two
|
||||
types without error. You can test that with `infcx.can_eq` (or
|
||||
`infcx.can_sub` for subtyping). If this returns `Ok`, then equality
|
||||
is possible -- but in all cases, any side-effects are reversed.
|
||||
|
||||
Be aware though that the success or failure of these methods is always
|
||||
**modulo regions**. That is, two types `&'a u32` and `&'b u32` will
|
||||
return `Ok` for `can_eq`, even if `'a != 'b`. This falls out from the
|
||||
"two-phase" nature of how we solve region constraints.
|
||||
|
||||
## Snapshots
|
||||
|
||||
As described in the previous section on `can_eq`, often it is useful
|
||||
to be able to do a series of operations and then roll back their
|
||||
side-effects. This is done for various reasons: one of them is to be
|
||||
able to backtrack, trying out multiple possibilities before settling
|
||||
on which path to take. Another is in order to ensure that a series of
|
||||
smaller changes take place atomically or not at all.
|
||||
|
||||
To allow for this, the inference context supports a `snapshot` method.
|
||||
When you call it, it will start recording changes that occur from the
|
||||
operations you perform. When you are done, you can either invoke
|
||||
`rollback_to`, which will undo those changes, or else `confirm`, which
|
||||
will make the permanent. Snapshots can be nested as long as you follow
|
||||
a stack-like discipline.
|
||||
|
||||
Rather than use snapshots directly, it is often helpful to use the
|
||||
methods like `commit_if_ok` or `probe` that encapsulte higher-level
|
||||
patterns.
|
||||
|
||||
## Subtyping obligations
|
||||
|
||||
One thing worth discussing are subtyping obligations. When you force
|
||||
two types to be a subtype, like `?T <: i32`, we can often convert those
|
||||
into equality constraints. This follows from Rust's rather limited notion
|
||||
of subtyping: so, in the above case, `?T <: i32` is equivalent to `?T = i32`.
|
||||
|
||||
However, in some cases we have to be more careful. For example, when
|
||||
regions are involved. So if you have `?T <: &'a i32`, what we would do
|
||||
is to first "generalize" `&'a i32` into a type with a region variable:
|
||||
`&'?b i32`, and then unify `?T` with that (`?T = &'?b i32`). We then
|
||||
relate this new variable with the original bound:
|
||||
|
||||
&'?b i32 <: &'a i32
|
||||
|
||||
This will result in a region constraint (see below) of `'?b: 'a`.
|
||||
|
||||
One final interesting case is relating two unbound type variables,
|
||||
like `?T <: ?U`. In that case, we can't make progress, so we enqueue
|
||||
an obligation `Subtype(?T, ?U)` and return it via the `InferOk`
|
||||
mechanism. You'll have to try again when more details about `?T` or
|
||||
`?U` are known.
|
||||
|
||||
## Region constraints
|
||||
|
||||
Regions are inferred somewhat differently from types. Rather than
|
||||
eagerly unifying things, we simply collect constraints as we go, but
|
||||
make (almost) no attempt to solve regions. These constraints have the
|
||||
form of an outlives constraint:
|
||||
|
||||
'a: 'b
|
||||
|
||||
Actually the code tends to view them as a subregion relation, but it's the same
|
||||
idea:
|
||||
|
||||
'b <= 'a
|
||||
|
||||
(There are various other kinds of constriants, such as "verifys"; see
|
||||
the `region_constraints` module for details.)
|
||||
|
||||
There is one case where we do some amount of eager unification. If you have an equality constraint
|
||||
between two regions
|
||||
|
||||
'a = 'b
|
||||
|
||||
we will record that fact in a unification table. You can then use
|
||||
`opportunistic_resolve_var` to convert `'b` to `'a` (or vice
|
||||
versa). This is sometimes needed to ensure termination of fixed-point
|
||||
algorithms.
|
||||
|
||||
## Extracting region constraints
|
||||
|
||||
Ultimately, region constraints are only solved at the very end of
|
||||
type-checking, once all other constraints are known. There are two
|
||||
ways to solve region constraints right now: lexical and
|
||||
non-lexical. Eventually there will only be one.
|
||||
|
||||
To solve **lexical** region constraints, you invoke
|
||||
`resolve_regions_and_report_errors`. This will "close" the region
|
||||
constraint process and invoke the `lexical_region_resolve` code. Once
|
||||
this is done, any further attempt to equate or create a subtyping
|
||||
relationship will yield an ICE.
|
||||
|
||||
Non-lexical region constraints are not handled within the inference
|
||||
context. Instead, the NLL solver (actually, the MIR type-checker)
|
||||
invokes `take_and_reset_region_constraints` periodically. This
|
||||
extracts all of the outlives constraints from the region solver, but
|
||||
leaves the set of variables intact. This is used to get *just* the
|
||||
region constraints that resulted from some particular point in the
|
||||
program, since the NLL solver needs to know not just *what* regions
|
||||
were subregions but *where*. Finally, the NLL solver invokes
|
||||
`take_region_var_origins`, which "closes" the region constraint
|
||||
process in the same way as normal solving.
|
||||
|
||||
## Lexical region resolution
|
||||
|
||||
Lexical region resolution is done by initially assigning each region
|
||||
variable to an empty value. We then process each outlives constraint
|
||||
repeatedly, growing region variables until a fixed-point is reached.
|
||||
Region variables can be grown using a least-upper-bound relation on
|
||||
the region lattice in a fairly straight-forward fashion.
|
||||
|
@ -104,7 +104,8 @@ impl<'combine, 'infcx, 'gcx, 'tcx> TypeRelation<'infcx, 'gcx, 'tcx>
|
||||
a,
|
||||
b);
|
||||
let origin = Subtype(self.fields.trace.clone());
|
||||
self.fields.infcx.region_vars.make_eqregion(origin, a, b);
|
||||
self.fields.infcx.borrow_region_constraints()
|
||||
.make_eqregion(origin, a, b);
|
||||
Ok(a)
|
||||
}
|
||||
|
||||
|
@ -13,8 +13,8 @@
|
||||
use hir;
|
||||
use infer::InferCtxt;
|
||||
use ty::{self, Region};
|
||||
use infer::region_inference::RegionResolutionError::*;
|
||||
use infer::region_inference::RegionResolutionError;
|
||||
use infer::lexical_region_resolve::RegionResolutionError::*;
|
||||
use infer::lexical_region_resolve::RegionResolutionError;
|
||||
use hir::map as hir_map;
|
||||
use middle::resolve_lifetime as rl;
|
||||
use hir::intravisit::{self, Visitor, NestedVisitorMap};
|
||||
|
@ -57,8 +57,8 @@
|
||||
|
||||
use infer;
|
||||
use super::{InferCtxt, TypeTrace, SubregionOrigin, RegionVariableOrigin, ValuePairs};
|
||||
use super::region_inference::{RegionResolutionError, ConcreteFailure, SubSupConflict,
|
||||
GenericBoundFailure, GenericKind};
|
||||
use super::region_constraints::GenericKind;
|
||||
use super::lexical_region_resolve::RegionResolutionError;
|
||||
|
||||
use std::fmt;
|
||||
use hir;
|
||||
@ -177,13 +177,7 @@ impl<'a, 'gcx, 'tcx> TyCtxt<'a, 'gcx, 'tcx> {
|
||||
|
||||
ty::ReEarlyBound(_) |
|
||||
ty::ReFree(_) => {
|
||||
let scope = match *region {
|
||||
ty::ReEarlyBound(ref br) => {
|
||||
self.parent_def_id(br.def_id).unwrap()
|
||||
}
|
||||
ty::ReFree(ref fr) => fr.scope,
|
||||
_ => bug!()
|
||||
};
|
||||
let scope = region.free_region_binding_scope(self);
|
||||
let prefix = match *region {
|
||||
ty::ReEarlyBound(ref br) => {
|
||||
format!("the lifetime {} as defined on", br.name)
|
||||
@ -293,8 +287,8 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
debug!("report_region_errors: error = {:?}", error);
|
||||
|
||||
if !self.try_report_named_anon_conflict(&error) &&
|
||||
!self.try_report_anon_anon_conflict(&error) {
|
||||
|
||||
!self.try_report_anon_anon_conflict(&error)
|
||||
{
|
||||
match error.clone() {
|
||||
// These errors could indicate all manner of different
|
||||
// problems with many different solutions. Rather
|
||||
@ -303,15 +297,19 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
// scenarios and try to find the best way to present
|
||||
// the error. If all of these fails, we fall back to a rather
|
||||
// general bit of code that displays the error information
|
||||
ConcreteFailure(origin, sub, sup) => {
|
||||
RegionResolutionError::ConcreteFailure(origin, sub, sup) => {
|
||||
self.report_concrete_failure(region_scope_tree, origin, sub, sup).emit();
|
||||
}
|
||||
|
||||
GenericBoundFailure(kind, param_ty, sub) => {
|
||||
RegionResolutionError::GenericBoundFailure(kind, param_ty, sub) => {
|
||||
self.report_generic_bound_failure(region_scope_tree, kind, param_ty, sub);
|
||||
}
|
||||
|
||||
SubSupConflict(var_origin, sub_origin, sub_r, sup_origin, sup_r) => {
|
||||
RegionResolutionError::SubSupConflict(var_origin,
|
||||
sub_origin,
|
||||
sub_r,
|
||||
sup_origin,
|
||||
sup_r) => {
|
||||
self.report_sub_sup_conflict(region_scope_tree,
|
||||
var_origin,
|
||||
sub_origin,
|
||||
@ -351,9 +349,9 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
// the only thing in the list.
|
||||
|
||||
let is_bound_failure = |e: &RegionResolutionError<'tcx>| match *e {
|
||||
ConcreteFailure(..) => false,
|
||||
SubSupConflict(..) => false,
|
||||
GenericBoundFailure(..) => true,
|
||||
RegionResolutionError::GenericBoundFailure(..) => true,
|
||||
RegionResolutionError::ConcreteFailure(..) |
|
||||
RegionResolutionError::SubSupConflict(..) => false,
|
||||
};
|
||||
|
||||
|
||||
@ -365,9 +363,9 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
|
||||
// sort the errors by span, for better error message stability.
|
||||
errors.sort_by_key(|u| match *u {
|
||||
ConcreteFailure(ref sro, _, _) => sro.span(),
|
||||
GenericBoundFailure(ref sro, _, _) => sro.span(),
|
||||
SubSupConflict(ref rvo, _, _, _, _) => rvo.span(),
|
||||
RegionResolutionError::ConcreteFailure(ref sro, _, _) => sro.span(),
|
||||
RegionResolutionError::GenericBoundFailure(ref sro, _, _) => sro.span(),
|
||||
RegionResolutionError::SubSupConflict(ref rvo, _, _, _, _) => rvo.span(),
|
||||
});
|
||||
errors
|
||||
}
|
||||
@ -880,14 +878,13 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
};
|
||||
|
||||
if let SubregionOrigin::CompareImplMethodObligation {
|
||||
span, item_name, impl_item_def_id, trait_item_def_id, lint_id
|
||||
span, item_name, impl_item_def_id, trait_item_def_id,
|
||||
} = origin {
|
||||
self.report_extra_impl_obligation(span,
|
||||
item_name,
|
||||
impl_item_def_id,
|
||||
trait_item_def_id,
|
||||
&format!("`{}: {}`", bound_kind, sub),
|
||||
lint_id)
|
||||
&format!("`{}: {}`", bound_kind, sub))
|
||||
.emit();
|
||||
return;
|
||||
}
|
||||
@ -1026,6 +1023,7 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
let var_name = self.tcx.hir.name(var_node_id);
|
||||
format!(" for capture of `{}` by closure", var_name)
|
||||
}
|
||||
infer::NLL(..) => bug!("NLL variable found in lexical phase"),
|
||||
};
|
||||
|
||||
struct_span_err!(self.tcx.sess, var_origin.span(), E0495,
|
||||
|
@ -11,8 +11,8 @@
|
||||
//! Error Reporting for Anonymous Region Lifetime Errors
|
||||
//! where one region is named and the other is anonymous.
|
||||
use infer::InferCtxt;
|
||||
use infer::region_inference::RegionResolutionError::*;
|
||||
use infer::region_inference::RegionResolutionError;
|
||||
use infer::lexical_region_resolve::RegionResolutionError::*;
|
||||
use infer::lexical_region_resolve::RegionResolutionError;
|
||||
use ty;
|
||||
|
||||
impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
|
@ -445,14 +445,12 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
infer::CompareImplMethodObligation { span,
|
||||
item_name,
|
||||
impl_item_def_id,
|
||||
trait_item_def_id,
|
||||
lint_id } => {
|
||||
trait_item_def_id } => {
|
||||
self.report_extra_impl_obligation(span,
|
||||
item_name,
|
||||
impl_item_def_id,
|
||||
trait_item_def_id,
|
||||
&format!("`{}: {}`", sup, sub),
|
||||
lint_id)
|
||||
&format!("`{}: {}`", sup, sub))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -78,8 +78,8 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
self.type_variables.borrow_mut().types_created_since_snapshot(
|
||||
&snapshot.type_snapshot);
|
||||
let region_vars =
|
||||
self.region_vars.vars_created_since_snapshot(
|
||||
&snapshot.region_vars_snapshot);
|
||||
self.borrow_region_constraints().vars_created_since_snapshot(
|
||||
&snapshot.region_constraints_snapshot);
|
||||
|
||||
Ok((type_variables, region_vars, value))
|
||||
}
|
||||
|
@ -67,7 +67,7 @@ impl<'combine, 'infcx, 'gcx, 'tcx> TypeRelation<'infcx, 'gcx, 'tcx>
|
||||
b);
|
||||
|
||||
let origin = Subtype(self.fields.trace.clone());
|
||||
Ok(self.fields.infcx.region_vars.glb_regions(origin, a, b))
|
||||
Ok(self.fields.infcx.borrow_region_constraints().glb_regions(self.tcx(), origin, a, b))
|
||||
}
|
||||
|
||||
fn binders<T>(&mut self, a: &ty::Binder<T>, b: &ty::Binder<T>)
|
||||
|
@ -17,7 +17,7 @@ use super::{CombinedSnapshot,
|
||||
SubregionOrigin,
|
||||
SkolemizationMap};
|
||||
use super::combine::CombineFields;
|
||||
use super::region_inference::{TaintDirections};
|
||||
use super::region_constraints::{TaintDirections};
|
||||
|
||||
use ty::{self, TyCtxt, Binder, TypeFoldable};
|
||||
use ty::error::TypeError;
|
||||
@ -176,7 +176,8 @@ impl<'a, 'gcx, 'tcx> CombineFields<'a, 'gcx, 'tcx> {
|
||||
.filter(|&r| r != representative)
|
||||
{
|
||||
let origin = SubregionOrigin::Subtype(self.trace.clone());
|
||||
self.infcx.region_vars.make_eqregion(origin,
|
||||
self.infcx.borrow_region_constraints()
|
||||
.make_eqregion(origin,
|
||||
*representative,
|
||||
*region);
|
||||
}
|
||||
@ -427,7 +428,7 @@ impl<'a, 'gcx, 'tcx> CombineFields<'a, 'gcx, 'tcx> {
|
||||
fn fresh_bound_variable<'a, 'gcx, 'tcx>(infcx: &InferCtxt<'a, 'gcx, 'tcx>,
|
||||
debruijn: ty::DebruijnIndex)
|
||||
-> ty::Region<'tcx> {
|
||||
infcx.region_vars.new_bound(debruijn)
|
||||
infcx.borrow_region_constraints().new_bound(infcx.tcx, debruijn)
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -481,7 +482,11 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
r: ty::Region<'tcx>,
|
||||
directions: TaintDirections)
|
||||
-> FxHashSet<ty::Region<'tcx>> {
|
||||
self.region_vars.tainted(&snapshot.region_vars_snapshot, r, directions)
|
||||
self.borrow_region_constraints().tainted(
|
||||
self.tcx,
|
||||
&snapshot.region_constraints_snapshot,
|
||||
r,
|
||||
directions)
|
||||
}
|
||||
|
||||
fn region_vars_confined_to_snapshot(&self,
|
||||
@ -539,7 +544,8 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
*/
|
||||
|
||||
let mut region_vars =
|
||||
self.region_vars.vars_created_since_snapshot(&snapshot.region_vars_snapshot);
|
||||
self.borrow_region_constraints().vars_created_since_snapshot(
|
||||
&snapshot.region_constraints_snapshot);
|
||||
|
||||
let escaping_types =
|
||||
self.type_variables.borrow_mut().types_escaping_snapshot(&snapshot.type_snapshot);
|
||||
@ -581,7 +587,8 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
where T : TypeFoldable<'tcx>
|
||||
{
|
||||
let (result, map) = self.tcx.replace_late_bound_regions(binder, |br| {
|
||||
self.region_vars.push_skolemized(br, &snapshot.region_vars_snapshot)
|
||||
self.borrow_region_constraints()
|
||||
.push_skolemized(self.tcx, br, &snapshot.region_constraints_snapshot)
|
||||
});
|
||||
|
||||
debug!("skolemize_bound_regions(binder={:?}, result={:?}, map={:?})",
|
||||
@ -766,7 +773,8 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
{
|
||||
debug!("pop_skolemized({:?})", skol_map);
|
||||
let skol_regions: FxHashSet<_> = skol_map.values().cloned().collect();
|
||||
self.region_vars.pop_skolemized(&skol_regions, &snapshot.region_vars_snapshot);
|
||||
self.borrow_region_constraints()
|
||||
.pop_skolemized(self.tcx, &skol_regions, &snapshot.region_constraints_snapshot);
|
||||
if !skol_map.is_empty() {
|
||||
self.projection_cache.borrow_mut().rollback_skolemized(
|
||||
&snapshot.projection_cache_snapshot);
|
||||
|
@ -1,10 +1,13 @@
|
||||
Region inference
|
||||
# Region inference
|
||||
|
||||
# Terminology
|
||||
## Terminology
|
||||
|
||||
Note that we use the terms region and lifetime interchangeably.
|
||||
|
||||
# Introduction
|
||||
## Introduction
|
||||
|
||||
See the [general inference README](../README.md) for an overview of
|
||||
how lexical-region-solving fits into the bigger picture.
|
||||
|
||||
Region inference uses a somewhat more involved algorithm than type
|
||||
inference. It is not the most efficient thing ever written though it
|
||||
@ -16,63 +19,6 @@ it's worth spending more time on a more involved analysis. Moreover,
|
||||
regions are a simpler case than types: they don't have aggregate
|
||||
structure, for example.
|
||||
|
||||
Unlike normal type inference, which is similar in spirit to H-M and thus
|
||||
works progressively, the region type inference works by accumulating
|
||||
constraints over the course of a function. Finally, at the end of
|
||||
processing a function, we process and solve the constraints all at
|
||||
once.
|
||||
|
||||
The constraints are always of one of three possible forms:
|
||||
|
||||
- `ConstrainVarSubVar(Ri, Rj)` states that region variable Ri must be
|
||||
a subregion of Rj
|
||||
- `ConstrainRegSubVar(R, Ri)` states that the concrete region R (which
|
||||
must not be a variable) must be a subregion of the variable Ri
|
||||
- `ConstrainVarSubReg(Ri, R)` states the variable Ri shoudl be less
|
||||
than the concrete region R. This is kind of deprecated and ought to
|
||||
be replaced with a verify (they essentially play the same role).
|
||||
|
||||
In addition to constraints, we also gather up a set of "verifys"
|
||||
(what, you don't think Verify is a noun? Get used to it my
|
||||
friend!). These represent relations that must hold but which don't
|
||||
influence inference proper. These take the form of:
|
||||
|
||||
- `VerifyRegSubReg(Ri, Rj)` indicates that Ri <= Rj must hold,
|
||||
where Rj is not an inference variable (and Ri may or may not contain
|
||||
one). This doesn't influence inference because we will already have
|
||||
inferred Ri to be as small as possible, so then we just test whether
|
||||
that result was less than Rj or not.
|
||||
- `VerifyGenericBound(R, Vb)` is a more complex expression which tests
|
||||
that the region R must satisfy the bound `Vb`. The bounds themselves
|
||||
may have structure like "must outlive one of the following regions"
|
||||
or "must outlive ALL of the following regions. These bounds arise
|
||||
from constraints like `T: 'a` -- if we know that `T: 'b` and `T: 'c`
|
||||
(say, from where clauses), then we can conclude that `T: 'a` if `'b:
|
||||
'a` *or* `'c: 'a`.
|
||||
|
||||
# Building up the constraints
|
||||
|
||||
Variables and constraints are created using the following methods:
|
||||
|
||||
- `new_region_var()` creates a new, unconstrained region variable;
|
||||
- `make_subregion(Ri, Rj)` states that Ri is a subregion of Rj
|
||||
- `lub_regions(Ri, Rj) -> Rk` returns a region Rk which is
|
||||
the smallest region that is greater than both Ri and Rj
|
||||
- `glb_regions(Ri, Rj) -> Rk` returns a region Rk which is
|
||||
the greatest region that is smaller than both Ri and Rj
|
||||
|
||||
The actual region resolution algorithm is not entirely
|
||||
obvious, though it is also not overly complex.
|
||||
|
||||
## Snapshotting
|
||||
|
||||
It is also permitted to try (and rollback) changes to the graph. This
|
||||
is done by invoking `start_snapshot()`, which returns a value. Then
|
||||
later you can call `rollback_to()` which undoes the work.
|
||||
Alternatively, you can call `commit()` which ends all snapshots.
|
||||
Snapshots can be recursive---so you can start a snapshot when another
|
||||
is in progress, but only the root snapshot can "commit".
|
||||
|
||||
## The problem
|
||||
|
||||
Basically our input is a directed graph where nodes can be divided
|
||||
@ -109,9 +55,9 @@ step where we walk over the verify bounds and check that they are
|
||||
satisfied. These bounds represent the "maximal" values that a region
|
||||
variable can take on, basically.
|
||||
|
||||
# The Region Hierarchy
|
||||
## The Region Hierarchy
|
||||
|
||||
## Without closures
|
||||
### Without closures
|
||||
|
||||
Let's first consider the region hierarchy without thinking about
|
||||
closures, because they add a lot of complications. The region
|
||||
@ -141,7 +87,7 @@ Within that, there are sublifetimes for the assignment pattern and
|
||||
also the expression `x + y`. The expression itself has sublifetimes
|
||||
for evaluating `x` and `y`.
|
||||
|
||||
## Function calls
|
||||
#s## Function calls
|
||||
|
||||
Function calls are a bit tricky. I will describe how we handle them
|
||||
*now* and then a bit about how we can improve them (Issue #6268).
|
||||
@ -259,7 +205,7 @@ there is a reference created whose lifetime does not enclose
|
||||
the borrow expression, we must issue sufficient restrictions to ensure
|
||||
that the pointee remains valid.
|
||||
|
||||
## Modeling closures
|
||||
### Modeling closures
|
||||
|
||||
Integrating closures properly into the model is a bit of
|
||||
work-in-progress. In an ideal world, we would model closures as
|
||||
@ -314,8 +260,3 @@ handling of closures, there are no known cases where this leads to a
|
||||
type-checking accepting incorrect code (though it sometimes rejects
|
||||
what might be considered correct code; see rust-lang/rust#22557), but
|
||||
it still doesn't feel like the right approach.
|
||||
|
||||
### Skolemization
|
||||
|
||||
For a discussion on skolemization and higher-ranked subtyping, please
|
||||
see the module `middle::infer::higher_ranked::doc`.
|
@ -9,7 +9,7 @@
|
||||
// except according to those terms.
|
||||
|
||||
//! This module provides linkage between libgraphviz traits and
|
||||
//! `rustc::middle::typeck::infer::region_inference`, generating a
|
||||
//! `rustc::middle::typeck::infer::region_constraints`, generating a
|
||||
//! rendering of the graph represented by the list of `Constraint`
|
||||
//! instances (which make up the edges of the graph), as well as the
|
||||
//! origin for each constraint (which are attached to the labels on
|
||||
@ -25,7 +25,7 @@ use middle::free_region::RegionRelations;
|
||||
use middle::region;
|
||||
use super::Constraint;
|
||||
use infer::SubregionOrigin;
|
||||
use infer::region_inference::RegionVarBindings;
|
||||
use infer::region_constraints::RegionConstraintData;
|
||||
use util::nodemap::{FxHashMap, FxHashSet};
|
||||
|
||||
use std::borrow::Cow;
|
||||
@ -57,12 +57,13 @@ graphs will be printed. \n\
|
||||
}
|
||||
|
||||
pub fn maybe_print_constraints_for<'a, 'gcx, 'tcx>(
|
||||
region_vars: &RegionVarBindings<'a, 'gcx, 'tcx>,
|
||||
region_data: &RegionConstraintData<'tcx>,
|
||||
region_rels: &RegionRelations<'a, 'gcx, 'tcx>)
|
||||
{
|
||||
let tcx = region_rels.tcx;
|
||||
let context = region_rels.context;
|
||||
|
||||
if !region_vars.tcx.sess.opts.debugging_opts.print_region_graph {
|
||||
if !tcx.sess.opts.debugging_opts.print_region_graph {
|
||||
return;
|
||||
}
|
||||
|
||||
@ -112,12 +113,11 @@ pub fn maybe_print_constraints_for<'a, 'gcx, 'tcx>(
|
||||
}
|
||||
};
|
||||
|
||||
let constraints = &*region_vars.constraints.borrow();
|
||||
match dump_region_constraints_to(region_rels, constraints, &output_path) {
|
||||
match dump_region_data_to(region_rels, ®ion_data.constraints, &output_path) {
|
||||
Ok(()) => {}
|
||||
Err(e) => {
|
||||
let msg = format!("io error dumping region constraints: {}", e);
|
||||
region_vars.tcx.sess.err(&msg)
|
||||
tcx.sess.err(&msg)
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -212,13 +212,13 @@ impl<'a, 'gcx, 'tcx> dot::Labeller<'a> for ConstraintGraph<'a, 'gcx, 'tcx> {
|
||||
|
||||
fn constraint_to_nodes(c: &Constraint) -> (Node, Node) {
|
||||
match *c {
|
||||
Constraint::ConstrainVarSubVar(rv_1, rv_2) =>
|
||||
Constraint::VarSubVar(rv_1, rv_2) =>
|
||||
(Node::RegionVid(rv_1), Node::RegionVid(rv_2)),
|
||||
Constraint::ConstrainRegSubVar(r_1, rv_2) =>
|
||||
Constraint::RegSubVar(r_1, rv_2) =>
|
||||
(Node::Region(*r_1), Node::RegionVid(rv_2)),
|
||||
Constraint::ConstrainVarSubReg(rv_1, r_2) =>
|
||||
Constraint::VarSubReg(rv_1, r_2) =>
|
||||
(Node::RegionVid(rv_1), Node::Region(*r_2)),
|
||||
Constraint::ConstrainRegSubReg(r_1, r_2) =>
|
||||
Constraint::RegSubReg(r_1, r_2) =>
|
||||
(Node::Region(*r_1), Node::Region(*r_2)),
|
||||
}
|
||||
}
|
||||
@ -267,15 +267,15 @@ impl<'a, 'gcx, 'tcx> dot::GraphWalk<'a> for ConstraintGraph<'a, 'gcx, 'tcx> {
|
||||
|
||||
pub type ConstraintMap<'tcx> = BTreeMap<Constraint<'tcx>, SubregionOrigin<'tcx>>;
|
||||
|
||||
fn dump_region_constraints_to<'a, 'gcx, 'tcx>(region_rels: &RegionRelations<'a, 'gcx, 'tcx>,
|
||||
fn dump_region_data_to<'a, 'gcx, 'tcx>(region_rels: &RegionRelations<'a, 'gcx, 'tcx>,
|
||||
map: &ConstraintMap<'tcx>,
|
||||
path: &str)
|
||||
-> io::Result<()> {
|
||||
debug!("dump_region_constraints map (len: {}) path: {}",
|
||||
debug!("dump_region_data map (len: {}) path: {}",
|
||||
map.len(),
|
||||
path);
|
||||
let g = ConstraintGraph::new(format!("region_constraints"), region_rels, map);
|
||||
debug!("dump_region_constraints calling render");
|
||||
let g = ConstraintGraph::new(format!("region_data"), region_rels, map);
|
||||
debug!("dump_region_data calling render");
|
||||
let mut v = Vec::new();
|
||||
dot::render(&g, &mut v).unwrap();
|
||||
File::create(path).and_then(|mut f| f.write_all(&v))
|
766
src/librustc/infer/lexical_region_resolve/mod.rs
Normal file
766
src/librustc/infer/lexical_region_resolve/mod.rs
Normal file
@ -0,0 +1,766 @@
|
||||
// Copyright 2012-2014 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
//! The code to do lexical region resolution.
|
||||
|
||||
use infer::SubregionOrigin;
|
||||
use infer::RegionVariableOrigin;
|
||||
use infer::region_constraints::Constraint;
|
||||
use infer::region_constraints::GenericKind;
|
||||
use infer::region_constraints::RegionConstraintData;
|
||||
use infer::region_constraints::VarOrigins;
|
||||
use infer::region_constraints::VerifyBound;
|
||||
use middle::free_region::RegionRelations;
|
||||
use rustc_data_structures::indexed_vec::{Idx, IndexVec};
|
||||
use rustc_data_structures::fx::FxHashSet;
|
||||
use rustc_data_structures::graph::{self, Direction, NodeIndex, OUTGOING};
|
||||
use std::fmt;
|
||||
use std::u32;
|
||||
use ty::{self, TyCtxt};
|
||||
use ty::{Region, RegionVid};
|
||||
use ty::{ReEarlyBound, ReEmpty, ReErased, ReFree, ReStatic};
|
||||
use ty::{ReLateBound, ReScope, ReSkolemized, ReVar};
|
||||
|
||||
mod graphviz;
|
||||
|
||||
/// This function performs lexical region resolution given a complete
|
||||
/// set of constraints and variable origins. It performs a fixed-point
|
||||
/// iteration to find region values which satisfy all constraints,
|
||||
/// assuming such values can be found. It returns the final values of
|
||||
/// all the variables as well as a set of errors that must be reported.
|
||||
pub fn resolve<'tcx>(
|
||||
region_rels: &RegionRelations<'_, '_, 'tcx>,
|
||||
var_origins: VarOrigins,
|
||||
data: RegionConstraintData<'tcx>,
|
||||
) -> (
|
||||
LexicalRegionResolutions<'tcx>,
|
||||
Vec<RegionResolutionError<'tcx>>,
|
||||
) {
|
||||
debug!("RegionConstraintData: resolve_regions()");
|
||||
let mut errors = vec![];
|
||||
let mut resolver = LexicalResolver {
|
||||
region_rels,
|
||||
var_origins,
|
||||
data,
|
||||
};
|
||||
let values = resolver.infer_variable_values(&mut errors);
|
||||
(values, errors)
|
||||
}
|
||||
|
||||
/// Contains the result of lexical region resolution. Offers methods
|
||||
/// to lookup up the final value of a region variable.
|
||||
pub struct LexicalRegionResolutions<'tcx> {
|
||||
values: IndexVec<RegionVid, VarValue<'tcx>>,
|
||||
error_region: ty::Region<'tcx>,
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, Debug)]
|
||||
enum VarValue<'tcx> {
|
||||
Value(Region<'tcx>),
|
||||
ErrorValue,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
pub enum RegionResolutionError<'tcx> {
|
||||
/// `ConcreteFailure(o, a, b)`:
|
||||
///
|
||||
/// `o` requires that `a <= b`, but this does not hold
|
||||
ConcreteFailure(SubregionOrigin<'tcx>, Region<'tcx>, Region<'tcx>),
|
||||
|
||||
/// `GenericBoundFailure(p, s, a)
|
||||
///
|
||||
/// The parameter/associated-type `p` must be known to outlive the lifetime
|
||||
/// `a` (but none of the known bounds are sufficient).
|
||||
GenericBoundFailure(SubregionOrigin<'tcx>, GenericKind<'tcx>, Region<'tcx>),
|
||||
|
||||
/// `SubSupConflict(v, sub_origin, sub_r, sup_origin, sup_r)`:
|
||||
///
|
||||
/// Could not infer a value for `v` because `sub_r <= v` (due to
|
||||
/// `sub_origin`) but `v <= sup_r` (due to `sup_origin`) and
|
||||
/// `sub_r <= sup_r` does not hold.
|
||||
SubSupConflict(
|
||||
RegionVariableOrigin,
|
||||
SubregionOrigin<'tcx>,
|
||||
Region<'tcx>,
|
||||
SubregionOrigin<'tcx>,
|
||||
Region<'tcx>,
|
||||
),
|
||||
}
|
||||
|
||||
struct RegionAndOrigin<'tcx> {
|
||||
region: Region<'tcx>,
|
||||
origin: SubregionOrigin<'tcx>,
|
||||
}
|
||||
|
||||
type RegionGraph<'tcx> = graph::Graph<(), Constraint<'tcx>>;
|
||||
|
||||
struct LexicalResolver<'cx, 'gcx: 'tcx, 'tcx: 'cx> {
|
||||
region_rels: &'cx RegionRelations<'cx, 'gcx, 'tcx>,
|
||||
var_origins: VarOrigins,
|
||||
data: RegionConstraintData<'tcx>,
|
||||
}
|
||||
|
||||
impl<'cx, 'gcx, 'tcx> LexicalResolver<'cx, 'gcx, 'tcx> {
|
||||
fn infer_variable_values(
|
||||
&mut self,
|
||||
errors: &mut Vec<RegionResolutionError<'tcx>>,
|
||||
) -> LexicalRegionResolutions<'tcx> {
|
||||
let mut var_data = self.construct_var_data(self.region_rels.tcx);
|
||||
|
||||
// Dorky hack to cause `dump_constraints` to only get called
|
||||
// if debug mode is enabled:
|
||||
debug!(
|
||||
"----() End constraint listing (context={:?}) {:?}---",
|
||||
self.region_rels.context,
|
||||
self.dump_constraints(self.region_rels)
|
||||
);
|
||||
graphviz::maybe_print_constraints_for(&self.data, self.region_rels);
|
||||
|
||||
let graph = self.construct_graph();
|
||||
self.expand_givens(&graph);
|
||||
self.expansion(&mut var_data);
|
||||
self.collect_errors(&mut var_data, errors);
|
||||
self.collect_var_errors(&var_data, &graph, errors);
|
||||
var_data
|
||||
}
|
||||
|
||||
fn num_vars(&self) -> usize {
|
||||
self.var_origins.len()
|
||||
}
|
||||
|
||||
/// Initially, the value for all variables is set to `'empty`, the
|
||||
/// empty region. The `expansion` phase will grow this larger.
|
||||
fn construct_var_data(&self, tcx: TyCtxt<'_, '_, 'tcx>) -> LexicalRegionResolutions<'tcx> {
|
||||
LexicalRegionResolutions {
|
||||
error_region: tcx.types.re_static,
|
||||
values: (0..self.num_vars())
|
||||
.map(|_| VarValue::Value(tcx.types.re_empty))
|
||||
.collect(),
|
||||
}
|
||||
}
|
||||
|
||||
fn dump_constraints(&self, free_regions: &RegionRelations<'_, '_, 'tcx>) {
|
||||
debug!(
|
||||
"----() Start constraint listing (context={:?}) ()----",
|
||||
free_regions.context
|
||||
);
|
||||
for (idx, (constraint, _)) in self.data.constraints.iter().enumerate() {
|
||||
debug!("Constraint {} => {:?}", idx, constraint);
|
||||
}
|
||||
}
|
||||
|
||||
fn expand_givens(&mut self, graph: &RegionGraph) {
|
||||
// Givens are a kind of horrible hack to account for
|
||||
// constraints like 'c <= '0 that are known to hold due to
|
||||
// closure signatures (see the comment above on the `givens`
|
||||
// field). They should go away. But until they do, the role
|
||||
// of this fn is to account for the transitive nature:
|
||||
//
|
||||
// Given 'c <= '0
|
||||
// and '0 <= '1
|
||||
// then 'c <= '1
|
||||
|
||||
let seeds: Vec<_> = self.data.givens.iter().cloned().collect();
|
||||
for (r, vid) in seeds {
|
||||
// While all things transitively reachable in the graph
|
||||
// from the variable (`'0` in the example above).
|
||||
let seed_index = NodeIndex(vid.index as usize);
|
||||
for succ_index in graph.depth_traverse(seed_index, OUTGOING) {
|
||||
let succ_index = succ_index.0;
|
||||
|
||||
// The first N nodes correspond to the region
|
||||
// variables. Other nodes correspond to constant
|
||||
// regions.
|
||||
if succ_index < self.num_vars() {
|
||||
let succ_vid = RegionVid::new(succ_index);
|
||||
|
||||
// Add `'c <= '1`.
|
||||
self.data.givens.insert((r, succ_vid));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn expansion(&self, var_values: &mut LexicalRegionResolutions<'tcx>) {
|
||||
self.iterate_until_fixed_point("Expansion", |constraint, origin| {
|
||||
debug!("expansion: constraint={:?} origin={:?}", constraint, origin);
|
||||
match *constraint {
|
||||
Constraint::RegSubVar(a_region, b_vid) => {
|
||||
let b_data = var_values.value_mut(b_vid);
|
||||
self.expand_node(a_region, b_vid, b_data)
|
||||
}
|
||||
Constraint::VarSubVar(a_vid, b_vid) => match *var_values.value(a_vid) {
|
||||
VarValue::ErrorValue => false,
|
||||
VarValue::Value(a_region) => {
|
||||
let b_node = var_values.value_mut(b_vid);
|
||||
self.expand_node(a_region, b_vid, b_node)
|
||||
}
|
||||
},
|
||||
Constraint::RegSubReg(..) | Constraint::VarSubReg(..) => {
|
||||
// These constraints are checked after expansion
|
||||
// is done, in `collect_errors`.
|
||||
false
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
fn expand_node(
|
||||
&self,
|
||||
a_region: Region<'tcx>,
|
||||
b_vid: RegionVid,
|
||||
b_data: &mut VarValue<'tcx>,
|
||||
) -> bool {
|
||||
debug!("expand_node({:?}, {:?} == {:?})", a_region, b_vid, b_data);
|
||||
|
||||
// Check if this relationship is implied by a given.
|
||||
match *a_region {
|
||||
ty::ReEarlyBound(_) | ty::ReFree(_) => if self.data.givens.contains(&(a_region, b_vid))
|
||||
{
|
||||
debug!("given");
|
||||
return false;
|
||||
},
|
||||
_ => {}
|
||||
}
|
||||
|
||||
match *b_data {
|
||||
VarValue::Value(cur_region) => {
|
||||
let lub = self.lub_concrete_regions(a_region, cur_region);
|
||||
if lub == cur_region {
|
||||
return false;
|
||||
}
|
||||
|
||||
debug!(
|
||||
"Expanding value of {:?} from {:?} to {:?}",
|
||||
b_vid,
|
||||
cur_region,
|
||||
lub
|
||||
);
|
||||
|
||||
*b_data = VarValue::Value(lub);
|
||||
return true;
|
||||
}
|
||||
|
||||
VarValue::ErrorValue => {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
fn lub_concrete_regions(&self, a: Region<'tcx>, b: Region<'tcx>) -> Region<'tcx> {
|
||||
let tcx = self.region_rels.tcx;
|
||||
match (a, b) {
|
||||
(&ReLateBound(..), _) | (_, &ReLateBound(..)) | (&ReErased, _) | (_, &ReErased) => {
|
||||
bug!("cannot relate region: LUB({:?}, {:?})", a, b);
|
||||
}
|
||||
|
||||
(r @ &ReStatic, _) | (_, r @ &ReStatic) => {
|
||||
r // nothing lives longer than static
|
||||
}
|
||||
|
||||
(&ReEmpty, r) | (r, &ReEmpty) => {
|
||||
r // everything lives longer than empty
|
||||
}
|
||||
|
||||
(&ReVar(v_id), _) | (_, &ReVar(v_id)) => {
|
||||
span_bug!(
|
||||
self.var_origins[v_id].span(),
|
||||
"lub_concrete_regions invoked with non-concrete \
|
||||
regions: {:?}, {:?}",
|
||||
a,
|
||||
b
|
||||
);
|
||||
}
|
||||
|
||||
(&ReEarlyBound(_), &ReScope(s_id)) |
|
||||
(&ReScope(s_id), &ReEarlyBound(_)) |
|
||||
(&ReFree(_), &ReScope(s_id)) |
|
||||
(&ReScope(s_id), &ReFree(_)) => {
|
||||
// A "free" region can be interpreted as "some region
|
||||
// at least as big as fr.scope". So, we can
|
||||
// reasonably compare free regions and scopes:
|
||||
let fr_scope = match (a, b) {
|
||||
(&ReEarlyBound(ref br), _) | (_, &ReEarlyBound(ref br)) => self.region_rels
|
||||
.region_scope_tree
|
||||
.early_free_scope(self.region_rels.tcx, br),
|
||||
(&ReFree(ref fr), _) | (_, &ReFree(ref fr)) => self.region_rels
|
||||
.region_scope_tree
|
||||
.free_scope(self.region_rels.tcx, fr),
|
||||
_ => bug!(),
|
||||
};
|
||||
let r_id = self.region_rels
|
||||
.region_scope_tree
|
||||
.nearest_common_ancestor(fr_scope, s_id);
|
||||
if r_id == fr_scope {
|
||||
// if the free region's scope `fr.scope` is bigger than
|
||||
// the scope region `s_id`, then the LUB is the free
|
||||
// region itself:
|
||||
match (a, b) {
|
||||
(_, &ReScope(_)) => return a,
|
||||
(&ReScope(_), _) => return b,
|
||||
_ => bug!(),
|
||||
}
|
||||
}
|
||||
|
||||
// otherwise, we don't know what the free region is,
|
||||
// so we must conservatively say the LUB is static:
|
||||
tcx.types.re_static
|
||||
}
|
||||
|
||||
(&ReScope(a_id), &ReScope(b_id)) => {
|
||||
// The region corresponding to an outer block is a
|
||||
// subtype of the region corresponding to an inner
|
||||
// block.
|
||||
let lub = self.region_rels
|
||||
.region_scope_tree
|
||||
.nearest_common_ancestor(a_id, b_id);
|
||||
tcx.mk_region(ReScope(lub))
|
||||
}
|
||||
|
||||
(&ReEarlyBound(_), &ReEarlyBound(_)) |
|
||||
(&ReFree(_), &ReEarlyBound(_)) |
|
||||
(&ReEarlyBound(_), &ReFree(_)) |
|
||||
(&ReFree(_), &ReFree(_)) => self.region_rels.lub_free_regions(a, b),
|
||||
|
||||
// For these types, we cannot define any additional
|
||||
// relationship:
|
||||
(&ReSkolemized(..), _) | (_, &ReSkolemized(..)) => if a == b {
|
||||
a
|
||||
} else {
|
||||
tcx.types.re_static
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
/// After expansion is complete, go and check upper bounds (i.e.,
|
||||
/// cases where the region cannot grow larger than a fixed point)
|
||||
/// and check that they are satisfied.
|
||||
fn collect_errors(
|
||||
&self,
|
||||
var_data: &mut LexicalRegionResolutions<'tcx>,
|
||||
errors: &mut Vec<RegionResolutionError<'tcx>>,
|
||||
) {
|
||||
for (constraint, origin) in &self.data.constraints {
|
||||
debug!(
|
||||
"collect_errors: constraint={:?} origin={:?}",
|
||||
constraint,
|
||||
origin
|
||||
);
|
||||
match *constraint {
|
||||
Constraint::RegSubVar(..) | Constraint::VarSubVar(..) => {
|
||||
// Expansion will ensure that these constraints hold. Ignore.
|
||||
}
|
||||
|
||||
Constraint::RegSubReg(sub, sup) => {
|
||||
if self.region_rels.is_subregion_of(sub, sup) {
|
||||
continue;
|
||||
}
|
||||
|
||||
debug!(
|
||||
"collect_errors: region error at {:?}: \
|
||||
cannot verify that {:?} <= {:?}",
|
||||
origin,
|
||||
sub,
|
||||
sup
|
||||
);
|
||||
|
||||
errors.push(RegionResolutionError::ConcreteFailure(
|
||||
(*origin).clone(),
|
||||
sub,
|
||||
sup,
|
||||
));
|
||||
}
|
||||
|
||||
Constraint::VarSubReg(a_vid, b_region) => {
|
||||
let a_data = var_data.value_mut(a_vid);
|
||||
debug!("contraction: {:?} == {:?}, {:?}", a_vid, a_data, b_region);
|
||||
|
||||
let a_region = match *a_data {
|
||||
VarValue::ErrorValue => continue,
|
||||
VarValue::Value(a_region) => a_region,
|
||||
};
|
||||
|
||||
// Do not report these errors immediately:
|
||||
// instead, set the variable value to error and
|
||||
// collect them later.
|
||||
if !self.region_rels.is_subregion_of(a_region, b_region) {
|
||||
debug!(
|
||||
"collect_errors: region error at {:?}: \
|
||||
cannot verify that {:?}={:?} <= {:?}",
|
||||
origin,
|
||||
a_vid,
|
||||
a_region,
|
||||
b_region
|
||||
);
|
||||
*a_data = VarValue::ErrorValue;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for verify in &self.data.verifys {
|
||||
debug!("collect_errors: verify={:?}", verify);
|
||||
let sub = var_data.normalize(verify.region);
|
||||
|
||||
// This was an inference variable which didn't get
|
||||
// constrained, therefore it can be assume to hold.
|
||||
if let ty::ReEmpty = *sub {
|
||||
continue;
|
||||
}
|
||||
|
||||
if self.bound_is_met(&verify.bound, var_data, sub) {
|
||||
continue;
|
||||
}
|
||||
|
||||
debug!(
|
||||
"collect_errors: region error at {:?}: \
|
||||
cannot verify that {:?} <= {:?}",
|
||||
verify.origin,
|
||||
verify.region,
|
||||
verify.bound
|
||||
);
|
||||
|
||||
errors.push(RegionResolutionError::GenericBoundFailure(
|
||||
verify.origin.clone(),
|
||||
verify.kind.clone(),
|
||||
sub,
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
/// Go over the variables that were declared to be error variables
|
||||
/// and create a `RegionResolutionError` for each of them.
|
||||
fn collect_var_errors(
|
||||
&self,
|
||||
var_data: &LexicalRegionResolutions<'tcx>,
|
||||
graph: &RegionGraph<'tcx>,
|
||||
errors: &mut Vec<RegionResolutionError<'tcx>>,
|
||||
) {
|
||||
debug!("collect_var_errors");
|
||||
|
||||
// This is the best way that I have found to suppress
|
||||
// duplicate and related errors. Basically we keep a set of
|
||||
// flags for every node. Whenever an error occurs, we will
|
||||
// walk some portion of the graph looking to find pairs of
|
||||
// conflicting regions to report to the user. As we walk, we
|
||||
// trip the flags from false to true, and if we find that
|
||||
// we've already reported an error involving any particular
|
||||
// node we just stop and don't report the current error. The
|
||||
// idea is to report errors that derive from independent
|
||||
// regions of the graph, but not those that derive from
|
||||
// overlapping locations.
|
||||
let mut dup_vec = vec![u32::MAX; self.num_vars()];
|
||||
|
||||
for (node_vid, value) in var_data.values.iter_enumerated() {
|
||||
match *value {
|
||||
VarValue::Value(_) => { /* Inference successful */ }
|
||||
VarValue::ErrorValue => {
|
||||
/* Inference impossible, this value contains
|
||||
inconsistent constraints.
|
||||
|
||||
I think that in this case we should report an
|
||||
error now---unlike the case above, we can't
|
||||
wait to see whether the user needs the result
|
||||
of this variable. The reason is that the mere
|
||||
existence of this variable implies that the
|
||||
region graph is inconsistent, whether or not it
|
||||
is used.
|
||||
|
||||
For example, we may have created a region
|
||||
variable that is the GLB of two other regions
|
||||
which do not have a GLB. Even if that variable
|
||||
is not used, it implies that those two regions
|
||||
*should* have a GLB.
|
||||
|
||||
At least I think this is true. It may be that
|
||||
the mere existence of a conflict in a region variable
|
||||
that is not used is not a problem, so if this rule
|
||||
starts to create problems we'll have to revisit
|
||||
this portion of the code and think hard about it. =) */
|
||||
self.collect_error_for_expanding_node(graph, &mut dup_vec, node_vid, errors);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn construct_graph(&self) -> RegionGraph<'tcx> {
|
||||
let num_vars = self.num_vars();
|
||||
|
||||
let mut graph = graph::Graph::new();
|
||||
|
||||
for _ in 0..num_vars {
|
||||
graph.add_node(());
|
||||
}
|
||||
|
||||
// Issue #30438: two distinct dummy nodes, one for incoming
|
||||
// edges (dummy_source) and another for outgoing edges
|
||||
// (dummy_sink). In `dummy -> a -> b -> dummy`, using one
|
||||
// dummy node leads one to think (erroneously) there exists a
|
||||
// path from `b` to `a`. Two dummy nodes sidesteps the issue.
|
||||
let dummy_source = graph.add_node(());
|
||||
let dummy_sink = graph.add_node(());
|
||||
|
||||
for (constraint, _) in &self.data.constraints {
|
||||
match *constraint {
|
||||
Constraint::VarSubVar(a_id, b_id) => {
|
||||
graph.add_edge(
|
||||
NodeIndex(a_id.index as usize),
|
||||
NodeIndex(b_id.index as usize),
|
||||
*constraint,
|
||||
);
|
||||
}
|
||||
Constraint::RegSubVar(_, b_id) => {
|
||||
graph.add_edge(dummy_source, NodeIndex(b_id.index as usize), *constraint);
|
||||
}
|
||||
Constraint::VarSubReg(a_id, _) => {
|
||||
graph.add_edge(NodeIndex(a_id.index as usize), dummy_sink, *constraint);
|
||||
}
|
||||
Constraint::RegSubReg(..) => {
|
||||
// this would be an edge from `dummy_source` to
|
||||
// `dummy_sink`; just ignore it.
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return graph;
|
||||
}
|
||||
|
||||
fn collect_error_for_expanding_node(
|
||||
&self,
|
||||
graph: &RegionGraph<'tcx>,
|
||||
dup_vec: &mut [u32],
|
||||
node_idx: RegionVid,
|
||||
errors: &mut Vec<RegionResolutionError<'tcx>>,
|
||||
) {
|
||||
// Errors in expanding nodes result from a lower-bound that is
|
||||
// not contained by an upper-bound.
|
||||
let (mut lower_bounds, lower_dup) =
|
||||
self.collect_concrete_regions(graph, node_idx, graph::INCOMING, dup_vec);
|
||||
let (mut upper_bounds, upper_dup) =
|
||||
self.collect_concrete_regions(graph, node_idx, graph::OUTGOING, dup_vec);
|
||||
|
||||
if lower_dup || upper_dup {
|
||||
return;
|
||||
}
|
||||
|
||||
// We place free regions first because we are special casing
|
||||
// SubSupConflict(ReFree, ReFree) when reporting error, and so
|
||||
// the user will more likely get a specific suggestion.
|
||||
fn region_order_key(x: &RegionAndOrigin) -> u8 {
|
||||
match *x.region {
|
||||
ReEarlyBound(_) => 0,
|
||||
ReFree(_) => 1,
|
||||
_ => 2,
|
||||
}
|
||||
}
|
||||
lower_bounds.sort_by_key(region_order_key);
|
||||
upper_bounds.sort_by_key(region_order_key);
|
||||
|
||||
for lower_bound in &lower_bounds {
|
||||
for upper_bound in &upper_bounds {
|
||||
if !self.region_rels
|
||||
.is_subregion_of(lower_bound.region, upper_bound.region)
|
||||
{
|
||||
let origin = self.var_origins[node_idx].clone();
|
||||
debug!(
|
||||
"region inference error at {:?} for {:?}: SubSupConflict sub: {:?} \
|
||||
sup: {:?}",
|
||||
origin,
|
||||
node_idx,
|
||||
lower_bound.region,
|
||||
upper_bound.region
|
||||
);
|
||||
errors.push(RegionResolutionError::SubSupConflict(
|
||||
origin,
|
||||
lower_bound.origin.clone(),
|
||||
lower_bound.region,
|
||||
upper_bound.origin.clone(),
|
||||
upper_bound.region,
|
||||
));
|
||||
return;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
span_bug!(
|
||||
self.var_origins[node_idx].span(),
|
||||
"collect_error_for_expanding_node() could not find \
|
||||
error for var {:?}, lower_bounds={:?}, \
|
||||
upper_bounds={:?}",
|
||||
node_idx,
|
||||
lower_bounds,
|
||||
upper_bounds
|
||||
);
|
||||
}
|
||||
|
||||
fn collect_concrete_regions(
|
||||
&self,
|
||||
graph: &RegionGraph<'tcx>,
|
||||
orig_node_idx: RegionVid,
|
||||
dir: Direction,
|
||||
dup_vec: &mut [u32],
|
||||
) -> (Vec<RegionAndOrigin<'tcx>>, bool) {
|
||||
struct WalkState<'tcx> {
|
||||
set: FxHashSet<RegionVid>,
|
||||
stack: Vec<RegionVid>,
|
||||
result: Vec<RegionAndOrigin<'tcx>>,
|
||||
dup_found: bool,
|
||||
}
|
||||
let mut state = WalkState {
|
||||
set: FxHashSet(),
|
||||
stack: vec![orig_node_idx],
|
||||
result: Vec::new(),
|
||||
dup_found: false,
|
||||
};
|
||||
state.set.insert(orig_node_idx);
|
||||
|
||||
// to start off the process, walk the source node in the
|
||||
// direction specified
|
||||
process_edges(&self.data, &mut state, graph, orig_node_idx, dir);
|
||||
|
||||
while !state.stack.is_empty() {
|
||||
let node_idx = state.stack.pop().unwrap();
|
||||
|
||||
// check whether we've visited this node on some previous walk
|
||||
if dup_vec[node_idx.index as usize] == u32::MAX {
|
||||
dup_vec[node_idx.index as usize] = orig_node_idx.index;
|
||||
} else if dup_vec[node_idx.index as usize] != orig_node_idx.index {
|
||||
state.dup_found = true;
|
||||
}
|
||||
|
||||
debug!(
|
||||
"collect_concrete_regions(orig_node_idx={:?}, node_idx={:?})",
|
||||
orig_node_idx,
|
||||
node_idx
|
||||
);
|
||||
|
||||
process_edges(&self.data, &mut state, graph, node_idx, dir);
|
||||
}
|
||||
|
||||
let WalkState {
|
||||
result, dup_found, ..
|
||||
} = state;
|
||||
return (result, dup_found);
|
||||
|
||||
fn process_edges<'tcx>(
|
||||
this: &RegionConstraintData<'tcx>,
|
||||
state: &mut WalkState<'tcx>,
|
||||
graph: &RegionGraph<'tcx>,
|
||||
source_vid: RegionVid,
|
||||
dir: Direction,
|
||||
) {
|
||||
debug!("process_edges(source_vid={:?}, dir={:?})", source_vid, dir);
|
||||
|
||||
let source_node_index = NodeIndex(source_vid.index as usize);
|
||||
for (_, edge) in graph.adjacent_edges(source_node_index, dir) {
|
||||
match edge.data {
|
||||
Constraint::VarSubVar(from_vid, to_vid) => {
|
||||
let opp_vid = if from_vid == source_vid {
|
||||
to_vid
|
||||
} else {
|
||||
from_vid
|
||||
};
|
||||
if state.set.insert(opp_vid) {
|
||||
state.stack.push(opp_vid);
|
||||
}
|
||||
}
|
||||
|
||||
Constraint::RegSubVar(region, _) | Constraint::VarSubReg(_, region) => {
|
||||
state.result.push(RegionAndOrigin {
|
||||
region,
|
||||
origin: this.constraints.get(&edge.data).unwrap().clone(),
|
||||
});
|
||||
}
|
||||
|
||||
Constraint::RegSubReg(..) => panic!(
|
||||
"cannot reach reg-sub-reg edge in region inference \
|
||||
post-processing"
|
||||
),
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn iterate_until_fixed_point<F>(&self, tag: &str, mut body: F)
|
||||
where
|
||||
F: FnMut(&Constraint<'tcx>, &SubregionOrigin<'tcx>) -> bool,
|
||||
{
|
||||
let mut iteration = 0;
|
||||
let mut changed = true;
|
||||
while changed {
|
||||
changed = false;
|
||||
iteration += 1;
|
||||
debug!("---- {} Iteration {}{}", "#", tag, iteration);
|
||||
for (constraint, origin) in &self.data.constraints {
|
||||
let edge_changed = body(constraint, origin);
|
||||
if edge_changed {
|
||||
debug!("Updated due to constraint {:?}", constraint);
|
||||
changed = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
debug!("---- {} Complete after {} iteration(s)", tag, iteration);
|
||||
}
|
||||
|
||||
fn bound_is_met(
|
||||
&self,
|
||||
bound: &VerifyBound<'tcx>,
|
||||
var_values: &LexicalRegionResolutions<'tcx>,
|
||||
min: ty::Region<'tcx>,
|
||||
) -> bool {
|
||||
match bound {
|
||||
VerifyBound::AnyRegion(rs) => rs.iter()
|
||||
.map(|&r| var_values.normalize(r))
|
||||
.any(|r| self.region_rels.is_subregion_of(min, r)),
|
||||
|
||||
VerifyBound::AllRegions(rs) => rs.iter()
|
||||
.map(|&r| var_values.normalize(r))
|
||||
.all(|r| self.region_rels.is_subregion_of(min, r)),
|
||||
|
||||
VerifyBound::AnyBound(bs) => bs.iter().any(|b| self.bound_is_met(b, var_values, min)),
|
||||
|
||||
VerifyBound::AllBounds(bs) => bs.iter().all(|b| self.bound_is_met(b, var_values, min)),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'tcx> fmt::Debug for RegionAndOrigin<'tcx> {
|
||||
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
|
||||
write!(f, "RegionAndOrigin({:?},{:?})", self.region, self.origin)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
impl<'tcx> LexicalRegionResolutions<'tcx> {
|
||||
fn normalize(&self, r: ty::Region<'tcx>) -> ty::Region<'tcx> {
|
||||
match *r {
|
||||
ty::ReVar(rid) => self.resolve_var(rid),
|
||||
_ => r,
|
||||
}
|
||||
}
|
||||
|
||||
fn value(&self, rid: RegionVid) -> &VarValue<'tcx> {
|
||||
&self.values[rid]
|
||||
}
|
||||
|
||||
fn value_mut(&mut self, rid: RegionVid) -> &mut VarValue<'tcx> {
|
||||
&mut self.values[rid]
|
||||
}
|
||||
|
||||
pub fn resolve_var(&self, rid: RegionVid) -> ty::Region<'tcx> {
|
||||
let result = match self.values[rid] {
|
||||
VarValue::Value(r) => r,
|
||||
VarValue::ErrorValue => self.error_region,
|
||||
};
|
||||
debug!("resolve_var({:?}) = {:?}", rid, result);
|
||||
result
|
||||
}
|
||||
}
|
@ -67,7 +67,7 @@ impl<'combine, 'infcx, 'gcx, 'tcx> TypeRelation<'infcx, 'gcx, 'tcx>
|
||||
b);
|
||||
|
||||
let origin = Subtype(self.fields.trace.clone());
|
||||
Ok(self.fields.infcx.region_vars.lub_regions(origin, a, b))
|
||||
Ok(self.fields.infcx.borrow_region_constraints().lub_regions(self.tcx(), origin, a, b))
|
||||
}
|
||||
|
||||
fn binders<T>(&mut self, a: &ty::Binder<T>, b: &ty::Binder<T>)
|
||||
|
@ -16,7 +16,6 @@ pub use self::SubregionOrigin::*;
|
||||
pub use self::ValuePairs::*;
|
||||
pub use ty::IntVarValue;
|
||||
pub use self::freshen::TypeFreshener;
|
||||
pub use self::region_inference::{GenericKind, VerifyBound};
|
||||
|
||||
use hir::def_id::DefId;
|
||||
use middle::free_region::{FreeRegionMap, RegionRelations};
|
||||
@ -31,7 +30,7 @@ use ty::fold::{TypeFoldable, TypeFolder, TypeVisitor};
|
||||
use ty::relate::RelateResult;
|
||||
use traits::{self, ObligationCause, PredicateObligations, Reveal};
|
||||
use rustc_data_structures::unify::{self, UnificationTable};
|
||||
use std::cell::{Cell, RefCell, Ref};
|
||||
use std::cell::{Cell, RefCell, Ref, RefMut};
|
||||
use std::fmt;
|
||||
use syntax::ast;
|
||||
use errors::DiagnosticBuilder;
|
||||
@ -41,7 +40,9 @@ use arena::DroplessArena;
|
||||
|
||||
use self::combine::CombineFields;
|
||||
use self::higher_ranked::HrMatchResult;
|
||||
use self::region_inference::{RegionVarBindings, RegionSnapshot};
|
||||
use self::region_constraints::{RegionConstraintCollector, RegionSnapshot};
|
||||
use self::region_constraints::{GenericKind, VerifyBound, RegionConstraintData, VarOrigins};
|
||||
use self::lexical_region_resolve::LexicalRegionResolutions;
|
||||
use self::type_variable::TypeVariableOrigin;
|
||||
use self::unify_key::ToType;
|
||||
|
||||
@ -54,13 +55,17 @@ mod glb;
|
||||
mod higher_ranked;
|
||||
pub mod lattice;
|
||||
mod lub;
|
||||
pub mod region_inference;
|
||||
pub mod region_constraints;
|
||||
mod lexical_region_resolve;
|
||||
mod outlives;
|
||||
pub mod resolve;
|
||||
mod freshen;
|
||||
mod sub;
|
||||
pub mod type_variable;
|
||||
pub mod unify_key;
|
||||
|
||||
pub use self::outlives::env::OutlivesEnvironment;
|
||||
|
||||
#[must_use]
|
||||
pub struct InferOk<'tcx, T> {
|
||||
pub value: T,
|
||||
@ -98,8 +103,15 @@ pub struct InferCtxt<'a, 'gcx: 'a+'tcx, 'tcx: 'a> {
|
||||
// Map from floating variable to the kind of float it represents
|
||||
float_unification_table: RefCell<UnificationTable<ty::FloatVid>>,
|
||||
|
||||
// For region variables.
|
||||
region_vars: RegionVarBindings<'a, 'gcx, 'tcx>,
|
||||
// Tracks the set of region variables and the constraints between
|
||||
// them. This is initially `Some(_)` but when
|
||||
// `resolve_regions_and_report_errors` is invoked, this gets set
|
||||
// to `None` -- further attempts to perform unification etc may
|
||||
// fail if new region constraints would've been added.
|
||||
region_constraints: RefCell<Option<RegionConstraintCollector<'tcx>>>,
|
||||
|
||||
// Once region inference is done, the values for each variable.
|
||||
lexical_region_resolutions: RefCell<Option<LexicalRegionResolutions<'tcx>>>,
|
||||
|
||||
/// Caches the results of trait selection. This cache is used
|
||||
/// for things that have to do with the parameters in scope.
|
||||
@ -135,6 +147,39 @@ pub struct InferCtxt<'a, 'gcx: 'a+'tcx, 'tcx: 'a> {
|
||||
|
||||
// This flag is true while there is an active snapshot.
|
||||
in_snapshot: Cell<bool>,
|
||||
|
||||
// A set of constraints that regionck must validate. Each
|
||||
// constraint has the form `T:'a`, meaning "some type `T` must
|
||||
// outlive the lifetime 'a". These constraints derive from
|
||||
// instantiated type parameters. So if you had a struct defined
|
||||
// like
|
||||
//
|
||||
// struct Foo<T:'static> { ... }
|
||||
//
|
||||
// then in some expression `let x = Foo { ... }` it will
|
||||
// instantiate the type parameter `T` with a fresh type `$0`. At
|
||||
// the same time, it will record a region obligation of
|
||||
// `$0:'static`. This will get checked later by regionck. (We
|
||||
// can't generally check these things right away because we have
|
||||
// to wait until types are resolved.)
|
||||
//
|
||||
// These are stored in a map keyed to the id of the innermost
|
||||
// enclosing fn body / static initializer expression. This is
|
||||
// because the location where the obligation was incurred can be
|
||||
// relevant with respect to which sublifetime assumptions are in
|
||||
// place. The reason that we store under the fn-id, and not
|
||||
// something more fine-grained, is so that it is easier for
|
||||
// regionck to be sure that it has found *all* the region
|
||||
// obligations (otherwise, it's easy to fail to walk to a
|
||||
// particular node-id).
|
||||
//
|
||||
// Before running `resolve_regions_and_report_errors`, the creator
|
||||
// of the inference context is expected to invoke
|
||||
// `process_region_obligations` (defined in `self::region_obligations`)
|
||||
// for each body-id in this map, which will process the
|
||||
// obligations within. This is expected to be done 'late enough'
|
||||
// that all type inference variables have been bound and so forth.
|
||||
region_obligations: RefCell<Vec<(ast::NodeId, RegionObligation<'tcx>)>>,
|
||||
}
|
||||
|
||||
/// A map returned by `skolemize_late_bound_regions()` indicating the skolemized
|
||||
@ -248,10 +293,6 @@ pub enum SubregionOrigin<'tcx> {
|
||||
item_name: ast::Name,
|
||||
impl_item_def_id: DefId,
|
||||
trait_item_def_id: DefId,
|
||||
|
||||
// this is `Some(_)` if this error arises from the bug fix for
|
||||
// #18937. This is a temporary measure.
|
||||
lint_id: Option<ast::NodeId>,
|
||||
},
|
||||
}
|
||||
|
||||
@ -280,7 +321,7 @@ pub enum LateBoundRegionConversionTime {
|
||||
/// Reasons to create a region inference variable
|
||||
///
|
||||
/// See `error_reporting` module for more details
|
||||
#[derive(Clone, Debug)]
|
||||
#[derive(Copy, Clone, Debug)]
|
||||
pub enum RegionVariableOrigin {
|
||||
// Region variables created for ill-categorized reasons,
|
||||
// mostly indicates places in need of refactoring
|
||||
@ -308,6 +349,20 @@ pub enum RegionVariableOrigin {
|
||||
UpvarRegion(ty::UpvarId, Span),
|
||||
|
||||
BoundRegionInCoherence(ast::Name),
|
||||
|
||||
// This origin is used for the inference variables that we create
|
||||
// during NLL region processing.
|
||||
NLL(NLLRegionVariableOrigin),
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, Debug, PartialEq, Eq, Hash)]
|
||||
pub enum NLLRegionVariableOrigin {
|
||||
// During NLL region processing, we create variables for free
|
||||
// regions that we encounter in the function signature and
|
||||
// elsewhere. This origin indices we've got one of those.
|
||||
FreeRegion,
|
||||
|
||||
Inferred(::mir::visit::TyContext),
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, Debug)]
|
||||
@ -317,6 +372,14 @@ pub enum FixupError {
|
||||
UnresolvedTy(TyVid)
|
||||
}
|
||||
|
||||
/// See the `region_obligations` field for more information.
|
||||
#[derive(Clone)]
|
||||
pub struct RegionObligation<'tcx> {
|
||||
pub sub_region: ty::Region<'tcx>,
|
||||
pub sup_type: Ty<'tcx>,
|
||||
pub cause: ObligationCause<'tcx>,
|
||||
}
|
||||
|
||||
impl fmt::Display for FixupError {
|
||||
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
|
||||
use self::FixupError::*;
|
||||
@ -379,13 +442,15 @@ impl<'a, 'gcx, 'tcx> InferCtxtBuilder<'a, 'gcx, 'tcx> {
|
||||
type_variables: RefCell::new(type_variable::TypeVariableTable::new()),
|
||||
int_unification_table: RefCell::new(UnificationTable::new()),
|
||||
float_unification_table: RefCell::new(UnificationTable::new()),
|
||||
region_vars: RegionVarBindings::new(tcx),
|
||||
region_constraints: RefCell::new(Some(RegionConstraintCollector::new())),
|
||||
lexical_region_resolutions: RefCell::new(None),
|
||||
selection_cache: traits::SelectionCache::new(),
|
||||
evaluation_cache: traits::EvaluationCache::new(),
|
||||
reported_trait_errors: RefCell::new(FxHashMap()),
|
||||
tainted_by_errors_flag: Cell::new(false),
|
||||
err_count_on_creation: tcx.sess.err_count(),
|
||||
in_snapshot: Cell::new(false),
|
||||
region_obligations: RefCell::new(vec![]),
|
||||
}))
|
||||
}
|
||||
}
|
||||
@ -412,7 +477,8 @@ pub struct CombinedSnapshot<'a, 'tcx:'a> {
|
||||
type_snapshot: type_variable::Snapshot,
|
||||
int_snapshot: unify::Snapshot<ty::IntVid>,
|
||||
float_snapshot: unify::Snapshot<ty::FloatVid>,
|
||||
region_vars_snapshot: RegionSnapshot,
|
||||
region_constraints_snapshot: RegionSnapshot,
|
||||
region_obligations_snapshot: usize,
|
||||
was_in_snapshot: bool,
|
||||
_in_progress_tables: Option<Ref<'a, ty::TypeckTables<'tcx>>>,
|
||||
}
|
||||
@ -720,7 +786,8 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
type_snapshot: self.type_variables.borrow_mut().snapshot(),
|
||||
int_snapshot: self.int_unification_table.borrow_mut().snapshot(),
|
||||
float_snapshot: self.float_unification_table.borrow_mut().snapshot(),
|
||||
region_vars_snapshot: self.region_vars.start_snapshot(),
|
||||
region_constraints_snapshot: self.borrow_region_constraints().start_snapshot(),
|
||||
region_obligations_snapshot: self.region_obligations.borrow().len(),
|
||||
was_in_snapshot: in_snapshot,
|
||||
// Borrow tables "in progress" (i.e. during typeck)
|
||||
// to ban writes from within a snapshot to them.
|
||||
@ -736,7 +803,8 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
type_snapshot,
|
||||
int_snapshot,
|
||||
float_snapshot,
|
||||
region_vars_snapshot,
|
||||
region_constraints_snapshot,
|
||||
region_obligations_snapshot,
|
||||
was_in_snapshot,
|
||||
_in_progress_tables } = snapshot;
|
||||
|
||||
@ -754,8 +822,11 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
self.float_unification_table
|
||||
.borrow_mut()
|
||||
.rollback_to(float_snapshot);
|
||||
self.region_vars
|
||||
.rollback_to(region_vars_snapshot);
|
||||
self.region_obligations
|
||||
.borrow_mut()
|
||||
.truncate(region_obligations_snapshot);
|
||||
self.borrow_region_constraints()
|
||||
.rollback_to(region_constraints_snapshot);
|
||||
}
|
||||
|
||||
fn commit_from(&self, snapshot: CombinedSnapshot) {
|
||||
@ -764,7 +835,8 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
type_snapshot,
|
||||
int_snapshot,
|
||||
float_snapshot,
|
||||
region_vars_snapshot,
|
||||
region_constraints_snapshot,
|
||||
region_obligations_snapshot: _,
|
||||
was_in_snapshot,
|
||||
_in_progress_tables } = snapshot;
|
||||
|
||||
@ -782,8 +854,8 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
self.float_unification_table
|
||||
.borrow_mut()
|
||||
.commit(float_snapshot);
|
||||
self.region_vars
|
||||
.commit(region_vars_snapshot);
|
||||
self.borrow_region_constraints()
|
||||
.commit(region_constraints_snapshot);
|
||||
}
|
||||
|
||||
/// Execute `f` and commit the bindings
|
||||
@ -838,7 +910,7 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
sub: ty::Region<'tcx>,
|
||||
sup: ty::RegionVid)
|
||||
{
|
||||
self.region_vars.add_given(sub, sup);
|
||||
self.borrow_region_constraints().add_given(sub, sup);
|
||||
}
|
||||
|
||||
pub fn can_sub<T>(&self,
|
||||
@ -878,7 +950,7 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
a: ty::Region<'tcx>,
|
||||
b: ty::Region<'tcx>) {
|
||||
debug!("sub_regions({:?} <: {:?})", a, b);
|
||||
self.region_vars.make_subregion(origin, a, b);
|
||||
self.borrow_region_constraints().make_subregion(origin, a, b);
|
||||
}
|
||||
|
||||
pub fn equality_predicate(&self,
|
||||
@ -979,9 +1051,21 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
.new_key(None)
|
||||
}
|
||||
|
||||
/// Create a fresh region variable with the next available index.
|
||||
///
|
||||
/// # Parameters
|
||||
///
|
||||
/// - `origin`: information about why we created this variable, for use
|
||||
/// during diagnostics / error-reporting.
|
||||
pub fn next_region_var(&self, origin: RegionVariableOrigin)
|
||||
-> ty::Region<'tcx> {
|
||||
self.tcx.mk_region(ty::ReVar(self.region_vars.new_region_var(origin)))
|
||||
self.tcx.mk_region(ty::ReVar(self.borrow_region_constraints().new_region_var(origin)))
|
||||
}
|
||||
|
||||
/// Just a convenient wrapper of `next_region_var` for using during NLL.
|
||||
pub fn next_nll_region_var(&self, origin: NLLRegionVariableOrigin)
|
||||
-> ty::Region<'tcx> {
|
||||
self.next_region_var(RegionVariableOrigin::NLL(origin))
|
||||
}
|
||||
|
||||
/// Create a region inference variable for the given
|
||||
@ -1040,10 +1124,6 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
})
|
||||
}
|
||||
|
||||
pub fn fresh_bound_region(&self, debruijn: ty::DebruijnIndex) -> ty::Region<'tcx> {
|
||||
self.region_vars.new_bound(debruijn)
|
||||
}
|
||||
|
||||
/// True if errors have been reported since this infcx was
|
||||
/// created. This is sometimes used as a heuristic to skip
|
||||
/// reporting errors that often occur as a result of earlier
|
||||
@ -1069,15 +1149,31 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
self.tainted_by_errors_flag.set(true)
|
||||
}
|
||||
|
||||
/// Process the region constraints and report any errors that
|
||||
/// result. After this, no more unification operations should be
|
||||
/// done -- or the compiler will panic -- but it is legal to use
|
||||
/// `resolve_type_vars_if_possible` as well as `fully_resolve`.
|
||||
pub fn resolve_regions_and_report_errors(&self,
|
||||
region_context: DefId,
|
||||
region_map: ®ion::ScopeTree,
|
||||
free_regions: &FreeRegionMap<'tcx>) {
|
||||
let region_rels = RegionRelations::new(self.tcx,
|
||||
assert!(self.is_tainted_by_errors() || self.region_obligations.borrow().is_empty(),
|
||||
"region_obligations not empty: {:#?}",
|
||||
self.region_obligations.borrow());
|
||||
|
||||
let region_rels = &RegionRelations::new(self.tcx,
|
||||
region_context,
|
||||
region_map,
|
||||
free_regions);
|
||||
let errors = self.region_vars.resolve_regions(®ion_rels);
|
||||
let (var_origins, data) = self.region_constraints.borrow_mut()
|
||||
.take()
|
||||
.expect("regions already resolved")
|
||||
.into_origins_and_data();
|
||||
let (lexical_region_resolutions, errors) =
|
||||
lexical_region_resolve::resolve(region_rels, var_origins, data);
|
||||
|
||||
let old_value = self.lexical_region_resolutions.replace(Some(lexical_region_resolutions));
|
||||
assert!(old_value.is_none());
|
||||
|
||||
if !self.is_tainted_by_errors() {
|
||||
// As a heuristic, just skip reporting region errors
|
||||
@ -1089,6 +1185,34 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
}
|
||||
}
|
||||
|
||||
/// Obtains (and clears) the current set of region
|
||||
/// constraints. The inference context is still usable: further
|
||||
/// unifications will simply add new constraints.
|
||||
///
|
||||
/// This method is not meant to be used with normal lexical region
|
||||
/// resolution. Rather, it is used in the NLL mode as a kind of
|
||||
/// interim hack: basically we run normal type-check and generate
|
||||
/// region constraints as normal, but then we take them and
|
||||
/// translate them into the form that the NLL solver
|
||||
/// understands. See the NLL module for mode details.
|
||||
pub fn take_and_reset_region_constraints(&self) -> RegionConstraintData<'tcx> {
|
||||
self.borrow_region_constraints().take_and_reset_data()
|
||||
}
|
||||
|
||||
/// Takes ownership of the list of variable regions. This implies
|
||||
/// that all the region constriants have already been taken, and
|
||||
/// hence that `resolve_regions_and_report_errors` can never be
|
||||
/// called. This is used only during NLL processing to "hand off" ownership
|
||||
/// of the set of region vairables into the NLL region context.
|
||||
pub fn take_region_var_origins(&self) -> VarOrigins {
|
||||
let (var_origins, data) = self.region_constraints.borrow_mut()
|
||||
.take()
|
||||
.expect("regions already resolved")
|
||||
.into_origins_and_data();
|
||||
assert!(data.is_empty());
|
||||
var_origins
|
||||
}
|
||||
|
||||
pub fn ty_to_string(&self, t: Ty<'tcx>) -> String {
|
||||
self.resolve_type_vars_if_possible(&t).to_string()
|
||||
}
|
||||
@ -1301,7 +1425,7 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
Ok(InferOk { value: result, obligations: combine.obligations })
|
||||
}
|
||||
|
||||
/// See `verify_generic_bound` method in `region_inference`
|
||||
/// See `verify_generic_bound` method in `region_constraints`
|
||||
pub fn verify_generic_bound(&self,
|
||||
origin: SubregionOrigin<'tcx>,
|
||||
kind: GenericKind<'tcx>,
|
||||
@ -1312,7 +1436,7 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
a,
|
||||
bound);
|
||||
|
||||
self.region_vars.verify_generic_bound(origin, kind, a, bound);
|
||||
self.borrow_region_constraints().verify_generic_bound(origin, kind, a, bound);
|
||||
}
|
||||
|
||||
pub fn type_moves_by_default(&self,
|
||||
@ -1389,6 +1513,33 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
|
||||
self.tcx.generator_sig(def_id)
|
||||
}
|
||||
|
||||
/// Normalizes associated types in `value`, potentially returning
|
||||
/// new obligations that must further be processed.
|
||||
pub fn partially_normalize_associated_types_in<T>(&self,
|
||||
span: Span,
|
||||
body_id: ast::NodeId,
|
||||
param_env: ty::ParamEnv<'tcx>,
|
||||
value: &T)
|
||||
-> InferOk<'tcx, T>
|
||||
where T : TypeFoldable<'tcx>
|
||||
{
|
||||
debug!("partially_normalize_associated_types_in(value={:?})", value);
|
||||
let mut selcx = traits::SelectionContext::new(self);
|
||||
let cause = ObligationCause::misc(span, body_id);
|
||||
let traits::Normalized { value, obligations } =
|
||||
traits::normalize(&mut selcx, param_env, cause, value);
|
||||
debug!("partially_normalize_associated_types_in: result={:?} predicates={:?}",
|
||||
value,
|
||||
obligations);
|
||||
InferOk { value, obligations }
|
||||
}
|
||||
|
||||
fn borrow_region_constraints(&self) -> RefMut<'_, RegionConstraintCollector<'tcx>> {
|
||||
RefMut::map(
|
||||
self.region_constraints.borrow_mut(),
|
||||
|c| c.as_mut().expect("region constraints already solved"))
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, 'gcx, 'tcx> TypeTrace<'tcx> {
|
||||
@ -1466,14 +1617,12 @@ impl<'tcx> SubregionOrigin<'tcx> {
|
||||
|
||||
traits::ObligationCauseCode::CompareImplMethodObligation { item_name,
|
||||
impl_item_def_id,
|
||||
trait_item_def_id,
|
||||
lint_id } =>
|
||||
trait_item_def_id, } =>
|
||||
SubregionOrigin::CompareImplMethodObligation {
|
||||
span: cause.span,
|
||||
item_name,
|
||||
impl_item_def_id,
|
||||
trait_item_def_id,
|
||||
lint_id,
|
||||
},
|
||||
|
||||
_ => default(),
|
||||
@ -1492,7 +1641,8 @@ impl RegionVariableOrigin {
|
||||
EarlyBoundRegion(a, ..) => a,
|
||||
LateBoundRegion(a, ..) => a,
|
||||
BoundRegionInCoherence(_) => syntax_pos::DUMMY_SP,
|
||||
UpvarRegion(_, a) => a
|
||||
UpvarRegion(_, a) => a,
|
||||
NLL(..) => bug!("NLL variable used with `span`"),
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -1533,3 +1683,12 @@ impl<'tcx> TypeFoldable<'tcx> for TypeTrace<'tcx> {
|
||||
self.cause.visit_with(visitor) || self.values.visit_with(visitor)
|
||||
}
|
||||
}
|
||||
|
||||
impl<'tcx> fmt::Debug for RegionObligation<'tcx> {
|
||||
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
|
||||
write!(f, "RegionObligation(sub_region={:?}, sup_type={:?})",
|
||||
self.sub_region,
|
||||
self.sup_type)
|
||||
}
|
||||
}
|
||||
|
||||
|
355
src/librustc/infer/outlives/env.rs
Normal file
355
src/librustc/infer/outlives/env.rs
Normal file
@ -0,0 +1,355 @@
|
||||
// Copyright 2012-2014 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
use middle::free_region::FreeRegionMap;
|
||||
use infer::{InferCtxt, GenericKind};
|
||||
use traits::FulfillmentContext;
|
||||
use ty::{self, Ty, TypeFoldable};
|
||||
use ty::outlives::Component;
|
||||
use ty::wf;
|
||||
|
||||
use syntax::ast;
|
||||
use syntax_pos::Span;
|
||||
|
||||
/// The `OutlivesEnvironment` collects information about what outlives
|
||||
/// what in a given type-checking setting. For example, if we have a
|
||||
/// where-clause like `where T: 'a` in scope, then the
|
||||
/// `OutlivesEnvironment` would record that (in its
|
||||
/// `region_bound_pairs` field). Similarly, it contains methods for
|
||||
/// processing and adding implied bounds into the outlives
|
||||
/// environment.
|
||||
///
|
||||
/// Other code at present does not typically take a
|
||||
/// `&OutlivesEnvironment`, but rather takes some of its fields (e.g.,
|
||||
/// `process_registered_region_obligations` wants the
|
||||
/// region-bound-pairs). There is no mistaking it: the current setup
|
||||
/// of tracking region information is quite scattered! The
|
||||
/// `OutlivesEnvironment`, for example, needs to sometimes be combined
|
||||
/// with the `middle::RegionRelations`, to yield a full picture of how
|
||||
/// (lexical) lifetimes interact. However, I'm reluctant to do more
|
||||
/// refactoring here, since the setup with NLL is quite different.
|
||||
/// For example, NLL has no need of `RegionRelations`, and is solely
|
||||
/// interested in the `OutlivesEnvironment`. -nmatsakis
|
||||
#[derive(Clone)]
|
||||
pub struct OutlivesEnvironment<'tcx> {
|
||||
param_env: ty::ParamEnv<'tcx>,
|
||||
free_region_map: FreeRegionMap<'tcx>,
|
||||
region_bound_pairs: Vec<(ty::Region<'tcx>, GenericKind<'tcx>)>,
|
||||
}
|
||||
|
||||
/// Implied bounds are region relationships that we deduce
|
||||
/// automatically. The idea is that (e.g.) a caller must check that a
|
||||
/// function's argument types are well-formed immediately before
|
||||
/// calling that fn, and hence the *callee* can assume that its
|
||||
/// argument types are well-formed. This may imply certain relationships
|
||||
/// between generic parameters. For example:
|
||||
///
|
||||
/// fn foo<'a,T>(x: &'a T)
|
||||
///
|
||||
/// can only be called with a `'a` and `T` such that `&'a T` is WF.
|
||||
/// For `&'a T` to be WF, `T: 'a` must hold. So we can assume `T: 'a`.
|
||||
#[derive(Debug)]
|
||||
enum ImpliedBound<'tcx> {
|
||||
RegionSubRegion(ty::Region<'tcx>, ty::Region<'tcx>),
|
||||
RegionSubParam(ty::Region<'tcx>, ty::ParamTy),
|
||||
RegionSubProjection(ty::Region<'tcx>, ty::ProjectionTy<'tcx>),
|
||||
}
|
||||
|
||||
impl<'a, 'gcx: 'tcx, 'tcx: 'a> OutlivesEnvironment<'tcx> {
|
||||
pub fn new(param_env: ty::ParamEnv<'tcx>) -> Self {
|
||||
let mut free_region_map = FreeRegionMap::new();
|
||||
free_region_map.relate_free_regions_from_predicates(¶m_env.caller_bounds);
|
||||
|
||||
OutlivesEnvironment {
|
||||
param_env,
|
||||
free_region_map,
|
||||
region_bound_pairs: vec![],
|
||||
}
|
||||
}
|
||||
|
||||
/// Borrows current value of the `free_region_map`.
|
||||
pub fn free_region_map(&self) -> &FreeRegionMap<'tcx> {
|
||||
&self.free_region_map
|
||||
}
|
||||
|
||||
/// Borrows current value of the `region_bound_pairs`.
|
||||
pub fn region_bound_pairs(&self) -> &[(ty::Region<'tcx>, GenericKind<'tcx>)] {
|
||||
&self.region_bound_pairs
|
||||
}
|
||||
|
||||
/// Returns ownership of the `free_region_map`.
|
||||
pub fn into_free_region_map(self) -> FreeRegionMap<'tcx> {
|
||||
self.free_region_map
|
||||
}
|
||||
|
||||
/// This is a hack to support the old-skool regionck, which
|
||||
/// processes region constraints from the main function and the
|
||||
/// closure together. In that context, when we enter a closure, we
|
||||
/// want to be able to "save" the state of the surrounding a
|
||||
/// function. We can then add implied bounds and the like from the
|
||||
/// closure arguments into the environment -- these should only
|
||||
/// apply in the closure body, so once we exit, we invoke
|
||||
/// `pop_snapshot_post_closure` to remove them.
|
||||
///
|
||||
/// Example:
|
||||
///
|
||||
/// ```
|
||||
/// fn foo<T>() {
|
||||
/// callback(for<'a> |x: &'a T| {
|
||||
/// // ^^^^^^^ not legal syntax, but probably should be
|
||||
/// // within this closure body, `T: 'a` holds
|
||||
/// })
|
||||
/// }
|
||||
/// ```
|
||||
///
|
||||
/// This "containment" of closure's effects only works so well. In
|
||||
/// particular, we (intentionally) leak relationships between free
|
||||
/// regions that are created by the closure's bounds. The case
|
||||
/// where this is useful is when you have (e.g.) a closure with a
|
||||
/// signature like `for<'a, 'b> fn(x: &'a &'b u32)` -- in this
|
||||
/// case, we want to keep the relationship `'b: 'a` in the
|
||||
/// free-region-map, so that later if we have to take `LUB('b,
|
||||
/// 'a)` we can get the result `'b`.
|
||||
///
|
||||
/// I have opted to keep **all modifications** to the
|
||||
/// free-region-map, however, and not just those that concern free
|
||||
/// variables bound in the closure. The latter seems more correct,
|
||||
/// but it is not the existing behavior, and I could not find a
|
||||
/// case where the existing behavior went wrong. In any case, it
|
||||
/// seems like it'd be readily fixed if we wanted. There are
|
||||
/// similar leaks around givens that seem equally suspicious, to
|
||||
/// be honest. --nmatsakis
|
||||
pub fn push_snapshot_pre_closure(&self) -> usize {
|
||||
self.region_bound_pairs.len()
|
||||
}
|
||||
|
||||
/// See `push_snapshot_pre_closure`.
|
||||
pub fn pop_snapshot_post_closure(&mut self, len: usize) {
|
||||
self.region_bound_pairs.truncate(len);
|
||||
}
|
||||
|
||||
/// This method adds "implied bounds" into the outlives environment.
|
||||
/// Implied bounds are outlives relationships that we can deduce
|
||||
/// on the basis that certain types must be well-formed -- these are
|
||||
/// either the types that appear in the function signature or else
|
||||
/// the input types to an impl. For example, if you have a function
|
||||
/// like
|
||||
///
|
||||
/// ```
|
||||
/// fn foo<'a, 'b, T>(x: &'a &'b [T]) { }
|
||||
/// ```
|
||||
///
|
||||
/// we can assume in the caller's body that `'b: 'a` and that `T:
|
||||
/// 'b` (and hence, transitively, that `T: 'a`). This method would
|
||||
/// add those assumptions into the outlives-environment.
|
||||
///
|
||||
/// Tests: `src/test/compile-fail/regions-free-region-ordering-*.rs`
|
||||
pub fn add_implied_bounds(
|
||||
&mut self,
|
||||
infcx: &InferCtxt<'a, 'gcx, 'tcx>,
|
||||
fn_sig_tys: &[Ty<'tcx>],
|
||||
body_id: ast::NodeId,
|
||||
span: Span,
|
||||
) {
|
||||
debug!("add_implied_bounds()");
|
||||
|
||||
for &ty in fn_sig_tys {
|
||||
let ty = infcx.resolve_type_vars_if_possible(&ty);
|
||||
debug!("add_implied_bounds: ty = {}", ty);
|
||||
let implied_bounds = self.implied_bounds(infcx, body_id, ty, span);
|
||||
|
||||
// But also record other relationships, such as `T:'x`,
|
||||
// that don't go into the free-region-map but which we use
|
||||
// here.
|
||||
for implication in implied_bounds {
|
||||
debug!("add_implied_bounds: implication={:?}", implication);
|
||||
match implication {
|
||||
ImpliedBound::RegionSubRegion(
|
||||
r_a @ &ty::ReEarlyBound(_),
|
||||
&ty::ReVar(vid_b),
|
||||
) |
|
||||
ImpliedBound::RegionSubRegion(r_a @ &ty::ReFree(_), &ty::ReVar(vid_b)) => {
|
||||
infcx.add_given(r_a, vid_b);
|
||||
}
|
||||
ImpliedBound::RegionSubParam(r_a, param_b) => {
|
||||
self.region_bound_pairs
|
||||
.push((r_a, GenericKind::Param(param_b)));
|
||||
}
|
||||
ImpliedBound::RegionSubProjection(r_a, projection_b) => {
|
||||
self.region_bound_pairs
|
||||
.push((r_a, GenericKind::Projection(projection_b)));
|
||||
}
|
||||
ImpliedBound::RegionSubRegion(r_a, r_b) => {
|
||||
// In principle, we could record (and take
|
||||
// advantage of) every relationship here, but
|
||||
// we are also free not to -- it simply means
|
||||
// strictly less that we can successfully type
|
||||
// check. Right now we only look for things
|
||||
// relationships between free regions. (It may
|
||||
// also be that we should revise our inference
|
||||
// system to be more general and to make use
|
||||
// of *every* relationship that arises here,
|
||||
// but presently we do not.)
|
||||
self.free_region_map.relate_regions(r_a, r_b);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Compute the implied bounds that a callee/impl can assume based on
|
||||
/// the fact that caller/projector has ensured that `ty` is WF. See
|
||||
/// the `ImpliedBound` type for more details.
|
||||
fn implied_bounds(
|
||||
&mut self,
|
||||
infcx: &InferCtxt<'a, 'gcx, 'tcx>,
|
||||
body_id: ast::NodeId,
|
||||
ty: Ty<'tcx>,
|
||||
span: Span,
|
||||
) -> Vec<ImpliedBound<'tcx>> {
|
||||
let tcx = infcx.tcx;
|
||||
|
||||
// Sometimes when we ask what it takes for T: WF, we get back that
|
||||
// U: WF is required; in that case, we push U onto this stack and
|
||||
// process it next. Currently (at least) these resulting
|
||||
// predicates are always guaranteed to be a subset of the original
|
||||
// type, so we need not fear non-termination.
|
||||
let mut wf_types = vec![ty];
|
||||
|
||||
let mut implied_bounds = vec![];
|
||||
|
||||
let mut fulfill_cx = FulfillmentContext::new();
|
||||
|
||||
while let Some(ty) = wf_types.pop() {
|
||||
// Compute the obligations for `ty` to be well-formed. If `ty` is
|
||||
// an unresolved inference variable, just substituted an empty set
|
||||
// -- because the return type here is going to be things we *add*
|
||||
// to the environment, it's always ok for this set to be smaller
|
||||
// than the ultimate set. (Note: normally there won't be
|
||||
// unresolved inference variables here anyway, but there might be
|
||||
// during typeck under some circumstances.)
|
||||
let obligations =
|
||||
wf::obligations(infcx, self.param_env, body_id, ty, span).unwrap_or(vec![]);
|
||||
|
||||
// NB: All of these predicates *ought* to be easily proven
|
||||
// true. In fact, their correctness is (mostly) implied by
|
||||
// other parts of the program. However, in #42552, we had
|
||||
// an annoying scenario where:
|
||||
//
|
||||
// - Some `T::Foo` gets normalized, resulting in a
|
||||
// variable `_1` and a `T: Trait<Foo=_1>` constraint
|
||||
// (not sure why it couldn't immediately get
|
||||
// solved). This result of `_1` got cached.
|
||||
// - These obligations were dropped on the floor here,
|
||||
// rather than being registered.
|
||||
// - Then later we would get a request to normalize
|
||||
// `T::Foo` which would result in `_1` being used from
|
||||
// the cache, but hence without the `T: Trait<Foo=_1>`
|
||||
// constraint. As a result, `_1` never gets resolved,
|
||||
// and we get an ICE (in dropck).
|
||||
//
|
||||
// Therefore, we register any predicates involving
|
||||
// inference variables. We restrict ourselves to those
|
||||
// involving inference variables both for efficiency and
|
||||
// to avoids duplicate errors that otherwise show up.
|
||||
fulfill_cx.register_predicate_obligations(
|
||||
infcx,
|
||||
obligations
|
||||
.iter()
|
||||
.filter(|o| o.predicate.has_infer_types())
|
||||
.cloned());
|
||||
|
||||
// From the full set of obligations, just filter down to the
|
||||
// region relationships.
|
||||
implied_bounds.extend(obligations.into_iter().flat_map(|obligation| {
|
||||
assert!(!obligation.has_escaping_regions());
|
||||
match obligation.predicate {
|
||||
ty::Predicate::Trait(..) |
|
||||
ty::Predicate::Equate(..) |
|
||||
ty::Predicate::Subtype(..) |
|
||||
ty::Predicate::Projection(..) |
|
||||
ty::Predicate::ClosureKind(..) |
|
||||
ty::Predicate::ObjectSafe(..) |
|
||||
ty::Predicate::ConstEvaluatable(..) => vec![],
|
||||
|
||||
ty::Predicate::WellFormed(subty) => {
|
||||
wf_types.push(subty);
|
||||
vec![]
|
||||
}
|
||||
|
||||
ty::Predicate::RegionOutlives(ref data) => {
|
||||
match tcx.no_late_bound_regions(data) {
|
||||
None => vec![],
|
||||
Some(ty::OutlivesPredicate(r_a, r_b)) => {
|
||||
vec![ImpliedBound::RegionSubRegion(r_b, r_a)]
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
ty::Predicate::TypeOutlives(ref data) => {
|
||||
match tcx.no_late_bound_regions(data) {
|
||||
None => vec![],
|
||||
Some(ty::OutlivesPredicate(ty_a, r_b)) => {
|
||||
let ty_a = infcx.resolve_type_vars_if_possible(&ty_a);
|
||||
let components = tcx.outlives_components(ty_a);
|
||||
self.implied_bounds_from_components(r_b, components)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}));
|
||||
}
|
||||
|
||||
// Ensure that those obligations that we had to solve
|
||||
// get solved *here*.
|
||||
match fulfill_cx.select_all_or_error(infcx) {
|
||||
Ok(()) => (),
|
||||
Err(errors) => infcx.report_fulfillment_errors(&errors, None),
|
||||
}
|
||||
|
||||
implied_bounds
|
||||
}
|
||||
|
||||
/// When we have an implied bound that `T: 'a`, we can further break
|
||||
/// this down to determine what relationships would have to hold for
|
||||
/// `T: 'a` to hold. We get to assume that the caller has validated
|
||||
/// those relationships.
|
||||
fn implied_bounds_from_components(
|
||||
&self,
|
||||
sub_region: ty::Region<'tcx>,
|
||||
sup_components: Vec<Component<'tcx>>,
|
||||
) -> Vec<ImpliedBound<'tcx>> {
|
||||
sup_components
|
||||
.into_iter()
|
||||
.flat_map(|component| {
|
||||
match component {
|
||||
Component::Region(r) =>
|
||||
vec![ImpliedBound::RegionSubRegion(sub_region, r)],
|
||||
Component::Param(p) =>
|
||||
vec![ImpliedBound::RegionSubParam(sub_region, p)],
|
||||
Component::Projection(p) =>
|
||||
vec![ImpliedBound::RegionSubProjection(sub_region, p)],
|
||||
Component::EscapingProjection(_) =>
|
||||
// If the projection has escaping regions, don't
|
||||
// try to infer any implied bounds even for its
|
||||
// free components. This is conservative, because
|
||||
// the caller will still have to prove that those
|
||||
// free components outlive `sub_region`. But the
|
||||
// idea is that the WAY that the caller proves
|
||||
// that may change in the future and we want to
|
||||
// give ourselves room to get smarter here.
|
||||
vec![],
|
||||
Component::UnresolvedInferenceVariable(..) =>
|
||||
vec![],
|
||||
}
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
}
|
12
src/librustc/infer/outlives/mod.rs
Normal file
12
src/librustc/infer/outlives/mod.rs
Normal file
@ -0,0 +1,12 @@
|
||||
// Copyright 2012-2014 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
pub mod env;
|
||||
mod obligations;
|
623
src/librustc/infer/outlives/obligations.rs
Normal file
623
src/librustc/infer/outlives/obligations.rs
Normal file
@ -0,0 +1,623 @@
|
||||
// Copyright 2012-2014 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
//! Code that handles "type-outlives" constraints like `T: 'a`. This
|
||||
//! is based on the `outlives_components` function defined on the tcx,
|
||||
//! but it adds a bit of heuristics on top, in particular to deal with
|
||||
//! associated types and projections.
|
||||
//!
|
||||
//! When we process a given `T: 'a` obligation, we may produce two
|
||||
//! kinds of constraints for the region inferencer:
|
||||
//!
|
||||
//! - Relationships between inference variables and other regions.
|
||||
//! For example, if we have `&'?0 u32: 'a`, then we would produce
|
||||
//! a constraint that `'a <= '?0`.
|
||||
//! - "Verifys" that must be checked after inferencing is done.
|
||||
//! For example, if we know that, for some type parameter `T`,
|
||||
//! `T: 'a + 'b`, and we have a requirement that `T: '?1`,
|
||||
//! then we add a "verify" that checks that `'?1 <= 'a || '?1 <= 'b`.
|
||||
//! - Note the difference with the previous case: here, the region
|
||||
//! variable must be less than something else, so this doesn't
|
||||
//! affect how inference works (it finds the smallest region that
|
||||
//! will do); it's just a post-condition that we have to check.
|
||||
//!
|
||||
//! **The key point is that once this function is done, we have
|
||||
//! reduced all of our "type-region outlives" obligations into relationships
|
||||
//! between individual regions.**
|
||||
//!
|
||||
//! One key input to this function is the set of "region-bound pairs".
|
||||
//! These are basically the relationships between type parameters and
|
||||
//! regions that are in scope at the point where the outlives
|
||||
//! obligation was incurred. **When type-checking a function,
|
||||
//! particularly in the face of closures, this is not known until
|
||||
//! regionck runs!** This is because some of those bounds come
|
||||
//! from things we have yet to infer.
|
||||
//!
|
||||
//! Consider:
|
||||
//!
|
||||
//! ```
|
||||
//! fn bar<T>(a: T, b: impl for<'a> Fn(&'a T));
|
||||
//! fn foo<T>(x: T) {
|
||||
//! bar(x, |y| { ... })
|
||||
//! // ^ closure arg
|
||||
//! }
|
||||
//! ```
|
||||
//!
|
||||
//! Here, the type of `y` may involve inference variables and the
|
||||
//! like, and it may also contain implied bounds that are needed to
|
||||
//! type-check the closure body (e.g., here it informs us that `T`
|
||||
//! outlives the late-bound region `'a`).
|
||||
//!
|
||||
//! Note that by delaying the gathering of implied bounds until all
|
||||
//! inference information is known, we may find relationships between
|
||||
//! bound regions and other regions in the environment. For example,
|
||||
//! when we first check a closure like the one expected as argument
|
||||
//! to `foo`:
|
||||
//!
|
||||
//! ```
|
||||
//! fn foo<U, F: for<'a> FnMut(&'a U)>(_f: F) {}
|
||||
//! ```
|
||||
//!
|
||||
//! the type of the closure's first argument would be `&'a ?U`. We
|
||||
//! might later infer `?U` to something like `&'b u32`, which would
|
||||
//! imply that `'b: 'a`.
|
||||
|
||||
use hir::def_id::DefId;
|
||||
use infer::{self, GenericKind, InferCtxt, RegionObligation, SubregionOrigin, VerifyBound};
|
||||
use traits;
|
||||
use ty::{self, Ty, TyCtxt, TypeFoldable};
|
||||
use ty::subst::{Subst, Substs};
|
||||
use ty::outlives::Component;
|
||||
use syntax::ast;
|
||||
|
||||
impl<'cx, 'gcx, 'tcx> InferCtxt<'cx, 'gcx, 'tcx> {
|
||||
/// Registers that the given region obligation must be resolved
|
||||
/// from within the scope of `body_id`. These regions are enqueued
|
||||
/// and later processed by regionck, when full type information is
|
||||
/// available (see `region_obligations` field for more
|
||||
/// information).
|
||||
pub fn register_region_obligation(
|
||||
&self,
|
||||
body_id: ast::NodeId,
|
||||
obligation: RegionObligation<'tcx>,
|
||||
) {
|
||||
self.region_obligations
|
||||
.borrow_mut()
|
||||
.push((body_id, obligation));
|
||||
}
|
||||
|
||||
/// Process the region obligations that must be proven (during
|
||||
/// `regionck`) for the given `body_id`, given information about
|
||||
/// the region bounds in scope and so forth. This function must be
|
||||
/// invoked for all relevant body-ids before region inference is
|
||||
/// done (or else an assert will fire).
|
||||
///
|
||||
/// See the `region_obligations` field of `InferCtxt` for some
|
||||
/// comments about how this funtion fits into the overall expected
|
||||
/// flow of the the inferencer. The key point is that it is
|
||||
/// invoked after all type-inference variables have been bound --
|
||||
/// towards the end of regionck. This also ensures that the
|
||||
/// region-bound-pairs are available (see comments above regarding
|
||||
/// closures).
|
||||
///
|
||||
/// # Parameters
|
||||
///
|
||||
/// - `region_bound_pairs`: the set of region bounds implied by
|
||||
/// the parameters and where-clauses. In particular, each pair
|
||||
/// `('a, K)` in this list tells us that the bounds in scope
|
||||
/// indicate that `K: 'a`, where `K` is either a generic
|
||||
/// parameter like `T` or a projection like `T::Item`.
|
||||
/// - `implicit_region_bound`: if some, this is a region bound
|
||||
/// that is considered to hold for all type parameters (the
|
||||
/// function body).
|
||||
/// - `param_env` is the parameter environment for the enclosing function.
|
||||
/// - `body_id` is the body-id whose region obligations are being
|
||||
/// processed.
|
||||
///
|
||||
/// # Returns
|
||||
///
|
||||
/// This function may have to perform normalizations, and hence it
|
||||
/// returns an `InferOk` with subobligations that must be
|
||||
/// processed.
|
||||
pub fn process_registered_region_obligations(
|
||||
&self,
|
||||
region_bound_pairs: &[(ty::Region<'tcx>, GenericKind<'tcx>)],
|
||||
implicit_region_bound: Option<ty::Region<'tcx>>,
|
||||
param_env: ty::ParamEnv<'tcx>,
|
||||
body_id: ast::NodeId,
|
||||
) {
|
||||
assert!(
|
||||
!self.in_snapshot.get(),
|
||||
"cannot process registered region obligations in a snapshot"
|
||||
);
|
||||
|
||||
// pull out the region obligations with the given `body_id` (leaving the rest)
|
||||
let mut my_region_obligations = Vec::with_capacity(self.region_obligations.borrow().len());
|
||||
{
|
||||
let mut r_o = self.region_obligations.borrow_mut();
|
||||
for (_, obligation) in r_o.drain_filter(|(ro_body_id, _)| *ro_body_id == body_id) {
|
||||
my_region_obligations.push(obligation);
|
||||
}
|
||||
}
|
||||
|
||||
let outlives =
|
||||
TypeOutlives::new(self, region_bound_pairs, implicit_region_bound, param_env);
|
||||
|
||||
for RegionObligation {
|
||||
sup_type,
|
||||
sub_region,
|
||||
cause,
|
||||
} in my_region_obligations
|
||||
{
|
||||
let origin = SubregionOrigin::from_obligation_cause(
|
||||
&cause,
|
||||
|| infer::RelateParamBound(cause.span, sup_type),
|
||||
);
|
||||
|
||||
outlives.type_must_outlive(origin, sup_type, sub_region);
|
||||
}
|
||||
}
|
||||
|
||||
/// Processes a single ad-hoc region obligation that was not
|
||||
/// registered in advance.
|
||||
pub fn type_must_outlive(
|
||||
&self,
|
||||
region_bound_pairs: &[(ty::Region<'tcx>, GenericKind<'tcx>)],
|
||||
implicit_region_bound: Option<ty::Region<'tcx>>,
|
||||
param_env: ty::ParamEnv<'tcx>,
|
||||
origin: infer::SubregionOrigin<'tcx>,
|
||||
ty: Ty<'tcx>,
|
||||
region: ty::Region<'tcx>,
|
||||
) {
|
||||
let outlives =
|
||||
TypeOutlives::new(self, region_bound_pairs, implicit_region_bound, param_env);
|
||||
outlives.type_must_outlive(origin, ty, region);
|
||||
}
|
||||
|
||||
/// Ignore the region obligations, not bothering to prove
|
||||
/// them. This function should not really exist; it is used to
|
||||
/// accommodate some older code for the time being.
|
||||
pub fn ignore_region_obligations(&self) {
|
||||
assert!(
|
||||
!self.in_snapshot.get(),
|
||||
"cannot ignore registered region obligations in a snapshot"
|
||||
);
|
||||
|
||||
self.region_obligations.borrow_mut().clear();
|
||||
}
|
||||
}
|
||||
|
||||
#[must_use] // you ought to invoke `into_accrued_obligations` when you are done =)
|
||||
struct TypeOutlives<'cx, 'gcx: 'tcx, 'tcx: 'cx> {
|
||||
// See the comments on `process_registered_region_obligations` for the meaning
|
||||
// of these fields.
|
||||
infcx: &'cx InferCtxt<'cx, 'gcx, 'tcx>,
|
||||
region_bound_pairs: &'cx [(ty::Region<'tcx>, GenericKind<'tcx>)],
|
||||
implicit_region_bound: Option<ty::Region<'tcx>>,
|
||||
param_env: ty::ParamEnv<'tcx>,
|
||||
}
|
||||
|
||||
impl<'cx, 'gcx, 'tcx> TypeOutlives<'cx, 'gcx, 'tcx> {
|
||||
fn new(
|
||||
infcx: &'cx InferCtxt<'cx, 'gcx, 'tcx>,
|
||||
region_bound_pairs: &'cx [(ty::Region<'tcx>, GenericKind<'tcx>)],
|
||||
implicit_region_bound: Option<ty::Region<'tcx>>,
|
||||
param_env: ty::ParamEnv<'tcx>,
|
||||
) -> Self {
|
||||
Self {
|
||||
infcx,
|
||||
region_bound_pairs,
|
||||
implicit_region_bound,
|
||||
param_env,
|
||||
}
|
||||
}
|
||||
|
||||
/// Adds constraints to inference such that `T: 'a` holds (or
|
||||
/// reports an error if it cannot).
|
||||
///
|
||||
/// # Parameters
|
||||
///
|
||||
/// - `origin`, the reason we need this constraint
|
||||
/// - `ty`, the type `T`
|
||||
/// - `region`, the region `'a`
|
||||
fn type_must_outlive(
|
||||
&self,
|
||||
origin: infer::SubregionOrigin<'tcx>,
|
||||
ty: Ty<'tcx>,
|
||||
region: ty::Region<'tcx>,
|
||||
) {
|
||||
let ty = self.infcx.resolve_type_vars_if_possible(&ty);
|
||||
|
||||
debug!(
|
||||
"type_must_outlive(ty={:?}, region={:?}, origin={:?})",
|
||||
ty,
|
||||
region,
|
||||
origin
|
||||
);
|
||||
|
||||
assert!(!ty.has_escaping_regions());
|
||||
|
||||
let components = self.tcx().outlives_components(ty);
|
||||
self.components_must_outlive(origin, components, region);
|
||||
}
|
||||
|
||||
fn tcx(&self) -> TyCtxt<'cx, 'gcx, 'tcx> {
|
||||
self.infcx.tcx
|
||||
}
|
||||
|
||||
fn components_must_outlive(
|
||||
&self,
|
||||
origin: infer::SubregionOrigin<'tcx>,
|
||||
components: Vec<Component<'tcx>>,
|
||||
region: ty::Region<'tcx>,
|
||||
) {
|
||||
for component in components {
|
||||
let origin = origin.clone();
|
||||
match component {
|
||||
Component::Region(region1) => {
|
||||
self.infcx.sub_regions(origin, region, region1);
|
||||
}
|
||||
Component::Param(param_ty) => {
|
||||
self.param_ty_must_outlive(origin, region, param_ty);
|
||||
}
|
||||
Component::Projection(projection_ty) => {
|
||||
self.projection_must_outlive(origin, region, projection_ty);
|
||||
}
|
||||
Component::EscapingProjection(subcomponents) => {
|
||||
self.components_must_outlive(origin, subcomponents, region);
|
||||
}
|
||||
Component::UnresolvedInferenceVariable(v) => {
|
||||
// ignore this, we presume it will yield an error
|
||||
// later, since if a type variable is not resolved by
|
||||
// this point it never will be
|
||||
self.infcx.tcx.sess.delay_span_bug(
|
||||
origin.span(),
|
||||
&format!("unresolved inference variable in outlives: {:?}", v),
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn param_ty_must_outlive(
|
||||
&self,
|
||||
origin: infer::SubregionOrigin<'tcx>,
|
||||
region: ty::Region<'tcx>,
|
||||
param_ty: ty::ParamTy,
|
||||
) {
|
||||
debug!(
|
||||
"param_ty_must_outlive(region={:?}, param_ty={:?}, origin={:?})",
|
||||
region,
|
||||
param_ty,
|
||||
origin
|
||||
);
|
||||
|
||||
let verify_bound = self.param_bound(param_ty);
|
||||
let generic = GenericKind::Param(param_ty);
|
||||
self.infcx
|
||||
.verify_generic_bound(origin, generic, region, verify_bound);
|
||||
}
|
||||
|
||||
fn projection_must_outlive(
|
||||
&self,
|
||||
origin: infer::SubregionOrigin<'tcx>,
|
||||
region: ty::Region<'tcx>,
|
||||
projection_ty: ty::ProjectionTy<'tcx>,
|
||||
) {
|
||||
debug!(
|
||||
"projection_must_outlive(region={:?}, projection_ty={:?}, origin={:?})",
|
||||
region,
|
||||
projection_ty,
|
||||
origin
|
||||
);
|
||||
|
||||
// This case is thorny for inference. The fundamental problem is
|
||||
// that there are many cases where we have choice, and inference
|
||||
// doesn't like choice (the current region inference in
|
||||
// particular). :) First off, we have to choose between using the
|
||||
// OutlivesProjectionEnv, OutlivesProjectionTraitDef, and
|
||||
// OutlivesProjectionComponent rules, any one of which is
|
||||
// sufficient. If there are no inference variables involved, it's
|
||||
// not hard to pick the right rule, but if there are, we're in a
|
||||
// bit of a catch 22: if we picked which rule we were going to
|
||||
// use, we could add constraints to the region inference graph
|
||||
// that make it apply, but if we don't add those constraints, the
|
||||
// rule might not apply (but another rule might). For now, we err
|
||||
// on the side of adding too few edges into the graph.
|
||||
|
||||
// Compute the bounds we can derive from the environment or trait
|
||||
// definition. We know that the projection outlives all the
|
||||
// regions in this list.
|
||||
let env_bounds = self.projection_declared_bounds(projection_ty);
|
||||
|
||||
debug!("projection_must_outlive: env_bounds={:?}", env_bounds);
|
||||
|
||||
// If we know that the projection outlives 'static, then we're
|
||||
// done here.
|
||||
if env_bounds.contains(&&ty::ReStatic) {
|
||||
debug!("projection_must_outlive: 'static as declared bound");
|
||||
return;
|
||||
}
|
||||
|
||||
// If declared bounds list is empty, the only applicable rule is
|
||||
// OutlivesProjectionComponent. If there are inference variables,
|
||||
// then, we can break down the outlives into more primitive
|
||||
// components without adding unnecessary edges.
|
||||
//
|
||||
// If there are *no* inference variables, however, we COULD do
|
||||
// this, but we choose not to, because the error messages are less
|
||||
// good. For example, a requirement like `T::Item: 'r` would be
|
||||
// translated to a requirement that `T: 'r`; when this is reported
|
||||
// to the user, it will thus say "T: 'r must hold so that T::Item:
|
||||
// 'r holds". But that makes it sound like the only way to fix
|
||||
// the problem is to add `T: 'r`, which isn't true. So, if there are no
|
||||
// inference variables, we use a verify constraint instead of adding
|
||||
// edges, which winds up enforcing the same condition.
|
||||
let needs_infer = projection_ty.needs_infer();
|
||||
if env_bounds.is_empty() && needs_infer {
|
||||
debug!("projection_must_outlive: no declared bounds");
|
||||
|
||||
for component_ty in projection_ty.substs.types() {
|
||||
self.type_must_outlive(origin.clone(), component_ty, region);
|
||||
}
|
||||
|
||||
for r in projection_ty.substs.regions() {
|
||||
self.infcx.sub_regions(origin.clone(), region, r);
|
||||
}
|
||||
|
||||
return;
|
||||
}
|
||||
|
||||
// If we find that there is a unique declared bound `'b`, and this bound
|
||||
// appears in the trait reference, then the best action is to require that `'b:'r`,
|
||||
// so do that. This is best no matter what rule we use:
|
||||
//
|
||||
// - OutlivesProjectionEnv or OutlivesProjectionTraitDef: these would translate to
|
||||
// the requirement that `'b:'r`
|
||||
// - OutlivesProjectionComponent: this would require `'b:'r` in addition to
|
||||
// other conditions
|
||||
if !env_bounds.is_empty() && env_bounds[1..].iter().all(|b| *b == env_bounds[0]) {
|
||||
let unique_bound = env_bounds[0];
|
||||
debug!(
|
||||
"projection_must_outlive: unique declared bound = {:?}",
|
||||
unique_bound
|
||||
);
|
||||
if projection_ty
|
||||
.substs
|
||||
.regions()
|
||||
.any(|r| env_bounds.contains(&r))
|
||||
{
|
||||
debug!("projection_must_outlive: unique declared bound appears in trait ref");
|
||||
self.infcx.sub_regions(origin.clone(), region, unique_bound);
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
// Fallback to verifying after the fact that there exists a
|
||||
// declared bound, or that all the components appearing in the
|
||||
// projection outlive; in some cases, this may add insufficient
|
||||
// edges into the inference graph, leading to inference failures
|
||||
// even though a satisfactory solution exists.
|
||||
let verify_bound = self.projection_bound(env_bounds, projection_ty);
|
||||
let generic = GenericKind::Projection(projection_ty);
|
||||
self.infcx
|
||||
.verify_generic_bound(origin, generic.clone(), region, verify_bound);
|
||||
}
|
||||
|
||||
fn type_bound(&self, ty: Ty<'tcx>) -> VerifyBound<'tcx> {
|
||||
match ty.sty {
|
||||
ty::TyParam(p) => self.param_bound(p),
|
||||
ty::TyProjection(data) => {
|
||||
let declared_bounds = self.projection_declared_bounds(data);
|
||||
self.projection_bound(declared_bounds, data)
|
||||
}
|
||||
_ => self.recursive_type_bound(ty),
|
||||
}
|
||||
}
|
||||
|
||||
fn param_bound(&self, param_ty: ty::ParamTy) -> VerifyBound<'tcx> {
|
||||
debug!("param_bound(param_ty={:?})", param_ty);
|
||||
|
||||
let mut param_bounds = self.declared_generic_bounds_from_env(GenericKind::Param(param_ty));
|
||||
|
||||
// Add in the default bound of fn body that applies to all in
|
||||
// scope type parameters:
|
||||
param_bounds.extend(self.implicit_region_bound);
|
||||
|
||||
VerifyBound::AnyRegion(param_bounds)
|
||||
}
|
||||
|
||||
fn projection_declared_bounds(
|
||||
&self,
|
||||
projection_ty: ty::ProjectionTy<'tcx>,
|
||||
) -> Vec<ty::Region<'tcx>> {
|
||||
// First assemble bounds from where clauses and traits.
|
||||
|
||||
let mut declared_bounds =
|
||||
self.declared_generic_bounds_from_env(GenericKind::Projection(projection_ty));
|
||||
|
||||
declared_bounds
|
||||
.extend_from_slice(&self.declared_projection_bounds_from_trait(projection_ty));
|
||||
|
||||
declared_bounds
|
||||
}
|
||||
|
||||
fn projection_bound(
|
||||
&self,
|
||||
declared_bounds: Vec<ty::Region<'tcx>>,
|
||||
projection_ty: ty::ProjectionTy<'tcx>,
|
||||
) -> VerifyBound<'tcx> {
|
||||
debug!(
|
||||
"projection_bound(declared_bounds={:?}, projection_ty={:?})",
|
||||
declared_bounds,
|
||||
projection_ty
|
||||
);
|
||||
|
||||
// see the extensive comment in projection_must_outlive
|
||||
let ty = self.infcx
|
||||
.tcx
|
||||
.mk_projection(projection_ty.item_def_id, projection_ty.substs);
|
||||
let recursive_bound = self.recursive_type_bound(ty);
|
||||
|
||||
VerifyBound::AnyRegion(declared_bounds).or(recursive_bound)
|
||||
}
|
||||
|
||||
fn recursive_type_bound(&self, ty: Ty<'tcx>) -> VerifyBound<'tcx> {
|
||||
let mut bounds = vec![];
|
||||
|
||||
for subty in ty.walk_shallow() {
|
||||
bounds.push(self.type_bound(subty));
|
||||
}
|
||||
|
||||
let mut regions = ty.regions();
|
||||
regions.retain(|r| !r.is_late_bound()); // ignore late-bound regions
|
||||
bounds.push(VerifyBound::AllRegions(regions));
|
||||
|
||||
// remove bounds that must hold, since they are not interesting
|
||||
bounds.retain(|b| !b.must_hold());
|
||||
|
||||
if bounds.len() == 1 {
|
||||
bounds.pop().unwrap()
|
||||
} else {
|
||||
VerifyBound::AllBounds(bounds)
|
||||
}
|
||||
}
|
||||
|
||||
fn declared_generic_bounds_from_env(
|
||||
&self,
|
||||
generic: GenericKind<'tcx>,
|
||||
) -> Vec<ty::Region<'tcx>> {
|
||||
let tcx = self.tcx();
|
||||
|
||||
// To start, collect bounds from user environment. Note that
|
||||
// parameter environments are already elaborated, so we don't
|
||||
// have to worry about that. Comparing using `==` is a bit
|
||||
// dubious for projections, but it will work for simple cases
|
||||
// like `T` and `T::Item`. It may not work as well for things
|
||||
// like `<T as Foo<'a>>::Item`.
|
||||
let generic_ty = generic.to_ty(tcx);
|
||||
let c_b = self.param_env.caller_bounds;
|
||||
let mut param_bounds = self.collect_outlives_from_predicate_list(generic_ty, c_b);
|
||||
|
||||
// Next, collect regions we scraped from the well-formedness
|
||||
// constraints in the fn signature. To do that, we walk the list
|
||||
// of known relations from the fn ctxt.
|
||||
//
|
||||
// This is crucial because otherwise code like this fails:
|
||||
//
|
||||
// fn foo<'a, A>(x: &'a A) { x.bar() }
|
||||
//
|
||||
// The problem is that the type of `x` is `&'a A`. To be
|
||||
// well-formed, then, A must be lower-generic by `'a`, but we
|
||||
// don't know that this holds from first principles.
|
||||
for &(r, p) in self.region_bound_pairs {
|
||||
debug!("generic={:?} p={:?}", generic, p);
|
||||
if generic == p {
|
||||
param_bounds.push(r);
|
||||
}
|
||||
}
|
||||
|
||||
param_bounds
|
||||
}
|
||||
|
||||
/// Given a projection like `<T as Foo<'x>>::Bar`, returns any bounds
|
||||
/// declared in the trait definition. For example, if the trait were
|
||||
///
|
||||
/// ```rust
|
||||
/// trait Foo<'a> {
|
||||
/// type Bar: 'a;
|
||||
/// }
|
||||
/// ```
|
||||
///
|
||||
/// then this function would return `'x`. This is subject to the
|
||||
/// limitations around higher-ranked bounds described in
|
||||
/// `region_bounds_declared_on_associated_item`.
|
||||
fn declared_projection_bounds_from_trait(
|
||||
&self,
|
||||
projection_ty: ty::ProjectionTy<'tcx>,
|
||||
) -> Vec<ty::Region<'tcx>> {
|
||||
debug!("projection_bounds(projection_ty={:?})", projection_ty);
|
||||
let mut bounds = self.region_bounds_declared_on_associated_item(projection_ty.item_def_id);
|
||||
for r in &mut bounds {
|
||||
*r = r.subst(self.tcx(), projection_ty.substs);
|
||||
}
|
||||
bounds
|
||||
}
|
||||
|
||||
/// Given the def-id of an associated item, returns any region
|
||||
/// bounds attached to that associated item from the trait definition.
|
||||
///
|
||||
/// For example:
|
||||
///
|
||||
/// ```rust
|
||||
/// trait Foo<'a> {
|
||||
/// type Bar: 'a;
|
||||
/// }
|
||||
/// ```
|
||||
///
|
||||
/// If we were given the def-id of `Foo::Bar`, we would return
|
||||
/// `'a`. You could then apply the substitutions from the
|
||||
/// projection to convert this into your namespace. This also
|
||||
/// works if the user writes `where <Self as Foo<'a>>::Bar: 'a` on
|
||||
/// the trait. In fact, it works by searching for just such a
|
||||
/// where-clause.
|
||||
///
|
||||
/// It will not, however, work for higher-ranked bounds like:
|
||||
///
|
||||
/// ```rust
|
||||
/// trait Foo<'a, 'b>
|
||||
/// where for<'x> <Self as Foo<'x, 'b>>::Bar: 'x
|
||||
/// {
|
||||
/// type Bar;
|
||||
/// }
|
||||
/// ```
|
||||
///
|
||||
/// This is for simplicity, and because we are not really smart
|
||||
/// enough to cope with such bounds anywhere.
|
||||
fn region_bounds_declared_on_associated_item(
|
||||
&self,
|
||||
assoc_item_def_id: DefId,
|
||||
) -> Vec<ty::Region<'tcx>> {
|
||||
let tcx = self.tcx();
|
||||
let assoc_item = tcx.associated_item(assoc_item_def_id);
|
||||
let trait_def_id = assoc_item.container.assert_trait();
|
||||
let trait_predicates = tcx.predicates_of(trait_def_id);
|
||||
let identity_substs = Substs::identity_for_item(tcx, assoc_item_def_id);
|
||||
let identity_proj = tcx.mk_projection(assoc_item_def_id, identity_substs);
|
||||
self.collect_outlives_from_predicate_list(
|
||||
identity_proj,
|
||||
traits::elaborate_predicates(tcx, trait_predicates.predicates),
|
||||
)
|
||||
}
|
||||
|
||||
/// Searches through a predicate list for a predicate `T: 'a`.
|
||||
///
|
||||
/// Careful: does not elaborate predicates, and just uses `==`
|
||||
/// when comparing `ty` for equality, so `ty` must be something
|
||||
/// that does not involve inference variables and where you
|
||||
/// otherwise want a precise match.
|
||||
fn collect_outlives_from_predicate_list<I, P>(
|
||||
&self,
|
||||
ty: Ty<'tcx>,
|
||||
predicates: I,
|
||||
) -> Vec<ty::Region<'tcx>>
|
||||
where
|
||||
I: IntoIterator<Item = P>,
|
||||
P: AsRef<ty::Predicate<'tcx>>,
|
||||
{
|
||||
predicates
|
||||
.into_iter()
|
||||
.filter_map(|p| p.as_ref().to_opt_type_outlives())
|
||||
.filter_map(|p| self.tcx().no_late_bound_regions(&p))
|
||||
.filter(|p| p.0 == ty)
|
||||
.map(|p| p.1)
|
||||
.collect()
|
||||
}
|
||||
}
|
70
src/librustc/infer/region_constraints/README.md
Normal file
70
src/librustc/infer/region_constraints/README.md
Normal file
@ -0,0 +1,70 @@
|
||||
# Region constraint collection
|
||||
|
||||
## Terminology
|
||||
|
||||
Note that we use the terms region and lifetime interchangeably.
|
||||
|
||||
## Introduction
|
||||
|
||||
As described in the [inference README](../README.md), and unlike
|
||||
normal type inference, which is similar in spirit to H-M and thus
|
||||
works progressively, the region type inference works by accumulating
|
||||
constraints over the course of a function. Finally, at the end of
|
||||
processing a function, we process and solve the constraints all at
|
||||
once.
|
||||
|
||||
The constraints are always of one of three possible forms:
|
||||
|
||||
- `ConstrainVarSubVar(Ri, Rj)` states that region variable Ri must be
|
||||
a subregion of Rj
|
||||
- `ConstrainRegSubVar(R, Ri)` states that the concrete region R (which
|
||||
must not be a variable) must be a subregion of the variable Ri
|
||||
- `ConstrainVarSubReg(Ri, R)` states the variable Ri shoudl be less
|
||||
than the concrete region R. This is kind of deprecated and ought to
|
||||
be replaced with a verify (they essentially play the same role).
|
||||
|
||||
In addition to constraints, we also gather up a set of "verifys"
|
||||
(what, you don't think Verify is a noun? Get used to it my
|
||||
friend!). These represent relations that must hold but which don't
|
||||
influence inference proper. These take the form of:
|
||||
|
||||
- `VerifyRegSubReg(Ri, Rj)` indicates that Ri <= Rj must hold,
|
||||
where Rj is not an inference variable (and Ri may or may not contain
|
||||
one). This doesn't influence inference because we will already have
|
||||
inferred Ri to be as small as possible, so then we just test whether
|
||||
that result was less than Rj or not.
|
||||
- `VerifyGenericBound(R, Vb)` is a more complex expression which tests
|
||||
that the region R must satisfy the bound `Vb`. The bounds themselves
|
||||
may have structure like "must outlive one of the following regions"
|
||||
or "must outlive ALL of the following regions. These bounds arise
|
||||
from constraints like `T: 'a` -- if we know that `T: 'b` and `T: 'c`
|
||||
(say, from where clauses), then we can conclude that `T: 'a` if `'b:
|
||||
'a` *or* `'c: 'a`.
|
||||
|
||||
## Building up the constraints
|
||||
|
||||
Variables and constraints are created using the following methods:
|
||||
|
||||
- `new_region_var()` creates a new, unconstrained region variable;
|
||||
- `make_subregion(Ri, Rj)` states that Ri is a subregion of Rj
|
||||
- `lub_regions(Ri, Rj) -> Rk` returns a region Rk which is
|
||||
the smallest region that is greater than both Ri and Rj
|
||||
- `glb_regions(Ri, Rj) -> Rk` returns a region Rk which is
|
||||
the greatest region that is smaller than both Ri and Rj
|
||||
|
||||
The actual region resolution algorithm is not entirely
|
||||
obvious, though it is also not overly complex.
|
||||
|
||||
## Snapshotting
|
||||
|
||||
It is also permitted to try (and rollback) changes to the graph. This
|
||||
is done by invoking `start_snapshot()`, which returns a value. Then
|
||||
later you can call `rollback_to()` which undoes the work.
|
||||
Alternatively, you can call `commit()` which ends all snapshots.
|
||||
Snapshots can be recursive---so you can start a snapshot when another
|
||||
is in progress, but only the root snapshot can "commit".
|
||||
|
||||
## Skolemization
|
||||
|
||||
For a discussion on skolemization and higher-ranked subtyping, please
|
||||
see the module `middle::infer::higher_ranked::doc`.
|
956
src/librustc/infer/region_constraints/mod.rs
Normal file
956
src/librustc/infer/region_constraints/mod.rs
Normal file
@ -0,0 +1,956 @@
|
||||
// Copyright 2012-2014 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
//! See README.md
|
||||
|
||||
use self::UndoLogEntry::*;
|
||||
use self::CombineMapType::*;
|
||||
|
||||
use super::{MiscVariable, RegionVariableOrigin, SubregionOrigin};
|
||||
use super::unify_key;
|
||||
|
||||
use rustc_data_structures::indexed_vec::IndexVec;
|
||||
use rustc_data_structures::fx::{FxHashMap, FxHashSet};
|
||||
use rustc_data_structures::unify::{self, UnificationTable};
|
||||
use ty::{self, Ty, TyCtxt};
|
||||
use ty::{Region, RegionVid};
|
||||
use ty::ReStatic;
|
||||
use ty::{BrFresh, ReLateBound, ReSkolemized, ReVar};
|
||||
|
||||
use std::collections::BTreeMap;
|
||||
use std::fmt;
|
||||
use std::mem;
|
||||
use std::u32;
|
||||
|
||||
mod taint;
|
||||
|
||||
pub struct RegionConstraintCollector<'tcx> {
|
||||
/// For each `RegionVid`, the corresponding `RegionVariableOrigin`.
|
||||
var_origins: IndexVec<RegionVid, RegionVariableOrigin>,
|
||||
|
||||
data: RegionConstraintData<'tcx>,
|
||||
|
||||
/// For a given pair of regions (R1, R2), maps to a region R3 that
|
||||
/// is designated as their LUB (edges R1 <= R3 and R2 <= R3
|
||||
/// exist). This prevents us from making many such regions.
|
||||
lubs: CombineMap<'tcx>,
|
||||
|
||||
/// For a given pair of regions (R1, R2), maps to a region R3 that
|
||||
/// is designated as their GLB (edges R3 <= R1 and R3 <= R2
|
||||
/// exist). This prevents us from making many such regions.
|
||||
glbs: CombineMap<'tcx>,
|
||||
|
||||
/// Number of skolemized variables currently active.
|
||||
skolemization_count: u32,
|
||||
|
||||
/// Global counter used during the GLB algorithm to create unique
|
||||
/// names for fresh bound regions
|
||||
bound_count: u32,
|
||||
|
||||
/// The undo log records actions that might later be undone.
|
||||
///
|
||||
/// Note: when the undo_log is empty, we are not actively
|
||||
/// snapshotting. When the `start_snapshot()` method is called, we
|
||||
/// push an OpenSnapshot entry onto the list to indicate that we
|
||||
/// are now actively snapshotting. The reason for this is that
|
||||
/// otherwise we end up adding entries for things like the lower
|
||||
/// bound on a variable and so forth, which can never be rolled
|
||||
/// back.
|
||||
undo_log: Vec<UndoLogEntry<'tcx>>,
|
||||
|
||||
/// When we add a R1 == R2 constriant, we currently add (a) edges
|
||||
/// R1 <= R2 and R2 <= R1 and (b) we unify the two regions in this
|
||||
/// table. You can then call `opportunistic_resolve_var` early
|
||||
/// which will map R1 and R2 to some common region (i.e., either
|
||||
/// R1 or R2). This is important when dropck and other such code
|
||||
/// is iterating to a fixed point, because otherwise we sometimes
|
||||
/// would wind up with a fresh stream of region variables that
|
||||
/// have been equated but appear distinct.
|
||||
unification_table: UnificationTable<ty::RegionVid>,
|
||||
}
|
||||
|
||||
pub type VarOrigins = IndexVec<RegionVid, RegionVariableOrigin>;
|
||||
|
||||
/// The full set of region constraints gathered up by the collector.
|
||||
/// Describes constraints between the region variables and other
|
||||
/// regions, as well as other conditions that must be verified, or
|
||||
/// assumptions that can be made.
|
||||
#[derive(Default)]
|
||||
pub struct RegionConstraintData<'tcx> {
|
||||
/// Constraints of the form `A <= B`, where either `A` or `B` can
|
||||
/// be a region variable (or neither, as it happens).
|
||||
pub constraints: BTreeMap<Constraint<'tcx>, SubregionOrigin<'tcx>>,
|
||||
|
||||
/// A "verify" is something that we need to verify after inference
|
||||
/// is done, but which does not directly affect inference in any
|
||||
/// way.
|
||||
///
|
||||
/// An example is a `A <= B` where neither `A` nor `B` are
|
||||
/// inference variables.
|
||||
pub verifys: Vec<Verify<'tcx>>,
|
||||
|
||||
/// A "given" is a relationship that is known to hold. In
|
||||
/// particular, we often know from closure fn signatures that a
|
||||
/// particular free region must be a subregion of a region
|
||||
/// variable:
|
||||
///
|
||||
/// foo.iter().filter(<'a> |x: &'a &'b T| ...)
|
||||
///
|
||||
/// In situations like this, `'b` is in fact a region variable
|
||||
/// introduced by the call to `iter()`, and `'a` is a bound region
|
||||
/// on the closure (as indicated by the `<'a>` prefix). If we are
|
||||
/// naive, we wind up inferring that `'b` must be `'static`,
|
||||
/// because we require that it be greater than `'a` and we do not
|
||||
/// know what `'a` is precisely.
|
||||
///
|
||||
/// This hashmap is used to avoid that naive scenario. Basically
|
||||
/// we record the fact that `'a <= 'b` is implied by the fn
|
||||
/// signature, and then ignore the constraint when solving
|
||||
/// equations. This is a bit of a hack but seems to work.
|
||||
pub givens: FxHashSet<(Region<'tcx>, ty::RegionVid)>,
|
||||
}
|
||||
|
||||
/// A constraint that influences the inference process.
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Hash, Debug, PartialOrd, Ord)]
|
||||
pub enum Constraint<'tcx> {
|
||||
/// One region variable is subregion of another
|
||||
VarSubVar(RegionVid, RegionVid),
|
||||
|
||||
/// Concrete region is subregion of region variable
|
||||
RegSubVar(Region<'tcx>, RegionVid),
|
||||
|
||||
/// Region variable is subregion of concrete region. This does not
|
||||
/// directly affect inference, but instead is checked after
|
||||
/// inference is complete.
|
||||
VarSubReg(RegionVid, Region<'tcx>),
|
||||
|
||||
/// A constraint where neither side is a variable. This does not
|
||||
/// directly affect inference, but instead is checked after
|
||||
/// inference is complete.
|
||||
RegSubReg(Region<'tcx>, Region<'tcx>),
|
||||
}
|
||||
|
||||
/// VerifyGenericBound(T, _, R, RS): The parameter type `T` (or
|
||||
/// associated type) must outlive the region `R`. `T` is known to
|
||||
/// outlive `RS`. Therefore verify that `R <= RS[i]` for some
|
||||
/// `i`. Inference variables may be involved (but this verification
|
||||
/// step doesn't influence inference).
|
||||
#[derive(Debug)]
|
||||
pub struct Verify<'tcx> {
|
||||
pub kind: GenericKind<'tcx>,
|
||||
pub origin: SubregionOrigin<'tcx>,
|
||||
pub region: Region<'tcx>,
|
||||
pub bound: VerifyBound<'tcx>,
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, PartialEq, Eq)]
|
||||
pub enum GenericKind<'tcx> {
|
||||
Param(ty::ParamTy),
|
||||
Projection(ty::ProjectionTy<'tcx>),
|
||||
}
|
||||
|
||||
/// When we introduce a verification step, we wish to test that a
|
||||
/// particular region (let's call it `'min`) meets some bound.
|
||||
/// The bound is described the by the following grammar:
|
||||
#[derive(Debug)]
|
||||
pub enum VerifyBound<'tcx> {
|
||||
/// B = exists {R} --> some 'r in {R} must outlive 'min
|
||||
///
|
||||
/// Put another way, the subject value is known to outlive all
|
||||
/// regions in {R}, so if any of those outlives 'min, then the
|
||||
/// bound is met.
|
||||
AnyRegion(Vec<Region<'tcx>>),
|
||||
|
||||
/// B = forall {R} --> all 'r in {R} must outlive 'min
|
||||
///
|
||||
/// Put another way, the subject value is known to outlive some
|
||||
/// region in {R}, so if all of those outlives 'min, then the bound
|
||||
/// is met.
|
||||
AllRegions(Vec<Region<'tcx>>),
|
||||
|
||||
/// B = exists {B} --> 'min must meet some bound b in {B}
|
||||
AnyBound(Vec<VerifyBound<'tcx>>),
|
||||
|
||||
/// B = forall {B} --> 'min must meet all bounds b in {B}
|
||||
AllBounds(Vec<VerifyBound<'tcx>>),
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, PartialEq, Eq, Hash)]
|
||||
struct TwoRegions<'tcx> {
|
||||
a: Region<'tcx>,
|
||||
b: Region<'tcx>,
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, PartialEq)]
|
||||
enum UndoLogEntry<'tcx> {
|
||||
/// Pushed when we start a snapshot.
|
||||
OpenSnapshot,
|
||||
|
||||
/// Replaces an `OpenSnapshot` when a snapshot is committed, but
|
||||
/// that snapshot is not the root. If the root snapshot is
|
||||
/// unrolled, all nested snapshots must be committed.
|
||||
CommitedSnapshot,
|
||||
|
||||
/// We added `RegionVid`
|
||||
AddVar(RegionVid),
|
||||
|
||||
/// We added the given `constraint`
|
||||
AddConstraint(Constraint<'tcx>),
|
||||
|
||||
/// We added the given `verify`
|
||||
AddVerify(usize),
|
||||
|
||||
/// We added the given `given`
|
||||
AddGiven(Region<'tcx>, ty::RegionVid),
|
||||
|
||||
/// We added a GLB/LUB "combination variable"
|
||||
AddCombination(CombineMapType, TwoRegions<'tcx>),
|
||||
|
||||
/// During skolemization, we sometimes purge entries from the undo
|
||||
/// log in a kind of minisnapshot (unlike other snapshots, this
|
||||
/// purging actually takes place *on success*). In that case, we
|
||||
/// replace the corresponding entry with `Noop` so as to avoid the
|
||||
/// need to do a bunch of swapping. (We can't use `swap_remove` as
|
||||
/// the order of the vector is important.)
|
||||
Purged,
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, PartialEq)]
|
||||
enum CombineMapType {
|
||||
Lub,
|
||||
Glb,
|
||||
}
|
||||
|
||||
type CombineMap<'tcx> = FxHashMap<TwoRegions<'tcx>, RegionVid>;
|
||||
|
||||
pub struct RegionSnapshot {
|
||||
length: usize,
|
||||
region_snapshot: unify::Snapshot<ty::RegionVid>,
|
||||
skolemization_count: u32,
|
||||
}
|
||||
|
||||
/// When working with skolemized regions, we often wish to find all of
|
||||
/// the regions that are either reachable from a skolemized region, or
|
||||
/// which can reach a skolemized region, or both. We call such regions
|
||||
/// *tained* regions. This struct allows you to decide what set of
|
||||
/// tainted regions you want.
|
||||
#[derive(Debug)]
|
||||
pub struct TaintDirections {
|
||||
incoming: bool,
|
||||
outgoing: bool,
|
||||
}
|
||||
|
||||
impl TaintDirections {
|
||||
pub fn incoming() -> Self {
|
||||
TaintDirections {
|
||||
incoming: true,
|
||||
outgoing: false,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn outgoing() -> Self {
|
||||
TaintDirections {
|
||||
incoming: false,
|
||||
outgoing: true,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn both() -> Self {
|
||||
TaintDirections {
|
||||
incoming: true,
|
||||
outgoing: true,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'tcx> RegionConstraintCollector<'tcx> {
|
||||
pub fn new() -> RegionConstraintCollector<'tcx> {
|
||||
RegionConstraintCollector {
|
||||
var_origins: VarOrigins::default(),
|
||||
data: RegionConstraintData::default(),
|
||||
lubs: FxHashMap(),
|
||||
glbs: FxHashMap(),
|
||||
skolemization_count: 0,
|
||||
bound_count: 0,
|
||||
undo_log: Vec::new(),
|
||||
unification_table: UnificationTable::new(),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn var_origins(&self) -> &VarOrigins {
|
||||
&self.var_origins
|
||||
}
|
||||
|
||||
/// Once all the constraints have been gathered, extract out the final data.
|
||||
///
|
||||
/// Not legal during a snapshot.
|
||||
pub fn into_origins_and_data(self) -> (VarOrigins, RegionConstraintData<'tcx>) {
|
||||
assert!(!self.in_snapshot());
|
||||
(self.var_origins, self.data)
|
||||
}
|
||||
|
||||
/// Takes (and clears) the current set of constraints. Note that
|
||||
/// the set of variables remains intact, but all relationships
|
||||
/// between them are reset. This is used during NLL checking to
|
||||
/// grab the set of constraints that arose from a particular
|
||||
/// operation.
|
||||
///
|
||||
/// We don't want to leak relationships between variables between
|
||||
/// points because just because (say) `r1 == r2` was true at some
|
||||
/// point P in the graph doesn't imply that it will be true at
|
||||
/// some other point Q, in NLL.
|
||||
///
|
||||
/// Not legal during a snapshot.
|
||||
pub fn take_and_reset_data(&mut self) -> RegionConstraintData<'tcx> {
|
||||
assert!(!self.in_snapshot());
|
||||
|
||||
// If you add a new field to `RegionConstraintCollector`, you
|
||||
// should think carefully about whether it needs to be cleared
|
||||
// or updated in some way.
|
||||
let RegionConstraintCollector {
|
||||
var_origins,
|
||||
data,
|
||||
lubs,
|
||||
glbs,
|
||||
skolemization_count,
|
||||
bound_count: _,
|
||||
undo_log: _,
|
||||
unification_table,
|
||||
} = self;
|
||||
|
||||
assert_eq!(*skolemization_count, 0);
|
||||
|
||||
// Clear the tables of (lubs, glbs), so that we will create
|
||||
// fresh regions if we do a LUB operation. As it happens,
|
||||
// LUB/GLB are not performed by the MIR type-checker, which is
|
||||
// the one that uses this method, but it's good to be correct.
|
||||
lubs.clear();
|
||||
glbs.clear();
|
||||
|
||||
// Clear all unifications and recreate the variables a "now
|
||||
// un-unified" state. Note that when we unify `a` and `b`, we
|
||||
// also insert `a <= b` and a `b <= a` edges, so the
|
||||
// `RegionConstraintData` contains the relationship here.
|
||||
*unification_table = UnificationTable::new();
|
||||
for vid in var_origins.indices() {
|
||||
unification_table.new_key(unify_key::RegionVidKey { min_vid: vid });
|
||||
}
|
||||
|
||||
mem::replace(data, RegionConstraintData::default())
|
||||
}
|
||||
|
||||
fn in_snapshot(&self) -> bool {
|
||||
!self.undo_log.is_empty()
|
||||
}
|
||||
|
||||
pub fn start_snapshot(&mut self) -> RegionSnapshot {
|
||||
let length = self.undo_log.len();
|
||||
debug!("RegionConstraintCollector: start_snapshot({})", length);
|
||||
self.undo_log.push(OpenSnapshot);
|
||||
RegionSnapshot {
|
||||
length,
|
||||
region_snapshot: self.unification_table.snapshot(),
|
||||
skolemization_count: self.skolemization_count,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn commit(&mut self, snapshot: RegionSnapshot) {
|
||||
debug!("RegionConstraintCollector: commit({})", snapshot.length);
|
||||
assert!(self.undo_log.len() > snapshot.length);
|
||||
assert!(self.undo_log[snapshot.length] == OpenSnapshot);
|
||||
assert!(
|
||||
self.skolemization_count == snapshot.skolemization_count,
|
||||
"failed to pop skolemized regions: {} now vs {} at start",
|
||||
self.skolemization_count,
|
||||
snapshot.skolemization_count
|
||||
);
|
||||
|
||||
if snapshot.length == 0 {
|
||||
self.undo_log.truncate(0);
|
||||
} else {
|
||||
(*self.undo_log)[snapshot.length] = CommitedSnapshot;
|
||||
}
|
||||
self.unification_table.commit(snapshot.region_snapshot);
|
||||
}
|
||||
|
||||
pub fn rollback_to(&mut self, snapshot: RegionSnapshot) {
|
||||
debug!("RegionConstraintCollector: rollback_to({:?})", snapshot);
|
||||
assert!(self.undo_log.len() > snapshot.length);
|
||||
assert!(self.undo_log[snapshot.length] == OpenSnapshot);
|
||||
while self.undo_log.len() > snapshot.length + 1 {
|
||||
let undo_entry = self.undo_log.pop().unwrap();
|
||||
self.rollback_undo_entry(undo_entry);
|
||||
}
|
||||
let c = self.undo_log.pop().unwrap();
|
||||
assert!(c == OpenSnapshot);
|
||||
self.skolemization_count = snapshot.skolemization_count;
|
||||
self.unification_table.rollback_to(snapshot.region_snapshot);
|
||||
}
|
||||
|
||||
fn rollback_undo_entry(&mut self, undo_entry: UndoLogEntry<'tcx>) {
|
||||
match undo_entry {
|
||||
OpenSnapshot => {
|
||||
panic!("Failure to observe stack discipline");
|
||||
}
|
||||
Purged | CommitedSnapshot => {
|
||||
// nothing to do here
|
||||
}
|
||||
AddVar(vid) => {
|
||||
self.var_origins.pop().unwrap();
|
||||
assert_eq!(self.var_origins.len(), vid.index as usize);
|
||||
}
|
||||
AddConstraint(ref constraint) => {
|
||||
self.data.constraints.remove(constraint);
|
||||
}
|
||||
AddVerify(index) => {
|
||||
self.data.verifys.pop();
|
||||
assert_eq!(self.data.verifys.len(), index);
|
||||
}
|
||||
AddGiven(sub, sup) => {
|
||||
self.data.givens.remove(&(sub, sup));
|
||||
}
|
||||
AddCombination(Glb, ref regions) => {
|
||||
self.glbs.remove(regions);
|
||||
}
|
||||
AddCombination(Lub, ref regions) => {
|
||||
self.lubs.remove(regions);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn new_region_var(&mut self, origin: RegionVariableOrigin) -> RegionVid {
|
||||
let vid = self.var_origins.push(origin.clone());
|
||||
|
||||
let u_vid = self.unification_table
|
||||
.new_key(unify_key::RegionVidKey { min_vid: vid });
|
||||
assert_eq!(vid, u_vid);
|
||||
if self.in_snapshot() {
|
||||
self.undo_log.push(AddVar(vid));
|
||||
}
|
||||
debug!(
|
||||
"created new region variable {:?} with origin {:?}",
|
||||
vid,
|
||||
origin
|
||||
);
|
||||
return vid;
|
||||
}
|
||||
|
||||
/// Returns the origin for the given variable.
|
||||
pub fn var_origin(&self, vid: RegionVid) -> RegionVariableOrigin {
|
||||
self.var_origins[vid].clone()
|
||||
}
|
||||
|
||||
/// Creates a new skolemized region. Skolemized regions are fresh
|
||||
/// regions used when performing higher-ranked computations. They
|
||||
/// must be used in a very particular way and are never supposed
|
||||
/// to "escape" out into error messages or the code at large.
|
||||
///
|
||||
/// The idea is to always create a snapshot. Skolemized regions
|
||||
/// can be created in the context of this snapshot, but before the
|
||||
/// snapshot is committed or rolled back, they must be popped
|
||||
/// (using `pop_skolemized_regions`), so that their numbers can be
|
||||
/// recycled. Normally you don't have to think about this: you use
|
||||
/// the APIs in `higher_ranked/mod.rs`, such as
|
||||
/// `skolemize_late_bound_regions` and `plug_leaks`, which will
|
||||
/// guide you on this path (ensure that the `SkolemizationMap` is
|
||||
/// consumed and you are good). There are also somewhat extensive
|
||||
/// comments in `higher_ranked/README.md`.
|
||||
///
|
||||
/// The `snapshot` argument to this function is not really used;
|
||||
/// it's just there to make it explicit which snapshot bounds the
|
||||
/// skolemized region that results. It should always be the top-most snapshot.
|
||||
pub fn push_skolemized(
|
||||
&mut self,
|
||||
tcx: TyCtxt<'_, '_, 'tcx>,
|
||||
br: ty::BoundRegion,
|
||||
snapshot: &RegionSnapshot,
|
||||
) -> Region<'tcx> {
|
||||
assert!(self.in_snapshot());
|
||||
assert!(self.undo_log[snapshot.length] == OpenSnapshot);
|
||||
|
||||
let sc = self.skolemization_count;
|
||||
self.skolemization_count = sc + 1;
|
||||
tcx.mk_region(ReSkolemized(ty::SkolemizedRegionVid { index: sc }, br))
|
||||
}
|
||||
|
||||
/// Removes all the edges to/from the skolemized regions that are
|
||||
/// in `skols`. This is used after a higher-ranked operation
|
||||
/// completes to remove all trace of the skolemized regions
|
||||
/// created in that time.
|
||||
pub fn pop_skolemized(
|
||||
&mut self,
|
||||
_tcx: TyCtxt<'_, '_, 'tcx>,
|
||||
skols: &FxHashSet<ty::Region<'tcx>>,
|
||||
snapshot: &RegionSnapshot,
|
||||
) {
|
||||
debug!("pop_skolemized_regions(skols={:?})", skols);
|
||||
|
||||
assert!(self.in_snapshot());
|
||||
assert!(self.undo_log[snapshot.length] == OpenSnapshot);
|
||||
assert!(
|
||||
self.skolemization_count as usize >= skols.len(),
|
||||
"popping more skolemized variables than actually exist, \
|
||||
sc now = {}, skols.len = {}",
|
||||
self.skolemization_count,
|
||||
skols.len()
|
||||
);
|
||||
|
||||
let last_to_pop = self.skolemization_count;
|
||||
let first_to_pop = last_to_pop - (skols.len() as u32);
|
||||
|
||||
assert!(
|
||||
first_to_pop >= snapshot.skolemization_count,
|
||||
"popping more regions than snapshot contains, \
|
||||
sc now = {}, sc then = {}, skols.len = {}",
|
||||
self.skolemization_count,
|
||||
snapshot.skolemization_count,
|
||||
skols.len()
|
||||
);
|
||||
debug_assert! {
|
||||
skols.iter()
|
||||
.all(|&k| match *k {
|
||||
ty::ReSkolemized(index, _) =>
|
||||
index.index >= first_to_pop &&
|
||||
index.index < last_to_pop,
|
||||
_ =>
|
||||
false
|
||||
}),
|
||||
"invalid skolemization keys or keys out of range ({}..{}): {:?}",
|
||||
snapshot.skolemization_count,
|
||||
self.skolemization_count,
|
||||
skols
|
||||
}
|
||||
|
||||
let constraints_to_kill: Vec<usize> = self.undo_log
|
||||
.iter()
|
||||
.enumerate()
|
||||
.rev()
|
||||
.filter(|&(_, undo_entry)| kill_constraint(skols, undo_entry))
|
||||
.map(|(index, _)| index)
|
||||
.collect();
|
||||
|
||||
for index in constraints_to_kill {
|
||||
let undo_entry = mem::replace(&mut self.undo_log[index], Purged);
|
||||
self.rollback_undo_entry(undo_entry);
|
||||
}
|
||||
|
||||
self.skolemization_count = snapshot.skolemization_count;
|
||||
return;
|
||||
|
||||
fn kill_constraint<'tcx>(
|
||||
skols: &FxHashSet<ty::Region<'tcx>>,
|
||||
undo_entry: &UndoLogEntry<'tcx>,
|
||||
) -> bool {
|
||||
match undo_entry {
|
||||
&AddConstraint(Constraint::VarSubVar(..)) => false,
|
||||
&AddConstraint(Constraint::RegSubVar(a, _)) => skols.contains(&a),
|
||||
&AddConstraint(Constraint::VarSubReg(_, b)) => skols.contains(&b),
|
||||
&AddConstraint(Constraint::RegSubReg(a, b)) => {
|
||||
skols.contains(&a) || skols.contains(&b)
|
||||
}
|
||||
&AddGiven(..) => false,
|
||||
&AddVerify(_) => false,
|
||||
&AddCombination(_, ref two_regions) => {
|
||||
skols.contains(&two_regions.a) || skols.contains(&two_regions.b)
|
||||
}
|
||||
&AddVar(..) | &OpenSnapshot | &Purged | &CommitedSnapshot => false,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn new_bound(
|
||||
&mut self,
|
||||
tcx: TyCtxt<'_, '_, 'tcx>,
|
||||
debruijn: ty::DebruijnIndex,
|
||||
) -> Region<'tcx> {
|
||||
// Creates a fresh bound variable for use in GLB computations.
|
||||
// See discussion of GLB computation in the large comment at
|
||||
// the top of this file for more details.
|
||||
//
|
||||
// This computation is potentially wrong in the face of
|
||||
// rollover. It's conceivable, if unlikely, that one might
|
||||
// wind up with accidental capture for nested functions in
|
||||
// that case, if the outer function had bound regions created
|
||||
// a very long time before and the inner function somehow
|
||||
// wound up rolling over such that supposedly fresh
|
||||
// identifiers were in fact shadowed. For now, we just assert
|
||||
// that there is no rollover -- eventually we should try to be
|
||||
// robust against this possibility, either by checking the set
|
||||
// of bound identifiers that appear in a given expression and
|
||||
// ensure that we generate one that is distinct, or by
|
||||
// changing the representation of bound regions in a fn
|
||||
// declaration
|
||||
|
||||
let sc = self.bound_count;
|
||||
self.bound_count = sc + 1;
|
||||
|
||||
if sc >= self.bound_count {
|
||||
bug!("rollover in RegionInference new_bound()");
|
||||
}
|
||||
|
||||
tcx.mk_region(ReLateBound(debruijn, BrFresh(sc)))
|
||||
}
|
||||
|
||||
fn add_constraint(&mut self, constraint: Constraint<'tcx>, origin: SubregionOrigin<'tcx>) {
|
||||
// cannot add constraints once regions are resolved
|
||||
debug!(
|
||||
"RegionConstraintCollector: add_constraint({:?})",
|
||||
constraint
|
||||
);
|
||||
|
||||
// never overwrite an existing (constraint, origin) - only insert one if it isn't
|
||||
// present in the map yet. This prevents origins from outside the snapshot being
|
||||
// replaced with "less informative" origins e.g. during calls to `can_eq`
|
||||
let in_snapshot = self.in_snapshot();
|
||||
let undo_log = &mut self.undo_log;
|
||||
self.data.constraints.entry(constraint).or_insert_with(|| {
|
||||
if in_snapshot {
|
||||
undo_log.push(AddConstraint(constraint));
|
||||
}
|
||||
origin
|
||||
});
|
||||
}
|
||||
|
||||
fn add_verify(&mut self, verify: Verify<'tcx>) {
|
||||
// cannot add verifys once regions are resolved
|
||||
debug!("RegionConstraintCollector: add_verify({:?})", verify);
|
||||
|
||||
// skip no-op cases known to be satisfied
|
||||
match verify.bound {
|
||||
VerifyBound::AllBounds(ref bs) if bs.len() == 0 => {
|
||||
return;
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
|
||||
let index = self.data.verifys.len();
|
||||
self.data.verifys.push(verify);
|
||||
if self.in_snapshot() {
|
||||
self.undo_log.push(AddVerify(index));
|
||||
}
|
||||
}
|
||||
|
||||
pub fn add_given(&mut self, sub: Region<'tcx>, sup: ty::RegionVid) {
|
||||
// cannot add givens once regions are resolved
|
||||
if self.data.givens.insert((sub, sup)) {
|
||||
debug!("add_given({:?} <= {:?})", sub, sup);
|
||||
|
||||
if self.in_snapshot() {
|
||||
self.undo_log.push(AddGiven(sub, sup));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn make_eqregion(
|
||||
&mut self,
|
||||
origin: SubregionOrigin<'tcx>,
|
||||
sub: Region<'tcx>,
|
||||
sup: Region<'tcx>,
|
||||
) {
|
||||
if sub != sup {
|
||||
// Eventually, it would be nice to add direct support for
|
||||
// equating regions.
|
||||
self.make_subregion(origin.clone(), sub, sup);
|
||||
self.make_subregion(origin, sup, sub);
|
||||
|
||||
if let (ty::ReVar(sub), ty::ReVar(sup)) = (*sub, *sup) {
|
||||
self.unification_table.union(sub, sup);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn make_subregion(
|
||||
&mut self,
|
||||
origin: SubregionOrigin<'tcx>,
|
||||
sub: Region<'tcx>,
|
||||
sup: Region<'tcx>,
|
||||
) {
|
||||
// cannot add constraints once regions are resolved
|
||||
debug!(
|
||||
"RegionConstraintCollector: make_subregion({:?}, {:?}) due to {:?}",
|
||||
sub,
|
||||
sup,
|
||||
origin
|
||||
);
|
||||
|
||||
match (sub, sup) {
|
||||
(&ReLateBound(..), _) | (_, &ReLateBound(..)) => {
|
||||
span_bug!(
|
||||
origin.span(),
|
||||
"cannot relate bound region: {:?} <= {:?}",
|
||||
sub,
|
||||
sup
|
||||
);
|
||||
}
|
||||
(_, &ReStatic) => {
|
||||
// all regions are subregions of static, so we can ignore this
|
||||
}
|
||||
(&ReVar(sub_id), &ReVar(sup_id)) => {
|
||||
self.add_constraint(Constraint::VarSubVar(sub_id, sup_id), origin);
|
||||
}
|
||||
(_, &ReVar(sup_id)) => {
|
||||
self.add_constraint(Constraint::RegSubVar(sub, sup_id), origin);
|
||||
}
|
||||
(&ReVar(sub_id), _) => {
|
||||
self.add_constraint(Constraint::VarSubReg(sub_id, sup), origin);
|
||||
}
|
||||
_ => {
|
||||
self.add_constraint(Constraint::RegSubReg(sub, sup), origin);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// See `Verify::VerifyGenericBound`
|
||||
pub fn verify_generic_bound(
|
||||
&mut self,
|
||||
origin: SubregionOrigin<'tcx>,
|
||||
kind: GenericKind<'tcx>,
|
||||
sub: Region<'tcx>,
|
||||
bound: VerifyBound<'tcx>,
|
||||
) {
|
||||
self.add_verify(Verify {
|
||||
kind,
|
||||
origin,
|
||||
region: sub,
|
||||
bound,
|
||||
});
|
||||
}
|
||||
|
||||
pub fn lub_regions(
|
||||
&mut self,
|
||||
tcx: TyCtxt<'_, '_, 'tcx>,
|
||||
origin: SubregionOrigin<'tcx>,
|
||||
a: Region<'tcx>,
|
||||
b: Region<'tcx>,
|
||||
) -> Region<'tcx> {
|
||||
// cannot add constraints once regions are resolved
|
||||
debug!("RegionConstraintCollector: lub_regions({:?}, {:?})", a, b);
|
||||
match (a, b) {
|
||||
(r @ &ReStatic, _) | (_, r @ &ReStatic) => {
|
||||
r // nothing lives longer than static
|
||||
}
|
||||
|
||||
_ if a == b => {
|
||||
a // LUB(a,a) = a
|
||||
}
|
||||
|
||||
_ => self.combine_vars(tcx, Lub, a, b, origin.clone()),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn glb_regions(
|
||||
&mut self,
|
||||
tcx: TyCtxt<'_, '_, 'tcx>,
|
||||
origin: SubregionOrigin<'tcx>,
|
||||
a: Region<'tcx>,
|
||||
b: Region<'tcx>,
|
||||
) -> Region<'tcx> {
|
||||
// cannot add constraints once regions are resolved
|
||||
debug!("RegionConstraintCollector: glb_regions({:?}, {:?})", a, b);
|
||||
match (a, b) {
|
||||
(&ReStatic, r) | (r, &ReStatic) => {
|
||||
r // static lives longer than everything else
|
||||
}
|
||||
|
||||
_ if a == b => {
|
||||
a // GLB(a,a) = a
|
||||
}
|
||||
|
||||
_ => self.combine_vars(tcx, Glb, a, b, origin.clone()),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn opportunistic_resolve_var(
|
||||
&mut self,
|
||||
tcx: TyCtxt<'_, '_, 'tcx>,
|
||||
rid: RegionVid,
|
||||
) -> ty::Region<'tcx> {
|
||||
let vid = self.unification_table.find_value(rid).min_vid;
|
||||
tcx.mk_region(ty::ReVar(vid))
|
||||
}
|
||||
|
||||
fn combine_map(&mut self, t: CombineMapType) -> &mut CombineMap<'tcx> {
|
||||
match t {
|
||||
Glb => &mut self.glbs,
|
||||
Lub => &mut self.lubs,
|
||||
}
|
||||
}
|
||||
|
||||
fn combine_vars(
|
||||
&mut self,
|
||||
tcx: TyCtxt<'_, '_, 'tcx>,
|
||||
t: CombineMapType,
|
||||
a: Region<'tcx>,
|
||||
b: Region<'tcx>,
|
||||
origin: SubregionOrigin<'tcx>,
|
||||
) -> Region<'tcx> {
|
||||
let vars = TwoRegions { a: a, b: b };
|
||||
if let Some(&c) = self.combine_map(t).get(&vars) {
|
||||
return tcx.mk_region(ReVar(c));
|
||||
}
|
||||
let c = self.new_region_var(MiscVariable(origin.span()));
|
||||
self.combine_map(t).insert(vars, c);
|
||||
if self.in_snapshot() {
|
||||
self.undo_log.push(AddCombination(t, vars));
|
||||
}
|
||||
let new_r = tcx.mk_region(ReVar(c));
|
||||
for &old_r in &[a, b] {
|
||||
match t {
|
||||
Glb => self.make_subregion(origin.clone(), new_r, old_r),
|
||||
Lub => self.make_subregion(origin.clone(), old_r, new_r),
|
||||
}
|
||||
}
|
||||
debug!("combine_vars() c={:?}", c);
|
||||
new_r
|
||||
}
|
||||
|
||||
pub fn vars_created_since_snapshot(&self, mark: &RegionSnapshot) -> Vec<RegionVid> {
|
||||
self.undo_log[mark.length..]
|
||||
.iter()
|
||||
.filter_map(|&elt| match elt {
|
||||
AddVar(vid) => Some(vid),
|
||||
_ => None,
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Computes all regions that have been related to `r0` since the
|
||||
/// mark `mark` was made---`r0` itself will be the first
|
||||
/// entry. The `directions` parameter controls what kind of
|
||||
/// relations are considered. For example, one can say that only
|
||||
/// "incoming" edges to `r0` are desired, in which case one will
|
||||
/// get the set of regions `{r|r <= r0}`. This is used when
|
||||
/// checking whether skolemized regions are being improperly
|
||||
/// related to other regions.
|
||||
pub fn tainted(
|
||||
&self,
|
||||
tcx: TyCtxt<'_, '_, 'tcx>,
|
||||
mark: &RegionSnapshot,
|
||||
r0: Region<'tcx>,
|
||||
directions: TaintDirections,
|
||||
) -> FxHashSet<ty::Region<'tcx>> {
|
||||
debug!(
|
||||
"tainted(mark={:?}, r0={:?}, directions={:?})",
|
||||
mark,
|
||||
r0,
|
||||
directions
|
||||
);
|
||||
|
||||
// `result_set` acts as a worklist: we explore all outgoing
|
||||
// edges and add any new regions we find to result_set. This
|
||||
// is not a terribly efficient implementation.
|
||||
let mut taint_set = taint::TaintSet::new(directions, r0);
|
||||
taint_set.fixed_point(tcx, &self.undo_log[mark.length..], &self.data.verifys);
|
||||
debug!("tainted: result={:?}", taint_set);
|
||||
return taint_set.into_set();
|
||||
}
|
||||
}
|
||||
|
||||
impl fmt::Debug for RegionSnapshot {
|
||||
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
|
||||
write!(
|
||||
f,
|
||||
"RegionSnapshot(length={},skolemization={})",
|
||||
self.length,
|
||||
self.skolemization_count
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
impl<'tcx> fmt::Debug for GenericKind<'tcx> {
|
||||
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
|
||||
match *self {
|
||||
GenericKind::Param(ref p) => write!(f, "{:?}", p),
|
||||
GenericKind::Projection(ref p) => write!(f, "{:?}", p),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'tcx> fmt::Display for GenericKind<'tcx> {
|
||||
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
|
||||
match *self {
|
||||
GenericKind::Param(ref p) => write!(f, "{}", p),
|
||||
GenericKind::Projection(ref p) => write!(f, "{}", p),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, 'gcx, 'tcx> GenericKind<'tcx> {
|
||||
pub fn to_ty(&self, tcx: TyCtxt<'a, 'gcx, 'tcx>) -> Ty<'tcx> {
|
||||
match *self {
|
||||
GenericKind::Param(ref p) => p.to_ty(tcx),
|
||||
GenericKind::Projection(ref p) => tcx.mk_projection(p.item_def_id, p.substs),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, 'gcx, 'tcx> VerifyBound<'tcx> {
|
||||
fn for_each_region(&self, f: &mut FnMut(ty::Region<'tcx>)) {
|
||||
match self {
|
||||
&VerifyBound::AnyRegion(ref rs) | &VerifyBound::AllRegions(ref rs) => for &r in rs {
|
||||
f(r);
|
||||
},
|
||||
|
||||
&VerifyBound::AnyBound(ref bs) | &VerifyBound::AllBounds(ref bs) => for b in bs {
|
||||
b.for_each_region(f);
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
pub fn must_hold(&self) -> bool {
|
||||
match self {
|
||||
&VerifyBound::AnyRegion(ref bs) => bs.contains(&&ty::ReStatic),
|
||||
&VerifyBound::AllRegions(ref bs) => bs.is_empty(),
|
||||
&VerifyBound::AnyBound(ref bs) => bs.iter().any(|b| b.must_hold()),
|
||||
&VerifyBound::AllBounds(ref bs) => bs.iter().all(|b| b.must_hold()),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn cannot_hold(&self) -> bool {
|
||||
match self {
|
||||
&VerifyBound::AnyRegion(ref bs) => bs.is_empty(),
|
||||
&VerifyBound::AllRegions(ref bs) => bs.contains(&&ty::ReEmpty),
|
||||
&VerifyBound::AnyBound(ref bs) => bs.iter().all(|b| b.cannot_hold()),
|
||||
&VerifyBound::AllBounds(ref bs) => bs.iter().any(|b| b.cannot_hold()),
|
||||
}
|
||||
}
|
||||
|
||||
pub fn or(self, vb: VerifyBound<'tcx>) -> VerifyBound<'tcx> {
|
||||
if self.must_hold() || vb.cannot_hold() {
|
||||
self
|
||||
} else if self.cannot_hold() || vb.must_hold() {
|
||||
vb
|
||||
} else {
|
||||
VerifyBound::AnyBound(vec![self, vb])
|
||||
}
|
||||
}
|
||||
|
||||
pub fn and(self, vb: VerifyBound<'tcx>) -> VerifyBound<'tcx> {
|
||||
if self.must_hold() && vb.must_hold() {
|
||||
self
|
||||
} else if self.cannot_hold() && vb.cannot_hold() {
|
||||
self
|
||||
} else {
|
||||
VerifyBound::AllBounds(vec![self, vb])
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'tcx> RegionConstraintData<'tcx> {
|
||||
/// True if this region constraint data contains no constraints.
|
||||
pub fn is_empty(&self) -> bool {
|
||||
let RegionConstraintData {
|
||||
constraints,
|
||||
verifys,
|
||||
givens,
|
||||
} = self;
|
||||
constraints.is_empty() && verifys.is_empty() && givens.is_empty()
|
||||
}
|
||||
}
|
96
src/librustc/infer/region_constraints/taint.rs
Normal file
96
src/librustc/infer/region_constraints/taint.rs
Normal file
@ -0,0 +1,96 @@
|
||||
// Copyright 2012-2014 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
use super::*;
|
||||
|
||||
#[derive(Debug)]
|
||||
pub(super) struct TaintSet<'tcx> {
|
||||
directions: TaintDirections,
|
||||
regions: FxHashSet<ty::Region<'tcx>>
|
||||
}
|
||||
|
||||
impl<'tcx> TaintSet<'tcx> {
|
||||
pub(super) fn new(directions: TaintDirections,
|
||||
initial_region: ty::Region<'tcx>)
|
||||
-> Self {
|
||||
let mut regions = FxHashSet();
|
||||
regions.insert(initial_region);
|
||||
TaintSet { directions: directions, regions: regions }
|
||||
}
|
||||
|
||||
pub(super) fn fixed_point(&mut self,
|
||||
tcx: TyCtxt<'_, '_, 'tcx>,
|
||||
undo_log: &[UndoLogEntry<'tcx>],
|
||||
verifys: &[Verify<'tcx>]) {
|
||||
let mut prev_len = 0;
|
||||
while prev_len < self.len() {
|
||||
debug!("tainted: prev_len = {:?} new_len = {:?}",
|
||||
prev_len, self.len());
|
||||
|
||||
prev_len = self.len();
|
||||
|
||||
for undo_entry in undo_log {
|
||||
match undo_entry {
|
||||
&AddConstraint(Constraint::VarSubVar(a, b)) => {
|
||||
self.add_edge(tcx.mk_region(ReVar(a)),
|
||||
tcx.mk_region(ReVar(b)));
|
||||
}
|
||||
&AddConstraint(Constraint::RegSubVar(a, b)) => {
|
||||
self.add_edge(a, tcx.mk_region(ReVar(b)));
|
||||
}
|
||||
&AddConstraint(Constraint::VarSubReg(a, b)) => {
|
||||
self.add_edge(tcx.mk_region(ReVar(a)), b);
|
||||
}
|
||||
&AddConstraint(Constraint::RegSubReg(a, b)) => {
|
||||
self.add_edge(a, b);
|
||||
}
|
||||
&AddGiven(a, b) => {
|
||||
self.add_edge(a, tcx.mk_region(ReVar(b)));
|
||||
}
|
||||
&AddVerify(i) => {
|
||||
verifys[i].bound.for_each_region(&mut |b| {
|
||||
self.add_edge(verifys[i].region, b);
|
||||
});
|
||||
}
|
||||
&Purged |
|
||||
&AddCombination(..) |
|
||||
&AddVar(..) |
|
||||
&OpenSnapshot |
|
||||
&CommitedSnapshot => {}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub(super) fn into_set(self) -> FxHashSet<ty::Region<'tcx>> {
|
||||
self.regions
|
||||
}
|
||||
|
||||
fn len(&self) -> usize {
|
||||
self.regions.len()
|
||||
}
|
||||
|
||||
fn add_edge(&mut self,
|
||||
source: ty::Region<'tcx>,
|
||||
target: ty::Region<'tcx>) {
|
||||
if self.directions.incoming {
|
||||
if self.regions.contains(&target) {
|
||||
self.regions.insert(source);
|
||||
}
|
||||
}
|
||||
|
||||
if self.directions.outgoing {
|
||||
if self.regions.contains(&source) {
|
||||
self.regions.insert(target);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
File diff suppressed because it is too large
Load Diff
@ -74,8 +74,11 @@ impl<'a, 'gcx, 'tcx> TypeFolder<'gcx, 'tcx> for OpportunisticTypeAndRegionResolv
|
||||
|
||||
fn fold_region(&mut self, r: ty::Region<'tcx>) -> ty::Region<'tcx> {
|
||||
match *r {
|
||||
ty::ReVar(rid) => self.infcx.region_vars.opportunistic_resolve_var(rid),
|
||||
_ => r,
|
||||
ty::ReVar(rid) =>
|
||||
self.infcx.borrow_region_constraints()
|
||||
.opportunistic_resolve_var(self.tcx(), rid),
|
||||
_ =>
|
||||
r,
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -185,7 +188,11 @@ impl<'a, 'gcx, 'tcx> TypeFolder<'gcx, 'tcx> for FullTypeResolver<'a, 'gcx, 'tcx>
|
||||
|
||||
fn fold_region(&mut self, r: ty::Region<'tcx>) -> ty::Region<'tcx> {
|
||||
match *r {
|
||||
ty::ReVar(rid) => self.infcx.region_vars.resolve_var(rid),
|
||||
ty::ReVar(rid) => self.infcx.lexical_region_resolutions
|
||||
.borrow()
|
||||
.as_ref()
|
||||
.expect("region resolution not performed")
|
||||
.resolve_var(rid),
|
||||
_ => r,
|
||||
}
|
||||
}
|
||||
|
@ -137,7 +137,8 @@ impl<'combine, 'infcx, 'gcx, 'tcx> TypeRelation<'infcx, 'gcx, 'tcx>
|
||||
// from the "cause" field, we could perhaps give more tailored
|
||||
// error messages.
|
||||
let origin = SubregionOrigin::Subtype(self.fields.trace.clone());
|
||||
self.fields.infcx.region_vars.make_subregion(origin, a, b);
|
||||
self.fields.infcx.borrow_region_constraints()
|
||||
.make_subregion(origin, a, b);
|
||||
|
||||
Ok(a)
|
||||
}
|
||||
|
@ -45,17 +45,21 @@
|
||||
#![feature(conservative_impl_trait)]
|
||||
#![feature(const_fn)]
|
||||
#![feature(core_intrinsics)]
|
||||
#![feature(drain_filter)]
|
||||
#![feature(i128_type)]
|
||||
#![feature(match_default_bindings)]
|
||||
#![feature(inclusive_range_syntax)]
|
||||
#![cfg_attr(windows, feature(libc))]
|
||||
#![feature(macro_vis_matcher)]
|
||||
#![feature(never_type)]
|
||||
#![feature(nonzero)]
|
||||
#![feature(quote)]
|
||||
#![feature(refcell_replace_swap)]
|
||||
#![feature(rustc_diagnostic_macros)]
|
||||
#![feature(slice_patterns)]
|
||||
#![feature(specialization)]
|
||||
#![feature(unboxed_closures)]
|
||||
#![feature(underscore_lifetimes)]
|
||||
#![feature(trace_macros)]
|
||||
#![feature(test)]
|
||||
#![feature(const_atomic_bool_new)]
|
||||
|
@ -161,12 +161,6 @@ declare_lint! {
|
||||
"patterns in functions without body were erroneously allowed"
|
||||
}
|
||||
|
||||
declare_lint! {
|
||||
pub EXTRA_REQUIREMENT_IN_IMPL,
|
||||
Deny,
|
||||
"detects extra requirements in impls that were erroneously allowed"
|
||||
}
|
||||
|
||||
declare_lint! {
|
||||
pub LEGACY_DIRECTORY_OWNERSHIP,
|
||||
Deny,
|
||||
@ -254,7 +248,6 @@ impl LintPass for HardwiredLints {
|
||||
RESOLVE_TRAIT_ON_DEFAULTED_UNIT,
|
||||
SAFE_EXTERN_STATICS,
|
||||
PATTERNS_IN_FNS_WITHOUT_BODY,
|
||||
EXTRA_REQUIREMENT_IN_IMPL,
|
||||
LEGACY_DIRECTORY_OWNERSHIP,
|
||||
LEGACY_IMPORTS,
|
||||
LEGACY_CONSTRUCTOR_VISIBILITY,
|
||||
|
@ -192,7 +192,7 @@ impl<'tcx> FreeRegionMap<'tcx> {
|
||||
///
|
||||
/// if `r_a` represents `'a`, this function would return `{'b, 'c}`.
|
||||
pub fn regions_that_outlive<'a, 'gcx>(&self, r_a: Region<'tcx>) -> Vec<&Region<'tcx>> {
|
||||
assert!(is_free(r_a));
|
||||
assert!(is_free(r_a) || *r_a == ty::ReStatic);
|
||||
self.relation.greater_than(&r_a)
|
||||
}
|
||||
}
|
||||
|
@ -12,7 +12,7 @@
|
||||
//! the parent links in the region hierarchy.
|
||||
//!
|
||||
//! Most of the documentation on regions can be found in
|
||||
//! `middle/infer/region_inference/README.md`
|
||||
//! `middle/infer/region_constraints/README.md`
|
||||
|
||||
use ich::{StableHashingContext, NodeIdHashingMode};
|
||||
use util::nodemap::{FxHashMap, FxHashSet};
|
||||
@ -320,7 +320,7 @@ pub struct ScopeTree {
|
||||
/// hierarchy based on their lexical mapping. This is used to
|
||||
/// handle the relationships between regions in a fn and in a
|
||||
/// closure defined by that fn. See the "Modeling closures"
|
||||
/// section of the README in infer::region_inference for
|
||||
/// section of the README in infer::region_constraints for
|
||||
/// more details.
|
||||
closure_tree: FxHashMap<hir::ItemLocalId, hir::ItemLocalId>,
|
||||
|
||||
@ -407,7 +407,7 @@ pub struct Context {
|
||||
/// of the innermost fn body. Each fn forms its own disjoint tree
|
||||
/// in the region hierarchy. These fn bodies are themselves
|
||||
/// arranged into a tree. See the "Modeling closures" section of
|
||||
/// the README in infer::region_inference for more
|
||||
/// the README in infer::region_constraints for more
|
||||
/// details.
|
||||
root_id: Option<hir::ItemLocalId>,
|
||||
|
||||
@ -646,7 +646,7 @@ impl<'tcx> ScopeTree {
|
||||
// different functions. Compare those fn for lexical
|
||||
// nesting. The reasoning behind this is subtle. See the
|
||||
// "Modeling closures" section of the README in
|
||||
// infer::region_inference for more details.
|
||||
// infer::region_constraints for more details.
|
||||
let a_root_scope = a_ancestors[a_index];
|
||||
let b_root_scope = a_ancestors[a_index];
|
||||
return match (a_root_scope.data(), b_root_scope.data()) {
|
||||
|
@ -555,6 +555,15 @@ pub struct UpvarDecl {
|
||||
|
||||
newtype_index!(BasicBlock { DEBUG_FORMAT = "bb{}" });
|
||||
|
||||
impl BasicBlock {
|
||||
pub fn start_location(self) -> Location {
|
||||
Location {
|
||||
block: self,
|
||||
statement_index: 0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
///////////////////////////////////////////////////////////////////////////
|
||||
// BasicBlockData and Terminator
|
||||
|
||||
@ -638,7 +647,32 @@ pub enum TerminatorKind<'tcx> {
|
||||
unwind: Option<BasicBlock>
|
||||
},
|
||||
|
||||
/// Drop the Lvalue and assign the new value over it
|
||||
/// Drop the Lvalue and assign the new value over it. This ensures
|
||||
/// that the assignment to LV occurs *even if* the destructor for
|
||||
/// lvalue unwinds. Its semantics are best explained by by the
|
||||
/// elaboration:
|
||||
///
|
||||
/// ```
|
||||
/// BB0 {
|
||||
/// DropAndReplace(LV <- RV, goto BB1, unwind BB2)
|
||||
/// }
|
||||
/// ```
|
||||
///
|
||||
/// becomes
|
||||
///
|
||||
/// ```
|
||||
/// BB0 {
|
||||
/// Drop(LV, goto BB1, unwind BB2)
|
||||
/// }
|
||||
/// BB1 {
|
||||
/// // LV is now unitialized
|
||||
/// LV <- RV
|
||||
/// }
|
||||
/// BB2 {
|
||||
/// // LV is now unitialized -- its dtor panicked
|
||||
/// LV <- RV
|
||||
/// }
|
||||
/// ```
|
||||
DropAndReplace {
|
||||
location: Lvalue<'tcx>,
|
||||
value: Operand<'tcx>,
|
||||
|
@ -292,11 +292,10 @@ macro_rules! make_mir_visitor {
|
||||
self.visit_visibility_scope_data(scope);
|
||||
}
|
||||
|
||||
let lookup = TyContext::SourceInfo(SourceInfo {
|
||||
self.visit_ty(&$($mutability)* mir.return_ty, TyContext::ReturnTy(SourceInfo {
|
||||
span: mir.span,
|
||||
scope: ARGUMENT_VISIBILITY_SCOPE,
|
||||
});
|
||||
self.visit_ty(&$($mutability)* mir.return_ty, lookup);
|
||||
}));
|
||||
|
||||
for local in mir.local_decls.indices() {
|
||||
self.visit_local_decl(local, & $($mutability)* mir.local_decls[local]);
|
||||
@ -811,7 +810,7 @@ make_mir_visitor!(MutVisitor,mut);
|
||||
|
||||
/// Extra information passed to `visit_ty` and friends to give context
|
||||
/// about where the type etc appears.
|
||||
#[derive(Copy, Clone, Debug)]
|
||||
#[derive(Copy, Clone, Debug, PartialEq, Eq, Hash)]
|
||||
pub enum TyContext {
|
||||
LocalDecl {
|
||||
/// The index of the local variable we are visiting.
|
||||
@ -821,9 +820,11 @@ pub enum TyContext {
|
||||
source_info: SourceInfo,
|
||||
},
|
||||
|
||||
Location(Location),
|
||||
/// The return type of the function.
|
||||
ReturnTy(SourceInfo),
|
||||
|
||||
SourceInfo(SourceInfo),
|
||||
/// A type found at some location.
|
||||
Location(Location),
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, Debug, PartialEq, Eq)]
|
||||
|
@ -33,7 +33,6 @@ use hir::def_id::DefId;
|
||||
use infer::{self, InferCtxt};
|
||||
use infer::type_variable::TypeVariableOrigin;
|
||||
use middle::const_val;
|
||||
use rustc::lint::builtin::EXTRA_REQUIREMENT_IN_IMPL;
|
||||
use std::fmt;
|
||||
use syntax::ast;
|
||||
use session::DiagnosticMessageId;
|
||||
@ -481,30 +480,14 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
item_name: ast::Name,
|
||||
_impl_item_def_id: DefId,
|
||||
trait_item_def_id: DefId,
|
||||
requirement: &fmt::Display,
|
||||
lint_id: Option<ast::NodeId>) // (*)
|
||||
requirement: &fmt::Display)
|
||||
-> DiagnosticBuilder<'tcx>
|
||||
{
|
||||
// (*) This parameter is temporary and used only for phasing
|
||||
// in the bug fix to #18937. If it is `Some`, it has a kind of
|
||||
// weird effect -- the diagnostic is reported as a lint, and
|
||||
// the builder which is returned is marked as canceled.
|
||||
|
||||
let msg = "impl has stricter requirements than trait";
|
||||
let mut err = match lint_id {
|
||||
Some(node_id) => {
|
||||
self.tcx.struct_span_lint_node(EXTRA_REQUIREMENT_IN_IMPL,
|
||||
node_id,
|
||||
error_span,
|
||||
msg)
|
||||
}
|
||||
None => {
|
||||
struct_span_err!(self.tcx.sess,
|
||||
let mut err = struct_span_err!(self.tcx.sess,
|
||||
error_span,
|
||||
E0276,
|
||||
"{}", msg)
|
||||
}
|
||||
};
|
||||
"{}", msg);
|
||||
|
||||
if let Some(trait_item_span) = self.tcx.hir.span_if_local(trait_item_def_id) {
|
||||
let span = self.tcx.sess.codemap().def_span(trait_item_span);
|
||||
@ -543,15 +526,14 @@ impl<'a, 'gcx, 'tcx> InferCtxt<'a, 'gcx, 'tcx> {
|
||||
let mut err = match *error {
|
||||
SelectionError::Unimplemented => {
|
||||
if let ObligationCauseCode::CompareImplMethodObligation {
|
||||
item_name, impl_item_def_id, trait_item_def_id, lint_id
|
||||
item_name, impl_item_def_id, trait_item_def_id,
|
||||
} = obligation.cause.code {
|
||||
self.report_extra_impl_obligation(
|
||||
span,
|
||||
item_name,
|
||||
impl_item_def_id,
|
||||
trait_item_def_id,
|
||||
&format!("`{}`", obligation.predicate),
|
||||
lint_id)
|
||||
&format!("`{}`", obligation.predicate))
|
||||
.emit();
|
||||
return;
|
||||
}
|
||||
|
@ -8,14 +8,12 @@
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
use infer::{InferCtxt, InferOk};
|
||||
use infer::{RegionObligation, InferCtxt, InferOk};
|
||||
use ty::{self, Ty, TypeFoldable, ToPolyTraitRef, ToPredicate};
|
||||
use ty::error::ExpectedFound;
|
||||
use rustc_data_structures::obligation_forest::{ObligationForest, Error};
|
||||
use rustc_data_structures::obligation_forest::{ForestObligation, ObligationProcessor};
|
||||
use std::marker::PhantomData;
|
||||
use syntax::ast;
|
||||
use util::nodemap::NodeMap;
|
||||
use hir::def_id::DefId;
|
||||
|
||||
use super::CodeAmbiguity;
|
||||
@ -48,39 +46,6 @@ pub struct FulfillmentContext<'tcx> {
|
||||
// A list of all obligations that have been registered with this
|
||||
// fulfillment context.
|
||||
predicates: ObligationForest<PendingPredicateObligation<'tcx>>,
|
||||
|
||||
// A set of constraints that regionck must validate. Each
|
||||
// constraint has the form `T:'a`, meaning "some type `T` must
|
||||
// outlive the lifetime 'a". These constraints derive from
|
||||
// instantiated type parameters. So if you had a struct defined
|
||||
// like
|
||||
//
|
||||
// struct Foo<T:'static> { ... }
|
||||
//
|
||||
// then in some expression `let x = Foo { ... }` it will
|
||||
// instantiate the type parameter `T` with a fresh type `$0`. At
|
||||
// the same time, it will record a region obligation of
|
||||
// `$0:'static`. This will get checked later by regionck. (We
|
||||
// can't generally check these things right away because we have
|
||||
// to wait until types are resolved.)
|
||||
//
|
||||
// These are stored in a map keyed to the id of the innermost
|
||||
// enclosing fn body / static initializer expression. This is
|
||||
// because the location where the obligation was incurred can be
|
||||
// relevant with respect to which sublifetime assumptions are in
|
||||
// place. The reason that we store under the fn-id, and not
|
||||
// something more fine-grained, is so that it is easier for
|
||||
// regionck to be sure that it has found *all* the region
|
||||
// obligations (otherwise, it's easy to fail to walk to a
|
||||
// particular node-id).
|
||||
region_obligations: NodeMap<Vec<RegionObligation<'tcx>>>,
|
||||
}
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct RegionObligation<'tcx> {
|
||||
pub sub_region: ty::Region<'tcx>,
|
||||
pub sup_type: Ty<'tcx>,
|
||||
pub cause: ObligationCause<'tcx>,
|
||||
}
|
||||
|
||||
#[derive(Clone, Debug)]
|
||||
@ -94,7 +59,6 @@ impl<'a, 'gcx, 'tcx> FulfillmentContext<'tcx> {
|
||||
pub fn new() -> FulfillmentContext<'tcx> {
|
||||
FulfillmentContext {
|
||||
predicates: ObligationForest::new(),
|
||||
region_obligations: NodeMap(),
|
||||
}
|
||||
}
|
||||
|
||||
@ -157,14 +121,6 @@ impl<'a, 'gcx, 'tcx> FulfillmentContext<'tcx> {
|
||||
});
|
||||
}
|
||||
|
||||
pub fn register_region_obligation(&mut self,
|
||||
t_a: Ty<'tcx>,
|
||||
r_b: ty::Region<'tcx>,
|
||||
cause: ObligationCause<'tcx>)
|
||||
{
|
||||
register_region_obligation(t_a, r_b, cause, &mut self.region_obligations);
|
||||
}
|
||||
|
||||
pub fn register_predicate_obligation(&mut self,
|
||||
infcx: &InferCtxt<'a, 'gcx, 'tcx>,
|
||||
obligation: PredicateObligation<'tcx>)
|
||||
@ -183,26 +139,16 @@ impl<'a, 'gcx, 'tcx> FulfillmentContext<'tcx> {
|
||||
});
|
||||
}
|
||||
|
||||
pub fn register_predicate_obligations(&mut self,
|
||||
pub fn register_predicate_obligations<I>(&mut self,
|
||||
infcx: &InferCtxt<'a, 'gcx, 'tcx>,
|
||||
obligations: Vec<PredicateObligation<'tcx>>)
|
||||
obligations: I)
|
||||
where I: IntoIterator<Item = PredicateObligation<'tcx>>
|
||||
{
|
||||
for obligation in obligations {
|
||||
self.register_predicate_obligation(infcx, obligation);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
pub fn region_obligations(&self,
|
||||
body_id: ast::NodeId)
|
||||
-> &[RegionObligation<'tcx>]
|
||||
{
|
||||
match self.region_obligations.get(&body_id) {
|
||||
None => Default::default(),
|
||||
Some(vec) => vec,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn select_all_or_error(&mut self,
|
||||
infcx: &InferCtxt<'a, 'gcx, 'tcx>)
|
||||
-> Result<(),Vec<FulfillmentError<'tcx>>>
|
||||
@ -245,10 +191,7 @@ impl<'a, 'gcx, 'tcx> FulfillmentContext<'tcx> {
|
||||
debug!("select: starting another iteration");
|
||||
|
||||
// Process pending obligations.
|
||||
let outcome = self.predicates.process_obligations(&mut FulfillProcessor {
|
||||
selcx,
|
||||
region_obligations: &mut self.region_obligations,
|
||||
});
|
||||
let outcome = self.predicates.process_obligations(&mut FulfillProcessor { selcx });
|
||||
debug!("select: outcome={:?}", outcome);
|
||||
|
||||
// FIXME: if we kept the original cache key, we could mark projection
|
||||
@ -277,7 +220,6 @@ impl<'a, 'gcx, 'tcx> FulfillmentContext<'tcx> {
|
||||
|
||||
struct FulfillProcessor<'a, 'b: 'a, 'gcx: 'tcx, 'tcx: 'b> {
|
||||
selcx: &'a mut SelectionContext<'b, 'gcx, 'tcx>,
|
||||
region_obligations: &'a mut NodeMap<Vec<RegionObligation<'tcx>>>,
|
||||
}
|
||||
|
||||
impl<'a, 'b, 'gcx, 'tcx> ObligationProcessor for FulfillProcessor<'a, 'b, 'gcx, 'tcx> {
|
||||
@ -288,9 +230,7 @@ impl<'a, 'b, 'gcx, 'tcx> ObligationProcessor for FulfillProcessor<'a, 'b, 'gcx,
|
||||
obligation: &mut Self::Obligation)
|
||||
-> Result<Option<Vec<Self::Obligation>>, Self::Error>
|
||||
{
|
||||
process_predicate(self.selcx,
|
||||
obligation,
|
||||
self.region_obligations)
|
||||
process_predicate(self.selcx, obligation)
|
||||
.map(|os| os.map(|os| os.into_iter().map(|o| PendingPredicateObligation {
|
||||
obligation: o,
|
||||
stalled_on: vec![]
|
||||
@ -329,8 +269,7 @@ fn trait_ref_type_vars<'a, 'gcx, 'tcx>(selcx: &mut SelectionContext<'a, 'gcx, 't
|
||||
/// - `Err` if the predicate does not hold
|
||||
fn process_predicate<'a, 'gcx, 'tcx>(
|
||||
selcx: &mut SelectionContext<'a, 'gcx, 'tcx>,
|
||||
pending_obligation: &mut PendingPredicateObligation<'tcx>,
|
||||
region_obligations: &mut NodeMap<Vec<RegionObligation<'tcx>>>)
|
||||
pending_obligation: &mut PendingPredicateObligation<'tcx>)
|
||||
-> Result<Option<Vec<PredicateObligation<'tcx>>>,
|
||||
FulfillmentErrorCode<'tcx>>
|
||||
{
|
||||
@ -452,18 +391,26 @@ fn process_predicate<'a, 'gcx, 'tcx>(
|
||||
// `for<'a> T: 'a where 'a not in T`, which we can treat as `T: 'static`.
|
||||
Some(t_a) => {
|
||||
let r_static = selcx.tcx().types.re_static;
|
||||
register_region_obligation(t_a, r_static,
|
||||
obligation.cause.clone(),
|
||||
region_obligations);
|
||||
selcx.infcx().register_region_obligation(
|
||||
obligation.cause.body_id,
|
||||
RegionObligation {
|
||||
sup_type: t_a,
|
||||
sub_region: r_static,
|
||||
cause: obligation.cause.clone(),
|
||||
});
|
||||
Ok(Some(vec![]))
|
||||
}
|
||||
}
|
||||
}
|
||||
// If there aren't, register the obligation.
|
||||
Some(ty::OutlivesPredicate(t_a, r_b)) => {
|
||||
register_region_obligation(t_a, r_b,
|
||||
obligation.cause.clone(),
|
||||
region_obligations);
|
||||
selcx.infcx().register_region_obligation(
|
||||
obligation.cause.body_id,
|
||||
RegionObligation {
|
||||
sup_type: t_a,
|
||||
sub_region: r_b,
|
||||
cause: obligation.cause.clone()
|
||||
});
|
||||
Ok(Some(vec![]))
|
||||
}
|
||||
}
|
||||
@ -566,25 +513,6 @@ fn process_predicate<'a, 'gcx, 'tcx>(
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
fn register_region_obligation<'tcx>(t_a: Ty<'tcx>,
|
||||
r_b: ty::Region<'tcx>,
|
||||
cause: ObligationCause<'tcx>,
|
||||
region_obligations: &mut NodeMap<Vec<RegionObligation<'tcx>>>)
|
||||
{
|
||||
let region_obligation = RegionObligation { sup_type: t_a,
|
||||
sub_region: r_b,
|
||||
cause: cause };
|
||||
|
||||
debug!("register_region_obligation({:?}, cause={:?})",
|
||||
region_obligation, region_obligation.cause);
|
||||
|
||||
region_obligations.entry(region_obligation.cause.body_id)
|
||||
.or_insert(vec![])
|
||||
.push(region_obligation);
|
||||
|
||||
}
|
||||
|
||||
fn to_fulfillment_error<'tcx>(
|
||||
error: Error<PendingPredicateObligation<'tcx>, FulfillmentErrorCode<'tcx>>)
|
||||
-> FulfillmentError<'tcx>
|
||||
|
@ -30,7 +30,7 @@ use syntax::ast;
|
||||
use syntax_pos::{Span, DUMMY_SP};
|
||||
|
||||
pub use self::coherence::{orphan_check, overlapping_impls, OrphanCheckErr, OverlapResult};
|
||||
pub use self::fulfill::{FulfillmentContext, RegionObligation};
|
||||
pub use self::fulfill::FulfillmentContext;
|
||||
pub use self::project::MismatchedProjectionTypes;
|
||||
pub use self::project::{normalize, normalize_projection_type, Normalized};
|
||||
pub use self::project::{ProjectionCache, ProjectionCacheSnapshot, Reveal};
|
||||
@ -152,7 +152,6 @@ pub enum ObligationCauseCode<'tcx> {
|
||||
item_name: ast::Name,
|
||||
impl_item_def_id: DefId,
|
||||
trait_item_def_id: DefId,
|
||||
lint_id: Option<ast::NodeId>,
|
||||
},
|
||||
|
||||
/// Checking that this expression can be assigned where it needs to be
|
||||
@ -537,6 +536,17 @@ pub fn normalize_param_env_or_error<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
||||
|
||||
let region_scope_tree = region::ScopeTree::default();
|
||||
let free_regions = FreeRegionMap::new();
|
||||
|
||||
// FIXME. We should really... do something with these region
|
||||
// obligations. But this call just continues the older
|
||||
// behavior (i.e., doesn't cause any new bugs), and it would
|
||||
// take some further refactoring to actually solve them. In
|
||||
// particular, we would have to handle implied bounds
|
||||
// properly, and that code is currently largely confined to
|
||||
// regionck (though I made some efforts to extract it
|
||||
// out). -nmatsakis
|
||||
let _ = infcx.ignore_region_obligations();
|
||||
|
||||
infcx.resolve_regions_and_report_errors(region_context, ®ion_scope_tree, &free_regions);
|
||||
let predicates = match infcx.fully_resolve(&predicates) {
|
||||
Ok(predicates) => predicates,
|
||||
|
@ -26,13 +26,6 @@ impl<'tcx, T: fmt::Debug> fmt::Debug for Normalized<'tcx, T> {
|
||||
}
|
||||
}
|
||||
|
||||
impl<'tcx> fmt::Debug for traits::RegionObligation<'tcx> {
|
||||
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
|
||||
write!(f, "RegionObligation(sub_region={:?}, sup_type={:?})",
|
||||
self.sub_region,
|
||||
self.sup_type)
|
||||
}
|
||||
}
|
||||
impl<'tcx, O: fmt::Debug> fmt::Debug for traits::Obligation<'tcx, O> {
|
||||
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
|
||||
write!(f, "Obligation(predicate={:?},depth={})",
|
||||
@ -221,13 +214,11 @@ impl<'a, 'tcx> Lift<'tcx> for traits::ObligationCauseCode<'a> {
|
||||
}
|
||||
super::CompareImplMethodObligation { item_name,
|
||||
impl_item_def_id,
|
||||
trait_item_def_id,
|
||||
lint_id } => {
|
||||
trait_item_def_id } => {
|
||||
Some(super::CompareImplMethodObligation {
|
||||
item_name,
|
||||
impl_item_def_id,
|
||||
trait_item_def_id,
|
||||
lint_id,
|
||||
})
|
||||
}
|
||||
super::ExprAssignable => Some(super::ExprAssignable),
|
||||
|
@ -144,6 +144,15 @@ pub enum AssociatedItemContainer {
|
||||
}
|
||||
|
||||
impl AssociatedItemContainer {
|
||||
/// Asserts that this is the def-id of an associated item declared
|
||||
/// in a trait, and returns the trait def-id.
|
||||
pub fn assert_trait(&self) -> DefId {
|
||||
match *self {
|
||||
TraitContainer(id) => id,
|
||||
_ => bug!("associated item has wrong container type: {:?}", self)
|
||||
}
|
||||
}
|
||||
|
||||
pub fn id(&self) -> DefId {
|
||||
match *self {
|
||||
TraitContainer(id) => id,
|
||||
@ -895,6 +904,12 @@ pub enum Predicate<'tcx> {
|
||||
ConstEvaluatable(DefId, &'tcx Substs<'tcx>),
|
||||
}
|
||||
|
||||
impl<'tcx> AsRef<Predicate<'tcx>> for Predicate<'tcx> {
|
||||
fn as_ref(&self) -> &Predicate<'tcx> {
|
||||
self
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, 'gcx, 'tcx> Predicate<'tcx> {
|
||||
/// Performs a substitution suitable for going from a
|
||||
/// poly-trait-ref to supertraits that must hold if that
|
||||
@ -1200,6 +1215,25 @@ impl<'tcx> Predicate<'tcx> {
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn to_opt_type_outlives(&self) -> Option<PolyTypeOutlivesPredicate<'tcx>> {
|
||||
match *self {
|
||||
Predicate::TypeOutlives(data) => {
|
||||
Some(data)
|
||||
}
|
||||
Predicate::Trait(..) |
|
||||
Predicate::Projection(..) |
|
||||
Predicate::Equate(..) |
|
||||
Predicate::Subtype(..) |
|
||||
Predicate::RegionOutlives(..) |
|
||||
Predicate::WellFormed(..) |
|
||||
Predicate::ObjectSafe(..) |
|
||||
Predicate::ClosureKind(..) |
|
||||
Predicate::ConstEvaluatable(..) => {
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Represents the bounds declared on a particular set of type
|
||||
|
@ -14,6 +14,7 @@ use hir::def_id::DefId;
|
||||
|
||||
use middle::const_val::ConstVal;
|
||||
use middle::region;
|
||||
use rustc_data_structures::indexed_vec::Idx;
|
||||
use ty::subst::{Substs, Subst};
|
||||
use ty::{self, AdtDef, TypeFlags, Ty, TyCtxt, TypeFoldable};
|
||||
use ty::{Slice, TyS};
|
||||
@ -898,6 +899,18 @@ pub struct RegionVid {
|
||||
pub index: u32,
|
||||
}
|
||||
|
||||
// FIXME: We could convert this to use `newtype_index!`
|
||||
impl Idx for RegionVid {
|
||||
fn new(value: usize) -> Self {
|
||||
assert!(value < ::std::u32::MAX as usize);
|
||||
RegionVid { index: value as u32 }
|
||||
}
|
||||
|
||||
fn index(self) -> usize {
|
||||
self.index as usize
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone, Copy, PartialEq, Eq, Hash, RustcEncodable, RustcDecodable, PartialOrd, Ord)]
|
||||
pub struct SkolemizedRegionVid {
|
||||
pub index: u32,
|
||||
@ -1037,6 +1050,35 @@ impl RegionKind {
|
||||
|
||||
flags
|
||||
}
|
||||
|
||||
/// Given an early-bound or free region, returns the def-id where it was bound.
|
||||
/// For example, consider the regions in this snippet of code:
|
||||
///
|
||||
/// ```
|
||||
/// impl<'a> Foo {
|
||||
/// ^^ -- early bound, declared on an impl
|
||||
///
|
||||
/// fn bar<'b, 'c>(x: &self, y: &'b u32, z: &'c u64) where 'static: 'c
|
||||
/// ^^ ^^ ^ anonymous, late-bound
|
||||
/// | early-bound, appears in where-clauses
|
||||
/// late-bound, appears only in fn args
|
||||
/// {..}
|
||||
/// }
|
||||
/// ```
|
||||
///
|
||||
/// Here, `free_region_binding_scope('a)` would return the def-id
|
||||
/// of the impl, and for all the other highlighted regions, it
|
||||
/// would return the def-id of the function. In other cases (not shown), this
|
||||
/// function might return the def-id of a closure.
|
||||
pub fn free_region_binding_scope(&self, tcx: TyCtxt<'_, '_, '_>) -> DefId {
|
||||
match self {
|
||||
ty::ReEarlyBound(br) => {
|
||||
tcx.parent_def_id(br.def_id).unwrap()
|
||||
}
|
||||
ty::ReFree(fr) => fr.scope,
|
||||
_ => bug!("free_region_binding_scope invoked on inappropriate region: {:?}", self),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Type utilities
|
||||
|
@ -384,6 +384,11 @@ impl<I: Idx, T> IndexVec<I, T> {
|
||||
idx
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn pop(&mut self) -> Option<T> {
|
||||
self.raw.pop()
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn len(&self) -> usize {
|
||||
self.raw.len()
|
||||
@ -411,7 +416,7 @@ impl<I: Idx, T> IndexVec<I, T> {
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn iter_enumerated(&self) -> Enumerated<I, slice::Iter<T>>
|
||||
pub fn iter_enumerated(&self) -> Enumerated<I, slice::Iter<'_, T>>
|
||||
{
|
||||
self.raw.iter().enumerate().map(IntoIdx { _marker: PhantomData })
|
||||
}
|
||||
@ -427,7 +432,7 @@ impl<I: Idx, T> IndexVec<I, T> {
|
||||
}
|
||||
|
||||
#[inline]
|
||||
pub fn iter_enumerated_mut(&mut self) -> Enumerated<I, slice::IterMut<T>>
|
||||
pub fn iter_enumerated_mut(&mut self) -> Enumerated<I, slice::IterMut<'_, T>>
|
||||
{
|
||||
self.raw.iter_mut().enumerate().map(IntoIdx { _marker: PhantomData })
|
||||
}
|
||||
|
@ -31,6 +31,7 @@
|
||||
#![feature(i128)]
|
||||
#![feature(conservative_impl_trait)]
|
||||
#![feature(specialization)]
|
||||
#![feature(underscore_lifetimes)]
|
||||
|
||||
#![cfg_attr(unix, feature(libc))]
|
||||
#![cfg_attr(test, feature(test))]
|
||||
|
@ -207,10 +207,6 @@ pub fn register_builtins(store: &mut lint::LintStore, sess: Option<&Session>) {
|
||||
id: LintId::of(INVALID_TYPE_PARAM_DEFAULT),
|
||||
reference: "issue #36887 <https://github.com/rust-lang/rust/issues/36887>",
|
||||
},
|
||||
FutureIncompatibleInfo {
|
||||
id: LintId::of(EXTRA_REQUIREMENT_IN_IMPL),
|
||||
reference: "issue #37166 <https://github.com/rust-lang/rust/issues/37166>",
|
||||
},
|
||||
FutureIncompatibleInfo {
|
||||
id: LintId::of(LEGACY_DIRECTORY_OWNERSHIP),
|
||||
reference: "issue #37872 <https://github.com/rust-lang/rust/issues/37872>",
|
||||
@ -276,4 +272,6 @@ pub fn register_builtins(store: &mut lint::LintStore, sess: Option<&Session>) {
|
||||
"converted into hard error, see https://github.com/rust-lang/rust/issues/36891");
|
||||
store.register_removed("lifetime_underscore",
|
||||
"converted into hard error, see https://github.com/rust-lang/rust/issues/36892");
|
||||
store.register_removed("extra_requirement_in_impl",
|
||||
"converted into hard error, see https://github.com/rust-lang/rust/issues/37166");
|
||||
}
|
||||
|
@ -112,7 +112,7 @@ fn do_mir_borrowck<'a, 'gcx, 'tcx>(infcx: &InferCtxt<'a, 'gcx, 'tcx>,
|
||||
let opt_regioncx = if !tcx.sess.opts.debugging_opts.nll {
|
||||
None
|
||||
} else {
|
||||
Some(nll::compute_regions(infcx, def_id, mir))
|
||||
Some(nll::compute_regions(infcx, def_id, param_env, mir))
|
||||
};
|
||||
|
||||
let mdpe = MoveDataParamEnv { move_data: move_data, param_env: param_env };
|
||||
@ -136,7 +136,6 @@ fn do_mir_borrowck<'a, 'gcx, 'tcx>(infcx: &InferCtxt<'a, 'gcx, 'tcx>,
|
||||
node_id: id,
|
||||
move_data: &mdpe.move_data,
|
||||
param_env: param_env,
|
||||
fake_infer_ctxt: &infcx,
|
||||
};
|
||||
|
||||
let mut state = InProgress::new(flow_borrows,
|
||||
@ -148,13 +147,12 @@ fn do_mir_borrowck<'a, 'gcx, 'tcx>(infcx: &InferCtxt<'a, 'gcx, 'tcx>,
|
||||
}
|
||||
|
||||
#[allow(dead_code)]
|
||||
pub struct MirBorrowckCtxt<'c, 'b, 'a: 'b+'c, 'gcx: 'a+'tcx, 'tcx: 'a> {
|
||||
tcx: TyCtxt<'a, 'gcx, 'tcx>,
|
||||
mir: &'b Mir<'tcx>,
|
||||
pub struct MirBorrowckCtxt<'cx, 'gcx: 'tcx, 'tcx: 'cx> {
|
||||
tcx: TyCtxt<'cx, 'gcx, 'tcx>,
|
||||
mir: &'cx Mir<'tcx>,
|
||||
node_id: ast::NodeId,
|
||||
move_data: &'b MoveData<'tcx>,
|
||||
param_env: ParamEnv<'tcx>,
|
||||
fake_infer_ctxt: &'c InferCtxt<'c, 'gcx, 'tcx>,
|
||||
move_data: &'cx MoveData<'tcx>,
|
||||
param_env: ParamEnv<'gcx>,
|
||||
}
|
||||
|
||||
// (forced to be `pub` due to its use as an associated type below.)
|
||||
@ -177,12 +175,10 @@ struct FlowInProgress<BD> where BD: BitDenotation {
|
||||
// 2. loans made in overlapping scopes do not conflict
|
||||
// 3. assignments do not affect things loaned out as immutable
|
||||
// 4. moves do not affect things loaned out in any way
|
||||
impl<'c, 'b, 'a: 'b+'c, 'gcx, 'tcx: 'a> DataflowResultsConsumer<'b, 'tcx>
|
||||
for MirBorrowckCtxt<'c, 'b, 'a, 'gcx, 'tcx>
|
||||
{
|
||||
type FlowState = InProgress<'b, 'gcx, 'tcx>;
|
||||
impl<'cx, 'gcx, 'tcx> DataflowResultsConsumer<'cx, 'tcx> for MirBorrowckCtxt<'cx, 'gcx, 'tcx> {
|
||||
type FlowState = InProgress<'cx, 'gcx, 'tcx>;
|
||||
|
||||
fn mir(&self) -> &'b Mir<'tcx> { self.mir }
|
||||
fn mir(&self) -> &'cx Mir<'tcx> { self.mir }
|
||||
|
||||
fn reset_to_entry_of(&mut self, bb: BasicBlock, flow_state: &mut Self::FlowState) {
|
||||
flow_state.each_flow(|b| b.reset_to_entry_of(bb),
|
||||
@ -437,12 +433,12 @@ enum WriteKind {
|
||||
Move,
|
||||
}
|
||||
|
||||
impl<'c, 'b, 'a: 'b+'c, 'gcx, 'tcx: 'a> MirBorrowckCtxt<'c, 'b, 'a, 'gcx, 'tcx> {
|
||||
impl<'cx, 'gcx, 'tcx> MirBorrowckCtxt<'cx, 'gcx, 'tcx> {
|
||||
fn access_lvalue(&mut self,
|
||||
context: Context,
|
||||
lvalue_span: (&Lvalue<'tcx>, Span),
|
||||
kind: (ShallowOrDeep, ReadOrWrite),
|
||||
flow_state: &InProgress<'b, 'gcx, 'tcx>) {
|
||||
flow_state: &InProgress<'cx, 'gcx, 'tcx>) {
|
||||
|
||||
let (sd, rw) = kind;
|
||||
|
||||
@ -501,7 +497,7 @@ impl<'c, 'b, 'a: 'b+'c, 'gcx, 'tcx: 'a> MirBorrowckCtxt<'c, 'b, 'a, 'gcx, 'tcx>
|
||||
lvalue_span: (&Lvalue<'tcx>, Span),
|
||||
kind: ShallowOrDeep,
|
||||
mode: MutateMode,
|
||||
flow_state: &InProgress<'b, 'gcx, 'tcx>) {
|
||||
flow_state: &InProgress<'cx, 'gcx, 'tcx>) {
|
||||
// Write of P[i] or *P, or WriteAndRead of any P, requires P init'd.
|
||||
match mode {
|
||||
MutateMode::WriteAndRead => {
|
||||
@ -522,7 +518,7 @@ impl<'c, 'b, 'a: 'b+'c, 'gcx, 'tcx: 'a> MirBorrowckCtxt<'c, 'b, 'a, 'gcx, 'tcx>
|
||||
context: Context,
|
||||
(rvalue, span): (&Rvalue<'tcx>, Span),
|
||||
_location: Location,
|
||||
flow_state: &InProgress<'b, 'gcx, 'tcx>) {
|
||||
flow_state: &InProgress<'cx, 'gcx, 'tcx>) {
|
||||
match *rvalue {
|
||||
Rvalue::Ref(_/*rgn*/, bk, ref lvalue) => {
|
||||
let access_kind = match bk {
|
||||
@ -579,7 +575,7 @@ impl<'c, 'b, 'a: 'b+'c, 'gcx, 'tcx: 'a> MirBorrowckCtxt<'c, 'b, 'a, 'gcx, 'tcx>
|
||||
context: Context,
|
||||
consume_via_drop: ConsumeKind,
|
||||
(operand, span): (&Operand<'tcx>, Span),
|
||||
flow_state: &InProgress<'b, 'gcx, 'tcx>) {
|
||||
flow_state: &InProgress<'cx, 'gcx, 'tcx>) {
|
||||
match *operand {
|
||||
Operand::Consume(ref lvalue) => {
|
||||
self.consume_lvalue(context, consume_via_drop, (lvalue, span), flow_state)
|
||||
@ -592,11 +588,22 @@ impl<'c, 'b, 'a: 'b+'c, 'gcx, 'tcx: 'a> MirBorrowckCtxt<'c, 'b, 'a, 'gcx, 'tcx>
|
||||
context: Context,
|
||||
consume_via_drop: ConsumeKind,
|
||||
lvalue_span: (&Lvalue<'tcx>, Span),
|
||||
flow_state: &InProgress<'b, 'gcx, 'tcx>) {
|
||||
flow_state: &InProgress<'cx, 'gcx, 'tcx>) {
|
||||
let lvalue = lvalue_span.0;
|
||||
|
||||
let ty = lvalue.ty(self.mir, self.tcx).to_ty(self.tcx);
|
||||
let moves_by_default =
|
||||
self.fake_infer_ctxt.type_moves_by_default(self.param_env, ty, DUMMY_SP);
|
||||
|
||||
// Erase the regions in type before checking whether it moves by
|
||||
// default. There are a few reasons to do this:
|
||||
//
|
||||
// - They should not affect the result.
|
||||
// - It avoids adding new region constraints into the surrounding context,
|
||||
// which would trigger an ICE, since the infcx will have been "frozen" by
|
||||
// the NLL region context.
|
||||
let gcx = self.tcx.global_tcx();
|
||||
let erased_ty = gcx.lift(&self.tcx.erase_regions(&ty)).unwrap();
|
||||
let moves_by_default = erased_ty.moves_by_default(gcx, self.param_env, DUMMY_SP);
|
||||
|
||||
if moves_by_default {
|
||||
// move of lvalue: check if this is move of already borrowed path
|
||||
self.access_lvalue(context, lvalue_span, (Deep, Write(WriteKind::Move)), flow_state);
|
||||
@ -619,11 +626,11 @@ impl<'c, 'b, 'a: 'b+'c, 'gcx, 'tcx: 'a> MirBorrowckCtxt<'c, 'b, 'a, 'gcx, 'tcx>
|
||||
}
|
||||
}
|
||||
|
||||
impl<'c, 'b, 'a: 'b+'c, 'gcx, 'tcx: 'a> MirBorrowckCtxt<'c, 'b, 'a, 'gcx, 'tcx> {
|
||||
impl<'cx, 'gcx, 'tcx> MirBorrowckCtxt<'cx, 'gcx, 'tcx> {
|
||||
fn check_if_reassignment_to_immutable_state(&mut self,
|
||||
context: Context,
|
||||
(lvalue, span): (&Lvalue<'tcx>, Span),
|
||||
flow_state: &InProgress<'b, 'gcx, 'tcx>) {
|
||||
flow_state: &InProgress<'cx, 'gcx, 'tcx>) {
|
||||
let move_data = self.move_data;
|
||||
|
||||
// determine if this path has a non-mut owner (and thus needs checking).
|
||||
@ -674,7 +681,7 @@ impl<'c, 'b, 'a: 'b+'c, 'gcx, 'tcx: 'a> MirBorrowckCtxt<'c, 'b, 'a, 'gcx, 'tcx>
|
||||
context: Context,
|
||||
desired_action: &str,
|
||||
lvalue_span: (&Lvalue<'tcx>, Span),
|
||||
flow_state: &InProgress<'b, 'gcx, 'tcx>) {
|
||||
flow_state: &InProgress<'cx, 'gcx, 'tcx>) {
|
||||
// FIXME: analogous code in check_loans first maps `lvalue` to
|
||||
// its base_path ... but is that what we want here?
|
||||
let lvalue = self.base_path(lvalue_span.0);
|
||||
@ -802,7 +809,7 @@ impl<'c, 'b, 'a: 'b+'c, 'gcx, 'tcx: 'a> MirBorrowckCtxt<'c, 'b, 'a, 'gcx, 'tcx>
|
||||
fn check_if_assigned_path_is_moved(&mut self,
|
||||
context: Context,
|
||||
(lvalue, span): (&Lvalue<'tcx>, Span),
|
||||
flow_state: &InProgress<'b, 'gcx, 'tcx>) {
|
||||
flow_state: &InProgress<'cx, 'gcx, 'tcx>) {
|
||||
// recur down lvalue; dispatch to check_if_path_is_moved when necessary
|
||||
let mut lvalue = lvalue;
|
||||
loop {
|
||||
@ -1015,11 +1022,11 @@ enum NoMovePathFound {
|
||||
ReachedStatic,
|
||||
}
|
||||
|
||||
impl<'c, 'b, 'a: 'b+'c, 'gcx, 'tcx: 'a> MirBorrowckCtxt<'c, 'b, 'a, 'gcx, 'tcx> {
|
||||
impl<'cx, 'gcx, 'tcx> MirBorrowckCtxt<'cx, 'gcx, 'tcx> {
|
||||
fn each_borrow_involving_path<F>(&mut self,
|
||||
_context: Context,
|
||||
access_lvalue: (ShallowOrDeep, &Lvalue<'tcx>),
|
||||
flow_state: &InProgress<'b, 'gcx, 'tcx>,
|
||||
flow_state: &InProgress<'cx, 'gcx, 'tcx>,
|
||||
mut op: F)
|
||||
where F: FnMut(&mut Self, BorrowIndex, &BorrowData<'tcx>, &Lvalue<'tcx>) -> Control
|
||||
{
|
||||
@ -1119,11 +1126,11 @@ mod prefixes {
|
||||
}
|
||||
|
||||
|
||||
pub(super) struct Prefixes<'c, 'gcx: 'tcx, 'tcx: 'c> {
|
||||
mir: &'c Mir<'tcx>,
|
||||
tcx: TyCtxt<'c, 'gcx, 'tcx>,
|
||||
pub(super) struct Prefixes<'cx, 'gcx: 'tcx, 'tcx: 'cx> {
|
||||
mir: &'cx Mir<'tcx>,
|
||||
tcx: TyCtxt<'cx, 'gcx, 'tcx>,
|
||||
kind: PrefixSet,
|
||||
next: Option<&'c Lvalue<'tcx>>,
|
||||
next: Option<&'cx Lvalue<'tcx>>,
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, PartialEq, Eq, Debug)]
|
||||
@ -1137,21 +1144,21 @@ mod prefixes {
|
||||
Supporting,
|
||||
}
|
||||
|
||||
impl<'c, 'b, 'a: 'b+'c, 'gcx, 'tcx: 'a> MirBorrowckCtxt<'c, 'b, 'a, 'gcx, 'tcx> {
|
||||
impl<'cx, 'gcx, 'tcx> MirBorrowckCtxt<'cx, 'gcx, 'tcx> {
|
||||
/// Returns an iterator over the prefixes of `lvalue`
|
||||
/// (inclusive) from longest to smallest, potentially
|
||||
/// terminating the iteration early based on `kind`.
|
||||
pub(super) fn prefixes<'d>(&self,
|
||||
lvalue: &'d Lvalue<'tcx>,
|
||||
pub(super) fn prefixes(&self,
|
||||
lvalue: &'cx Lvalue<'tcx>,
|
||||
kind: PrefixSet)
|
||||
-> Prefixes<'d, 'gcx, 'tcx> where 'b: 'd
|
||||
-> Prefixes<'cx, 'gcx, 'tcx>
|
||||
{
|
||||
Prefixes { next: Some(lvalue), kind, mir: self.mir, tcx: self.tcx }
|
||||
}
|
||||
}
|
||||
|
||||
impl<'c, 'gcx, 'tcx> Iterator for Prefixes<'c, 'gcx, 'tcx> {
|
||||
type Item = &'c Lvalue<'tcx>;
|
||||
impl<'cx, 'gcx, 'tcx> Iterator for Prefixes<'cx, 'gcx, 'tcx> {
|
||||
type Item = &'cx Lvalue<'tcx>;
|
||||
fn next(&mut self) -> Option<Self::Item> {
|
||||
let mut cursor = match self.next {
|
||||
None => return None,
|
||||
@ -1244,7 +1251,7 @@ mod prefixes {
|
||||
}
|
||||
}
|
||||
|
||||
impl<'c, 'b, 'a: 'b+'c, 'gcx, 'tcx: 'a> MirBorrowckCtxt<'c, 'b, 'a, 'gcx, 'tcx> {
|
||||
impl<'cx, 'gcx, 'tcx> MirBorrowckCtxt<'cx, 'gcx, 'tcx> {
|
||||
fn report_use_of_moved_or_uninitialized(&mut self,
|
||||
_context: Context,
|
||||
desired_action: &str,
|
||||
@ -1483,7 +1490,7 @@ impl<'c, 'b, 'a: 'b+'c, 'gcx, 'tcx: 'a> MirBorrowckCtxt<'c, 'b, 'a, 'gcx, 'tcx>
|
||||
}
|
||||
}
|
||||
|
||||
impl<'c, 'b, 'a: 'b+'c, 'gcx, 'tcx: 'a> MirBorrowckCtxt<'c, 'b, 'a, 'gcx, 'tcx> {
|
||||
impl<'cx, 'gcx, 'tcx> MirBorrowckCtxt<'cx, 'gcx, 'tcx> {
|
||||
// End-user visible description of `lvalue`
|
||||
fn describe_lvalue(&self, lvalue: &Lvalue<'tcx>) -> String {
|
||||
let mut buf = String::new();
|
||||
@ -1641,7 +1648,7 @@ impl<'c, 'b, 'a: 'b+'c, 'gcx, 'tcx: 'a> MirBorrowckCtxt<'c, 'b, 'a, 'gcx, 'tcx>
|
||||
}
|
||||
}
|
||||
|
||||
impl<'c, 'b, 'a: 'b+'c, 'gcx, 'tcx: 'a> MirBorrowckCtxt<'c, 'b, 'a, 'gcx, 'tcx> {
|
||||
impl<'cx, 'gcx, 'tcx> MirBorrowckCtxt<'cx, 'gcx, 'tcx> {
|
||||
// FIXME (#16118): function intended to allow the borrow checker
|
||||
// to be less precise in its handling of Box while still allowing
|
||||
// moves out of a Box. They should be removed when/if we stop
|
||||
|
@ -22,7 +22,7 @@ use rustc_data_structures::indexed_vec::{IndexVec};
|
||||
use dataflow::{BitDenotation, BlockSets, DataflowOperator};
|
||||
pub use dataflow::indexes::BorrowIndex;
|
||||
use transform::nll::region_infer::RegionInferenceContext;
|
||||
use transform::nll::ToRegionIndex;
|
||||
use transform::nll::ToRegionVid;
|
||||
|
||||
use syntax_pos::Span;
|
||||
|
||||
@ -145,7 +145,7 @@ impl<'a, 'gcx, 'tcx> Borrows<'a, 'gcx, 'tcx> {
|
||||
location: Location) {
|
||||
if let Some(regioncx) = self.nonlexical_regioncx {
|
||||
for (borrow_index, borrow_data) in self.borrows.iter_enumerated() {
|
||||
let borrow_region = borrow_data.region.to_region_index();
|
||||
let borrow_region = borrow_data.region.to_region_vid();
|
||||
if !regioncx.region_contains_point(borrow_region, location) {
|
||||
// The region checker really considers the borrow
|
||||
// to start at the point **after** the location of
|
||||
|
@ -23,6 +23,7 @@ Rust MIR: a lowered representation of Rust. Also: an experiment!
|
||||
#![feature(core_intrinsics)]
|
||||
#![feature(decl_macro)]
|
||||
#![feature(i128_type)]
|
||||
#![feature(match_default_bindings)]
|
||||
#![feature(rustc_diagnostic_macros)]
|
||||
#![feature(placement_in_syntax)]
|
||||
#![feature(collection_placement)]
|
||||
|
@ -9,7 +9,7 @@
|
||||
// except according to those terms.
|
||||
|
||||
use rustc::hir;
|
||||
use rustc::mir::{BasicBlock, BorrowKind, Location, Lvalue, Mir, Rvalue, Statement, StatementKind};
|
||||
use rustc::mir::{Location, Lvalue, Mir, Rvalue};
|
||||
use rustc::mir::visit::Visitor;
|
||||
use rustc::mir::Lvalue::Projection;
|
||||
use rustc::mir::{LvalueProjection, ProjectionElem};
|
||||
@ -21,9 +21,8 @@ use rustc::util::common::ErrorReported;
|
||||
use rustc_data_structures::fx::FxHashSet;
|
||||
use syntax::codemap::DUMMY_SP;
|
||||
|
||||
use super::subtype;
|
||||
use super::LivenessResults;
|
||||
use super::ToRegionIndex;
|
||||
use super::ToRegionVid;
|
||||
use super::region_infer::RegionInferenceContext;
|
||||
|
||||
pub(super) fn generate_constraints<'a, 'gcx, 'tcx>(
|
||||
@ -102,7 +101,7 @@ impl<'cx, 'gcx, 'tcx> ConstraintGeneration<'cx, 'gcx, 'tcx> {
|
||||
self.infcx
|
||||
.tcx
|
||||
.for_each_free_region(&live_ty, |live_region| {
|
||||
let vid = live_region.to_region_index();
|
||||
let vid = live_region.to_region_vid();
|
||||
self.regioncx.add_live_point(vid, location);
|
||||
});
|
||||
}
|
||||
@ -179,29 +178,6 @@ impl<'cx, 'gcx, 'tcx> ConstraintGeneration<'cx, 'gcx, 'tcx> {
|
||||
self.visit_mir(self.mir);
|
||||
}
|
||||
|
||||
fn add_borrow_constraint(
|
||||
&mut self,
|
||||
location: Location,
|
||||
destination_lv: &Lvalue<'tcx>,
|
||||
borrow_region: ty::Region<'tcx>,
|
||||
_borrow_kind: BorrowKind,
|
||||
_borrowed_lv: &Lvalue<'tcx>,
|
||||
) {
|
||||
let tcx = self.infcx.tcx;
|
||||
let span = self.mir.source_info(location).span;
|
||||
let destination_ty = destination_lv.ty(self.mir, tcx).to_ty(tcx);
|
||||
|
||||
let destination_region = match destination_ty.sty {
|
||||
ty::TyRef(r, _) => r,
|
||||
_ => bug!()
|
||||
};
|
||||
|
||||
self.regioncx.add_outlives(span,
|
||||
borrow_region.to_region_index(),
|
||||
destination_region.to_region_index(),
|
||||
location.successor_within_block());
|
||||
}
|
||||
|
||||
fn add_reborrow_constraint(
|
||||
&mut self,
|
||||
location: Location,
|
||||
@ -227,8 +203,8 @@ impl<'cx, 'gcx, 'tcx> ConstraintGeneration<'cx, 'gcx, 'tcx> {
|
||||
|
||||
let span = self.mir.source_info(location).span;
|
||||
self.regioncx.add_outlives(span,
|
||||
base_region.to_region_index(),
|
||||
borrow_region.to_region_index(),
|
||||
base_region.to_region_vid(),
|
||||
borrow_region.to_region_vid(),
|
||||
location.successor_within_block());
|
||||
}
|
||||
}
|
||||
@ -237,35 +213,22 @@ impl<'cx, 'gcx, 'tcx> ConstraintGeneration<'cx, 'gcx, 'tcx> {
|
||||
}
|
||||
|
||||
impl<'cx, 'gcx, 'tcx> Visitor<'tcx> for ConstraintGeneration<'cx, 'gcx, 'tcx> {
|
||||
fn visit_statement(&mut self,
|
||||
block: BasicBlock,
|
||||
statement: &Statement<'tcx>,
|
||||
fn visit_rvalue(&mut self,
|
||||
rvalue: &Rvalue<'tcx>,
|
||||
location: Location) {
|
||||
debug!("visit_rvalue(rvalue={:?}, location={:?})", rvalue, location);
|
||||
|
||||
debug!("visit_statement(statement={:?}, location={:?})", statement, location);
|
||||
|
||||
// Look for a statement like:
|
||||
// Look for an rvalue like:
|
||||
//
|
||||
// D = & L
|
||||
// & L
|
||||
//
|
||||
// where D is the path to which we are assigning, and
|
||||
// L is the path that is borrowed.
|
||||
if let StatementKind::Assign(ref destination_lv, ref rv) = statement.kind {
|
||||
if let Rvalue::Ref(region, bk, ref borrowed_lv) = *rv {
|
||||
self.add_borrow_constraint(location, destination_lv, region, bk, borrowed_lv);
|
||||
// where L is the path that is borrowed. In that case, we have
|
||||
// to add the reborrow constraints (which don't fall out
|
||||
// naturally from the type-checker).
|
||||
if let Rvalue::Ref(region, _bk, ref borrowed_lv) = *rvalue {
|
||||
self.add_reborrow_constraint(location, region, borrowed_lv);
|
||||
}
|
||||
|
||||
let tcx = self.infcx.tcx;
|
||||
let destination_ty = destination_lv.ty(self.mir, tcx).to_ty(tcx);
|
||||
let rv_ty = rv.ty(self.mir, tcx);
|
||||
|
||||
let span = self.mir.source_info(location).span;
|
||||
for (a, b) in subtype::outlives_pairs(tcx, rv_ty, destination_ty) {
|
||||
self.regioncx.add_outlives(span, a, b, location.successor_within_block());
|
||||
}
|
||||
}
|
||||
|
||||
self.super_statement(block, statement, location);
|
||||
self.super_rvalue(rvalue, location);
|
||||
}
|
||||
}
|
||||
|
@ -25,17 +25,18 @@
|
||||
use rustc::hir::def_id::DefId;
|
||||
use rustc::infer::InferCtxt;
|
||||
use rustc::middle::free_region::FreeRegionMap;
|
||||
use rustc::ty;
|
||||
use rustc::ty::{self, RegionVid};
|
||||
use rustc::ty::subst::Substs;
|
||||
use rustc::util::nodemap::FxHashMap;
|
||||
use rustc_data_structures::indexed_vec::Idx;
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct FreeRegions<'tcx> {
|
||||
/// Given a free region defined on this function (either early- or
|
||||
/// late-bound), this maps it to its internal region index. The
|
||||
/// corresponding variable will be "capped" so that it cannot
|
||||
/// grow.
|
||||
pub indices: FxHashMap<ty::Region<'tcx>, usize>,
|
||||
/// late-bound), this maps it to its internal region index. When
|
||||
/// the region context is created, the first N variables will be
|
||||
/// created based on these indices.
|
||||
pub indices: FxHashMap<ty::Region<'tcx>, RegionVid>,
|
||||
|
||||
/// The map from the typeck tables telling us how to relate free regions.
|
||||
pub free_region_map: &'tcx FreeRegionMap<'tcx>,
|
||||
@ -49,6 +50,9 @@ pub fn free_regions<'a, 'gcx, 'tcx>(
|
||||
|
||||
let mut indices = FxHashMap();
|
||||
|
||||
// `'static` is always free.
|
||||
insert_free_region(&mut indices, infcx.tcx.types.re_static);
|
||||
|
||||
// Extract the early regions.
|
||||
let item_substs = Substs::identity_for_item(infcx.tcx, item_def_id);
|
||||
for item_subst in item_substs {
|
||||
@ -78,9 +82,9 @@ pub fn free_regions<'a, 'gcx, 'tcx>(
|
||||
}
|
||||
|
||||
fn insert_free_region<'tcx>(
|
||||
free_regions: &mut FxHashMap<ty::Region<'tcx>, usize>,
|
||||
free_regions: &mut FxHashMap<ty::Region<'tcx>, RegionVid>,
|
||||
region: ty::Region<'tcx>,
|
||||
) {
|
||||
let len = free_regions.len();
|
||||
free_regions.entry(region).or_insert(len);
|
||||
let next = RegionVid::new(free_regions.len());
|
||||
free_regions.entry(region).or_insert(next);
|
||||
}
|
||||
|
@ -11,19 +11,19 @@
|
||||
use rustc::hir::def_id::DefId;
|
||||
use rustc::mir::Mir;
|
||||
use rustc::infer::InferCtxt;
|
||||
use rustc::ty::{self, RegionKind};
|
||||
use rustc::ty::{self, RegionKind, RegionVid};
|
||||
use rustc::util::nodemap::FxHashMap;
|
||||
use rustc_data_structures::indexed_vec::Idx;
|
||||
use std::collections::BTreeSet;
|
||||
use transform::MirSource;
|
||||
use transform::type_check;
|
||||
use util::liveness::{self, LivenessMode, LivenessResult, LocalSet};
|
||||
|
||||
use util as mir_util;
|
||||
use self::mir_util::PassWhere;
|
||||
|
||||
mod constraint_generation;
|
||||
mod subtype_constraint_generation;
|
||||
mod free_regions;
|
||||
mod subtype;
|
||||
|
||||
pub(crate) mod region_infer;
|
||||
use self::region_infer::RegionInferenceContext;
|
||||
@ -36,13 +36,24 @@ mod renumber;
|
||||
pub fn compute_regions<'a, 'gcx, 'tcx>(
|
||||
infcx: &InferCtxt<'a, 'gcx, 'tcx>,
|
||||
def_id: DefId,
|
||||
param_env: ty::ParamEnv<'gcx>,
|
||||
mir: &mut Mir<'tcx>,
|
||||
) -> RegionInferenceContext<'tcx> {
|
||||
// Compute named region information.
|
||||
let free_regions = &free_regions::free_regions(infcx, def_id);
|
||||
|
||||
// Replace all regions with fresh inference variables.
|
||||
let num_region_variables = renumber::renumber_mir(infcx, free_regions, mir);
|
||||
renumber::renumber_mir(infcx, free_regions, mir);
|
||||
|
||||
// Run the MIR type-checker.
|
||||
let mir_node_id = infcx.tcx.hir.as_local_node_id(def_id).unwrap();
|
||||
let constraint_sets = &type_check::type_check(infcx, mir_node_id, param_env, mir);
|
||||
|
||||
// Create the region inference context, taking ownership of the region inference
|
||||
// data that was contained in `infcx`.
|
||||
let var_origins = infcx.take_region_var_origins();
|
||||
let mut regioncx = RegionInferenceContext::new(var_origins, free_regions, mir);
|
||||
subtype_constraint_generation::generate(&mut regioncx, free_regions, mir, constraint_sets);
|
||||
|
||||
// Compute what is live where.
|
||||
let liveness = &LivenessResults {
|
||||
@ -63,11 +74,10 @@ pub fn compute_regions<'a, 'gcx, 'tcx>(
|
||||
),
|
||||
};
|
||||
|
||||
// Create the region inference context, generate the constraints,
|
||||
// and then solve them.
|
||||
let mut regioncx = RegionInferenceContext::new(free_regions, num_region_variables, mir);
|
||||
let param_env = infcx.tcx.param_env(def_id);
|
||||
// Generate non-subtyping constraints.
|
||||
constraint_generation::generate_constraints(infcx, &mut regioncx, &mir, param_env, liveness);
|
||||
|
||||
// Solve the region constraints.
|
||||
regioncx.solve(infcx, &mir);
|
||||
|
||||
// Dump MIR results into a file, if that is enabled. This let us
|
||||
@ -123,12 +133,7 @@ fn dump_mir_results<'a, 'gcx, 'tcx>(
|
||||
match pass_where {
|
||||
// Before the CFG, dump out the values for each region variable.
|
||||
PassWhere::BeforeCFG => for region in regioncx.regions() {
|
||||
writeln!(
|
||||
out,
|
||||
"| {:?}: {:?}",
|
||||
region,
|
||||
regioncx.region_value(region)
|
||||
)?;
|
||||
writeln!(out, "| {:?}: {:?}", region, regioncx.region_value(region))?;
|
||||
},
|
||||
|
||||
// Before each basic block, dump out the values
|
||||
@ -152,23 +157,19 @@ fn dump_mir_results<'a, 'gcx, 'tcx>(
|
||||
});
|
||||
}
|
||||
|
||||
newtype_index!(RegionIndex {
|
||||
DEBUG_FORMAT = "'_#{}r",
|
||||
});
|
||||
|
||||
/// Right now, we piggy back on the `ReVar` to store our NLL inference
|
||||
/// regions. These are indexed with `RegionIndex`. This method will
|
||||
/// assert that the region is a `ReVar` and convert the internal index
|
||||
/// into a `RegionIndex`. This is reasonable because in our MIR we
|
||||
/// replace all free regions with inference variables.
|
||||
pub trait ToRegionIndex {
|
||||
fn to_region_index(&self) -> RegionIndex;
|
||||
/// regions. These are indexed with `RegionVid`. This method will
|
||||
/// assert that the region is a `ReVar` and extract its interal index.
|
||||
/// This is reasonable because in our MIR we replace all free regions
|
||||
/// with inference variables.
|
||||
pub trait ToRegionVid {
|
||||
fn to_region_vid(&self) -> RegionVid;
|
||||
}
|
||||
|
||||
impl ToRegionIndex for RegionKind {
|
||||
fn to_region_index(&self) -> RegionIndex {
|
||||
impl ToRegionVid for RegionKind {
|
||||
fn to_region_vid(&self) -> RegionVid {
|
||||
if let &ty::ReVar(vid) = self {
|
||||
RegionIndex::new(vid.index as usize)
|
||||
vid
|
||||
} else {
|
||||
bug!("region is not an ReVar: {:?}", self)
|
||||
}
|
||||
|
@ -8,12 +8,14 @@
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
use super::RegionIndex;
|
||||
use super::free_regions::FreeRegions;
|
||||
use rustc::infer::InferCtxt;
|
||||
use rustc::infer::RegionVariableOrigin;
|
||||
use rustc::infer::NLLRegionVariableOrigin;
|
||||
use rustc::infer::region_constraints::VarOrigins;
|
||||
use rustc::mir::{Location, Mir};
|
||||
use rustc::ty;
|
||||
use rustc_data_structures::indexed_vec::{Idx, IndexVec};
|
||||
use rustc::ty::{self, RegionVid};
|
||||
use rustc_data_structures::indexed_vec::IndexVec;
|
||||
use rustc_data_structures::fx::FxHashSet;
|
||||
use std::collections::BTreeSet;
|
||||
use std::fmt;
|
||||
@ -21,28 +23,22 @@ use syntax_pos::Span;
|
||||
|
||||
pub struct RegionInferenceContext<'tcx> {
|
||||
/// Contains the definition for every region variable. Region
|
||||
/// variables are identified by their index (`RegionIndex`). The
|
||||
/// variables are identified by their index (`RegionVid`). The
|
||||
/// definition contains information about where the region came
|
||||
/// from as well as its final inferred value.
|
||||
definitions: IndexVec<RegionIndex, RegionDefinition<'tcx>>,
|
||||
|
||||
/// The indices of all "free regions" in scope. These are the
|
||||
/// lifetime parameters (anonymous and named) declared in the
|
||||
/// function signature:
|
||||
///
|
||||
/// fn foo<'a, 'b>(x: &Foo<'a, 'b>)
|
||||
/// ^^ ^^ ^
|
||||
///
|
||||
/// These indices will be from 0..N, as it happens, but we collect
|
||||
/// them into a vector for convenience.
|
||||
free_regions: Vec<RegionIndex>,
|
||||
definitions: IndexVec<RegionVid, RegionDefinition<'tcx>>,
|
||||
|
||||
/// The constraints we have accumulated and used during solving.
|
||||
constraints: Vec<Constraint>,
|
||||
}
|
||||
|
||||
#[derive(Default)]
|
||||
struct RegionDefinition<'tcx> {
|
||||
/// Why we created this variable. Mostly these will be
|
||||
/// `RegionVariableOrigin::NLL`, but some variables get created
|
||||
/// elsewhere in the code with other causes (e.g., instantiation
|
||||
/// late-bound-regions).
|
||||
origin: RegionVariableOrigin,
|
||||
|
||||
/// If this is a free-region, then this is `Some(X)` where `X` is
|
||||
/// the name of the region.
|
||||
name: Option<ty::Region<'tcx>>,
|
||||
@ -66,7 +62,7 @@ struct RegionDefinition<'tcx> {
|
||||
#[derive(Clone, Default, PartialEq, Eq)]
|
||||
struct Region {
|
||||
points: BTreeSet<Location>,
|
||||
free_regions: BTreeSet<RegionIndex>,
|
||||
free_regions: BTreeSet<RegionVid>,
|
||||
}
|
||||
|
||||
impl fmt::Debug for Region {
|
||||
@ -84,7 +80,7 @@ impl Region {
|
||||
self.points.insert(point)
|
||||
}
|
||||
|
||||
fn add_free_region(&mut self, region: RegionIndex) -> bool {
|
||||
fn add_free_region(&mut self, region: RegionVid) -> bool {
|
||||
self.free_regions.insert(region)
|
||||
}
|
||||
|
||||
@ -93,19 +89,24 @@ impl Region {
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Copy, Clone, Debug, PartialEq, Eq, Hash)]
|
||||
#[derive(Copy, Clone, PartialEq, Eq, PartialOrd, Ord, Hash)]
|
||||
pub struct Constraint {
|
||||
/// Where did this constraint arise?
|
||||
span: Span,
|
||||
// NB. The ordering here is not significant for correctness, but
|
||||
// it is for convenience. Before we dump the constraints in the
|
||||
// debugging logs, we sort them, and we'd like the "super region"
|
||||
// to be first, etc. (In particular, span should remain last.)
|
||||
|
||||
/// The region SUP must outlive SUB...
|
||||
sup: RegionIndex,
|
||||
sup: RegionVid,
|
||||
|
||||
/// Region that must be outlived.
|
||||
sub: RegionIndex,
|
||||
sub: RegionVid,
|
||||
|
||||
/// At this location.
|
||||
point: Location,
|
||||
|
||||
/// Where did this constraint arise?
|
||||
span: Span,
|
||||
}
|
||||
|
||||
impl<'a, 'gcx, 'tcx> RegionInferenceContext<'tcx> {
|
||||
@ -113,17 +114,16 @@ impl<'a, 'gcx, 'tcx> RegionInferenceContext<'tcx> {
|
||||
/// `num_region_variables` valid inference variables; the first N
|
||||
/// of those will be constant regions representing the free
|
||||
/// regions defined in `free_regions`.
|
||||
pub fn new(
|
||||
free_regions: &FreeRegions<'tcx>,
|
||||
num_region_variables: usize,
|
||||
mir: &Mir<'tcx>,
|
||||
) -> Self {
|
||||
pub fn new(var_origins: VarOrigins, free_regions: &FreeRegions<'tcx>, mir: &Mir<'tcx>) -> Self {
|
||||
// Create a RegionDefinition for each inference variable.
|
||||
let definitions = var_origins
|
||||
.into_iter()
|
||||
.map(|origin| RegionDefinition::new(origin))
|
||||
.collect();
|
||||
|
||||
let mut result = Self {
|
||||
definitions: (0..num_region_variables)
|
||||
.map(|_| RegionDefinition::default())
|
||||
.collect(),
|
||||
definitions: definitions,
|
||||
constraints: Vec::new(),
|
||||
free_regions: Vec::new(),
|
||||
};
|
||||
|
||||
result.init_free_regions(free_regions, mir);
|
||||
@ -151,16 +151,18 @@ impl<'a, 'gcx, 'tcx> RegionInferenceContext<'tcx> {
|
||||
/// is just itself. R1 (`'b`) in contrast also outlives `'a` and
|
||||
/// hence contains R0 and R1.
|
||||
fn init_free_regions(&mut self, free_regions: &FreeRegions<'tcx>, mir: &Mir<'tcx>) {
|
||||
let &FreeRegions {
|
||||
ref indices,
|
||||
ref free_region_map,
|
||||
let FreeRegions {
|
||||
indices,
|
||||
free_region_map,
|
||||
} = free_regions;
|
||||
|
||||
// For each free region X:
|
||||
for (free_region, index) in indices {
|
||||
let variable = RegionIndex::new(*index);
|
||||
|
||||
self.free_regions.push(variable);
|
||||
for (free_region, &variable) in indices {
|
||||
// These should be free-region variables.
|
||||
assert!(match self.definitions[variable].origin {
|
||||
RegionVariableOrigin::NLL(NLLRegionVariableOrigin::FreeRegion) => true,
|
||||
_ => false,
|
||||
});
|
||||
|
||||
// Initialize the name and a few other details.
|
||||
self.definitions[variable].name = Some(free_region);
|
||||
@ -181,10 +183,19 @@ impl<'a, 'gcx, 'tcx> RegionInferenceContext<'tcx> {
|
||||
// Add `end(X)` into the set for X.
|
||||
self.definitions[variable].value.add_free_region(variable);
|
||||
|
||||
// `'static` outlives all other free regions as well.
|
||||
if let ty::ReStatic = free_region {
|
||||
for &other_variable in indices.values() {
|
||||
self.definitions[variable]
|
||||
.value
|
||||
.add_free_region(other_variable);
|
||||
}
|
||||
}
|
||||
|
||||
// Go through each region Y that outlives X (i.e., where
|
||||
// Y: X is true). Add `end(X)` into the set for `Y`.
|
||||
for superregion in free_region_map.regions_that_outlive(&free_region) {
|
||||
let superregion_index = RegionIndex::new(indices[superregion]);
|
||||
let superregion_index = indices[superregion];
|
||||
self.definitions[superregion_index]
|
||||
.value
|
||||
.add_free_region(variable);
|
||||
@ -200,24 +211,24 @@ impl<'a, 'gcx, 'tcx> RegionInferenceContext<'tcx> {
|
||||
}
|
||||
|
||||
/// Returns an iterator over all the region indices.
|
||||
pub fn regions(&self) -> impl Iterator<Item = RegionIndex> {
|
||||
pub fn regions(&self) -> impl Iterator<Item = RegionVid> {
|
||||
self.definitions.indices()
|
||||
}
|
||||
|
||||
/// Returns true if the region `r` contains the point `p`.
|
||||
///
|
||||
/// Until `solve()` executes, this value is not particularly meaningful.
|
||||
pub fn region_contains_point(&self, r: RegionIndex, p: Location) -> bool {
|
||||
pub fn region_contains_point(&self, r: RegionVid, p: Location) -> bool {
|
||||
self.definitions[r].value.contains_point(p)
|
||||
}
|
||||
|
||||
/// Returns access to the value of `r` for debugging purposes.
|
||||
pub(super) fn region_value(&self, r: RegionIndex) -> &fmt::Debug {
|
||||
pub(super) fn region_value(&self, r: RegionVid) -> &fmt::Debug {
|
||||
&self.definitions[r].value
|
||||
}
|
||||
|
||||
/// Indicates that the region variable `v` is live at the point `point`.
|
||||
pub(super) fn add_live_point(&mut self, v: RegionIndex, point: Location) {
|
||||
pub(super) fn add_live_point(&mut self, v: RegionVid, point: Location) {
|
||||
debug!("add_live_point({:?}, {:?})", v, point);
|
||||
let definition = &mut self.definitions[v];
|
||||
if !definition.constant {
|
||||
@ -233,8 +244,8 @@ impl<'a, 'gcx, 'tcx> RegionInferenceContext<'tcx> {
|
||||
pub(super) fn add_outlives(
|
||||
&mut self,
|
||||
span: Span,
|
||||
sup: RegionIndex,
|
||||
sub: RegionIndex,
|
||||
sup: RegionVid,
|
||||
sub: RegionVid,
|
||||
point: Location,
|
||||
) {
|
||||
debug!("add_outlives({:?}: {:?} @ {:?}", sup, sub, point);
|
||||
@ -267,23 +278,28 @@ impl<'a, 'gcx, 'tcx> RegionInferenceContext<'tcx> {
|
||||
/// for each region variable until all the constraints are
|
||||
/// satisfied. Note that some values may grow **too** large to be
|
||||
/// feasible, but we check this later.
|
||||
fn propagate_constraints(
|
||||
&mut self,
|
||||
mir: &Mir<'tcx>,
|
||||
) -> Vec<(RegionIndex, Span, RegionIndex)> {
|
||||
fn propagate_constraints(&mut self, mir: &Mir<'tcx>) -> Vec<(RegionVid, Span, RegionVid)> {
|
||||
let mut changed = true;
|
||||
let mut dfs = Dfs::new(mir);
|
||||
let mut error_regions = FxHashSet();
|
||||
let mut errors = vec![];
|
||||
|
||||
debug!("propagate_constraints()");
|
||||
debug!("propagate_constraints: constraints={:#?}", {
|
||||
let mut constraints: Vec<_> = self.constraints.iter().collect();
|
||||
constraints.sort();
|
||||
constraints
|
||||
});
|
||||
|
||||
while changed {
|
||||
changed = false;
|
||||
for constraint in &self.constraints {
|
||||
debug!("constraint: {:?}", constraint);
|
||||
debug!("propagate_constraints: constraint={:?}", constraint);
|
||||
let sub = &self.definitions[constraint.sub].value.clone();
|
||||
let sup_def = &mut self.definitions[constraint.sup];
|
||||
|
||||
debug!(" sub (before): {:?}", sub);
|
||||
debug!(" sup (before): {:?}", sup_def.value);
|
||||
debug!("propagate_constraints: sub (before): {:?}", sub);
|
||||
debug!("propagate_constraints: sup (before): {:?}", sup_def.value);
|
||||
|
||||
if !sup_def.constant {
|
||||
// If this is not a constant, then grow the value as needed to
|
||||
@ -293,8 +309,8 @@ impl<'a, 'gcx, 'tcx> RegionInferenceContext<'tcx> {
|
||||
changed = true;
|
||||
}
|
||||
|
||||
debug!(" sup (after) : {:?}", sup_def.value);
|
||||
debug!(" changed : {:?}", changed);
|
||||
debug!("propagate_constraints: sup (after) : {:?}", sup_def.value);
|
||||
debug!("propagate_constraints: changed : {:?}", changed);
|
||||
} else {
|
||||
// If this is a constant, check whether it *would
|
||||
// have* to grow in order for the constraint to be
|
||||
@ -310,7 +326,7 @@ impl<'a, 'gcx, 'tcx> RegionInferenceContext<'tcx> {
|
||||
.difference(&sup_def.value.free_regions)
|
||||
.next()
|
||||
.unwrap();
|
||||
debug!(" new_region : {:?}", new_region);
|
||||
debug!("propagate_constraints: new_region : {:?}", new_region);
|
||||
if error_regions.insert(constraint.sup) {
|
||||
errors.push((constraint.sup, constraint.span, new_region));
|
||||
}
|
||||
@ -398,3 +414,30 @@ impl<'a, 'tcx> Dfs<'a, 'tcx> {
|
||||
changed
|
||||
}
|
||||
}
|
||||
|
||||
impl<'tcx> RegionDefinition<'tcx> {
|
||||
fn new(origin: RegionVariableOrigin) -> Self {
|
||||
// Create a new region definition. Note that, for free
|
||||
// regions, these fields get updated later in
|
||||
// `init_free_regions`.
|
||||
Self {
|
||||
origin,
|
||||
name: None,
|
||||
constant: false,
|
||||
value: Region::default(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl fmt::Debug for Constraint {
|
||||
fn fmt(&self, formatter: &mut fmt::Formatter) -> Result<(), fmt::Error> {
|
||||
write!(
|
||||
formatter,
|
||||
"({:?}: {:?} @ {:?}) due to {:?}",
|
||||
self.sup,
|
||||
self.sub,
|
||||
self.point,
|
||||
self.span
|
||||
)
|
||||
}
|
||||
}
|
||||
|
@ -8,15 +8,14 @@
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
use rustc_data_structures::indexed_vec::Idx;
|
||||
use rustc::ty::subst::{Kind, Substs};
|
||||
use rustc::ty::{self, ClosureSubsts, RegionKind, RegionVid, Ty, TypeFoldable};
|
||||
use rustc::mir::{BasicBlock, Local, Location, Mir, Rvalue, Statement, StatementKind};
|
||||
use rustc_data_structures::indexed_vec::{Idx, IndexVec};
|
||||
use rustc::ty::subst::Substs;
|
||||
use rustc::ty::{self, ClosureSubsts, RegionVid, Ty, TypeFoldable};
|
||||
use rustc::mir::{BasicBlock, Local, Location, Mir, Statement, StatementKind};
|
||||
use rustc::mir::visit::{MutVisitor, TyContext};
|
||||
use rustc::infer::{self as rustc_infer, InferCtxt};
|
||||
use syntax_pos::DUMMY_SP;
|
||||
use std::collections::HashMap;
|
||||
use rustc::infer::{InferCtxt, NLLRegionVariableOrigin};
|
||||
|
||||
use super::ToRegionVid;
|
||||
use super::free_regions::FreeRegions;
|
||||
|
||||
/// Replaces all free regions appearing in the MIR with fresh
|
||||
@ -25,33 +24,35 @@ pub fn renumber_mir<'a, 'gcx, 'tcx>(
|
||||
infcx: &InferCtxt<'a, 'gcx, 'tcx>,
|
||||
free_regions: &FreeRegions<'tcx>,
|
||||
mir: &mut Mir<'tcx>,
|
||||
) -> usize {
|
||||
) {
|
||||
// Create inference variables for each of the free regions
|
||||
// declared on the function signature.
|
||||
let free_region_inference_vars = (0..free_regions.indices.len())
|
||||
.map(|_| {
|
||||
infcx.next_region_var(rustc_infer::MiscVariable(DUMMY_SP))
|
||||
.map(RegionVid::new)
|
||||
.map(|vid_expected| {
|
||||
let r = infcx.next_nll_region_var(NLLRegionVariableOrigin::FreeRegion);
|
||||
assert_eq!(vid_expected, r.to_region_vid());
|
||||
r
|
||||
})
|
||||
.collect();
|
||||
|
||||
debug!("renumber_mir()");
|
||||
debug!("renumber_mir: free_regions={:#?}", free_regions);
|
||||
debug!("renumber_mir: mir.arg_count={:?}", mir.arg_count);
|
||||
|
||||
let mut visitor = NLLVisitor {
|
||||
infcx,
|
||||
lookup_map: HashMap::new(),
|
||||
num_region_variables: free_regions.indices.len(),
|
||||
free_regions,
|
||||
free_region_inference_vars,
|
||||
arg_count: mir.arg_count,
|
||||
};
|
||||
visitor.visit_mir(mir);
|
||||
visitor.num_region_variables
|
||||
}
|
||||
|
||||
struct NLLVisitor<'a, 'gcx: 'a + 'tcx, 'tcx: 'a> {
|
||||
lookup_map: HashMap<RegionVid, TyContext>,
|
||||
num_region_variables: usize,
|
||||
infcx: &'a InferCtxt<'a, 'gcx, 'tcx>,
|
||||
free_regions: &'a FreeRegions<'tcx>,
|
||||
free_region_inference_vars: Vec<ty::Region<'tcx>>,
|
||||
free_region_inference_vars: IndexVec<RegionVid, ty::Region<'tcx>>,
|
||||
arg_count: usize,
|
||||
}
|
||||
|
||||
@ -59,16 +60,17 @@ impl<'a, 'gcx, 'tcx> NLLVisitor<'a, 'gcx, 'tcx> {
|
||||
/// Replaces all regions appearing in `value` with fresh inference
|
||||
/// variables. This is what we do for almost the entire MIR, with
|
||||
/// the exception of the declared types of our arguments.
|
||||
fn renumber_regions<T>(&mut self, value: &T) -> T
|
||||
fn renumber_regions<T>(&mut self, ty_context: TyContext, value: &T) -> T
|
||||
where
|
||||
T: TypeFoldable<'tcx>,
|
||||
{
|
||||
debug!("renumber_regions(value={:?})", value);
|
||||
|
||||
self.infcx
|
||||
.tcx
|
||||
.fold_regions(value, &mut false, |_region, _depth| {
|
||||
self.num_region_variables += 1;
|
||||
self.infcx
|
||||
.next_region_var(rustc_infer::MiscVariable(DUMMY_SP))
|
||||
let origin = NLLRegionVariableOrigin::Inferred(ty_context);
|
||||
self.infcx.next_nll_region_var(origin)
|
||||
})
|
||||
}
|
||||
|
||||
@ -78,6 +80,8 @@ impl<'a, 'gcx, 'tcx> NLLVisitor<'a, 'gcx, 'tcx> {
|
||||
where
|
||||
T: TypeFoldable<'tcx>,
|
||||
{
|
||||
debug!("renumber_free_regions(value={:?})", value);
|
||||
|
||||
self.infcx
|
||||
.tcx
|
||||
.fold_regions(value, &mut false, |region, _depth| {
|
||||
@ -86,26 +90,6 @@ impl<'a, 'gcx, 'tcx> NLLVisitor<'a, 'gcx, 'tcx> {
|
||||
})
|
||||
}
|
||||
|
||||
fn store_region(&mut self, region: &RegionKind, lookup: TyContext) {
|
||||
if let RegionKind::ReVar(rid) = *region {
|
||||
self.lookup_map.entry(rid).or_insert(lookup);
|
||||
}
|
||||
}
|
||||
|
||||
fn store_ty_regions(&mut self, ty: &Ty<'tcx>, ty_context: TyContext) {
|
||||
for region in ty.regions() {
|
||||
self.store_region(region, ty_context);
|
||||
}
|
||||
}
|
||||
|
||||
fn store_kind_regions(&mut self, kind: &'tcx Kind, ty_context: TyContext) {
|
||||
if let Some(ty) = kind.as_type() {
|
||||
self.store_ty_regions(&ty, ty_context);
|
||||
} else if let Some(region) = kind.as_region() {
|
||||
self.store_region(region, ty_context);
|
||||
}
|
||||
}
|
||||
|
||||
fn is_argument_or_return_slot(&self, local: Local) -> bool {
|
||||
// The first argument is return slot, next N are arguments.
|
||||
local.index() <= self.arg_count
|
||||
@ -116,56 +100,55 @@ impl<'a, 'gcx, 'tcx> MutVisitor<'tcx> for NLLVisitor<'a, 'gcx, 'tcx> {
|
||||
fn visit_ty(&mut self, ty: &mut Ty<'tcx>, ty_context: TyContext) {
|
||||
let is_arg = match ty_context {
|
||||
TyContext::LocalDecl { local, .. } => self.is_argument_or_return_slot(local),
|
||||
_ => false,
|
||||
TyContext::ReturnTy(..) => true,
|
||||
TyContext::Location(..) => false,
|
||||
};
|
||||
debug!(
|
||||
"visit_ty(ty={:?}, is_arg={:?}, ty_context={:?})",
|
||||
ty,
|
||||
is_arg,
|
||||
ty_context
|
||||
);
|
||||
|
||||
let old_ty = *ty;
|
||||
*ty = if is_arg {
|
||||
self.renumber_free_regions(&old_ty)
|
||||
} else {
|
||||
self.renumber_regions(&old_ty)
|
||||
self.renumber_regions(ty_context, &old_ty)
|
||||
};
|
||||
self.store_ty_regions(ty, ty_context);
|
||||
debug!("visit_ty: ty={:?}", ty);
|
||||
}
|
||||
|
||||
fn visit_substs(&mut self, substs: &mut &'tcx Substs<'tcx>, location: Location) {
|
||||
*substs = self.renumber_regions(&{ *substs });
|
||||
debug!("visit_substs(substs={:?}, location={:?})", substs, location);
|
||||
|
||||
let ty_context = TyContext::Location(location);
|
||||
for kind in *substs {
|
||||
self.store_kind_regions(kind, ty_context);
|
||||
}
|
||||
*substs = self.renumber_regions(ty_context, &{ *substs });
|
||||
|
||||
debug!("visit_substs: substs={:?}", substs);
|
||||
}
|
||||
|
||||
fn visit_rvalue(&mut self, rvalue: &mut Rvalue<'tcx>, location: Location) {
|
||||
match *rvalue {
|
||||
Rvalue::Ref(ref mut r, _, _) => {
|
||||
let old_r = *r;
|
||||
*r = self.renumber_regions(&old_r);
|
||||
fn visit_region(&mut self, region: &mut ty::Region<'tcx>, location: Location) {
|
||||
debug!("visit_region(region={:?}, location={:?})", region, location);
|
||||
|
||||
let old_region = *region;
|
||||
let ty_context = TyContext::Location(location);
|
||||
self.store_region(r, ty_context);
|
||||
}
|
||||
Rvalue::Use(..) |
|
||||
Rvalue::Repeat(..) |
|
||||
Rvalue::Len(..) |
|
||||
Rvalue::Cast(..) |
|
||||
Rvalue::BinaryOp(..) |
|
||||
Rvalue::CheckedBinaryOp(..) |
|
||||
Rvalue::UnaryOp(..) |
|
||||
Rvalue::Discriminant(..) |
|
||||
Rvalue::NullaryOp(..) |
|
||||
Rvalue::Aggregate(..) => {
|
||||
// These variants don't contain regions.
|
||||
}
|
||||
}
|
||||
self.super_rvalue(rvalue, location);
|
||||
*region = self.renumber_regions(ty_context, &old_region);
|
||||
|
||||
debug!("visit_region: region={:?}", region);
|
||||
}
|
||||
|
||||
fn visit_closure_substs(&mut self, substs: &mut ClosureSubsts<'tcx>, location: Location) {
|
||||
*substs = self.renumber_regions(substs);
|
||||
debug!(
|
||||
"visit_closure_substs(substs={:?}, location={:?})",
|
||||
substs,
|
||||
location
|
||||
);
|
||||
|
||||
let ty_context = TyContext::Location(location);
|
||||
for kind in substs.substs {
|
||||
self.store_kind_regions(kind, ty_context);
|
||||
}
|
||||
*substs = self.renumber_regions(ty_context, substs);
|
||||
|
||||
debug!("visit_closure_substs: substs={:?}", substs);
|
||||
}
|
||||
|
||||
fn visit_statement(
|
||||
|
@ -1,99 +0,0 @@
|
||||
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
use super::RegionIndex;
|
||||
use transform::nll::ToRegionIndex;
|
||||
use rustc::ty::{self, Ty, TyCtxt};
|
||||
use rustc::ty::relate::{self, Relate, RelateResult, TypeRelation};
|
||||
|
||||
pub fn outlives_pairs<'a, 'gcx, 'tcx>(tcx: TyCtxt<'a, 'gcx, 'tcx>,
|
||||
a: Ty<'tcx>,
|
||||
b: Ty<'tcx>)
|
||||
-> Vec<(RegionIndex, RegionIndex)>
|
||||
{
|
||||
let mut subtype = Subtype::new(tcx);
|
||||
match subtype.relate(&a, &b) {
|
||||
Ok(_) => subtype.outlives_pairs,
|
||||
|
||||
Err(_) => bug!("Fail to relate a = {:?} and b = {:?}", a, b)
|
||||
}
|
||||
}
|
||||
|
||||
struct Subtype<'a, 'gcx: 'a+'tcx, 'tcx: 'a> {
|
||||
tcx: TyCtxt<'a, 'gcx, 'tcx>,
|
||||
outlives_pairs: Vec<(RegionIndex, RegionIndex)>,
|
||||
ambient_variance: ty::Variance,
|
||||
}
|
||||
|
||||
impl<'a, 'gcx, 'tcx> Subtype<'a, 'gcx, 'tcx> {
|
||||
pub fn new(tcx: TyCtxt<'a, 'gcx, 'tcx>) -> Subtype<'a, 'gcx, 'tcx> {
|
||||
Subtype {
|
||||
tcx,
|
||||
outlives_pairs: vec![],
|
||||
ambient_variance: ty::Covariant,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl<'a, 'gcx, 'tcx> TypeRelation<'a, 'gcx, 'tcx> for Subtype<'a, 'gcx, 'tcx> {
|
||||
fn tag(&self) -> &'static str { "Subtype" }
|
||||
fn tcx(&self) -> TyCtxt<'a, 'gcx, 'tcx> { self.tcx }
|
||||
fn a_is_expected(&self) -> bool { true } // irrelevant
|
||||
|
||||
fn relate_with_variance<T: Relate<'tcx>>(&mut self,
|
||||
variance: ty::Variance,
|
||||
a: &T,
|
||||
b: &T)
|
||||
-> RelateResult<'tcx, T>
|
||||
{
|
||||
let old_ambient_variance = self.ambient_variance;
|
||||
self.ambient_variance = self.ambient_variance.xform(variance);
|
||||
|
||||
let result = self.relate(a, b);
|
||||
self.ambient_variance = old_ambient_variance;
|
||||
result
|
||||
}
|
||||
|
||||
fn tys(&mut self, t: Ty<'tcx>, t2: Ty<'tcx>) -> RelateResult<'tcx, Ty<'tcx>> {
|
||||
relate::super_relate_tys(self, t, t2)
|
||||
}
|
||||
|
||||
fn regions(&mut self, r_a: ty::Region<'tcx>, r_b: ty::Region<'tcx>)
|
||||
-> RelateResult<'tcx, ty::Region<'tcx>> {
|
||||
let a = r_a.to_region_index();
|
||||
let b = r_b.to_region_index();
|
||||
|
||||
match self.ambient_variance {
|
||||
ty::Covariant => {
|
||||
self.outlives_pairs.push((b, a));
|
||||
},
|
||||
|
||||
ty::Invariant => {
|
||||
self.outlives_pairs.push((a, b));
|
||||
self.outlives_pairs.push((b, a));
|
||||
},
|
||||
|
||||
ty::Contravariant => {
|
||||
self.outlives_pairs.push((a, b));
|
||||
},
|
||||
|
||||
ty::Bivariant => {},
|
||||
}
|
||||
|
||||
Ok(r_a)
|
||||
}
|
||||
|
||||
fn binders<T>(&mut self, _a: &ty::Binder<T>, _b: &ty::Binder<T>)
|
||||
-> RelateResult<'tcx, ty::Binder<T>>
|
||||
where T: Relate<'tcx>
|
||||
{
|
||||
unimplemented!();
|
||||
}
|
||||
}
|
112
src/librustc_mir/transform/nll/subtype_constraint_generation.rs
Normal file
112
src/librustc_mir/transform/nll/subtype_constraint_generation.rs
Normal file
@ -0,0 +1,112 @@
|
||||
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
use rustc::mir::Mir;
|
||||
use rustc::infer::region_constraints::Constraint;
|
||||
use rustc::infer::region_constraints::RegionConstraintData;
|
||||
use rustc::ty;
|
||||
use transform::type_check::MirTypeckRegionConstraints;
|
||||
use transform::type_check::OutlivesSet;
|
||||
|
||||
use super::free_regions::FreeRegions;
|
||||
use super::region_infer::RegionInferenceContext;
|
||||
|
||||
/// When the MIR type-checker executes, it validates all the types in
|
||||
/// the MIR, and in the process generates a set of constraints that
|
||||
/// must hold regarding the regions in the MIR, along with locations
|
||||
/// *where* they must hold. This code takes those constriants and adds
|
||||
/// them into the NLL `RegionInferenceContext`.
|
||||
pub(super) fn generate<'tcx>(
|
||||
regioncx: &mut RegionInferenceContext<'tcx>,
|
||||
free_regions: &FreeRegions<'tcx>,
|
||||
mir: &Mir<'tcx>,
|
||||
constraints: &MirTypeckRegionConstraints<'tcx>,
|
||||
) {
|
||||
SubtypeConstraintGenerator {
|
||||
regioncx,
|
||||
free_regions,
|
||||
mir,
|
||||
}.generate(constraints);
|
||||
}
|
||||
|
||||
struct SubtypeConstraintGenerator<'cx, 'tcx: 'cx> {
|
||||
regioncx: &'cx mut RegionInferenceContext<'tcx>,
|
||||
free_regions: &'cx FreeRegions<'tcx>,
|
||||
mir: &'cx Mir<'tcx>,
|
||||
}
|
||||
|
||||
impl<'cx, 'tcx> SubtypeConstraintGenerator<'cx, 'tcx> {
|
||||
fn generate(&mut self, constraints: &MirTypeckRegionConstraints<'tcx>) {
|
||||
let MirTypeckRegionConstraints {
|
||||
liveness_set,
|
||||
outlives_sets,
|
||||
} = constraints;
|
||||
|
||||
debug!(
|
||||
"generate(liveness_set={} items, outlives_sets={} items)",
|
||||
liveness_set.len(),
|
||||
outlives_sets.len()
|
||||
);
|
||||
|
||||
for (region, location) in liveness_set {
|
||||
debug!("generate: {:#?} is live at {:#?}", region, location);
|
||||
let region_vid = self.to_region_vid(region);
|
||||
self.regioncx.add_live_point(region_vid, *location);
|
||||
}
|
||||
|
||||
for OutlivesSet { locations, data } in outlives_sets {
|
||||
debug!("generate: constraints at: {:#?}", locations);
|
||||
let RegionConstraintData {
|
||||
constraints,
|
||||
verifys,
|
||||
givens,
|
||||
} = data;
|
||||
|
||||
for constraint in constraints.keys() {
|
||||
debug!("generate: constraint: {:?}", constraint);
|
||||
let (a_vid, b_vid) = match constraint {
|
||||
Constraint::VarSubVar(a_vid, b_vid) => (*a_vid, *b_vid),
|
||||
Constraint::RegSubVar(a_r, b_vid) => (self.to_region_vid(a_r), *b_vid),
|
||||
Constraint::VarSubReg(a_vid, b_r) => (*a_vid, self.to_region_vid(b_r)),
|
||||
Constraint::RegSubReg(a_r, b_r) => {
|
||||
(self.to_region_vid(a_r), self.to_region_vid(b_r))
|
||||
}
|
||||
};
|
||||
|
||||
// We have the constraint that `a_vid <= b_vid`. Add
|
||||
// `b_vid: a_vid` to our region checker. Note that we
|
||||
// reverse direction, because `regioncx` talks about
|
||||
// "outlives" (`>=`) whereas the region constraints
|
||||
// talk about `<=`.
|
||||
let span = self.mir.source_info(locations.from_location).span;
|
||||
self.regioncx
|
||||
.add_outlives(span, b_vid, a_vid, locations.at_location);
|
||||
}
|
||||
|
||||
assert!(verifys.is_empty(), "verifys not yet implemented");
|
||||
assert!(
|
||||
givens.is_empty(),
|
||||
"MIR type-checker does not use givens (thank goodness)"
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
fn to_region_vid(&self, r: ty::Region<'tcx>) -> ty::RegionVid {
|
||||
// Every region that we see in the constraints came from the
|
||||
// MIR or from the parameter environment. If the former, it
|
||||
// will be a region variable. If the latter, it will be in
|
||||
// the set of free regions *somewhere*.
|
||||
if let ty::ReVar(vid) = r {
|
||||
*vid
|
||||
} else {
|
||||
self.free_regions.indices[&r]
|
||||
}
|
||||
}
|
||||
}
|
File diff suppressed because it is too large
Load Diff
@ -348,7 +348,7 @@ pub fn write_mir_intro<'a, 'gcx, 'tcx>(tcx: TyCtxt<'a, 'gcx, 'tcx>,
|
||||
let indented_retptr = format!("{}let mut {:?}: {};",
|
||||
INDENT,
|
||||
RETURN_POINTER,
|
||||
mir.return_ty);
|
||||
mir.local_decls[RETURN_POINTER].ty);
|
||||
writeln!(w, "{0:1$} // return pointer",
|
||||
indented_retptr,
|
||||
ALIGN)?;
|
||||
|
@ -471,7 +471,7 @@ impl<'a, 'gcx, 'tcx> FnCtxt<'a, 'gcx, 'tcx> {
|
||||
//
|
||||
// 2. Things go horribly wrong if we use subtype. The reason for
|
||||
// THIS is a fairly subtle case involving bound regions. See the
|
||||
// `givens` field in `region_inference`, as well as the test
|
||||
// `givens` field in `region_constraints`, as well as the test
|
||||
// `regions-relate-bound-regions-on-closures-to-inference-variables.rs`,
|
||||
// for details. Short version is that we must sometimes detect
|
||||
// relationships between specific region variables and regions
|
||||
|
@ -10,8 +10,6 @@
|
||||
|
||||
use rustc::hir::{self, ImplItemKind, TraitItemKind};
|
||||
use rustc::infer::{self, InferOk};
|
||||
use rustc::middle::free_region::FreeRegionMap;
|
||||
use rustc::middle::region;
|
||||
use rustc::ty::{self, TyCtxt};
|
||||
use rustc::ty::util::ExplicitSelf;
|
||||
use rustc::traits::{self, ObligationCause, ObligationCauseCode, Reveal};
|
||||
@ -38,8 +36,7 @@ pub fn compare_impl_method<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
||||
impl_m_span: Span,
|
||||
trait_m: &ty::AssociatedItem,
|
||||
impl_trait_ref: ty::TraitRef<'tcx>,
|
||||
trait_item_span: Option<Span>,
|
||||
old_broken_mode: bool) {
|
||||
trait_item_span: Option<Span>) {
|
||||
debug!("compare_impl_method(impl_trait_ref={:?})",
|
||||
impl_trait_ref);
|
||||
|
||||
@ -79,8 +76,7 @@ pub fn compare_impl_method<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
||||
impl_m,
|
||||
impl_m_span,
|
||||
trait_m,
|
||||
impl_trait_ref,
|
||||
old_broken_mode) {
|
||||
impl_trait_ref) {
|
||||
return;
|
||||
}
|
||||
}
|
||||
@ -89,8 +85,7 @@ fn compare_predicate_entailment<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
||||
impl_m: &ty::AssociatedItem,
|
||||
impl_m_span: Span,
|
||||
trait_m: &ty::AssociatedItem,
|
||||
impl_trait_ref: ty::TraitRef<'tcx>,
|
||||
old_broken_mode: bool)
|
||||
impl_trait_ref: ty::TraitRef<'tcx>)
|
||||
-> Result<(), ErrorReported> {
|
||||
let trait_to_impl_substs = impl_trait_ref.substs;
|
||||
|
||||
@ -106,7 +101,6 @@ fn compare_predicate_entailment<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
||||
item_name: impl_m.name,
|
||||
impl_item_def_id: impl_m.def_id,
|
||||
trait_item_def_id: trait_m.def_id,
|
||||
lint_id: if !old_broken_mode { Some(impl_m_node_id) } else { None },
|
||||
},
|
||||
};
|
||||
|
||||
@ -342,22 +336,8 @@ fn compare_predicate_entailment<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
||||
|
||||
// Finally, resolve all regions. This catches wily misuses of
|
||||
// lifetime parameters.
|
||||
if old_broken_mode {
|
||||
// FIXME(#18937) -- this is how the code used to
|
||||
// work. This is buggy because the fulfillment cx creates
|
||||
// region obligations that get overlooked. The right
|
||||
// thing to do is the code below. But we keep this old
|
||||
// pass around temporarily.
|
||||
let region_scope_tree = region::ScopeTree::default();
|
||||
let mut free_regions = FreeRegionMap::new();
|
||||
free_regions.relate_free_regions_from_predicates(¶m_env.caller_bounds);
|
||||
infcx.resolve_regions_and_report_errors(impl_m.def_id,
|
||||
®ion_scope_tree,
|
||||
&free_regions);
|
||||
} else {
|
||||
let fcx = FnCtxt::new(&inh, param_env, impl_m_node_id);
|
||||
fcx.regionck_item(impl_m_node_id, impl_m_span, &[]);
|
||||
}
|
||||
|
||||
Ok(())
|
||||
})
|
||||
|
@ -137,7 +137,7 @@ mod autoderef;
|
||||
pub mod dropck;
|
||||
pub mod _match;
|
||||
pub mod writeback;
|
||||
pub mod regionck;
|
||||
mod regionck;
|
||||
pub mod coercion;
|
||||
pub mod demand;
|
||||
pub mod method;
|
||||
@ -658,29 +658,10 @@ impl<'a, 'gcx, 'tcx> Inherited<'a, 'gcx, 'tcx> {
|
||||
value: &T) -> T
|
||||
where T : TypeFoldable<'tcx>
|
||||
{
|
||||
let ok = self.normalize_associated_types_in_as_infer_ok(span, body_id, param_env, value);
|
||||
let ok = self.partially_normalize_associated_types_in(span, body_id, param_env, value);
|
||||
self.register_infer_ok_obligations(ok)
|
||||
}
|
||||
|
||||
fn normalize_associated_types_in_as_infer_ok<T>(&self,
|
||||
span: Span,
|
||||
body_id: ast::NodeId,
|
||||
param_env: ty::ParamEnv<'tcx>,
|
||||
value: &T)
|
||||
-> InferOk<'tcx, T>
|
||||
where T : TypeFoldable<'tcx>
|
||||
{
|
||||
debug!("normalize_associated_types_in(value={:?})", value);
|
||||
let mut selcx = traits::SelectionContext::new(self);
|
||||
let cause = ObligationCause::misc(span, body_id);
|
||||
let traits::Normalized { value, obligations } =
|
||||
traits::normalize(&mut selcx, param_env, cause, value);
|
||||
debug!("normalize_associated_types_in: result={:?} predicates={:?}",
|
||||
value,
|
||||
obligations);
|
||||
InferOk { value, obligations }
|
||||
}
|
||||
|
||||
/// Replace any late-bound regions bound in `value` with
|
||||
/// free variants attached to `all_outlive_scope`.
|
||||
fn liberate_late_bound_regions<T>(&self,
|
||||
@ -1340,24 +1321,12 @@ fn check_impl_items_against_trait<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
|
||||
hir::ImplItemKind::Method(..) => {
|
||||
let trait_span = tcx.hir.span_if_local(ty_trait_item.def_id);
|
||||
if ty_trait_item.kind == ty::AssociatedKind::Method {
|
||||
let err_count = tcx.sess.err_count();
|
||||
compare_impl_method(tcx,
|
||||
&ty_impl_item,
|
||||
impl_item.span,
|
||||
&ty_trait_item,
|
||||
impl_trait_ref,
|
||||
trait_span,
|
||||
true); // start with old-broken-mode
|
||||
if err_count == tcx.sess.err_count() {
|
||||
// old broken mode did not report an error. Try with the new mode.
|
||||
compare_impl_method(tcx,
|
||||
&ty_impl_item,
|
||||
impl_item.span,
|
||||
&ty_trait_item,
|
||||
impl_trait_ref,
|
||||
trait_span,
|
||||
false); // use the new mode
|
||||
}
|
||||
trait_span);
|
||||
} else {
|
||||
let mut err = struct_span_err!(tcx.sess, impl_item.span, E0324,
|
||||
"item `{}` is an associated method, \
|
||||
@ -1986,7 +1955,7 @@ impl<'a, 'gcx, 'tcx> FnCtxt<'a, 'gcx, 'tcx> {
|
||||
-> InferOk<'tcx, T>
|
||||
where T : TypeFoldable<'tcx>
|
||||
{
|
||||
self.inh.normalize_associated_types_in_as_infer_ok(span,
|
||||
self.inh.partially_normalize_associated_types_in(span,
|
||||
self.body_id,
|
||||
self.param_env,
|
||||
value)
|
||||
|
@ -84,18 +84,14 @@
|
||||
|
||||
use check::dropck;
|
||||
use check::FnCtxt;
|
||||
use middle::free_region::FreeRegionMap;
|
||||
use middle::mem_categorization as mc;
|
||||
use middle::mem_categorization::Categorization;
|
||||
use middle::region;
|
||||
use rustc::hir::def_id::DefId;
|
||||
use rustc::ty::subst::Substs;
|
||||
use rustc::traits;
|
||||
use rustc::ty::{self, Ty, TypeFoldable};
|
||||
use rustc::infer::{self, GenericKind, SubregionOrigin, VerifyBound};
|
||||
use rustc::ty::{self, Ty};
|
||||
use rustc::infer::{self, OutlivesEnvironment};
|
||||
use rustc::ty::adjustment;
|
||||
use rustc::ty::outlives::Component;
|
||||
use rustc::ty::wf;
|
||||
|
||||
use std::mem;
|
||||
use std::ops::Deref;
|
||||
@ -117,7 +113,11 @@ impl<'a, 'gcx, 'tcx> FnCtxt<'a, 'gcx, 'tcx> {
|
||||
pub fn regionck_expr(&self, body: &'gcx hir::Body) {
|
||||
let subject = self.tcx.hir.body_owner_def_id(body.id());
|
||||
let id = body.value.id;
|
||||
let mut rcx = RegionCtxt::new(self, RepeatingScope(id), id, Subject(subject));
|
||||
let mut rcx = RegionCtxt::new(self,
|
||||
RepeatingScope(id),
|
||||
id,
|
||||
Subject(subject),
|
||||
self.param_env);
|
||||
if self.err_count_since_creation() == 0 {
|
||||
// regionck assumes typeck succeeded
|
||||
rcx.visit_body(body);
|
||||
@ -126,7 +126,7 @@ impl<'a, 'gcx, 'tcx> FnCtxt<'a, 'gcx, 'tcx> {
|
||||
rcx.resolve_regions_and_report_errors();
|
||||
|
||||
assert!(self.tables.borrow().free_region_map.is_empty());
|
||||
self.tables.borrow_mut().free_region_map = rcx.free_region_map;
|
||||
self.tables.borrow_mut().free_region_map = rcx.outlives_environment.into_free_region_map();
|
||||
}
|
||||
|
||||
/// Region checking during the WF phase for items. `wf_tys` are the
|
||||
@ -137,37 +137,48 @@ impl<'a, 'gcx, 'tcx> FnCtxt<'a, 'gcx, 'tcx> {
|
||||
wf_tys: &[Ty<'tcx>]) {
|
||||
debug!("regionck_item(item.id={:?}, wf_tys={:?}", item_id, wf_tys);
|
||||
let subject = self.tcx.hir.local_def_id(item_id);
|
||||
let mut rcx = RegionCtxt::new(self, RepeatingScope(item_id), item_id, Subject(subject));
|
||||
rcx.free_region_map.relate_free_regions_from_predicates(
|
||||
&self.param_env.caller_bounds);
|
||||
rcx.relate_free_regions(wf_tys, item_id, span);
|
||||
let mut rcx = RegionCtxt::new(self,
|
||||
RepeatingScope(item_id),
|
||||
item_id,
|
||||
Subject(subject),
|
||||
self.param_env);
|
||||
rcx.outlives_environment.add_implied_bounds(self, wf_tys, item_id, span);
|
||||
rcx.visit_region_obligations(item_id);
|
||||
rcx.resolve_regions_and_report_errors();
|
||||
}
|
||||
|
||||
/// Region check a function body. Not invoked on closures, but
|
||||
/// only on the "root" fn item (in which closures may be
|
||||
/// embedded). Walks the function body and adds various add'l
|
||||
/// constraints that are needed for region inference. This is
|
||||
/// separated both to isolate "pure" region constraints from the
|
||||
/// rest of type check and because sometimes we need type
|
||||
/// inference to have completed before we can determine which
|
||||
/// constraints to add.
|
||||
pub fn regionck_fn(&self,
|
||||
fn_id: ast::NodeId,
|
||||
body: &'gcx hir::Body) {
|
||||
debug!("regionck_fn(id={})", fn_id);
|
||||
let subject = self.tcx.hir.body_owner_def_id(body.id());
|
||||
let node_id = body.value.id;
|
||||
let mut rcx = RegionCtxt::new(self, RepeatingScope(node_id), node_id, Subject(subject));
|
||||
let mut rcx = RegionCtxt::new(self,
|
||||
RepeatingScope(node_id),
|
||||
node_id,
|
||||
Subject(subject),
|
||||
self.param_env);
|
||||
|
||||
if self.err_count_since_creation() == 0 {
|
||||
// regionck assumes typeck succeeded
|
||||
rcx.visit_fn_body(fn_id, body, self.tcx.hir.span(fn_id));
|
||||
}
|
||||
|
||||
rcx.free_region_map.relate_free_regions_from_predicates(
|
||||
&self.param_env.caller_bounds);
|
||||
|
||||
rcx.resolve_regions_and_report_errors();
|
||||
|
||||
// In this mode, we also copy the free-region-map into the
|
||||
// tables of the enclosing fcx. In the other regionck modes
|
||||
// (e.g., `regionck_item`), we don't have an enclosing tables.
|
||||
assert!(self.tables.borrow().free_region_map.is_empty());
|
||||
self.tables.borrow_mut().free_region_map = rcx.free_region_map;
|
||||
self.tables.borrow_mut().free_region_map = rcx.outlives_environment.into_free_region_map();
|
||||
}
|
||||
}
|
||||
|
||||
@ -177,11 +188,9 @@ impl<'a, 'gcx, 'tcx> FnCtxt<'a, 'gcx, 'tcx> {
|
||||
pub struct RegionCtxt<'a, 'gcx: 'a+'tcx, 'tcx: 'a> {
|
||||
pub fcx: &'a FnCtxt<'a, 'gcx, 'tcx>,
|
||||
|
||||
region_bound_pairs: Vec<(ty::Region<'tcx>, GenericKind<'tcx>)>,
|
||||
|
||||
pub region_scope_tree: Rc<region::ScopeTree>,
|
||||
|
||||
free_region_map: FreeRegionMap<'tcx>,
|
||||
outlives_environment: OutlivesEnvironment<'tcx>,
|
||||
|
||||
// id of innermost fn body id
|
||||
body_id: ast::NodeId,
|
||||
@ -197,24 +206,6 @@ pub struct RegionCtxt<'a, 'gcx: 'a+'tcx, 'tcx: 'a> {
|
||||
|
||||
}
|
||||
|
||||
/// Implied bounds are region relationships that we deduce
|
||||
/// automatically. The idea is that (e.g.) a caller must check that a
|
||||
/// function's argument types are well-formed immediately before
|
||||
/// calling that fn, and hence the *callee* can assume that its
|
||||
/// argument types are well-formed. This may imply certain relationships
|
||||
/// between generic parameters. For example:
|
||||
///
|
||||
/// fn foo<'a,T>(x: &'a T)
|
||||
///
|
||||
/// can only be called with a `'a` and `T` such that `&'a T` is WF.
|
||||
/// For `&'a T` to be WF, `T: 'a` must hold. So we can assume `T: 'a`.
|
||||
#[derive(Debug)]
|
||||
enum ImpliedBound<'tcx> {
|
||||
RegionSubRegion(ty::Region<'tcx>, ty::Region<'tcx>),
|
||||
RegionSubParam(ty::Region<'tcx>, ty::ParamTy),
|
||||
RegionSubProjection(ty::Region<'tcx>, ty::ProjectionTy<'tcx>),
|
||||
}
|
||||
|
||||
impl<'a, 'gcx, 'tcx> Deref for RegionCtxt<'a, 'gcx, 'tcx> {
|
||||
type Target = FnCtxt<'a, 'gcx, 'tcx>;
|
||||
fn deref(&self) -> &Self::Target {
|
||||
@ -229,8 +220,11 @@ impl<'a, 'gcx, 'tcx> RegionCtxt<'a, 'gcx, 'tcx> {
|
||||
pub fn new(fcx: &'a FnCtxt<'a, 'gcx, 'tcx>,
|
||||
RepeatingScope(initial_repeating_scope): RepeatingScope,
|
||||
initial_body_id: ast::NodeId,
|
||||
Subject(subject): Subject) -> RegionCtxt<'a, 'gcx, 'tcx> {
|
||||
Subject(subject): Subject,
|
||||
param_env: ty::ParamEnv<'tcx>)
|
||||
-> RegionCtxt<'a, 'gcx, 'tcx> {
|
||||
let region_scope_tree = fcx.tcx.region_scope_tree(subject);
|
||||
let outlives_environment = OutlivesEnvironment::new(param_env);
|
||||
RegionCtxt {
|
||||
fcx,
|
||||
region_scope_tree,
|
||||
@ -238,20 +232,10 @@ impl<'a, 'gcx, 'tcx> RegionCtxt<'a, 'gcx, 'tcx> {
|
||||
body_id: initial_body_id,
|
||||
call_site_scope: None,
|
||||
subject_def_id: subject,
|
||||
region_bound_pairs: Vec::new(),
|
||||
free_region_map: FreeRegionMap::new(),
|
||||
outlives_environment,
|
||||
}
|
||||
}
|
||||
|
||||
fn set_call_site_scope(&mut self, call_site_scope: Option<region::Scope>)
|
||||
-> Option<region::Scope> {
|
||||
mem::replace(&mut self.call_site_scope, call_site_scope)
|
||||
}
|
||||
|
||||
fn set_body_id(&mut self, body_id: ast::NodeId) -> ast::NodeId {
|
||||
mem::replace(&mut self.body_id, body_id)
|
||||
}
|
||||
|
||||
fn set_repeating_scope(&mut self, scope: ast::NodeId) -> ast::NodeId {
|
||||
mem::replace(&mut self.repeating_scope, scope)
|
||||
}
|
||||
@ -295,6 +279,18 @@ impl<'a, 'gcx, 'tcx> RegionCtxt<'a, 'gcx, 'tcx> {
|
||||
self.resolve_type(ty)
|
||||
}
|
||||
|
||||
/// This is the "main" function when region-checking a function item or a closure
|
||||
/// within a function item. It begins by updating various fields (e.g., `call_site_scope`
|
||||
/// and `outlives_environment`) to be appropriate to the function and then adds constraints
|
||||
/// derived from the function body.
|
||||
///
|
||||
/// Note that it does **not** restore the state of the fields that
|
||||
/// it updates! This is intentional, since -- for the main
|
||||
/// function -- we wish to be able to read the final
|
||||
/// `outlives_environment` and other fields from the caller. For
|
||||
/// closures, however, we save and restore any "scoped state"
|
||||
/// before we invoke this function. (See `visit_fn` in the
|
||||
/// `intravisit::Visitor` impl below.)
|
||||
fn visit_fn_body(&mut self,
|
||||
id: ast::NodeId, // the id of the fn itself
|
||||
body: &'gcx hir::Body,
|
||||
@ -304,9 +300,10 @@ impl<'a, 'gcx, 'tcx> RegionCtxt<'a, 'gcx, 'tcx> {
|
||||
debug!("visit_fn_body(id={})", id);
|
||||
|
||||
let body_id = body.id();
|
||||
self.body_id = body_id.node_id;
|
||||
|
||||
let call_site = region::Scope::CallSite(body.value.hir_id.local_id);
|
||||
let old_call_site_scope = self.set_call_site_scope(Some(call_site));
|
||||
self.call_site_scope = Some(call_site);
|
||||
|
||||
let fn_sig = {
|
||||
let fn_hir_id = self.tcx.hir.node_to_hir_id(id);
|
||||
@ -318,8 +315,6 @@ impl<'a, 'gcx, 'tcx> RegionCtxt<'a, 'gcx, 'tcx> {
|
||||
}
|
||||
};
|
||||
|
||||
let old_region_bounds_pairs_len = self.region_bound_pairs.len();
|
||||
|
||||
// Collect the types from which we create inferred bounds.
|
||||
// For the return type, if diverging, substitute `bool` just
|
||||
// because it will have no effect.
|
||||
@ -328,8 +323,11 @@ impl<'a, 'gcx, 'tcx> RegionCtxt<'a, 'gcx, 'tcx> {
|
||||
let fn_sig_tys: Vec<_> =
|
||||
fn_sig.inputs().iter().cloned().chain(Some(fn_sig.output())).collect();
|
||||
|
||||
let old_body_id = self.set_body_id(body_id.node_id);
|
||||
self.relate_free_regions(&fn_sig_tys[..], body_id.node_id, span);
|
||||
self.outlives_environment.add_implied_bounds(
|
||||
self.fcx,
|
||||
&fn_sig_tys[..],
|
||||
body_id.node_id,
|
||||
span);
|
||||
self.link_fn_args(region::Scope::Node(body.value.hir_id.local_id), &body.arguments);
|
||||
self.visit_body(body);
|
||||
self.visit_region_obligations(body_id.node_id);
|
||||
@ -342,11 +340,6 @@ impl<'a, 'gcx, 'tcx> RegionCtxt<'a, 'gcx, 'tcx> {
|
||||
self.type_of_node_must_outlive(infer::CallReturn(span),
|
||||
body_hir_id,
|
||||
call_site_region);
|
||||
|
||||
self.region_bound_pairs.truncate(old_region_bounds_pairs_len);
|
||||
|
||||
self.set_body_id(old_body_id);
|
||||
self.set_call_site_scope(old_call_site_scope);
|
||||
}
|
||||
|
||||
fn visit_region_obligations(&mut self, node_id: ast::NodeId)
|
||||
@ -358,231 +351,17 @@ impl<'a, 'gcx, 'tcx> RegionCtxt<'a, 'gcx, 'tcx> {
|
||||
// obligations. So make sure we process those.
|
||||
self.select_all_obligations_or_error();
|
||||
|
||||
// Make a copy of the region obligations vec because we'll need
|
||||
// to be able to borrow the fulfillment-cx below when projecting.
|
||||
let region_obligations =
|
||||
self.fulfillment_cx
|
||||
.borrow()
|
||||
.region_obligations(node_id)
|
||||
.to_vec();
|
||||
|
||||
for r_o in ®ion_obligations {
|
||||
debug!("visit_region_obligations: r_o={:?} cause={:?}",
|
||||
r_o, r_o.cause);
|
||||
let sup_type = self.resolve_type(r_o.sup_type);
|
||||
let origin = self.code_to_origin(&r_o.cause, sup_type);
|
||||
self.type_must_outlive(origin, sup_type, r_o.sub_region);
|
||||
}
|
||||
|
||||
// Processing the region obligations should not cause the list to grow further:
|
||||
assert_eq!(region_obligations.len(),
|
||||
self.fulfillment_cx.borrow().region_obligations(node_id).len());
|
||||
}
|
||||
|
||||
fn code_to_origin(&self,
|
||||
cause: &traits::ObligationCause<'tcx>,
|
||||
sup_type: Ty<'tcx>)
|
||||
-> SubregionOrigin<'tcx> {
|
||||
SubregionOrigin::from_obligation_cause(cause,
|
||||
|| infer::RelateParamBound(cause.span, sup_type))
|
||||
}
|
||||
|
||||
/// This method populates the region map's `free_region_map`. It walks over the transformed
|
||||
/// argument and return types for each function just before we check the body of that function,
|
||||
/// looking for types where you have a borrowed pointer to other borrowed data (e.g., `&'a &'b
|
||||
/// [usize]`. We do not allow references to outlive the things they point at, so we can assume
|
||||
/// that `'a <= 'b`. This holds for both the argument and return types, basically because, on
|
||||
/// the caller side, the caller is responsible for checking that the type of every expression
|
||||
/// (including the actual values for the arguments, as well as the return type of the fn call)
|
||||
/// is well-formed.
|
||||
///
|
||||
/// Tests: `src/test/compile-fail/regions-free-region-ordering-*.rs`
|
||||
fn relate_free_regions(&mut self,
|
||||
fn_sig_tys: &[Ty<'tcx>],
|
||||
body_id: ast::NodeId,
|
||||
span: Span) {
|
||||
debug!("relate_free_regions >>");
|
||||
|
||||
for &ty in fn_sig_tys {
|
||||
let ty = self.resolve_type(ty);
|
||||
debug!("relate_free_regions(t={:?})", ty);
|
||||
let implied_bounds = self.implied_bounds(body_id, ty, span);
|
||||
|
||||
// But also record other relationships, such as `T:'x`,
|
||||
// that don't go into the free-region-map but which we use
|
||||
// here.
|
||||
for implication in implied_bounds {
|
||||
debug!("implication: {:?}", implication);
|
||||
match implication {
|
||||
ImpliedBound::RegionSubRegion(r_a @ &ty::ReEarlyBound(_),
|
||||
&ty::ReVar(vid_b)) |
|
||||
ImpliedBound::RegionSubRegion(r_a @ &ty::ReFree(_),
|
||||
&ty::ReVar(vid_b)) => {
|
||||
self.add_given(r_a, vid_b);
|
||||
}
|
||||
ImpliedBound::RegionSubParam(r_a, param_b) => {
|
||||
self.region_bound_pairs.push((r_a, GenericKind::Param(param_b)));
|
||||
}
|
||||
ImpliedBound::RegionSubProjection(r_a, projection_b) => {
|
||||
self.region_bound_pairs.push((r_a, GenericKind::Projection(projection_b)));
|
||||
}
|
||||
ImpliedBound::RegionSubRegion(r_a, r_b) => {
|
||||
// In principle, we could record (and take
|
||||
// advantage of) every relationship here, but
|
||||
// we are also free not to -- it simply means
|
||||
// strictly less that we can successfully type
|
||||
// check. Right now we only look for things
|
||||
// relationships between free regions. (It may
|
||||
// also be that we should revise our inference
|
||||
// system to be more general and to make use
|
||||
// of *every* relationship that arises here,
|
||||
// but presently we do not.)
|
||||
self.free_region_map.relate_regions(r_a, r_b);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
debug!("<< relate_free_regions");
|
||||
}
|
||||
|
||||
/// Compute the implied bounds that a callee/impl can assume based on
|
||||
/// the fact that caller/projector has ensured that `ty` is WF. See
|
||||
/// the `ImpliedBound` type for more details.
|
||||
fn implied_bounds(&mut self, body_id: ast::NodeId, ty: Ty<'tcx>, span: Span)
|
||||
-> Vec<ImpliedBound<'tcx>> {
|
||||
// Sometimes when we ask what it takes for T: WF, we get back that
|
||||
// U: WF is required; in that case, we push U onto this stack and
|
||||
// process it next. Currently (at least) these resulting
|
||||
// predicates are always guaranteed to be a subset of the original
|
||||
// type, so we need not fear non-termination.
|
||||
let mut wf_types = vec![ty];
|
||||
|
||||
let mut implied_bounds = vec![];
|
||||
|
||||
while let Some(ty) = wf_types.pop() {
|
||||
// Compute the obligations for `ty` to be well-formed. If `ty` is
|
||||
// an unresolved inference variable, just substituted an empty set
|
||||
// -- because the return type here is going to be things we *add*
|
||||
// to the environment, it's always ok for this set to be smaller
|
||||
// than the ultimate set. (Note: normally there won't be
|
||||
// unresolved inference variables here anyway, but there might be
|
||||
// during typeck under some circumstances.)
|
||||
let obligations =
|
||||
wf::obligations(self, self.fcx.param_env, body_id, ty, span)
|
||||
.unwrap_or(vec![]);
|
||||
|
||||
// NB: All of these predicates *ought* to be easily proven
|
||||
// true. In fact, their correctness is (mostly) implied by
|
||||
// other parts of the program. However, in #42552, we had
|
||||
// an annoying scenario where:
|
||||
//
|
||||
// - Some `T::Foo` gets normalized, resulting in a
|
||||
// variable `_1` and a `T: Trait<Foo=_1>` constraint
|
||||
// (not sure why it couldn't immediately get
|
||||
// solved). This result of `_1` got cached.
|
||||
// - These obligations were dropped on the floor here,
|
||||
// rather than being registered.
|
||||
// - Then later we would get a request to normalize
|
||||
// `T::Foo` which would result in `_1` being used from
|
||||
// the cache, but hence without the `T: Trait<Foo=_1>`
|
||||
// constraint. As a result, `_1` never gets resolved,
|
||||
// and we get an ICE (in dropck).
|
||||
//
|
||||
// Therefore, we register any predicates involving
|
||||
// inference variables. We restrict ourselves to those
|
||||
// involving inference variables both for efficiency and
|
||||
// to avoids duplicate errors that otherwise show up.
|
||||
self.fcx.register_predicates(
|
||||
obligations.iter()
|
||||
.filter(|o| o.predicate.has_infer_types())
|
||||
.cloned());
|
||||
|
||||
// From the full set of obligations, just filter down to the
|
||||
// region relationships.
|
||||
implied_bounds.extend(
|
||||
obligations
|
||||
.into_iter()
|
||||
.flat_map(|obligation| {
|
||||
assert!(!obligation.has_escaping_regions());
|
||||
match obligation.predicate {
|
||||
ty::Predicate::Trait(..) |
|
||||
ty::Predicate::Equate(..) |
|
||||
ty::Predicate::Subtype(..) |
|
||||
ty::Predicate::Projection(..) |
|
||||
ty::Predicate::ClosureKind(..) |
|
||||
ty::Predicate::ObjectSafe(..) |
|
||||
ty::Predicate::ConstEvaluatable(..) =>
|
||||
vec![],
|
||||
|
||||
ty::Predicate::WellFormed(subty) => {
|
||||
wf_types.push(subty);
|
||||
vec![]
|
||||
}
|
||||
|
||||
ty::Predicate::RegionOutlives(ref data) =>
|
||||
match self.tcx.no_late_bound_regions(data) {
|
||||
None =>
|
||||
vec![],
|
||||
Some(ty::OutlivesPredicate(r_a, r_b)) =>
|
||||
vec![ImpliedBound::RegionSubRegion(r_b, r_a)],
|
||||
},
|
||||
|
||||
ty::Predicate::TypeOutlives(ref data) =>
|
||||
match self.tcx.no_late_bound_regions(data) {
|
||||
None => vec![],
|
||||
Some(ty::OutlivesPredicate(ty_a, r_b)) => {
|
||||
let ty_a = self.resolve_type_vars_if_possible(&ty_a);
|
||||
let components = self.tcx.outlives_components(ty_a);
|
||||
self.implied_bounds_from_components(r_b, components)
|
||||
}
|
||||
},
|
||||
}}));
|
||||
}
|
||||
|
||||
implied_bounds
|
||||
}
|
||||
|
||||
/// When we have an implied bound that `T: 'a`, we can further break
|
||||
/// this down to determine what relationships would have to hold for
|
||||
/// `T: 'a` to hold. We get to assume that the caller has validated
|
||||
/// those relationships.
|
||||
fn implied_bounds_from_components(&self,
|
||||
sub_region: ty::Region<'tcx>,
|
||||
sup_components: Vec<Component<'tcx>>)
|
||||
-> Vec<ImpliedBound<'tcx>>
|
||||
{
|
||||
sup_components
|
||||
.into_iter()
|
||||
.flat_map(|component| {
|
||||
match component {
|
||||
Component::Region(r) =>
|
||||
vec![ImpliedBound::RegionSubRegion(sub_region, r)],
|
||||
Component::Param(p) =>
|
||||
vec![ImpliedBound::RegionSubParam(sub_region, p)],
|
||||
Component::Projection(p) =>
|
||||
vec![ImpliedBound::RegionSubProjection(sub_region, p)],
|
||||
Component::EscapingProjection(_) =>
|
||||
// If the projection has escaping regions, don't
|
||||
// try to infer any implied bounds even for its
|
||||
// free components. This is conservative, because
|
||||
// the caller will still have to prove that those
|
||||
// free components outlive `sub_region`. But the
|
||||
// idea is that the WAY that the caller proves
|
||||
// that may change in the future and we want to
|
||||
// give ourselves room to get smarter here.
|
||||
vec![],
|
||||
Component::UnresolvedInferenceVariable(..) =>
|
||||
vec![],
|
||||
}
|
||||
})
|
||||
.collect()
|
||||
self.infcx.process_registered_region_obligations(
|
||||
self.outlives_environment.region_bound_pairs(),
|
||||
self.implicit_region_bound,
|
||||
self.param_env,
|
||||
self.body_id);
|
||||
}
|
||||
|
||||
fn resolve_regions_and_report_errors(&self) {
|
||||
self.fcx.resolve_regions_and_report_errors(self.subject_def_id,
|
||||
&self.region_scope_tree,
|
||||
&self.free_region_map);
|
||||
self.outlives_environment.free_region_map());
|
||||
}
|
||||
|
||||
fn constrain_bindings_in_pat(&mut self, pat: &hir::Pat) {
|
||||
@ -638,10 +417,28 @@ impl<'a, 'gcx, 'tcx> Visitor<'gcx> for RegionCtxt<'a, 'gcx, 'tcx> {
|
||||
NestedVisitorMap::None
|
||||
}
|
||||
|
||||
fn visit_fn(&mut self, _fk: intravisit::FnKind<'gcx>, _: &'gcx hir::FnDecl,
|
||||
b: hir::BodyId, span: Span, id: ast::NodeId) {
|
||||
let body = self.tcx.hir.body(b);
|
||||
self.visit_fn_body(id, body, span)
|
||||
fn visit_fn(&mut self,
|
||||
fk: intravisit::FnKind<'gcx>,
|
||||
_: &'gcx hir::FnDecl,
|
||||
body_id: hir::BodyId,
|
||||
span: Span,
|
||||
id: ast::NodeId) {
|
||||
assert!(match fk { intravisit::FnKind::Closure(..) => true, _ => false },
|
||||
"visit_fn invoked for something other than a closure");
|
||||
|
||||
// Save state of current function before invoking
|
||||
// `visit_fn_body`. We will restore afterwards.
|
||||
let old_body_id = self.body_id;
|
||||
let old_call_site_scope = self.call_site_scope;
|
||||
let env_snapshot = self.outlives_environment.push_snapshot_pre_closure();
|
||||
|
||||
let body = self.tcx.hir.body(body_id);
|
||||
self.visit_fn_body(id, body, span);
|
||||
|
||||
// Restore state from previous function.
|
||||
self.outlives_environment.pop_snapshot_post_closure(env_snapshot);
|
||||
self.call_site_scope = old_call_site_scope;
|
||||
self.body_id = old_body_id;
|
||||
}
|
||||
|
||||
//visit_pat: visit_pat, // (..) see above
|
||||
@ -1137,6 +934,27 @@ impl<'a, 'gcx, 'tcx> RegionCtxt<'a, 'gcx, 'tcx> {
|
||||
self.type_must_outlive(origin, ty, minimum_lifetime);
|
||||
}
|
||||
|
||||
/// Adds constraints to inference such that `T: 'a` holds (or
|
||||
/// reports an error if it cannot).
|
||||
///
|
||||
/// # Parameters
|
||||
///
|
||||
/// - `origin`, the reason we need this constraint
|
||||
/// - `ty`, the type `T`
|
||||
/// - `region`, the region `'a`
|
||||
pub fn type_must_outlive(&self,
|
||||
origin: infer::SubregionOrigin<'tcx>,
|
||||
ty: Ty<'tcx>,
|
||||
region: ty::Region<'tcx>)
|
||||
{
|
||||
self.infcx.type_must_outlive(self.outlives_environment.region_bound_pairs(),
|
||||
self.implicit_region_bound,
|
||||
self.param_env,
|
||||
origin,
|
||||
ty,
|
||||
region);
|
||||
}
|
||||
|
||||
/// Computes the guarantor for an expression `&base` and then ensures that the lifetime of the
|
||||
/// resulting pointer is linked to the lifetime of its guarantor (if any).
|
||||
fn link_addr_of(&mut self, expr: &hir::Expr,
|
||||
@ -1492,345 +1310,4 @@ impl<'a, 'gcx, 'tcx> RegionCtxt<'a, 'gcx, 'tcx> {
|
||||
self.type_must_outlive(origin.clone(), ty, expr_region);
|
||||
}
|
||||
}
|
||||
|
||||
/// Ensures that type is well-formed in `region`, which implies (among
|
||||
/// other things) that all borrowed data reachable via `ty` outlives
|
||||
/// `region`.
|
||||
pub fn type_must_outlive(&self,
|
||||
origin: infer::SubregionOrigin<'tcx>,
|
||||
ty: Ty<'tcx>,
|
||||
region: ty::Region<'tcx>)
|
||||
{
|
||||
let ty = self.resolve_type(ty);
|
||||
|
||||
debug!("type_must_outlive(ty={:?}, region={:?}, origin={:?})",
|
||||
ty,
|
||||
region,
|
||||
origin);
|
||||
|
||||
assert!(!ty.has_escaping_regions());
|
||||
|
||||
let components = self.tcx.outlives_components(ty);
|
||||
self.components_must_outlive(origin, components, region);
|
||||
}
|
||||
|
||||
fn components_must_outlive(&self,
|
||||
origin: infer::SubregionOrigin<'tcx>,
|
||||
components: Vec<Component<'tcx>>,
|
||||
region: ty::Region<'tcx>)
|
||||
{
|
||||
for component in components {
|
||||
let origin = origin.clone();
|
||||
match component {
|
||||
Component::Region(region1) => {
|
||||
self.sub_regions(origin, region, region1);
|
||||
}
|
||||
Component::Param(param_ty) => {
|
||||
self.param_ty_must_outlive(origin, region, param_ty);
|
||||
}
|
||||
Component::Projection(projection_ty) => {
|
||||
self.projection_must_outlive(origin, region, projection_ty);
|
||||
}
|
||||
Component::EscapingProjection(subcomponents) => {
|
||||
self.components_must_outlive(origin, subcomponents, region);
|
||||
}
|
||||
Component::UnresolvedInferenceVariable(v) => {
|
||||
// ignore this, we presume it will yield an error
|
||||
// later, since if a type variable is not resolved by
|
||||
// this point it never will be
|
||||
self.tcx.sess.delay_span_bug(
|
||||
origin.span(),
|
||||
&format!("unresolved inference variable in outlives: {:?}", v));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn param_ty_must_outlive(&self,
|
||||
origin: infer::SubregionOrigin<'tcx>,
|
||||
region: ty::Region<'tcx>,
|
||||
param_ty: ty::ParamTy) {
|
||||
debug!("param_ty_must_outlive(region={:?}, param_ty={:?}, origin={:?})",
|
||||
region, param_ty, origin);
|
||||
|
||||
let verify_bound = self.param_bound(param_ty);
|
||||
let generic = GenericKind::Param(param_ty);
|
||||
self.verify_generic_bound(origin, generic, region, verify_bound);
|
||||
}
|
||||
|
||||
fn projection_must_outlive(&self,
|
||||
origin: infer::SubregionOrigin<'tcx>,
|
||||
region: ty::Region<'tcx>,
|
||||
projection_ty: ty::ProjectionTy<'tcx>)
|
||||
{
|
||||
debug!("projection_must_outlive(region={:?}, projection_ty={:?}, origin={:?})",
|
||||
region, projection_ty, origin);
|
||||
|
||||
// This case is thorny for inference. The fundamental problem is
|
||||
// that there are many cases where we have choice, and inference
|
||||
// doesn't like choice (the current region inference in
|
||||
// particular). :) First off, we have to choose between using the
|
||||
// OutlivesProjectionEnv, OutlivesProjectionTraitDef, and
|
||||
// OutlivesProjectionComponent rules, any one of which is
|
||||
// sufficient. If there are no inference variables involved, it's
|
||||
// not hard to pick the right rule, but if there are, we're in a
|
||||
// bit of a catch 22: if we picked which rule we were going to
|
||||
// use, we could add constraints to the region inference graph
|
||||
// that make it apply, but if we don't add those constraints, the
|
||||
// rule might not apply (but another rule might). For now, we err
|
||||
// on the side of adding too few edges into the graph.
|
||||
|
||||
// Compute the bounds we can derive from the environment or trait
|
||||
// definition. We know that the projection outlives all the
|
||||
// regions in this list.
|
||||
let env_bounds = self.projection_declared_bounds(origin.span(), projection_ty);
|
||||
|
||||
debug!("projection_must_outlive: env_bounds={:?}",
|
||||
env_bounds);
|
||||
|
||||
// If we know that the projection outlives 'static, then we're
|
||||
// done here.
|
||||
if env_bounds.contains(&&ty::ReStatic) {
|
||||
debug!("projection_must_outlive: 'static as declared bound");
|
||||
return;
|
||||
}
|
||||
|
||||
// If declared bounds list is empty, the only applicable rule is
|
||||
// OutlivesProjectionComponent. If there are inference variables,
|
||||
// then, we can break down the outlives into more primitive
|
||||
// components without adding unnecessary edges.
|
||||
//
|
||||
// If there are *no* inference variables, however, we COULD do
|
||||
// this, but we choose not to, because the error messages are less
|
||||
// good. For example, a requirement like `T::Item: 'r` would be
|
||||
// translated to a requirement that `T: 'r`; when this is reported
|
||||
// to the user, it will thus say "T: 'r must hold so that T::Item:
|
||||
// 'r holds". But that makes it sound like the only way to fix
|
||||
// the problem is to add `T: 'r`, which isn't true. So, if there are no
|
||||
// inference variables, we use a verify constraint instead of adding
|
||||
// edges, which winds up enforcing the same condition.
|
||||
let needs_infer = projection_ty.needs_infer();
|
||||
if env_bounds.is_empty() && needs_infer {
|
||||
debug!("projection_must_outlive: no declared bounds");
|
||||
|
||||
for component_ty in projection_ty.substs.types() {
|
||||
self.type_must_outlive(origin.clone(), component_ty, region);
|
||||
}
|
||||
|
||||
for r in projection_ty.substs.regions() {
|
||||
self.sub_regions(origin.clone(), region, r);
|
||||
}
|
||||
|
||||
return;
|
||||
}
|
||||
|
||||
// If we find that there is a unique declared bound `'b`, and this bound
|
||||
// appears in the trait reference, then the best action is to require that `'b:'r`,
|
||||
// so do that. This is best no matter what rule we use:
|
||||
//
|
||||
// - OutlivesProjectionEnv or OutlivesProjectionTraitDef: these would translate to
|
||||
// the requirement that `'b:'r`
|
||||
// - OutlivesProjectionComponent: this would require `'b:'r` in addition to
|
||||
// other conditions
|
||||
if !env_bounds.is_empty() && env_bounds[1..].iter().all(|b| *b == env_bounds[0]) {
|
||||
let unique_bound = env_bounds[0];
|
||||
debug!("projection_must_outlive: unique declared bound = {:?}", unique_bound);
|
||||
if projection_ty.substs.regions().any(|r| env_bounds.contains(&r)) {
|
||||
debug!("projection_must_outlive: unique declared bound appears in trait ref");
|
||||
self.sub_regions(origin.clone(), region, unique_bound);
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
// Fallback to verifying after the fact that there exists a
|
||||
// declared bound, or that all the components appearing in the
|
||||
// projection outlive; in some cases, this may add insufficient
|
||||
// edges into the inference graph, leading to inference failures
|
||||
// even though a satisfactory solution exists.
|
||||
let verify_bound = self.projection_bound(origin.span(), env_bounds, projection_ty);
|
||||
let generic = GenericKind::Projection(projection_ty);
|
||||
self.verify_generic_bound(origin, generic.clone(), region, verify_bound);
|
||||
}
|
||||
|
||||
fn type_bound(&self, span: Span, ty: Ty<'tcx>) -> VerifyBound<'tcx> {
|
||||
match ty.sty {
|
||||
ty::TyParam(p) => {
|
||||
self.param_bound(p)
|
||||
}
|
||||
ty::TyProjection(data) => {
|
||||
let declared_bounds = self.projection_declared_bounds(span, data);
|
||||
self.projection_bound(span, declared_bounds, data)
|
||||
}
|
||||
_ => {
|
||||
self.recursive_type_bound(span, ty)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn param_bound(&self, param_ty: ty::ParamTy) -> VerifyBound<'tcx> {
|
||||
debug!("param_bound(param_ty={:?})",
|
||||
param_ty);
|
||||
|
||||
let mut param_bounds = self.declared_generic_bounds_from_env(GenericKind::Param(param_ty));
|
||||
|
||||
// Add in the default bound of fn body that applies to all in
|
||||
// scope type parameters:
|
||||
param_bounds.extend(self.implicit_region_bound);
|
||||
|
||||
VerifyBound::AnyRegion(param_bounds)
|
||||
}
|
||||
|
||||
fn projection_declared_bounds(&self,
|
||||
span: Span,
|
||||
projection_ty: ty::ProjectionTy<'tcx>)
|
||||
-> Vec<ty::Region<'tcx>>
|
||||
{
|
||||
// First assemble bounds from where clauses and traits.
|
||||
|
||||
let mut declared_bounds =
|
||||
self.declared_generic_bounds_from_env(GenericKind::Projection(projection_ty));
|
||||
|
||||
declared_bounds.extend_from_slice(
|
||||
&self.declared_projection_bounds_from_trait(span, projection_ty));
|
||||
|
||||
declared_bounds
|
||||
}
|
||||
|
||||
fn projection_bound(&self,
|
||||
span: Span,
|
||||
declared_bounds: Vec<ty::Region<'tcx>>,
|
||||
projection_ty: ty::ProjectionTy<'tcx>)
|
||||
-> VerifyBound<'tcx> {
|
||||
debug!("projection_bound(declared_bounds={:?}, projection_ty={:?})",
|
||||
declared_bounds, projection_ty);
|
||||
|
||||
// see the extensive comment in projection_must_outlive
|
||||
let ty = self.tcx.mk_projection(projection_ty.item_def_id, projection_ty.substs);
|
||||
let recursive_bound = self.recursive_type_bound(span, ty);
|
||||
|
||||
VerifyBound::AnyRegion(declared_bounds).or(recursive_bound)
|
||||
}
|
||||
|
||||
fn recursive_type_bound(&self, span: Span, ty: Ty<'tcx>) -> VerifyBound<'tcx> {
|
||||
let mut bounds = vec![];
|
||||
|
||||
for subty in ty.walk_shallow() {
|
||||
bounds.push(self.type_bound(span, subty));
|
||||
}
|
||||
|
||||
let mut regions = ty.regions();
|
||||
regions.retain(|r| !r.is_late_bound()); // ignore late-bound regions
|
||||
bounds.push(VerifyBound::AllRegions(regions));
|
||||
|
||||
// remove bounds that must hold, since they are not interesting
|
||||
bounds.retain(|b| !b.must_hold());
|
||||
|
||||
if bounds.len() == 1 {
|
||||
bounds.pop().unwrap()
|
||||
} else {
|
||||
VerifyBound::AllBounds(bounds)
|
||||
}
|
||||
}
|
||||
|
||||
fn declared_generic_bounds_from_env(&self, generic: GenericKind<'tcx>)
|
||||
-> Vec<ty::Region<'tcx>>
|
||||
{
|
||||
let param_env = &self.param_env;
|
||||
|
||||
// To start, collect bounds from user:
|
||||
let mut param_bounds = self.tcx.required_region_bounds(generic.to_ty(self.tcx),
|
||||
param_env.caller_bounds.to_vec());
|
||||
|
||||
// Next, collect regions we scraped from the well-formedness
|
||||
// constraints in the fn signature. To do that, we walk the list
|
||||
// of known relations from the fn ctxt.
|
||||
//
|
||||
// This is crucial because otherwise code like this fails:
|
||||
//
|
||||
// fn foo<'a, A>(x: &'a A) { x.bar() }
|
||||
//
|
||||
// The problem is that the type of `x` is `&'a A`. To be
|
||||
// well-formed, then, A must be lower-generic by `'a`, but we
|
||||
// don't know that this holds from first principles.
|
||||
for &(r, p) in &self.region_bound_pairs {
|
||||
debug!("generic={:?} p={:?}",
|
||||
generic,
|
||||
p);
|
||||
if generic == p {
|
||||
param_bounds.push(r);
|
||||
}
|
||||
}
|
||||
|
||||
param_bounds
|
||||
}
|
||||
|
||||
fn declared_projection_bounds_from_trait(&self,
|
||||
span: Span,
|
||||
projection_ty: ty::ProjectionTy<'tcx>)
|
||||
-> Vec<ty::Region<'tcx>>
|
||||
{
|
||||
debug!("projection_bounds(projection_ty={:?})",
|
||||
projection_ty);
|
||||
let ty = self.tcx.mk_projection(projection_ty.item_def_id, projection_ty.substs);
|
||||
|
||||
// Say we have a projection `<T as SomeTrait<'a>>::SomeType`. We are interested
|
||||
// in looking for a trait definition like:
|
||||
//
|
||||
// ```
|
||||
// trait SomeTrait<'a> {
|
||||
// type SomeType : 'a;
|
||||
// }
|
||||
// ```
|
||||
//
|
||||
// we can thus deduce that `<T as SomeTrait<'a>>::SomeType : 'a`.
|
||||
let trait_predicates = self.tcx.predicates_of(projection_ty.trait_ref(self.tcx).def_id);
|
||||
assert_eq!(trait_predicates.parent, None);
|
||||
let predicates = trait_predicates.predicates.as_slice().to_vec();
|
||||
traits::elaborate_predicates(self.tcx, predicates)
|
||||
.filter_map(|predicate| {
|
||||
// we're only interesting in `T : 'a` style predicates:
|
||||
let outlives = match predicate {
|
||||
ty::Predicate::TypeOutlives(data) => data,
|
||||
_ => { return None; }
|
||||
};
|
||||
|
||||
debug!("projection_bounds: outlives={:?} (1)",
|
||||
outlives);
|
||||
|
||||
// apply the substitutions (and normalize any projected types)
|
||||
let outlives = self.instantiate_type_scheme(span,
|
||||
projection_ty.substs,
|
||||
&outlives);
|
||||
|
||||
debug!("projection_bounds: outlives={:?} (2)",
|
||||
outlives);
|
||||
|
||||
let region_result = self.commit_if_ok(|_| {
|
||||
let (outlives, _) =
|
||||
self.replace_late_bound_regions_with_fresh_var(
|
||||
span,
|
||||
infer::AssocTypeProjection(projection_ty.item_def_id),
|
||||
&outlives);
|
||||
|
||||
debug!("projection_bounds: outlives={:?} (3)",
|
||||
outlives);
|
||||
|
||||
// check whether this predicate applies to our current projection
|
||||
let cause = self.fcx.misc(span);
|
||||
match self.at(&cause, self.fcx.param_env).eq(outlives.0, ty) {
|
||||
Ok(ok) => Ok((ok, outlives.1)),
|
||||
Err(_) => Err(())
|
||||
}
|
||||
}).map(|(ok, result)| {
|
||||
self.register_infer_ok_obligations(ok);
|
||||
result
|
||||
});
|
||||
|
||||
debug!("projection_bounds: region_result={:?}",
|
||||
region_result);
|
||||
|
||||
region_result.ok()
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
}
|
||||
|
@ -75,6 +75,7 @@ This API is completely unstable and subject to change.
|
||||
#![feature(advanced_slice_patterns)]
|
||||
#![feature(box_patterns)]
|
||||
#![feature(box_syntax)]
|
||||
#![feature(crate_visibility_modifier)]
|
||||
#![feature(conservative_impl_trait)]
|
||||
#![feature(match_default_bindings)]
|
||||
#![feature(never_type)]
|
||||
|
@ -27,7 +27,6 @@ trait A<'a> {
|
||||
|
||||
impl<'a> A<'a> for B {
|
||||
fn foo<F>(&mut self, f: F) //~ ERROR impl has stricter
|
||||
//~^ WARNING future release
|
||||
where F: fmt::Debug + 'static,
|
||||
{
|
||||
self.list.push(Box::new(f));
|
||||
|
@ -26,9 +26,9 @@ fn main() {
|
||||
|
||||
// END RUST SOURCE
|
||||
// START rustc.use_x.nll.0.mir
|
||||
// | '_#0r: {bb0[0], bb0[1], '_#0r}
|
||||
// | '_#1r: {bb0[0], bb0[1], '_#0r, '_#1r}
|
||||
// | '_#2r: {bb0[0], bb0[1], '_#2r}
|
||||
// ...
|
||||
// fn use_x(_1: &'_#0r mut i32, _2: &'_#1r u32, _3: &'_#0r u32, _4: &'_#2r u32) -> bool {
|
||||
// | '_#0r: {bb0[0], bb0[1], '_#0r, '_#1r, '_#2r, '_#3r}
|
||||
// | '_#1r: {bb0[0], bb0[1], '_#1r}
|
||||
// | '_#2r: {bb0[0], bb0[1], '_#1r, '_#2r}
|
||||
// | '_#3r: {bb0[0], bb0[1], '_#3r}
|
||||
// fn use_x(_1: &'_#1r mut i32, _2: &'_#2r u32, _3: &'_#1r u32, _4: &'_#3r u32) -> bool {
|
||||
// END rustc.use_x.nll.0.mir
|
||||
|
@ -28,12 +28,12 @@ fn main() {
|
||||
|
||||
// END RUST SOURCE
|
||||
// START rustc.main.nll.0.mir
|
||||
// | '_#5r: {bb0[6], bb0[7], bb0[8], bb0[9], bb0[10], bb0[11], bb0[12], bb0[13], bb0[14]}
|
||||
// | '_#6r: {bb0[6], bb0[7], bb0[8], bb0[9], bb0[10], bb0[11], bb0[12], bb0[13], bb0[14]}
|
||||
// ...
|
||||
// | '_#7r: {bb0[11], bb0[12], bb0[13], bb0[14]}
|
||||
// | '_#8r: {bb0[11], bb0[12], bb0[13], bb0[14]}
|
||||
// END rustc.main.nll.0.mir
|
||||
// START rustc.main.nll.0.mir
|
||||
// let _2: &'_#5r mut i32;
|
||||
// let _2: &'_#6r mut i32;
|
||||
// ...
|
||||
// let _4: &'_#7r mut i32;
|
||||
// let _4: &'_#8r mut i32;
|
||||
// END rustc.main.nll.0.mir
|
||||
|
@ -31,15 +31,15 @@ fn main() {
|
||||
|
||||
// END RUST SOURCE
|
||||
// START rustc.main.nll.0.mir
|
||||
// | '_#0r: {bb1[1], bb2[0], bb2[1]}
|
||||
// | '_#1r: {bb1[1], bb2[0], bb2[1]}
|
||||
// | '_#2r: {bb1[1], bb2[0], bb2[1]}
|
||||
// ...
|
||||
// let _2: &'_#1r usize;
|
||||
// let _2: &'_#2r usize;
|
||||
// END rustc.main.nll.0.mir
|
||||
// START rustc.main.nll.0.mir
|
||||
// bb1: {
|
||||
// | Live variables at bb1[0]: [_1, _3]
|
||||
// _2 = &'_#0r _1[_3];
|
||||
// _2 = &'_#1r _1[_3];
|
||||
// | Live variables at bb1[1]: [_2]
|
||||
// switchInt(const true) -> [0u8: bb3, otherwise: bb2];
|
||||
// }
|
||||
|
@ -44,5 +44,5 @@ unsafe impl<#[may_dangle] T> Drop for Wrap<T> {
|
||||
|
||||
// END RUST SOURCE
|
||||
// START rustc.main.nll.0.mir
|
||||
// | '_#4r: {bb1[3], bb1[4], bb1[5], bb2[0], bb2[1]}
|
||||
// | '_#5r: {bb1[3], bb1[4], bb1[5], bb2[0], bb2[1]}
|
||||
// END rustc.main.nll.0.mir
|
||||
|
@ -46,5 +46,5 @@ impl<T> Drop for Wrap<T> {
|
||||
|
||||
// END RUST SOURCE
|
||||
// START rustc.main.nll.0.mir
|
||||
// | '_#4r: {bb1[3], bb1[4], bb1[5], bb2[0], bb2[1], bb2[2], bb3[0], bb3[1], bb3[2], bb4[0], bb4[1], bb4[2], bb6[0], bb7[0], bb7[1], bb7[2], bb8[0]}
|
||||
// | '_#5r: {bb1[3], bb1[4], bb1[5], bb2[0], bb2[1], bb2[2], bb3[0], bb3[1], bb3[2], bb4[0], bb4[1], bb4[2], bb6[0], bb7[0], bb7[1], bb7[2], bb8[0]}
|
||||
// END rustc.main.nll.0.mir
|
||||
|
@ -36,14 +36,14 @@ fn main() {
|
||||
|
||||
// END RUST SOURCE
|
||||
// START rustc.main.nll.0.mir
|
||||
// | '_#0r: {bb1[1], bb2[0], bb2[1]}
|
||||
// | '_#1r: {bb1[1], bb2[0], bb2[1]}
|
||||
// ...
|
||||
// | '_#2r: {bb7[2], bb7[3], bb7[4]}
|
||||
// | '_#3r: {bb1[1], bb2[0], bb2[1], bb7[2], bb7[3], bb7[4]}
|
||||
// | '_#3r: {bb7[2], bb7[3], bb7[4]}
|
||||
// | '_#4r: {bb1[1], bb2[0], bb2[1], bb7[2], bb7[3], bb7[4]}
|
||||
// ...
|
||||
// let mut _2: &'_#3r usize;
|
||||
// let mut _2: &'_#4r usize;
|
||||
// ...
|
||||
// _2 = &'_#0r _1[_3];
|
||||
// _2 = &'_#1r _1[_3];
|
||||
// ...
|
||||
// _2 = &'_#2r (*_11);
|
||||
// _2 = &'_#3r (*_11);
|
||||
// END rustc.main.nll.0.mir
|
||||
|
@ -32,16 +32,16 @@ fn main() {
|
||||
|
||||
// END RUST SOURCE
|
||||
// START rustc.main.nll.0.mir
|
||||
// | '_#0r: {bb1[1], bb1[2], bb1[3], bb1[4], bb1[5], bb1[6], bb2[0], bb2[1]}
|
||||
// | '_#1r: {bb1[1], bb1[2], bb1[3], bb1[4], bb1[5], bb1[6], bb2[0], bb2[1]}
|
||||
// | '_#2r: {bb1[5], bb1[6], bb2[0], bb2[1]}
|
||||
// | '_#2r: {bb1[1], bb1[2], bb1[3], bb1[4], bb1[5], bb1[6], bb2[0], bb2[1]}
|
||||
// | '_#3r: {bb1[5], bb1[6], bb2[0], bb2[1]}
|
||||
// END rustc.main.nll.0.mir
|
||||
// START rustc.main.nll.0.mir
|
||||
// let _2: &'_#1r usize;
|
||||
// let _2: &'_#2r usize;
|
||||
// ...
|
||||
// let _6: &'_#2r usize;
|
||||
// let _6: &'_#3r usize;
|
||||
// ...
|
||||
// _2 = &'_#0r _1[_3];
|
||||
// _2 = &'_#1r _1[_3];
|
||||
// ...
|
||||
// _7 = _2;
|
||||
// ...
|
||||
|
44
src/test/run-pass/implied-bounds-closure-arg-outlives.rs
Normal file
44
src/test/run-pass/implied-bounds-closure-arg-outlives.rs
Normal file
@ -0,0 +1,44 @@
|
||||
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
// Test that we are able to handle the relationships between free
|
||||
// regions bound in a closure callback.
|
||||
|
||||
#[derive(Copy, Clone)]
|
||||
struct MyCx<'short, 'long: 'short> {
|
||||
short: &'short u32,
|
||||
long: &'long u32,
|
||||
}
|
||||
|
||||
impl<'short, 'long> MyCx<'short, 'long> {
|
||||
fn short(self) -> &'short u32 { self.short }
|
||||
fn long(self) -> &'long u32 { self.long }
|
||||
fn set_short(&mut self, v: &'short u32) { self.short = v; }
|
||||
}
|
||||
|
||||
fn with<F, R>(op: F) -> R
|
||||
where
|
||||
F: for<'short, 'long> FnOnce(MyCx<'short, 'long>) -> R,
|
||||
{
|
||||
op(MyCx {
|
||||
short: &22,
|
||||
long: &22,
|
||||
})
|
||||
}
|
||||
|
||||
fn main() {
|
||||
with(|mut cx| {
|
||||
// For this to type-check, we need to be able to deduce that
|
||||
// the lifetime of `l` can be `'short`, even though it has
|
||||
// input from `'long`.
|
||||
let l = if true { cx.long() } else { cx.short() };
|
||||
cx.set_short(l);
|
||||
});
|
||||
}
|
@ -42,7 +42,7 @@ impl<'a,'tcx> Foo<'a,'tcx> {
|
||||
// inferring `'_2` to be `'static` in this case, because
|
||||
// it is created outside the closure but then related to
|
||||
// regions bound by the closure itself. See the
|
||||
// `region_inference.rs` file (and the `givens` field, in
|
||||
// `region_constraints.rs` file (and the `givens` field, in
|
||||
// particular) for more details.
|
||||
this.foo()
|
||||
}))
|
||||
|
@ -1,4 +1,4 @@
|
||||
error: impl has stricter requirements than trait
|
||||
error[E0276]: impl has stricter requirements than trait
|
||||
--> $DIR/proj-outlives-region.rs:19:5
|
||||
|
|
||||
14 | fn foo() where T: 'a;
|
||||
@ -6,10 +6,6 @@ error: impl has stricter requirements than trait
|
||||
...
|
||||
19 | fn foo() where U: 'a { } //~ ERROR E0276
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ impl has extra requirement `U: 'a`
|
||||
|
|
||||
= note: #[deny(extra_requirement_in_impl)] on by default
|
||||
= warning: this was previously accepted by the compiler but is being phased out; it will become a hard error in a future release!
|
||||
= note: for more information, see issue #37166 <https://github.com/rust-lang/rust/issues/37166>
|
||||
|
||||
error: aborting due to previous error
|
||||
|
||||
|
@ -1,4 +1,4 @@
|
||||
error: impl has stricter requirements than trait
|
||||
error[E0276]: impl has stricter requirements than trait
|
||||
--> $DIR/region-unrelated.rs:19:5
|
||||
|
|
||||
14 | fn foo() where T: 'a;
|
||||
@ -6,10 +6,6 @@ error: impl has stricter requirements than trait
|
||||
...
|
||||
19 | fn foo() where V: 'a { }
|
||||
| ^^^^^^^^^^^^^^^^^^^^^^^^ impl has extra requirement `V: 'a`
|
||||
|
|
||||
= note: #[deny(extra_requirement_in_impl)] on by default
|
||||
= warning: this was previously accepted by the compiler but is being phased out; it will become a hard error in a future release!
|
||||
= note: for more information, see issue #37166 <https://github.com/rust-lang/rust/issues/37166>
|
||||
|
||||
error: aborting due to previous error
|
||||
|
||||
|
53
src/test/ui/nll/get_default.rs
Normal file
53
src/test/ui/nll/get_default.rs
Normal file
@ -0,0 +1,53 @@
|
||||
// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
// Basic test for free regions in the NLL code. This test ought to
|
||||
// report an error due to a reborrowing constraint. Right now, we get
|
||||
// a variety of errors from the older, AST-based machinery (notably
|
||||
// borrowck), and then we get the NLL error at the end.
|
||||
|
||||
// compile-flags:-Znll -Zborrowck-mir
|
||||
|
||||
struct Map {
|
||||
}
|
||||
|
||||
impl Map {
|
||||
fn get(&self) -> Option<&String> { None }
|
||||
fn set(&mut self, v: String) { }
|
||||
}
|
||||
|
||||
fn ok(map: &mut Map) -> &String {
|
||||
loop {
|
||||
match map.get() {
|
||||
Some(v) => {
|
||||
return v;
|
||||
}
|
||||
None => {
|
||||
map.set(String::new()); // Just AST errors here
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn err(map: &mut Map) -> &String {
|
||||
loop {
|
||||
match map.get() {
|
||||
Some(v) => {
|
||||
map.set(String::new()); // Both AST and MIR error here
|
||||
return v;
|
||||
}
|
||||
None => {
|
||||
map.set(String::new()); // Just AST errors here
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn main() { }
|
47
src/test/ui/nll/get_default.stderr
Normal file
47
src/test/ui/nll/get_default.stderr
Normal file
@ -0,0 +1,47 @@
|
||||
error[E0502]: cannot borrow `*map` as mutable because it is also borrowed as immutable (Ast)
|
||||
--> $DIR/get_default.rs:33:17
|
||||
|
|
||||
28 | match map.get() {
|
||||
| --- immutable borrow occurs here
|
||||
...
|
||||
33 | map.set(String::new()); // Just AST errors here
|
||||
| ^^^ mutable borrow occurs here
|
||||
...
|
||||
37 | }
|
||||
| - immutable borrow ends here
|
||||
|
||||
error[E0502]: cannot borrow `*map` as mutable because it is also borrowed as immutable (Ast)
|
||||
--> $DIR/get_default.rs:43:17
|
||||
|
|
||||
41 | match map.get() {
|
||||
| --- immutable borrow occurs here
|
||||
42 | Some(v) => {
|
||||
43 | map.set(String::new()); // Both AST and MIR error here
|
||||
| ^^^ mutable borrow occurs here
|
||||
...
|
||||
51 | }
|
||||
| - immutable borrow ends here
|
||||
|
||||
error[E0502]: cannot borrow `*map` as mutable because it is also borrowed as immutable (Ast)
|
||||
--> $DIR/get_default.rs:47:17
|
||||
|
|
||||
41 | match map.get() {
|
||||
| --- immutable borrow occurs here
|
||||
...
|
||||
47 | map.set(String::new()); // Just AST errors here
|
||||
| ^^^ mutable borrow occurs here
|
||||
...
|
||||
51 | }
|
||||
| - immutable borrow ends here
|
||||
|
||||
error[E0502]: cannot borrow `(*map)` as mutable because it is also borrowed as immutable (Mir)
|
||||
--> $DIR/get_default.rs:43:17
|
||||
|
|
||||
41 | match map.get() {
|
||||
| --- immutable borrow occurs here
|
||||
42 | Some(v) => {
|
||||
43 | map.set(String::new()); // Both AST and MIR error here
|
||||
| ^^^ mutable borrow occurs here
|
||||
|
||||
error: aborting due to 4 previous errors
|
||||
|
Loading…
Reference in New Issue
Block a user