Auto merge of #44505 - nikomatsakis:lotsa-comments, r=steveklabnik

rework the README.md for rustc and add other readmes

OK, so, long ago I committed to the idea of trying to write some high-level documentation for rustc. This has proved to be much harder for me to get done than I thought it would! This PR is far from as complete as I had hoped, but I wanted to open it so that people can give me feedback on the conventions that it establishes. If this seems like a good way forward, we can land it and I will open an issue with a good check-list of things to write (and try to take down some of them myself).

Here are the conventions I established on which I would like feedback.

**Use README.md files**. First off, I'm aiming to keep most of the high-level docs in `README.md` files, rather than entries on forge. My thought is that such files are (a) more discoverable than forge and (b) closer to the code, and hence can be edited in a single PR. However, since they are not *in the code*, they will naturally get out of date, so the intention is to focus on the highest-level details, which are least likely to bitrot. I've included a few examples of common functions and so forth, but never tried to (e.g.) exhaustively list the names of functions and so forth.
    - I would like to use the tidy scripts to try and check that these do not go out of date. Future work.

**librustc/README.md as the main entrypoint.** This seems like the most natural place people will look first. It lays out how the crates are structured and **is intended** to give pointers to the main data structures of the compiler (I didn't update that yet; the existing material is terribly dated).

**A glossary listing abbreviations and things.** It's much harder to read code if you don't know what some obscure set of letters like `infcx` stands for.

**Major modules each have their own README.md that documents the high-level idea.** For example, I wrote some stuff about `hir` and `ty`. Both of them have many missing topics, but I think that is roughly the level of depth that would be good. The idea is to give people a "feeling" for what the code does.

What is missing primarily here is lots of content. =) Here are some things I'd like to see:

- A description of what a QUERY is and how to define one
    - Some comments for `librustc/ty/maps.rs`
- An overview of how compilation proceeds now (i.e., the hybrid demand-driven and forward model) and how we would like to see it going in the future (all demand-driven)
- Some coverage of how incremental will work under red-green
- An updated list of the major IRs in use of the compiler (AST, HIR, TypeckTables, MIR) and major bits of interesting code (typeck, borrowck, etc)
- More advice on how to use `x.py`, or at least pointers to that
- Good choice for `config.toml`
- How to use `RUST_LOG` and other debugging flags (e.g., `-Zverbose`, `-Ztreat-err-as-bug`)
- Helpful conventions for `debug!` statement formatting

cc @rust-lang/compiler @mgattozzi
This commit is contained in:
bors 2017-09-19 22:43:58 +00:00
commit f60bc3ac0c
20 changed files with 2555 additions and 1741 deletions

View File

@ -13,162 +13,191 @@ https://github.com/rust-lang/rust/issues
Your concerns are probably the same as someone else's.
You may also be interested in the
[Rust Forge](https://forge.rust-lang.org/), which includes a number of
interesting bits of information.
Finally, at the end of this file is a GLOSSARY defining a number of
common (and not necessarily obvious!) names that are used in the Rust
compiler code. If you see some funky name and you'd like to know what
it stands for, check there!
The crates of rustc
===================
Rustc consists of a number of crates, including `libsyntax`,
`librustc`, `librustc_back`, `librustc_trans`, and `librustc_driver`
(the names and divisions are not set in stone and may change;
in general, a finer-grained division of crates is preferable):
Rustc consists of a number of crates, including `syntax`,
`rustc`, `rustc_back`, `rustc_trans`, `rustc_driver`, and
many more. The source for each crate can be found in a directory
like `src/libXXX`, where `XXX` is the crate name.
- [`libsyntax`][libsyntax] contains those things concerned purely with syntax
that is, the AST, parser, pretty-printer, lexer, macro expander, and
utilities for traversing ASTs are in a separate crate called
"syntax", whose files are in `./../libsyntax`, where `.` is the
current directory (that is, the parent directory of front/, middle/,
back/, and so on).
(NB. The names and divisions of these crates are not set in
stone and may change over time -- for the time being, we tend towards
a finer-grained division to help with compilation time, though as
incremental improves that may change.)
- `librustc` (the current directory) contains the high-level analysis
passes, such as the type checker, borrow checker, and so forth.
It is the heart of the compiler.
The dependency structure of these crates is roughly a diamond:
- [`librustc_back`][back] contains some very low-level details that are
specific to different LLVM targets and so forth.
````
rustc_driver
/ | \
/ | \
/ | \
/ v \
rustc_trans rustc_borrowck ... rustc_metadata
\ | /
\ | /
\ | /
\ v /
rustc
|
v
syntax
/ \
/ \
syntax_pos syntax_ext
```
- [`librustc_trans`][trans] contains the code to convert from Rust IR into LLVM
IR, and then from LLVM IR into machine code, as well as the main
driver that orchestrates all the other passes and various other bits
of miscellany. In general it contains code that runs towards the
end of the compilation process.
The `rustc_driver` crate, at the top of this lattice, is effectively
the "main" function for the rust compiler. It doesn't have much "real
code", but instead ties together all of the code defined in the other
crates and defines the overall flow of execution. (As we transition
more and more to the [query model](ty/maps/README.md), however, the
"flow" of compilation is becoming less centrally defined.)
- [`librustc_driver`][driver] invokes the compiler from
[`libsyntax`][libsyntax], then the analysis phases from `librustc`, and
finally the lowering and codegen passes from [`librustc_trans`][trans].
At the other extreme, the `rustc` crate defines the common and
pervasive data structures that all the rest of the compiler uses
(e.g., how to represent types, traits, and the program itself). It
also contains some amount of the compiler itself, although that is
relatively limited.
Roughly speaking the "order" of the three crates is as follows:
Finally, all the crates in the bulge in the middle define the bulk of
the compiler -- they all depend on `rustc`, so that they can make use
of the various types defined there, and they export public routines
that `rustc_driver` will invoke as needed (more and more, what these
crates export are "query definitions", but those are covered later
on).
librustc_driver
|
+-----------------+-------------------+
| |
libsyntax -> librustc -> librustc_trans
Below `rustc` lie various crates that make up the parser and error
reporting mechanism. For historical reasons, these crates do not have
the `rustc_` prefix, but they are really just as much an internal part
of the compiler and not intended to be stable (though they do wind up
getting used by some crates in the wild; a practice we hope to
gradually phase out).
Each crate has a `README.md` file that describes, at a high-level,
what it contains, and tries to give some kind of explanation (some
better than others).
The compiler process:
=====================
The compiler process
====================
The Rust compiler is comprised of six main compilation phases.
The Rust compiler is in a bit of transition right now. It used to be a
purely "pass-based" compiler, where we ran a number of passes over the
entire program, and each did a particular check of transformation.
1. Parsing input
2. Configuration & expanding (cfg rules & syntax extension expansion)
3. Running analysis passes
4. Translation to LLVM
5. LLVM passes
6. Linking
We are gradually replacing this pass-based code with an alternative
setup based on on-demand **queries**. In the query-model, we work
backwards, executing a *query* that expresses our ultimate goal (e.g.,
"compiler this crate"). This query in turn may make other queries
(e.g., "get me a list of all modules in the crate"). Those queries
make other queries that ultimately bottom out in the base operations,
like parsing the input, running the type-checker, and so forth. This
on-demand model permits us to do exciting things like only do the
minimal amount of work needed to type-check a single function. It also
helps with incremental compilation. (For details on defining queries,
check out `src/librustc/ty/maps/README.md`.)
Phase one is responsible for parsing & lexing the input to the compiler. The
output of this phase is an abstract syntax tree (AST). The AST at this point
includes all macro uses & attributes. This means code which will be later
expanded and/or removed due to `cfg` attributes is still present in this
version of the AST. Parsing abstracts away details about individual files which
have been read into the AST.
Regardless of the general setup, the basic operations that the
compiler must perform are the same. The only thing that changes is
whether these operations are invoked front-to-back, or on demand. In
order to compile a Rust crate, these are the general steps that we
take:
Phase two handles configuration and macro expansion. You can think of this
phase as a function acting on the AST from the previous phase. The input for
this phase is the unexpanded AST from phase one, and the output is an expanded
version of the same AST. This phase will expand all macros & syntax
extensions and will evaluate all `cfg` attributes, potentially removing some
code. The resulting AST will not contain any macros or `macro_use` statements.
1. **Parsing input**
- this processes the `.rs` files and produces the AST ("abstract syntax tree")
- the AST is defined in `syntax/ast.rs`. It is intended to match the lexical
syntax of the Rust language quite closely.
2. **Name resolution, macro expansion, and configuration**
- once parsing is complete, we process the AST recursively, resolving paths
and expanding macros. This same process also processes `#[cfg]` nodes, and hence
may strip things out of the AST as well.
3. **Lowering to HIR**
- Once name resolution completes, we convert the AST into the HIR,
or "high-level IR". The HIR is defined in `src/librustc/hir/`; that module also includes
the lowering code.
- The HIR is a lightly desugared variant of the AST. It is more processed than the
AST and more suitable for the analyses that follow. It is **not** required to match
the syntax of the Rust language.
- As a simple example, in the **AST**, we preserve the parentheses
that the user wrote, so `((1 + 2) + 3)` and `1 + 2 + 3` parse
into distinct trees, even though they are equivalent. In the
HIR, however, parentheses nodes are removed, and those two
expressions are represented in the same way.
3. **Type-checking and subsequent analyses**
- An important step in processing the HIR is to perform type
checking. This process assigns types to every HIR expression,
for example, and also is responsible for resolving some
"type-dependent" paths, such as field accesses (`x.f` -- we
can't know what field `f` is being accessed until we know the
type of `x`) and associated type references (`T::Item` -- we
can't know what type `Item` is until we know what `T` is).
- Type checking creates "side-tables" (`TypeckTables`) that include
the types of expressions, the way to resolve methods, and so forth.
- After type-checking, we can do other analyses, such as privacy checking.
4. **Lowering to MIR and post-processing**
- Once type-checking is done, we can lower the HIR into MIR ("middle IR"), which
is a **very** desugared version of Rust, well suited to the borrowck but also
certain high-level optimizations.
5. **Translation to LLVM and LLVM optimizations**
- From MIR, we can produce LLVM IR.
- LLVM then runs its various optimizations, which produces a number of `.o` files
(one for each "codegen unit").
6. **Linking**
- Finally, those `.o` files are linke together.
The code for these first two phases is in [`libsyntax`][libsyntax].
Glossary
========
After this phase, the compiler allocates IDs to each node in the AST
(technically not every node, but most of them). If we are writing out
dependencies, that happens now.
The compiler uses a number of...idiosyncratic abbreviations and
things. This glossary attempts to list them and give you a few
pointers for understanding them better.
The third phase is analysis. This is the most complex phase in the compiler,
and makes up much of the code. This phase included name resolution, type
checking, borrow checking, type & lifetime inference, trait selection, method
selection, linting and so on. Most of the error detection in the compiler comes
from this phase (with the exception of parse errors which arise during
parsing). The "output" of this phase is a set of side tables containing
semantic information about the source program. The analysis code is in
[`librustc`][rustc] and some other crates with the `librustc_` prefix.
The fourth phase is translation. This phase translates the AST (and the side
tables from the previous phase) into LLVM IR (intermediate representation).
This is achieved by calling into the LLVM libraries. The code for this is in
[`librustc_trans`][trans].
Phase five runs the LLVM backend. This runs LLVM's optimization passes on the
generated IR and generates machine code resulting in object files. This phase
is not really part of the Rust compiler, as LLVM carries out all the work.
The interface between LLVM and Rust is in [`librustc_llvm`][llvm].
The final phase, phase six, links the object files into an executable. This is
again outsourced to other tools and not performed by the Rust compiler
directly. The interface is in [`librustc_back`][back] (which also contains some
things used primarily during translation).
A module called the driver coordinates all these phases. It handles all the
highest level coordination of compilation from parsing command line arguments
all the way to invoking the linker to produce an executable.
Modules in the librustc crate
=============================
The librustc crate itself consists of the following submodules
(mostly, but not entirely, in their own directories):
- session: options and data that pertain to the compilation session as
a whole
- middle: middle-end: name resolution, typechecking, LLVM code
generation
- metadata: encoder and decoder for data required by separate
compilation
- plugin: infrastructure for compiler plugins
- lint: infrastructure for compiler warnings
- util: ubiquitous types and helper functions
- lib: bindings to LLVM
The entry-point for the compiler is main() in the [`librustc_driver`][driver]
crate.
The 3 central data structures:
------------------------------
1. `./../libsyntax/ast.rs` defines the AST. The AST is treated as
immutable after parsing, but it depends on mutable context data
structures (mainly hash maps) to give it meaning.
- Many though not all nodes within this data structure are
wrapped in the type `spanned<T>`, meaning that the front-end has
marked the input coordinates of that node. The member `node` is
the data itself, the member `span` is the input location (file,
line, column; both low and high).
- Many other nodes within this data structure carry a
`def_id`. These nodes represent the 'target' of some name
reference elsewhere in the tree. When the AST is resolved, by
`middle/resolve.rs`, all names wind up acquiring a def that they
point to. So anything that can be pointed-to by a name winds
up with a `def_id`.
2. `middle/ty.rs` defines the datatype `sty`. This is the type that
represents types after they have been resolved and normalized by
the middle-end. The typeck phase converts every ast type to a
`ty::sty`, and the latter is used to drive later phases of
compilation. Most variants in the `ast::ty` tag have a
corresponding variant in the `ty::sty` tag.
3. `./../librustc_llvm/lib.rs` defines the exported types
`ValueRef`, `TypeRef`, `BasicBlockRef`, and several others.
Each of these is an opaque pointer to an LLVM type,
manipulated through the `lib::llvm` interface.
[libsyntax]: https://github.com/rust-lang/rust/tree/master/src/libsyntax/
[trans]: https://github.com/rust-lang/rust/tree/master/src/librustc_trans/
[llvm]: https://github.com/rust-lang/rust/tree/master/src/librustc_llvm/
[back]: https://github.com/rust-lang/rust/tree/master/src/librustc_back/
[rustc]: https://github.com/rust-lang/rust/tree/master/src/librustc/
[driver]: https://github.com/rust-lang/rust/tree/master/src/librustc_driver
- AST -- the **abstract syntax tree** produced the `syntax` crate; reflects user syntax
very closely.
- codegen unit -- when we produce LLVM IR, we group the Rust code into a number of codegen
units. Each of these units is processed by LLVM independently from one another,
enabling parallelism. They are also the unit of incremental re-use.
- cx -- we tend to use "cx" as an abbrevation for context. See also tcx, infcx, etc.
- `DefId` -- an index identifying a **definition** (see `librustc/hir/def_id.rs`). Uniquely
identifies a `DefPath`.
- HIR -- the **High-level IR**, created by lowering and desugaring the AST. See `librustc/hir`.
- `HirId` -- identifies a particular node in the HIR by combining a
def-id with an "intra-definition offset".
- `'gcx` -- the lifetime of the global arena (see `librustc/ty`).
- generics -- the set of generic type parameters defined on a type or item
- ICE -- internal compiler error. When the compiler crashes.
- infcx -- the inference context (see `librustc/infer`)
- MIR -- the **Mid-level IR** that is created after type-checking for use by borrowck and trans.
Defined in the `src/librustc/mir/` module, but much of the code that manipulates it is
found in `src/librustc_mir`.
- obligation -- something that must be proven by the trait system; see `librustc/traits`.
- local crate -- the crate currently being compiled.
- node-id or `NodeId` -- an index identifying a particular node in the
AST or HIR; gradually being phased out and replaced with `HirId`.
- query -- perhaps some sub-computation during compilation; see `librustc/maps`.
- provider -- the function that executes a query; see `librustc/maps`.
- sess -- the **compiler session**, which stores global data used throughout compilation
- side tables -- because the AST and HIR are immutable once created, we often carry extra
information about them in the form of hashtables, indexed by the id of a particular node.
- span -- a location in the user's source code, used for error
reporting primarily. These are like a file-name/line-number/column
tuple on steroids: they carry a start/end point, and also track
macro expansions and compiler desugaring. All while being packed
into a few bytes (really, it's an index into a table). See the
`Span` datatype for more.
- substs -- the **substitutions** for a given generic type or item
(e.g., the `i32, u32` in `HashMap<i32, u32>`)
- tcx -- the "typing context", main data structure of the compiler (see `librustc/ty`).
- trans -- the code to **translate** MIR into LLVM IR.
- trait reference -- a trait and values for its type parameters (see `librustc/ty`).
- ty -- the internal representation of a **type** (see `librustc/ty`).

119
src/librustc/hir/README.md Normal file
View File

@ -0,0 +1,119 @@
# Introduction to the HIR
The HIR -- "High-level IR" -- is the primary IR used in most of
rustc. It is a desugared version of the "abstract syntax tree" (AST)
that is generated after parsing, macro expansion, and name resolution
have completed. Many parts of HIR resemble Rust surface syntax quite
closely, with the exception that some of Rust's expression forms have
been desugared away (as an example, `for` loops are converted into a
`loop` and do not appear in the HIR).
This README covers the main concepts of the HIR.
### Out-of-band storage and the `Crate` type
The top-level data-structure in the HIR is the `Crate`, which stores
the contents of the crate currently being compiled (we only ever
construct HIR for the current crate). Whereas in the AST the crate
data structure basically just contains the root module, the HIR
`Crate` structure contains a number of maps and other things that
serve to organize the content of the crate for easier access.
For example, the contents of individual items (e.g., modules,
functions, traits, impls, etc) in the HIR are not immediately
accessible in the parents. So, for example, if had a module item `foo`
containing a function `bar()`:
```
mod foo {
fn bar() { }
}
```
Then in the HIR the representation of module `foo` (the `Mod`
stuct) would have only the **`ItemId`** `I` of `bar()`. To get the
details of the function `bar()`, we would lookup `I` in the
`items` map.
One nice result from this representation is that one can iterate
over all items in the crate by iterating over the key-value pairs
in these maps (without the need to trawl through the IR in total).
There are similar maps for things like trait items and impl items,
as well as "bodies" (explained below).
The other reason to setup the representation this way is for better
integration with incremental compilation. This way, if you gain access
to a `&hir::Item` (e.g. for the mod `foo`), you do not immediately
gain access to the contents of the function `bar()`. Instead, you only
gain access to the **id** for `bar()`, and you must invoke some
function to lookup the contents of `bar()` given its id; this gives us
a chance to observe that you accessed the data for `bar()` and record
the dependency.
### Identifiers in the HIR
Most of the code that has to deal with things in HIR tends not to
carry around references into the HIR, but rather to carry around
*identifier numbers* (or just "ids"). Right now, you will find four
sorts of identifiers in active use:
- `DefId`, which primarily name "definitions" or top-level items.
- You can think of a `DefId` as being shorthand for a very explicit
and complete path, like `std::collections::HashMap`. However,
these paths are able to name things that are not nameable in
normal Rust (e.g., impls), and they also include extra information
about the crate (such as its version number, as two versions of
the same crate can co-exist).
- A `DefId` really consists of two parts, a `CrateNum` (which
identifies the crate) and a `DefIndex` (which indixes into a list
of items that is maintained per crate).
- `HirId`, which combines the index of a particular item with an
offset within that item.
- the key point of a `HirId` is that it is *relative* to some item (which is named
via a `DefId`).
- `BodyId`, this is an absolute identifier that refers to a specific
body (definition of a function or constant) in the crate. It is currently
effectively a "newtype'd" `NodeId`.
- `NodeId`, which is an absolute id that identifies a single node in the HIR tree.
- While these are still in common use, **they are being slowly phased out**.
- Since they are absolute within the crate, adding a new node
anywhere in the tree causes the node-ids of all subsequent code in
the crate to change. This is terrible for incremental compilation,
as you can perhaps imagine.
### HIR Map
Most of the time when you are working with the HIR, you will do so via
the **HIR Map**, accessible in the tcx via `tcx.hir` (and defined in
the `hir::map` module). The HIR map contains a number of methods to
convert between ids of various kinds and to lookup data associated
with a HIR node.
For example, if you have a `DefId`, and you would like to convert it
to a `NodeId`, you can use `tcx.hir.as_local_node_id(def_id)`. This
returns an `Option<NodeId>` -- this will be `None` if the def-id
refers to something outside of the current crate (since then it has no
HIR node), but otherwise returns `Some(n)` where `n` is the node-id of
the definition.
Similarly, you can use `tcx.hir.find(n)` to lookup the node for a
`NodeId`. This returns a `Option<Node<'tcx>>`, where `Node` is an enum
defined in the map; by matching on this you can find out what sort of
node the node-id referred to and also get a pointer to the data
itself. Often, you know what sort of node `n` is -- e.g., if you know
that `n` must be some HIR expression, you can do
`tcx.hir.expect_expr(n)`, which will extract and return the
`&hir::Expr`, panicking if `n` is not in fact an expression.
Finally, you can use the HIR map to find the parents of nodes, via
calls like `tcx.hir.get_parent_node(n)`.
### HIR Bodies
A **body** represents some kind of executable code, such as the body
of a function/closure or the definition of a constant. Bodies are
associated with an **owner**, which is typically some kind of item
(e.g., a `fn()` or `const`), but could also be a closure expression
(e.g., `|x, y| x + y`). You can use the HIR map to find find the body
associated with a given def-id (`maybe_body_owned_by()`) or to find
the owner of a body (`body_owner_def_id()`).

View File

@ -0,0 +1,4 @@
The HIR map, accessible via `tcx.hir`, allows you to quickly navigate the
HIR and convert between various forms of identifiers. See [the HIR README] for more information.
[the HIR README]: ../README.md

View File

@ -413,6 +413,10 @@ pub struct WhereEqPredicate {
pub type CrateConfig = HirVec<P<MetaItem>>;
/// The top-level data structure that stores the entire contents of
/// the crate currently being compiled.
///
/// For more details, see [the module-level README](README.md).
#[derive(Clone, PartialEq, Eq, RustcEncodable, RustcDecodable, Debug)]
pub struct Crate {
pub module: Mod,
@ -927,7 +931,27 @@ pub struct BodyId {
pub node_id: NodeId,
}
/// The body of a function or constant value.
/// The body of a function, closure, or constant value. In the case of
/// a function, the body contains not only the function body itself
/// (which is an expression), but also the argument patterns, since
/// those are something that the caller doesn't really care about.
///
/// # Examples
///
/// ```
/// fn foo((x, y): (u32, u32)) -> u32 {
/// x + y
/// }
/// ```
///
/// Here, the `Body` associated with `foo()` would contain:
///
/// - an `arguments` array containing the `(x, y)` pattern
/// - a `value` containing the `x + y` expression (maybe wrapped in a block)
/// - `is_generator` would be false
///
/// All bodies have an **owner**, which can be accessed via the HIR
/// map using `body_owner_def_id()`.
#[derive(Clone, PartialEq, Eq, RustcEncodable, RustcDecodable, Hash, Debug)]
pub struct Body {
pub arguments: HirVec<Arg>,

View File

@ -8,7 +8,28 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! The Rust compiler.
//! The "main crate" of the Rust compiler. This crate contains common
//! type definitions that are used by the other crates in the rustc
//! "family". Some prominent examples (note that each of these modules
//! has their own README with further details).
//!
//! - **HIR.** The "high-level (H) intermediate representation (IR)" is
//! defined in the `hir` module.
//! - **MIR.** The "mid-level (M) intermediate representation (IR)" is
//! defined in the `mir` module. This module contains only the
//! *definition* of the MIR; the passes that transform and operate
//! on MIR are found in `librustc_mir` crate.
//! - **Types.** The internal representation of types used in rustc is
//! defined in the `ty` module. This includes the **type context**
//! (or `tcx`), which is the central context during most of
//! compilation, containing the interners and other things.
//! - **Traits.** Trait resolution is implemented in the `traits` module.
//! - **Type inference.** The type inference code can be found in the `infer` module;
//! this code handles low-level equality and subtyping operations. The
//! type check pass in the compiler is found in the `librustc_typeck` crate.
//!
//! For a deeper explanation of how the compiler works and is
//! organized, see the README.md file in this directory.
//!
//! # Note
//!

165
src/librustc/ty/README.md Normal file
View File

@ -0,0 +1,165 @@
# Types and the Type Context
The `ty` module defines how the Rust compiler represents types
internally. It also defines the *typing context* (`tcx` or `TyCtxt`),
which is the central data structure in the compiler.
## The tcx and how it uses lifetimes
The `tcx` ("typing context") is the central data structure in the
compiler. It is the context that you use to perform all manner of
queries. The struct `TyCtxt` defines a reference to this shared context:
```rust
tcx: TyCtxt<'a, 'gcx, 'tcx>
// -- ---- ----
// | | |
// | | innermost arena lifetime (if any)
// | "global arena" lifetime
// lifetime of this reference
```
As you can see, the `TyCtxt` type takes three lifetime parameters.
These lifetimes are perhaps the most complex thing to understand about
the tcx. During Rust compilation, we allocate most of our memory in
**arenas**, which are basically pools of memory that get freed all at
once. When you see a reference with a lifetime like `'tcx` or `'gcx`,
you know that it refers to arena-allocated data (or data that lives as
long as the arenas, anyhow).
We use two distinct levels of arenas. The outer level is the "global
arena". This arena lasts for the entire compilation: so anything you
allocate in there is only freed once compilation is basically over
(actually, when we shift to executing LLVM).
To reduce peak memory usage, when we do type inference, we also use an
inner level of arena. These arenas get thrown away once type inference
is over. This is done because type inference generates a lot of
"throw-away" types that are not particularly interesting after type
inference completes, so keeping around those allocations would be
wasteful.
Often, we wish to write code that explicitly asserts that it is not
taking place during inference. In that case, there is no "local"
arena, and all the types that you can access are allocated in the
global arena. To express this, the idea is to us the same lifetime
for the `'gcx` and `'tcx` parameters of `TyCtxt`. Just to be a touch
confusing, we tend to use the name `'tcx` in such contexts. Here is an
example:
```rust
fn not_in_inference<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>, def_id: DefId) {
// ---- ----
// Using the same lifetime here asserts
// that the innermost arena accessible through
// this reference *is* the global arena.
}
```
In contrast, if we want to code that can be usable during type inference, then you
need to declare a distinct `'gcx` and `'tcx` lifetime parameter:
```rust
fn maybe_in_inference<'a, 'gcx, 'tcx>(tcx: TyCtxt<'a, 'gcx, 'tcx>, def_id: DefId) {
// ---- ----
// Using different lifetimes here means that
// the innermost arena *may* be distinct
// from the global arena (but doesn't have to be).
}
```
### Allocating and working with types
Rust types are represented using the `Ty<'tcx>` defined in the `ty`
module (not to be confused with the `Ty` struct from [the HIR]). This
is in fact a simple type alias for a reference with `'tcx` lifetime:
```rust
pub type Ty<'tcx> = &'tcx TyS<'tcx>;
```
[the HIR]: ../hir/README.md
You can basically ignore the `TyS` struct -- you will basically never
access it explicitly. We always pass it by reference using the
`Ty<'tcx>` alias -- the only exception I think is to define inherent
methods on types. Instances of `TyS` are only ever allocated in one of
the rustc arenas (never e.g. on the stack).
One common operation on types is to **match** and see what kinds of
types they are. This is done by doing `match ty.sty`, sort of like this:
```rust
fn test_type<'tcx>(ty: Ty<'tcx>) {
match ty.sty {
ty::TyArray(elem_ty, len) => { ... }
...
}
}
```
The `sty` field (the origin of this name is unclear to me; perhaps
structural type?) is of type `TypeVariants<'tcx>`, which is an enum
definined all of the different kinds of types in the compiler.
> NB: inspecting the `sty` field on types during type inference can be
> risky, as there are may be inference variables and other things to
> consider, or sometimes types are not yet known that will become
> known later.).
To allocate a new type, you can use the various `mk_` methods defined
on the `tcx`. These have names that correpond mostly to the various kinds
of type variants. For example:
```rust
let array_ty = tcx.mk_array(elem_ty, len * 2);
```
These methods all return a `Ty<'tcx>` -- note that the lifetime you
get back is the lifetime of the innermost arena that this `tcx` has
access to. In fact, types are always canonicalized and interned (so we
never allocate exactly the same type twice) and are always allocated
in the outermost arena where they can be (so, if they do not contain
any inference variables or other "temporary" types, they will be
allocated in the global arena). However, the lifetime `'tcx` is always
a safe approximation, so that is what you get back.
> NB. Because types are interned, it is possible to compare them for
> equality efficiently using `==` -- however, this is almost never what
> you want to do unless you happen to be hashing and looking for
> duplicates. This is because often in Rust there are multiple ways to
> represent the same type, particularly once inference is involved. If
> you are going to be testing for type equality, you probably need to
> start looking into the inference code to do it right.
You can also find various common types in the tcx itself by accessing
`tcx.types.bool`, `tcx.types.char`, etc (see `CommonTypes` for more).
### Beyond types: Other kinds of arena-allocated data structures
In addition to types, there are a number of other arena-allocated data
structures that you can allocate, and which are found in this
module. Here are a few examples:
- `Substs`, allocated with `mk_substs` -- this will intern a slice of types, often used to
specify the values to be substituted for generics (e.g., `HashMap<i32, u32>`
would be represented as a slice `&'tcx [tcx.types.i32, tcx.types.u32]`.
- `TraitRef`, typically passed by value -- a **trait reference**
consists of a reference to a trait along with its various type
parameters (including `Self`), like `i32: Display` (here, the def-id
would reference the `Display` trait, and the substs would contain
`i32`).
- `Predicate` defines something the trait system has to prove (see `traits` module).
### Import conventions
Although there is no hard and fast rule, the `ty` module tends to be used like so:
```rust
use ty::{self, Ty, TyCtxt};
```
In particular, since they are so common, the `Ty` and `TyCtxt` types
are imported directly. Other types are often referenced with an
explicit `ty::` prefix (e.g., `ty::TraitRef<'tcx>`). But some modules
choose to import a larger or smaller set of names explicitly.

View File

@ -793,9 +793,10 @@ impl<'tcx> CommonTypes<'tcx> {
}
}
/// The data structure to keep track of all the information that typechecker
/// generates so that so that it can be reused and doesn't have to be redone
/// later on.
/// The central data structure of the compiler. It stores references
/// to the various **arenas** and also houses the results of the
/// various **compiler queries** that have been performed. See [the
/// README](README.md) for more deatils.
#[derive(Copy, Clone)]
pub struct TyCtxt<'a, 'gcx: 'a+'tcx, 'tcx: 'a> {
gcx: &'a GlobalCtxt<'gcx>,

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,302 @@
# The Rust Compiler Query System
The Compiler Query System is the key to our new demand-driven
organization. The idea is pretty simple. You have various queries
that compute things about the input -- for example, there is a query
called `type_of(def_id)` that, given the def-id of some item, will
compute the type of that item and return it to you.
Query execution is **memoized** -- so the first time you invoke a
query, it will go do the computation, but the next time, the result is
returned from a hashtable. Moreover, query execution fits nicely into
**incremental computation**; the idea is roughly that, when you do a
query, the result **may** be returned to you by loading stored data
from disk (but that's a separate topic we won't discuss further here).
The overall vision is that, eventually, the entire compiler
control-flow will be query driven. There will effectively be one
top-level query ("compile") that will run compilation on a crate; this
will in turn demand information about that crate, starting from the
*end*. For example:
- This "compile" query might demand to get a list of codegen-units
(i.e., modules that need to be compiled by LLVM).
- But computing the list of codegen-units would invoke some subquery
that returns the list of all modules defined in the Rust source.
- That query in turn would invoke something asking for the HIR.
- This keeps going further and further back until we wind up doing the
actual parsing.
However, that vision is not fully realized. Still, big chunks of the
compiler (for example, generating MIR) work exactly like this.
### Invoking queries
To invoke a query is simple. The tcx ("type context") offers a method
for each defined query. So, for example, to invoke the `type_of`
query, you would just do this:
```rust
let ty = tcx.type_of(some_def_id);
```
### Cycles between queries
Currently, cycles during query execution should always result in a
compilation error. Typically, they arise because of illegal programs
that contain cyclic references they shouldn't (though sometimes they
arise because of compiler bugs, in which case we need to factor our
queries in a more fine-grained fashion to avoid them).
However, it is nonetheless often useful to *recover* from a cycle
(after reporting an error, say) and try to soldier on, so as to give a
better user experience. In order to recover from a cycle, you don't
get to use the nice method-call-style syntax. Instead, you invoke
using the `try_get` method, which looks roughly like this:
```rust
use ty::maps::queries;
...
match queries::type_of::try_get(tcx, DUMMY_SP, self.did) {
Ok(result) => {
// no cycle occurred! You can use `result`
}
Err(err) => {
// A cycle occurred! The error value `err` is a `DiagnosticBuilder`,
// meaning essentially an "in-progress", not-yet-reported error message.
// See below for more details on what to do here.
}
}
```
So, if you get back an `Err` from `try_get`, then a cycle *did* occur. This means that
you must ensure that a compiler error message is reported. You can do that in two ways:
The simplest is to invoke `err.emit()`. This will emit the cycle error to the user.
However, often cycles happen because of an illegal program, and you
know at that point that an error either already has been reported or
will be reported due to this cycle by some other bit of code. In that
case, you can invoke `err.cancel()` to not emit any error. It is
traditional to then invoke:
```
tcx.sess.delay_span_bug(some_span, "some message")
```
`delay_span_bug()` is a helper that says: we expect a compilation
error to have happened or to happen in the future; so, if compilation
ultimately succeeds, make an ICE with the message `"some
message"`. This is basically just a precaution in case you are wrong.
### How the compiler executes a query
So you may be wondering what happens when you invoke a query
method. The answer is that, for each query, the compiler maintains a
cache -- if your query has already been executed, then, the answer is
simple: we clone the return value out of the cache and return it
(therefore, you should try to ensure that the return types of queries
are cheaply cloneable; insert a `Rc` if necessary).
#### Providers
If, however, the query is *not* in the cache, then the compiler will
try to find a suitable **provider**. A provider is a function that has
been defined and linked into the compiler somewhere that contains the
code to compute the result of the query.
**Providers are defined per-crate.** The compiler maintains,
internally, a table of providers for every crate, at least
conceptually. Right now, there are really two sets: the providers for
queries about the **local crate** (that is, the one being compiled)
and providers for queries about **external crates** (that is,
dependencies of the local crate). Note that what determines the crate
that a query is targeting is not the *kind* of query, but the *key*.
For example, when you invoke `tcx.type_of(def_id)`, that could be a
local query or an external query, depending on what crate the `def_id`
is referring to (see the `self::keys::Key` trait for more information
on how that works).
Providers always have the same signature:
```rust
fn provider<'cx, 'tcx>(tcx: TyCtxt<'cx, 'tcx, 'tcx>,
key: QUERY_KEY)
-> QUERY_RESULT
{
...
}
```
Providers take two arguments: the `tcx` and the query key. Note also
that they take the *global* tcx (i.e., they use the `'tcx` lifetime
twice), rather than taking a tcx with some active inference context.
They return the result of the query.
#### How providers are setup
When the tcx is created, it is given the providers by its creator using
the `Providers` struct. This struct is generate by the macros here, but it
is basically a big list of function pointers:
```rust
struct Providers {
type_of: for<'cx, 'tcx> fn(TyCtxt<'cx, 'tcx, 'tcx>, DefId) -> Ty<'tcx>,
...
}
```
At present, we have one copy of the struct for local crates, and one
for external crates, though the plan is that we may eventually have
one per crate.
These `Provider` structs are ultimately created and populated by
`librustc_driver`, but it does this by distributing the work
throughout the other `rustc_*` crates. This is done by invoking
various `provide` functions. These functions tend to look something
like this:
```rust
pub fn provide(providers: &mut Providers) {
*providers = Providers {
type_of,
..*providers
};
}
```
That is, they take an `&mut Providers` and mutate it in place. Usually
we use the formulation above just because it looks nice, but you could
as well do `providers.type_of = type_of`, which would be equivalent.
(Here, `type_of` would be a top-level function, defined as we saw
before.) So, if we wanted to have add a provider for some other query,
let's call it `fubar`, into the crate above, we might modify the `provide()`
function like so:
```rust
pub fn provide(providers: &mut Providers) {
*providers = Providers {
type_of,
fubar,
..*providers
};
}
fn fubar<'cx, 'tcx>(tcx: TyCtxt<'cx, 'tcx>, key: DefId) -> Fubar<'tcx> { .. }
```
NB. Most of the `rustc_*` crate only provide **local
providers**. Almost all **extern providers** wind up going through the
`rustc_metadata` crate, which loads the information from the crate
metadata. But in some cases there are crates that provide queries for
*both* local and external crates, in which case they define both a
`provide` and a `provide_extern` function that `rustc_driver` can
invoke.
### Adding a new kind of query
So suppose you want to add a new kind of query, how do you do so?
Well, defining a query takes place in two steps:
1. first, you have to specify the query name and arguments; and then,
2. you have to supply query providers where needed.
The specify the query name and arguments, you simply add an entry
to the big macro invocation in `mod.rs`. This will probably have changed
by the time you read this README, but at present it looks something
like:
```
define_maps! { <'tcx>
/// Records the type of every item.
[] fn type_of: TypeOfItem(DefId) -> Ty<'tcx>,
...
}
```
Each line of the macro defines one query. The name is broken up like this:
```
[] fn type_of: TypeOfItem(DefId) -> Ty<'tcx>,
^^ ^^^^^^^ ^^^^^^^^^^ ^^^^^ ^^^^^^^^
| | | | |
| | | | result type of query
| | | query key type
| | dep-node constructor
| name of query
query flags
```
Let's go over them one by one:
- **Query flags:** these are largely unused right now, but the intention
is that we'll be able to customize various aspects of how the query is
processed.
- **Name of query:** the name of the query method
(`tcx.type_of(..)`). Also used as the name of a struct
(`ty::maps::queries::type_of`) that will be generated to represent
this query.
- **Dep-node constructor:** indicates the constructor function that
connects this query to incremental compilation. Typically, this is a
`DepNode` variant, which can be added by modifying the
`define_dep_nodes!` macro invocation in
`librustc/dep_graph/dep_node.rs`.
- However, sometimes we use a custom function, in which case the
name will be in snake case and the function will be defined at the
bottom of the file. This is typically used when the query key is
not a def-id, or just not the type that the dep-node expects.
- **Query key type:** the type of the argument to this query.
This type must implement the `ty::maps::keys::Key` trait, which
defines (for example) how to map it to a crate, and so forth.
- **Result type of query:** the type produced by this query. This type
should (a) not use `RefCell` or other interior mutability and (b) be
cheaply cloneable. Interning or using `Rc` or `Arc` is recommended for
non-trivial data types.
- The one exception to those rules is the `ty::steal::Steal` type,
which is used to cheaply modify MIR in place. See the definition
of `Steal` for more details. New uses of `Steal` should **not** be
added without alerting `@rust-lang/compiler`.
So, to add a query:
- Add an entry to `define_maps!` using the format above.
- Possibly add a corresponding entry to the dep-node macro.
- Link the provider by modifying the appropriate `provide` method;
or add a new one if needed and ensure that `rustc_driver` is invoking it.
#### Query structs and descriptions
For each kind, the `define_maps` macro will generate a "query struct"
named after the query. This struct is a kind of a place-holder
describing the query. Each such struct implements the
`self::config::QueryConfig` trait, which has associated types for the
key/value of that particular query. Basically the code generated looks something
like this:
```rust
// Dummy struct representing a particular kind of query:
pub struct type_of<'tcx> { phantom: PhantomData<&'tcx ()> }
impl<'tcx> QueryConfig for type_of<'tcx> {
type Key = DefId;
type Value = Ty<'tcx>;
}
```
There is an additional trait that you may wish to implement called
`self::config::QueryDescription`. This trait is used during cycle
errors to give a "human readable" name for the query, so that we can
summarize what was happening when the cycle occurred. Implementing
this trait is optional if the query key is `DefId`, but if you *don't*
implement it, you get a pretty generic error ("processing `foo`...").
You can put new impls into the `config` module. They look something like this:
```rust
impl<'tcx> QueryDescription for queries::type_of<'tcx> {
fn describe(tcx: TyCtxt, key: DefId) -> String {
format!("computing the type of `{}`", tcx.item_path_str(key))
}
}
```

View File

@ -0,0 +1,492 @@
// Copyright 2012-2015 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use hir::def_id::{CrateNum, DefId, DefIndex};
use ty::{self, Ty, TyCtxt};
use ty::maps::queries;
use ty::subst::Substs;
use std::hash::Hash;
use syntax_pos::symbol::InternedString;
/// Query configuration and description traits.
pub trait QueryConfig {
type Key: Eq + Hash + Clone;
type Value;
}
pub(super) trait QueryDescription: QueryConfig {
fn describe(tcx: TyCtxt, key: Self::Key) -> String;
}
impl<M: QueryConfig<Key=DefId>> QueryDescription for M {
default fn describe(tcx: TyCtxt, def_id: DefId) -> String {
format!("processing `{}`", tcx.item_path_str(def_id))
}
}
impl<'tcx> QueryDescription for queries::is_copy_raw<'tcx> {
fn describe(_tcx: TyCtxt, env: ty::ParamEnvAnd<'tcx, Ty<'tcx>>) -> String {
format!("computing whether `{}` is `Copy`", env.value)
}
}
impl<'tcx> QueryDescription for queries::is_sized_raw<'tcx> {
fn describe(_tcx: TyCtxt, env: ty::ParamEnvAnd<'tcx, Ty<'tcx>>) -> String {
format!("computing whether `{}` is `Sized`", env.value)
}
}
impl<'tcx> QueryDescription for queries::is_freeze_raw<'tcx> {
fn describe(_tcx: TyCtxt, env: ty::ParamEnvAnd<'tcx, Ty<'tcx>>) -> String {
format!("computing whether `{}` is freeze", env.value)
}
}
impl<'tcx> QueryDescription for queries::needs_drop_raw<'tcx> {
fn describe(_tcx: TyCtxt, env: ty::ParamEnvAnd<'tcx, Ty<'tcx>>) -> String {
format!("computing whether `{}` needs drop", env.value)
}
}
impl<'tcx> QueryDescription for queries::layout_raw<'tcx> {
fn describe(_tcx: TyCtxt, env: ty::ParamEnvAnd<'tcx, Ty<'tcx>>) -> String {
format!("computing layout of `{}`", env.value)
}
}
impl<'tcx> QueryDescription for queries::super_predicates_of<'tcx> {
fn describe(tcx: TyCtxt, def_id: DefId) -> String {
format!("computing the supertraits of `{}`",
tcx.item_path_str(def_id))
}
}
impl<'tcx> QueryDescription for queries::type_param_predicates<'tcx> {
fn describe(tcx: TyCtxt, (_, def_id): (DefId, DefId)) -> String {
let id = tcx.hir.as_local_node_id(def_id).unwrap();
format!("computing the bounds for type parameter `{}`",
tcx.hir.ty_param_name(id))
}
}
impl<'tcx> QueryDescription for queries::coherent_trait<'tcx> {
fn describe(tcx: TyCtxt, (_, def_id): (CrateNum, DefId)) -> String {
format!("coherence checking all impls of trait `{}`",
tcx.item_path_str(def_id))
}
}
impl<'tcx> QueryDescription for queries::crate_inherent_impls<'tcx> {
fn describe(_: TyCtxt, k: CrateNum) -> String {
format!("all inherent impls defined in crate `{:?}`", k)
}
}
impl<'tcx> QueryDescription for queries::crate_inherent_impls_overlap_check<'tcx> {
fn describe(_: TyCtxt, _: CrateNum) -> String {
format!("check for overlap between inherent impls defined in this crate")
}
}
impl<'tcx> QueryDescription for queries::crate_variances<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("computing the variances for items in this crate")
}
}
impl<'tcx> QueryDescription for queries::mir_shims<'tcx> {
fn describe(tcx: TyCtxt, def: ty::InstanceDef<'tcx>) -> String {
format!("generating MIR shim for `{}`",
tcx.item_path_str(def.def_id()))
}
}
impl<'tcx> QueryDescription for queries::privacy_access_levels<'tcx> {
fn describe(_: TyCtxt, _: CrateNum) -> String {
format!("privacy access levels")
}
}
impl<'tcx> QueryDescription for queries::typeck_item_bodies<'tcx> {
fn describe(_: TyCtxt, _: CrateNum) -> String {
format!("type-checking all item bodies")
}
}
impl<'tcx> QueryDescription for queries::reachable_set<'tcx> {
fn describe(_: TyCtxt, _: CrateNum) -> String {
format!("reachability")
}
}
impl<'tcx> QueryDescription for queries::const_eval<'tcx> {
fn describe(tcx: TyCtxt, key: ty::ParamEnvAnd<'tcx, (DefId, &'tcx Substs<'tcx>)>) -> String {
format!("const-evaluating `{}`", tcx.item_path_str(key.value.0))
}
}
impl<'tcx> QueryDescription for queries::mir_keys<'tcx> {
fn describe(_: TyCtxt, _: CrateNum) -> String {
format!("getting a list of all mir_keys")
}
}
impl<'tcx> QueryDescription for queries::symbol_name<'tcx> {
fn describe(_tcx: TyCtxt, instance: ty::Instance<'tcx>) -> String {
format!("computing the symbol for `{}`", instance)
}
}
impl<'tcx> QueryDescription for queries::describe_def<'tcx> {
fn describe(_: TyCtxt, _: DefId) -> String {
bug!("describe_def")
}
}
impl<'tcx> QueryDescription for queries::def_span<'tcx> {
fn describe(_: TyCtxt, _: DefId) -> String {
bug!("def_span")
}
}
impl<'tcx> QueryDescription for queries::lookup_stability<'tcx> {
fn describe(_: TyCtxt, _: DefId) -> String {
bug!("stability")
}
}
impl<'tcx> QueryDescription for queries::lookup_deprecation_entry<'tcx> {
fn describe(_: TyCtxt, _: DefId) -> String {
bug!("deprecation")
}
}
impl<'tcx> QueryDescription for queries::item_attrs<'tcx> {
fn describe(_: TyCtxt, _: DefId) -> String {
bug!("item_attrs")
}
}
impl<'tcx> QueryDescription for queries::is_exported_symbol<'tcx> {
fn describe(_: TyCtxt, _: DefId) -> String {
bug!("is_exported_symbol")
}
}
impl<'tcx> QueryDescription for queries::fn_arg_names<'tcx> {
fn describe(_: TyCtxt, _: DefId) -> String {
bug!("fn_arg_names")
}
}
impl<'tcx> QueryDescription for queries::impl_parent<'tcx> {
fn describe(_: TyCtxt, _: DefId) -> String {
bug!("impl_parent")
}
}
impl<'tcx> QueryDescription for queries::trait_of_item<'tcx> {
fn describe(_: TyCtxt, _: DefId) -> String {
bug!("trait_of_item")
}
}
impl<'tcx> QueryDescription for queries::item_body_nested_bodies<'tcx> {
fn describe(tcx: TyCtxt, def_id: DefId) -> String {
format!("nested item bodies of `{}`", tcx.item_path_str(def_id))
}
}
impl<'tcx> QueryDescription for queries::const_is_rvalue_promotable_to_static<'tcx> {
fn describe(tcx: TyCtxt, def_id: DefId) -> String {
format!("const checking if rvalue is promotable to static `{}`",
tcx.item_path_str(def_id))
}
}
impl<'tcx> QueryDescription for queries::is_mir_available<'tcx> {
fn describe(tcx: TyCtxt, def_id: DefId) -> String {
format!("checking if item is mir available: `{}`",
tcx.item_path_str(def_id))
}
}
impl<'tcx> QueryDescription for queries::trait_impls_of<'tcx> {
fn describe(tcx: TyCtxt, def_id: DefId) -> String {
format!("trait impls of `{}`", tcx.item_path_str(def_id))
}
}
impl<'tcx> QueryDescription for queries::is_object_safe<'tcx> {
fn describe(tcx: TyCtxt, def_id: DefId) -> String {
format!("determine object safety of trait `{}`", tcx.item_path_str(def_id))
}
}
impl<'tcx> QueryDescription for queries::is_const_fn<'tcx> {
fn describe(tcx: TyCtxt, def_id: DefId) -> String {
format!("checking if item is const fn: `{}`", tcx.item_path_str(def_id))
}
}
impl<'tcx> QueryDescription for queries::dylib_dependency_formats<'tcx> {
fn describe(_: TyCtxt, _: CrateNum) -> String {
"dylib dependency formats of crate".to_string()
}
}
impl<'tcx> QueryDescription for queries::is_panic_runtime<'tcx> {
fn describe(_: TyCtxt, _: CrateNum) -> String {
"checking if the crate is_panic_runtime".to_string()
}
}
impl<'tcx> QueryDescription for queries::is_compiler_builtins<'tcx> {
fn describe(_: TyCtxt, _: CrateNum) -> String {
"checking if the crate is_compiler_builtins".to_string()
}
}
impl<'tcx> QueryDescription for queries::has_global_allocator<'tcx> {
fn describe(_: TyCtxt, _: CrateNum) -> String {
"checking if the crate has_global_allocator".to_string()
}
}
impl<'tcx> QueryDescription for queries::extern_crate<'tcx> {
fn describe(_: TyCtxt, _: DefId) -> String {
"getting crate's ExternCrateData".to_string()
}
}
impl<'tcx> QueryDescription for queries::lint_levels<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("computing the lint levels for items in this crate")
}
}
impl<'tcx> QueryDescription for queries::specializes<'tcx> {
fn describe(_tcx: TyCtxt, _: (DefId, DefId)) -> String {
format!("computing whether impls specialize one another")
}
}
impl<'tcx> QueryDescription for queries::in_scope_traits_map<'tcx> {
fn describe(_tcx: TyCtxt, _: DefIndex) -> String {
format!("traits in scope at a block")
}
}
impl<'tcx> QueryDescription for queries::is_no_builtins<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("test whether a crate has #![no_builtins]")
}
}
impl<'tcx> QueryDescription for queries::panic_strategy<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("query a crate's configured panic strategy")
}
}
impl<'tcx> QueryDescription for queries::is_profiler_runtime<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("query a crate is #![profiler_runtime]")
}
}
impl<'tcx> QueryDescription for queries::is_sanitizer_runtime<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("query a crate is #![sanitizer_runtime]")
}
}
impl<'tcx> QueryDescription for queries::exported_symbol_ids<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("looking up the exported symbols of a crate")
}
}
impl<'tcx> QueryDescription for queries::native_libraries<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("looking up the native libraries of a linked crate")
}
}
impl<'tcx> QueryDescription for queries::plugin_registrar_fn<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("looking up the plugin registrar for a crate")
}
}
impl<'tcx> QueryDescription for queries::derive_registrar_fn<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("looking up the derive registrar for a crate")
}
}
impl<'tcx> QueryDescription for queries::crate_disambiguator<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("looking up the disambiguator a crate")
}
}
impl<'tcx> QueryDescription for queries::crate_hash<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("looking up the hash a crate")
}
}
impl<'tcx> QueryDescription for queries::original_crate_name<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("looking up the original name a crate")
}
}
impl<'tcx> QueryDescription for queries::implementations_of_trait<'tcx> {
fn describe(_tcx: TyCtxt, _: (CrateNum, DefId)) -> String {
format!("looking up implementations of a trait in a crate")
}
}
impl<'tcx> QueryDescription for queries::all_trait_implementations<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("looking up all (?) trait implementations")
}
}
impl<'tcx> QueryDescription for queries::link_args<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("looking up link arguments for a crate")
}
}
impl<'tcx> QueryDescription for queries::named_region_map<'tcx> {
fn describe(_tcx: TyCtxt, _: DefIndex) -> String {
format!("looking up a named region")
}
}
impl<'tcx> QueryDescription for queries::is_late_bound_map<'tcx> {
fn describe(_tcx: TyCtxt, _: DefIndex) -> String {
format!("testing if a region is late boudn")
}
}
impl<'tcx> QueryDescription for queries::object_lifetime_defaults_map<'tcx> {
fn describe(_tcx: TyCtxt, _: DefIndex) -> String {
format!("looking up lifetime defaults for a region")
}
}
impl<'tcx> QueryDescription for queries::dep_kind<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("fetching what a dependency looks like")
}
}
impl<'tcx> QueryDescription for queries::crate_name<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("fetching what a crate is named")
}
}
impl<'tcx> QueryDescription for queries::get_lang_items<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("calculating the lang items map")
}
}
impl<'tcx> QueryDescription for queries::defined_lang_items<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("calculating the lang items defined in a crate")
}
}
impl<'tcx> QueryDescription for queries::missing_lang_items<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("calculating the missing lang items in a crate")
}
}
impl<'tcx> QueryDescription for queries::visible_parent_map<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("calculating the visible parent map")
}
}
impl<'tcx> QueryDescription for queries::missing_extern_crate_item<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("seeing if we're missing an `extern crate` item for this crate")
}
}
impl<'tcx> QueryDescription for queries::used_crate_source<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("looking at the source for a crate")
}
}
impl<'tcx> QueryDescription for queries::postorder_cnums<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("generating a postorder list of CrateNums")
}
}
impl<'tcx> QueryDescription for queries::maybe_unused_extern_crates<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("looking up all possibly unused extern crates")
}
}
impl<'tcx> QueryDescription for queries::stability_index<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("calculating the stability index for the local crate")
}
}
impl<'tcx> QueryDescription for queries::all_crate_nums<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("fetching all foreign CrateNum instances")
}
}
impl<'tcx> QueryDescription for queries::exported_symbols<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("exported_symbols")
}
}
impl<'tcx> QueryDescription for queries::collect_and_partition_translation_items<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("collect_and_partition_translation_items")
}
}
impl<'tcx> QueryDescription for queries::codegen_unit<'tcx> {
fn describe(_tcx: TyCtxt, _: InternedString) -> String {
format!("codegen_unit")
}
}
impl<'tcx> QueryDescription for queries::compile_codegen_unit<'tcx> {
fn describe(_tcx: TyCtxt, _: InternedString) -> String {
format!("compile_codegen_unit")
}
}
impl<'tcx> QueryDescription for queries::output_filenames<'tcx> {
fn describe(_tcx: TyCtxt, _: CrateNum) -> String {
format!("output_filenames")
}
}

View File

@ -0,0 +1,162 @@
// Copyright 2012-2015 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! Defines the set of legal keys that can be used in queries.
use hir::def_id::{CrateNum, DefId, LOCAL_CRATE, DefIndex};
use mir::transform::{MirSuite, MirPassIndex};
use ty::{self, Ty, TyCtxt};
use ty::subst::Substs;
use ty::fast_reject::SimplifiedType;
use std::fmt::Debug;
use std::hash::Hash;
use syntax_pos::{Span, DUMMY_SP};
use syntax_pos::symbol::InternedString;
/// The `Key` trait controls what types can legally be used as the key
/// for a query.
pub trait Key: Clone + Hash + Eq + Debug {
/// Given an instance of this key, what crate is it referring to?
/// This is used to find the provider.
fn map_crate(&self) -> CrateNum;
/// In the event that a cycle occurs, if no explicit span has been
/// given for a query with key `self`, what span should we use?
fn default_span(&self, tcx: TyCtxt) -> Span;
}
impl<'tcx> Key for ty::InstanceDef<'tcx> {
fn map_crate(&self) -> CrateNum {
LOCAL_CRATE
}
fn default_span(&self, tcx: TyCtxt) -> Span {
tcx.def_span(self.def_id())
}
}
impl<'tcx> Key for ty::Instance<'tcx> {
fn map_crate(&self) -> CrateNum {
LOCAL_CRATE
}
fn default_span(&self, tcx: TyCtxt) -> Span {
tcx.def_span(self.def_id())
}
}
impl Key for CrateNum {
fn map_crate(&self) -> CrateNum {
*self
}
fn default_span(&self, _: TyCtxt) -> Span {
DUMMY_SP
}
}
impl Key for DefIndex {
fn map_crate(&self) -> CrateNum {
LOCAL_CRATE
}
fn default_span(&self, _tcx: TyCtxt) -> Span {
DUMMY_SP
}
}
impl Key for DefId {
fn map_crate(&self) -> CrateNum {
self.krate
}
fn default_span(&self, tcx: TyCtxt) -> Span {
tcx.def_span(*self)
}
}
impl Key for (DefId, DefId) {
fn map_crate(&self) -> CrateNum {
self.0.krate
}
fn default_span(&self, tcx: TyCtxt) -> Span {
self.1.default_span(tcx)
}
}
impl Key for (CrateNum, DefId) {
fn map_crate(&self) -> CrateNum {
self.0
}
fn default_span(&self, tcx: TyCtxt) -> Span {
self.1.default_span(tcx)
}
}
impl Key for (DefId, SimplifiedType) {
fn map_crate(&self) -> CrateNum {
self.0.krate
}
fn default_span(&self, tcx: TyCtxt) -> Span {
self.0.default_span(tcx)
}
}
impl<'tcx> Key for (DefId, &'tcx Substs<'tcx>) {
fn map_crate(&self) -> CrateNum {
self.0.krate
}
fn default_span(&self, tcx: TyCtxt) -> Span {
self.0.default_span(tcx)
}
}
impl Key for (MirSuite, DefId) {
fn map_crate(&self) -> CrateNum {
self.1.map_crate()
}
fn default_span(&self, tcx: TyCtxt) -> Span {
self.1.default_span(tcx)
}
}
impl Key for (MirSuite, MirPassIndex, DefId) {
fn map_crate(&self) -> CrateNum {
self.2.map_crate()
}
fn default_span(&self, tcx: TyCtxt) -> Span {
self.2.default_span(tcx)
}
}
impl<'tcx> Key for Ty<'tcx> {
fn map_crate(&self) -> CrateNum {
LOCAL_CRATE
}
fn default_span(&self, _: TyCtxt) -> Span {
DUMMY_SP
}
}
impl<'tcx, T: Key> Key for ty::ParamEnvAnd<'tcx, T> {
fn map_crate(&self) -> CrateNum {
self.value.map_crate()
}
fn default_span(&self, tcx: TyCtxt) -> Span {
self.value.default_span(tcx)
}
}
impl Key for InternedString {
fn map_crate(&self) -> CrateNum {
LOCAL_CRATE
}
fn default_span(&self, _tcx: TyCtxt) -> Span {
DUMMY_SP
}
}

453
src/librustc/ty/maps/mod.rs Normal file
View File

@ -0,0 +1,453 @@
// Copyright 2012-2015 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use dep_graph::{DepConstructor, DepNode};
use errors::DiagnosticBuilder;
use hir::def_id::{CrateNum, DefId, DefIndex};
use hir::def::{Def, Export};
use hir::{self, TraitCandidate, ItemLocalId};
use hir::svh::Svh;
use lint;
use middle::const_val;
use middle::cstore::{ExternCrate, LinkagePreference, NativeLibrary,
ExternBodyNestedBodies};
use middle::cstore::{NativeLibraryKind, DepKind, CrateSource, ExternConstBody};
use middle::privacy::AccessLevels;
use middle::reachable::ReachableSet;
use middle::region;
use middle::resolve_lifetime::{Region, ObjectLifetimeDefault};
use middle::stability::{self, DeprecationEntry};
use middle::lang_items::{LanguageItems, LangItem};
use middle::exported_symbols::SymbolExportLevel;
use middle::trans::{CodegenUnit, Stats};
use mir;
use session::CompileResult;
use session::config::OutputFilenames;
use traits::specialization_graph;
use ty::{self, CrateInherentImpls, Ty, TyCtxt};
use ty::layout::{Layout, LayoutError};
use ty::steal::Steal;
use ty::subst::Substs;
use util::nodemap::{DefIdSet, DefIdMap};
use util::common::{profq_msg, ProfileQueriesMsg};
use rustc_data_structures::indexed_set::IdxSetBuf;
use rustc_back::PanicStrategy;
use rustc_data_structures::indexed_vec::IndexVec;
use rustc_data_structures::fx::{FxHashMap, FxHashSet};
use std::cell::{RefCell, Cell};
use std::ops::Deref;
use std::rc::Rc;
use std::sync::Arc;
use syntax_pos::{Span, DUMMY_SP};
use syntax_pos::symbol::InternedString;
use syntax::attr;
use syntax::ast;
use syntax::symbol::Symbol;
#[macro_use]
mod plumbing;
use self::plumbing::*;
mod keys;
pub use self::keys::Key;
mod values;
use self::values::Value;
mod config;
pub use self::config::QueryConfig;
use self::config::QueryDescription;
// Each of these maps also corresponds to a method on a
// `Provider` trait for requesting a value of that type,
// and a method on `Maps` itself for doing that in a
// a way that memoizes and does dep-graph tracking,
// wrapping around the actual chain of providers that
// the driver creates (using several `rustc_*` crates).
define_maps! { <'tcx>
/// Records the type of every item.
[] fn type_of: TypeOfItem(DefId) -> Ty<'tcx>,
/// Maps from the def-id of an item (trait/struct/enum/fn) to its
/// associated generics and predicates.
[] fn generics_of: GenericsOfItem(DefId) -> &'tcx ty::Generics,
[] fn predicates_of: PredicatesOfItem(DefId) -> ty::GenericPredicates<'tcx>,
/// Maps from the def-id of a trait to the list of
/// super-predicates. This is a subset of the full list of
/// predicates. We store these in a separate map because we must
/// evaluate them even during type conversion, often before the
/// full predicates are available (note that supertraits have
/// additional acyclicity requirements).
[] fn super_predicates_of: SuperPredicatesOfItem(DefId) -> ty::GenericPredicates<'tcx>,
/// To avoid cycles within the predicates of a single item we compute
/// per-type-parameter predicates for resolving `T::AssocTy`.
[] fn type_param_predicates: type_param_predicates((DefId, DefId))
-> ty::GenericPredicates<'tcx>,
[] fn trait_def: TraitDefOfItem(DefId) -> &'tcx ty::TraitDef,
[] fn adt_def: AdtDefOfItem(DefId) -> &'tcx ty::AdtDef,
[] fn adt_destructor: AdtDestructor(DefId) -> Option<ty::Destructor>,
[] fn adt_sized_constraint: SizedConstraint(DefId) -> &'tcx [Ty<'tcx>],
[] fn adt_dtorck_constraint: DtorckConstraint(DefId) -> ty::DtorckConstraint<'tcx>,
/// True if this is a const fn
[] fn is_const_fn: IsConstFn(DefId) -> bool,
/// True if this is a foreign item (i.e., linked via `extern { ... }`).
[] fn is_foreign_item: IsForeignItem(DefId) -> bool,
/// True if this is a default impl (aka impl Foo for ..)
[] fn is_default_impl: IsDefaultImpl(DefId) -> bool,
/// Get a map with the variance of every item; use `item_variance`
/// instead.
[] fn crate_variances: crate_variances(CrateNum) -> Rc<ty::CrateVariancesMap>,
/// Maps from def-id of a type or region parameter to its
/// (inferred) variance.
[] fn variances_of: ItemVariances(DefId) -> Rc<Vec<ty::Variance>>,
/// Maps from an impl/trait def-id to a list of the def-ids of its items
[] fn associated_item_def_ids: AssociatedItemDefIds(DefId) -> Rc<Vec<DefId>>,
/// Maps from a trait item to the trait item "descriptor"
[] fn associated_item: AssociatedItems(DefId) -> ty::AssociatedItem,
[] fn impl_trait_ref: ImplTraitRef(DefId) -> Option<ty::TraitRef<'tcx>>,
[] fn impl_polarity: ImplPolarity(DefId) -> hir::ImplPolarity,
/// Maps a DefId of a type to a list of its inherent impls.
/// Contains implementations of methods that are inherent to a type.
/// Methods in these implementations don't need to be exported.
[] fn inherent_impls: InherentImpls(DefId) -> Rc<Vec<DefId>>,
/// Set of all the def-ids in this crate that have MIR associated with
/// them. This includes all the body owners, but also things like struct
/// constructors.
[] fn mir_keys: mir_keys(CrateNum) -> Rc<DefIdSet>,
/// Maps DefId's that have an associated Mir to the result
/// of the MIR qualify_consts pass. The actual meaning of
/// the value isn't known except to the pass itself.
[] fn mir_const_qualif: MirConstQualif(DefId) -> (u8, Rc<IdxSetBuf<mir::Local>>),
/// Fetch the MIR for a given def-id up till the point where it is
/// ready for const evaluation.
///
/// See the README for the `mir` module for details.
[] fn mir_const: MirConst(DefId) -> &'tcx Steal<mir::Mir<'tcx>>,
[] fn mir_validated: MirValidated(DefId) -> &'tcx Steal<mir::Mir<'tcx>>,
/// MIR after our optimization passes have run. This is MIR that is ready
/// for trans. This is also the only query that can fetch non-local MIR, at present.
[] fn optimized_mir: MirOptimized(DefId) -> &'tcx mir::Mir<'tcx>,
/// Type of each closure. The def ID is the ID of the
/// expression defining the closure.
[] fn closure_kind: ClosureKind(DefId) -> ty::ClosureKind,
/// The signature of functions and closures.
[] fn fn_sig: FnSignature(DefId) -> ty::PolyFnSig<'tcx>,
/// Records the signature of each generator. The def ID is the ID of the
/// expression defining the closure.
[] fn generator_sig: GenSignature(DefId) -> Option<ty::PolyGenSig<'tcx>>,
/// Caches CoerceUnsized kinds for impls on custom types.
[] fn coerce_unsized_info: CoerceUnsizedInfo(DefId)
-> ty::adjustment::CoerceUnsizedInfo,
[] fn typeck_item_bodies: typeck_item_bodies_dep_node(CrateNum) -> CompileResult,
[] fn typeck_tables_of: TypeckTables(DefId) -> &'tcx ty::TypeckTables<'tcx>,
[] fn has_typeck_tables: HasTypeckTables(DefId) -> bool,
[] fn coherent_trait: coherent_trait_dep_node((CrateNum, DefId)) -> (),
[] fn borrowck: BorrowCheck(DefId) -> (),
// FIXME: shouldn't this return a `Result<(), BorrowckErrors>` instead?
[] fn mir_borrowck: MirBorrowCheck(DefId) -> (),
/// Gets a complete map from all types to their inherent impls.
/// Not meant to be used directly outside of coherence.
/// (Defined only for LOCAL_CRATE)
[] fn crate_inherent_impls: crate_inherent_impls_dep_node(CrateNum) -> CrateInherentImpls,
/// Checks all types in the krate for overlap in their inherent impls. Reports errors.
/// Not meant to be used directly outside of coherence.
/// (Defined only for LOCAL_CRATE)
[] fn crate_inherent_impls_overlap_check: inherent_impls_overlap_check_dep_node(CrateNum) -> (),
/// Results of evaluating const items or constants embedded in
/// other items (such as enum variant explicit discriminants).
[] fn const_eval: const_eval_dep_node(ty::ParamEnvAnd<'tcx, (DefId, &'tcx Substs<'tcx>)>)
-> const_val::EvalResult<'tcx>,
/// Performs the privacy check and computes "access levels".
[] fn privacy_access_levels: PrivacyAccessLevels(CrateNum) -> Rc<AccessLevels>,
[] fn reachable_set: reachability_dep_node(CrateNum) -> ReachableSet,
/// Per-body `region::ScopeTree`. The `DefId` should be the owner-def-id for the body;
/// in the case of closures, this will be redirected to the enclosing function.
[] fn region_scope_tree: RegionScopeTree(DefId) -> Rc<region::ScopeTree>,
[] fn mir_shims: mir_shim_dep_node(ty::InstanceDef<'tcx>) -> &'tcx mir::Mir<'tcx>,
[] fn def_symbol_name: SymbolName(DefId) -> ty::SymbolName,
[] fn symbol_name: symbol_name_dep_node(ty::Instance<'tcx>) -> ty::SymbolName,
[] fn describe_def: DescribeDef(DefId) -> Option<Def>,
[] fn def_span: DefSpan(DefId) -> Span,
[] fn lookup_stability: LookupStability(DefId) -> Option<&'tcx attr::Stability>,
[] fn lookup_deprecation_entry: LookupDeprecationEntry(DefId) -> Option<DeprecationEntry>,
[] fn item_attrs: ItemAttrs(DefId) -> Rc<[ast::Attribute]>,
[] fn fn_arg_names: FnArgNames(DefId) -> Vec<ast::Name>,
[] fn impl_parent: ImplParent(DefId) -> Option<DefId>,
[] fn trait_of_item: TraitOfItem(DefId) -> Option<DefId>,
[] fn is_exported_symbol: IsExportedSymbol(DefId) -> bool,
[] fn item_body_nested_bodies: ItemBodyNestedBodies(DefId) -> ExternBodyNestedBodies,
[] fn const_is_rvalue_promotable_to_static: ConstIsRvaluePromotableToStatic(DefId) -> bool,
[] fn is_mir_available: IsMirAvailable(DefId) -> bool,
[] fn trait_impls_of: TraitImpls(DefId) -> Rc<ty::trait_def::TraitImpls>,
[] fn specialization_graph_of: SpecializationGraph(DefId) -> Rc<specialization_graph::Graph>,
[] fn is_object_safe: ObjectSafety(DefId) -> bool,
// Get the ParameterEnvironment for a given item; this environment
// will be in "user-facing" mode, meaning that it is suitabe for
// type-checking etc, and it does not normalize specializable
// associated types. This is almost always what you want,
// unless you are doing MIR optimizations, in which case you
// might want to use `reveal_all()` method to change modes.
[] fn param_env: ParamEnv(DefId) -> ty::ParamEnv<'tcx>,
// Trait selection queries. These are best used by invoking `ty.moves_by_default()`,
// `ty.is_copy()`, etc, since that will prune the environment where possible.
[] fn is_copy_raw: is_copy_dep_node(ty::ParamEnvAnd<'tcx, Ty<'tcx>>) -> bool,
[] fn is_sized_raw: is_sized_dep_node(ty::ParamEnvAnd<'tcx, Ty<'tcx>>) -> bool,
[] fn is_freeze_raw: is_freeze_dep_node(ty::ParamEnvAnd<'tcx, Ty<'tcx>>) -> bool,
[] fn needs_drop_raw: needs_drop_dep_node(ty::ParamEnvAnd<'tcx, Ty<'tcx>>) -> bool,
[] fn layout_raw: layout_dep_node(ty::ParamEnvAnd<'tcx, Ty<'tcx>>)
-> Result<&'tcx Layout, LayoutError<'tcx>>,
[] fn dylib_dependency_formats: DylibDepFormats(CrateNum)
-> Rc<Vec<(CrateNum, LinkagePreference)>>,
[] fn is_panic_runtime: IsPanicRuntime(CrateNum) -> bool,
[] fn is_compiler_builtins: IsCompilerBuiltins(CrateNum) -> bool,
[] fn has_global_allocator: HasGlobalAllocator(CrateNum) -> bool,
[] fn is_sanitizer_runtime: IsSanitizerRuntime(CrateNum) -> bool,
[] fn is_profiler_runtime: IsProfilerRuntime(CrateNum) -> bool,
[] fn panic_strategy: GetPanicStrategy(CrateNum) -> PanicStrategy,
[] fn is_no_builtins: IsNoBuiltins(CrateNum) -> bool,
[] fn extern_crate: ExternCrate(DefId) -> Rc<Option<ExternCrate>>,
[] fn specializes: specializes_node((DefId, DefId)) -> bool,
[] fn in_scope_traits_map: InScopeTraits(DefIndex)
-> Option<Rc<FxHashMap<ItemLocalId, Rc<Vec<TraitCandidate>>>>>,
[] fn module_exports: ModuleExports(DefId) -> Option<Rc<Vec<Export>>>,
[] fn lint_levels: lint_levels_node(CrateNum) -> Rc<lint::LintLevelMap>,
[] fn impl_defaultness: ImplDefaultness(DefId) -> hir::Defaultness,
[] fn exported_symbol_ids: ExportedSymbolIds(CrateNum) -> Rc<DefIdSet>,
[] fn native_libraries: NativeLibraries(CrateNum) -> Rc<Vec<NativeLibrary>>,
[] fn plugin_registrar_fn: PluginRegistrarFn(CrateNum) -> Option<DefId>,
[] fn derive_registrar_fn: DeriveRegistrarFn(CrateNum) -> Option<DefId>,
[] fn crate_disambiguator: CrateDisambiguator(CrateNum) -> Symbol,
[] fn crate_hash: CrateHash(CrateNum) -> Svh,
[] fn original_crate_name: OriginalCrateName(CrateNum) -> Symbol,
[] fn implementations_of_trait: implementations_of_trait_node((CrateNum, DefId))
-> Rc<Vec<DefId>>,
[] fn all_trait_implementations: AllTraitImplementations(CrateNum)
-> Rc<Vec<DefId>>,
[] fn is_dllimport_foreign_item: IsDllimportForeignItem(DefId) -> bool,
[] fn is_statically_included_foreign_item: IsStaticallyIncludedForeignItem(DefId) -> bool,
[] fn native_library_kind: NativeLibraryKind(DefId)
-> Option<NativeLibraryKind>,
[] fn link_args: link_args_node(CrateNum) -> Rc<Vec<String>>,
[] fn named_region_map: NamedRegion(DefIndex) ->
Option<Rc<FxHashMap<ItemLocalId, Region>>>,
[] fn is_late_bound_map: IsLateBound(DefIndex) ->
Option<Rc<FxHashSet<ItemLocalId>>>,
[] fn object_lifetime_defaults_map: ObjectLifetimeDefaults(DefIndex)
-> Option<Rc<FxHashMap<ItemLocalId, Rc<Vec<ObjectLifetimeDefault>>>>>,
[] fn visibility: Visibility(DefId) -> ty::Visibility,
[] fn dep_kind: DepKind(CrateNum) -> DepKind,
[] fn crate_name: CrateName(CrateNum) -> Symbol,
[] fn item_children: ItemChildren(DefId) -> Rc<Vec<Export>>,
[] fn extern_mod_stmt_cnum: ExternModStmtCnum(DefId) -> Option<CrateNum>,
[] fn get_lang_items: get_lang_items_node(CrateNum) -> Rc<LanguageItems>,
[] fn defined_lang_items: DefinedLangItems(CrateNum) -> Rc<Vec<(DefId, usize)>>,
[] fn missing_lang_items: MissingLangItems(CrateNum) -> Rc<Vec<LangItem>>,
[] fn extern_const_body: ExternConstBody(DefId) -> ExternConstBody<'tcx>,
[] fn visible_parent_map: visible_parent_map_node(CrateNum)
-> Rc<DefIdMap<DefId>>,
[] fn missing_extern_crate_item: MissingExternCrateItem(CrateNum) -> bool,
[] fn used_crate_source: UsedCrateSource(CrateNum) -> Rc<CrateSource>,
[] fn postorder_cnums: postorder_cnums_node(CrateNum) -> Rc<Vec<CrateNum>>,
[] fn freevars: Freevars(DefId) -> Option<Rc<Vec<hir::Freevar>>>,
[] fn maybe_unused_trait_import: MaybeUnusedTraitImport(DefId) -> bool,
[] fn maybe_unused_extern_crates: maybe_unused_extern_crates_node(CrateNum)
-> Rc<Vec<(DefId, Span)>>,
[] fn stability_index: stability_index_node(CrateNum) -> Rc<stability::Index<'tcx>>,
[] fn all_crate_nums: all_crate_nums_node(CrateNum) -> Rc<Vec<CrateNum>>,
[] fn exported_symbols: ExportedSymbols(CrateNum)
-> Arc<Vec<(String, Option<DefId>, SymbolExportLevel)>>,
[] fn collect_and_partition_translation_items:
collect_and_partition_translation_items_node(CrateNum)
-> (Arc<DefIdSet>, Arc<Vec<Arc<CodegenUnit<'tcx>>>>),
[] fn export_name: ExportName(DefId) -> Option<Symbol>,
[] fn contains_extern_indicator: ContainsExternIndicator(DefId) -> bool,
[] fn is_translated_function: IsTranslatedFunction(DefId) -> bool,
[] fn codegen_unit: CodegenUnit(InternedString) -> Arc<CodegenUnit<'tcx>>,
[] fn compile_codegen_unit: CompileCodegenUnit(InternedString) -> Stats,
[] fn output_filenames: output_filenames_node(CrateNum)
-> Arc<OutputFilenames>,
}
//////////////////////////////////////////////////////////////////////
// These functions are little shims used to find the dep-node for a
// given query when there is not a *direct* mapping:
fn type_param_predicates<'tcx>((item_id, param_id): (DefId, DefId)) -> DepConstructor<'tcx> {
DepConstructor::TypeParamPredicates {
item_id,
param_id
}
}
fn coherent_trait_dep_node<'tcx>((_, def_id): (CrateNum, DefId)) -> DepConstructor<'tcx> {
DepConstructor::CoherenceCheckTrait(def_id)
}
fn crate_inherent_impls_dep_node<'tcx>(_: CrateNum) -> DepConstructor<'tcx> {
DepConstructor::Coherence
}
fn inherent_impls_overlap_check_dep_node<'tcx>(_: CrateNum) -> DepConstructor<'tcx> {
DepConstructor::CoherenceInherentImplOverlapCheck
}
fn reachability_dep_node<'tcx>(_: CrateNum) -> DepConstructor<'tcx> {
DepConstructor::Reachability
}
fn mir_shim_dep_node<'tcx>(instance_def: ty::InstanceDef<'tcx>) -> DepConstructor<'tcx> {
DepConstructor::MirShim {
instance_def
}
}
fn symbol_name_dep_node<'tcx>(instance: ty::Instance<'tcx>) -> DepConstructor<'tcx> {
DepConstructor::InstanceSymbolName { instance }
}
fn typeck_item_bodies_dep_node<'tcx>(_: CrateNum) -> DepConstructor<'tcx> {
DepConstructor::TypeckBodiesKrate
}
fn const_eval_dep_node<'tcx>(_: ty::ParamEnvAnd<'tcx, (DefId, &'tcx Substs<'tcx>)>)
-> DepConstructor<'tcx> {
DepConstructor::ConstEval
}
fn mir_keys<'tcx>(_: CrateNum) -> DepConstructor<'tcx> {
DepConstructor::MirKeys
}
fn crate_variances<'tcx>(_: CrateNum) -> DepConstructor<'tcx> {
DepConstructor::CrateVariances
}
fn is_copy_dep_node<'tcx>(_: ty::ParamEnvAnd<'tcx, Ty<'tcx>>) -> DepConstructor<'tcx> {
DepConstructor::IsCopy
}
fn is_sized_dep_node<'tcx>(_: ty::ParamEnvAnd<'tcx, Ty<'tcx>>) -> DepConstructor<'tcx> {
DepConstructor::IsSized
}
fn is_freeze_dep_node<'tcx>(_: ty::ParamEnvAnd<'tcx, Ty<'tcx>>) -> DepConstructor<'tcx> {
DepConstructor::IsFreeze
}
fn needs_drop_dep_node<'tcx>(_: ty::ParamEnvAnd<'tcx, Ty<'tcx>>) -> DepConstructor<'tcx> {
DepConstructor::NeedsDrop
}
fn layout_dep_node<'tcx>(_: ty::ParamEnvAnd<'tcx, Ty<'tcx>>) -> DepConstructor<'tcx> {
DepConstructor::Layout
}
fn lint_levels_node<'tcx>(_: CrateNum) -> DepConstructor<'tcx> {
DepConstructor::LintLevels
}
fn specializes_node<'tcx>((a, b): (DefId, DefId)) -> DepConstructor<'tcx> {
DepConstructor::Specializes { impl1: a, impl2: b }
}
fn implementations_of_trait_node<'tcx>((krate, trait_id): (CrateNum, DefId))
-> DepConstructor<'tcx>
{
DepConstructor::ImplementationsOfTrait { krate, trait_id }
}
fn link_args_node<'tcx>(_: CrateNum) -> DepConstructor<'tcx> {
DepConstructor::LinkArgs
}
fn get_lang_items_node<'tcx>(_: CrateNum) -> DepConstructor<'tcx> {
DepConstructor::GetLangItems
}
fn visible_parent_map_node<'tcx>(_: CrateNum) -> DepConstructor<'tcx> {
DepConstructor::VisibleParentMap
}
fn postorder_cnums_node<'tcx>(_: CrateNum) -> DepConstructor<'tcx> {
DepConstructor::PostorderCnums
}
fn maybe_unused_extern_crates_node<'tcx>(_: CrateNum) -> DepConstructor<'tcx> {
DepConstructor::MaybeUnusedExternCrates
}
fn stability_index_node<'tcx>(_: CrateNum) -> DepConstructor<'tcx> {
DepConstructor::StabilityIndex
}
fn all_crate_nums_node<'tcx>(_: CrateNum) -> DepConstructor<'tcx> {
DepConstructor::AllCrateNums
}
fn collect_and_partition_translation_items_node<'tcx>(_: CrateNum) -> DepConstructor<'tcx> {
DepConstructor::CollectAndPartitionTranslationItems
}
fn output_filenames_node<'tcx>(_: CrateNum) -> DepConstructor<'tcx> {
DepConstructor::OutputFilenames
}

View File

@ -0,0 +1,494 @@
// Copyright 2012-2015 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
//! The implementation of the query system itself. Defines the macros
//! that generate the actual methods on tcx which find and execute the
//! provider, manage the caches, and so forth.
use dep_graph::{DepNodeIndex};
use errors::{Diagnostic, DiagnosticBuilder};
use ty::{TyCtxt};
use ty::maps::Query; // NB: actually generated by the macros in this file
use ty::maps::config::QueryDescription;
use ty::item_path;
use rustc_data_structures::fx::{FxHashMap};
use std::cell::{RefMut, Cell};
use std::marker::PhantomData;
use std::mem;
use syntax_pos::Span;
pub(super) struct QueryMap<D: QueryDescription> {
phantom: PhantomData<D>,
pub(super) map: FxHashMap<D::Key, QueryValue<D::Value>>,
}
pub(super) struct QueryValue<T> {
pub(super) value: T,
pub(super) index: DepNodeIndex,
pub(super) diagnostics: Option<Box<QueryDiagnostics>>,
}
pub(super) struct QueryDiagnostics {
pub(super) diagnostics: Vec<Diagnostic>,
pub(super) emitted_diagnostics: Cell<bool>,
}
impl<M: QueryDescription> QueryMap<M> {
pub(super) fn new() -> QueryMap<M> {
QueryMap {
phantom: PhantomData,
map: FxHashMap(),
}
}
}
pub(super) struct CycleError<'a, 'tcx: 'a> {
span: Span,
cycle: RefMut<'a, [(Span, Query<'tcx>)]>,
}
impl<'a, 'gcx, 'tcx> TyCtxt<'a, 'gcx, 'tcx> {
pub(super) fn report_cycle(self, CycleError { span, cycle }: CycleError)
-> DiagnosticBuilder<'a>
{
// Subtle: release the refcell lock before invoking `describe()`
// below by dropping `cycle`.
let stack = cycle.to_vec();
mem::drop(cycle);
assert!(!stack.is_empty());
// Disable naming impls with types in this path, since that
// sometimes cycles itself, leading to extra cycle errors.
// (And cycle errors around impls tend to occur during the
// collect/coherence phases anyhow.)
item_path::with_forced_impl_filename_line(|| {
let mut err =
struct_span_err!(self.sess, span, E0391,
"unsupported cyclic reference between types/traits detected");
err.span_label(span, "cyclic reference");
err.span_note(stack[0].0, &format!("the cycle begins when {}...",
stack[0].1.describe(self)));
for &(span, ref query) in &stack[1..] {
err.span_note(span, &format!("...which then requires {}...",
query.describe(self)));
}
err.note(&format!("...which then again requires {}, completing the cycle.",
stack[0].1.describe(self)));
return err
})
}
pub(super) fn cycle_check<F, R>(self, span: Span, query: Query<'gcx>, compute: F)
-> Result<R, CycleError<'a, 'gcx>>
where F: FnOnce() -> R
{
{
let mut stack = self.maps.query_stack.borrow_mut();
if let Some((i, _)) = stack.iter().enumerate().rev()
.find(|&(_, &(_, ref q))| *q == query) {
return Err(CycleError {
span,
cycle: RefMut::map(stack, |stack| &mut stack[i..])
});
}
stack.push((span, query));
}
let result = compute();
self.maps.query_stack.borrow_mut().pop();
Ok(result)
}
}
// If enabled, send a message to the profile-queries thread
macro_rules! profq_msg {
($tcx:expr, $msg:expr) => {
if cfg!(debug_assertions) {
if $tcx.sess.profile_queries() {
profq_msg($msg)
}
}
}
}
// If enabled, format a key using its debug string, which can be
// expensive to compute (in terms of time).
macro_rules! profq_key {
($tcx:expr, $key:expr) => {
if cfg!(debug_assertions) {
if $tcx.sess.profile_queries_and_keys() {
Some(format!("{:?}", $key))
} else { None }
} else { None }
}
}
macro_rules! define_maps {
(<$tcx:tt>
$($(#[$attr:meta])*
[$($modifiers:tt)*] fn $name:ident: $node:ident($K:ty) -> $V:ty,)*) => {
define_map_struct! {
tcx: $tcx,
input: ($(([$($modifiers)*] [$($attr)*] [$name]))*)
}
impl<$tcx> Maps<$tcx> {
pub fn new(providers: IndexVec<CrateNum, Providers<$tcx>>)
-> Self {
Maps {
providers,
query_stack: RefCell::new(vec![]),
$($name: RefCell::new(QueryMap::new())),*
}
}
}
#[allow(bad_style)]
#[derive(Copy, Clone, Debug, PartialEq, Eq)]
pub enum Query<$tcx> {
$($(#[$attr])* $name($K)),*
}
#[allow(bad_style)]
#[derive(Clone, Debug, PartialEq, Eq)]
pub enum QueryMsg {
$($name(Option<String>)),*
}
impl<$tcx> Query<$tcx> {
pub fn describe(&self, tcx: TyCtxt) -> String {
let (r, name) = match *self {
$(Query::$name(key) => {
(queries::$name::describe(tcx, key), stringify!($name))
})*
};
if tcx.sess.verbose() {
format!("{} [{}]", r, name)
} else {
r
}
}
}
pub mod queries {
use std::marker::PhantomData;
$(#[allow(bad_style)]
pub struct $name<$tcx> {
data: PhantomData<&$tcx ()>
})*
}
$(impl<$tcx> QueryConfig for queries::$name<$tcx> {
type Key = $K;
type Value = $V;
}
impl<'a, $tcx, 'lcx> queries::$name<$tcx> {
#[allow(unused)]
fn to_dep_node(tcx: TyCtxt<'a, $tcx, 'lcx>, key: &$K) -> DepNode {
use dep_graph::DepConstructor::*;
DepNode::new(tcx, $node(*key))
}
fn try_get_with<F, R>(tcx: TyCtxt<'a, $tcx, 'lcx>,
mut span: Span,
key: $K,
f: F)
-> Result<R, CycleError<'a, $tcx>>
where F: FnOnce(&$V) -> R
{
debug!("ty::queries::{}::try_get_with(key={:?}, span={:?})",
stringify!($name),
key,
span);
profq_msg!(tcx,
ProfileQueriesMsg::QueryBegin(
span.clone(),
QueryMsg::$name(profq_key!(tcx, key))
)
);
if let Some(value) = tcx.maps.$name.borrow().map.get(&key) {
if let Some(ref d) = value.diagnostics {
if !d.emitted_diagnostics.get() {
d.emitted_diagnostics.set(true);
let handle = tcx.sess.diagnostic();
for diagnostic in d.diagnostics.iter() {
DiagnosticBuilder::new_diagnostic(handle, diagnostic.clone())
.emit();
}
}
}
profq_msg!(tcx, ProfileQueriesMsg::CacheHit);
tcx.dep_graph.read_index(value.index);
return Ok(f(&value.value));
}
// else, we are going to run the provider:
profq_msg!(tcx, ProfileQueriesMsg::ProviderBegin);
// FIXME(eddyb) Get more valid Span's on queries.
// def_span guard is necessary to prevent a recursive loop,
// default_span calls def_span query internally.
if span == DUMMY_SP && stringify!($name) != "def_span" {
span = key.default_span(tcx)
}
let dep_node = Self::to_dep_node(tcx, &key);
let res = tcx.cycle_check(span, Query::$name(key), || {
tcx.sess.diagnostic().track_diagnostics(|| {
if dep_node.kind.is_anon() {
tcx.dep_graph.with_anon_task(dep_node.kind, || {
let provider = tcx.maps.providers[key.map_crate()].$name;
provider(tcx.global_tcx(), key)
})
} else {
fn run_provider<'a, 'tcx, 'lcx>(tcx: TyCtxt<'a, 'tcx, 'lcx>,
key: $K)
-> $V {
let provider = tcx.maps.providers[key.map_crate()].$name;
provider(tcx.global_tcx(), key)
}
tcx.dep_graph.with_task(dep_node, tcx, key, run_provider)
}
})
})?;
profq_msg!(tcx, ProfileQueriesMsg::ProviderEnd);
let ((result, dep_node_index), diagnostics) = res;
tcx.dep_graph.read_index(dep_node_index);
let value = QueryValue {
value: result,
index: dep_node_index,
diagnostics: if diagnostics.len() == 0 {
None
} else {
Some(Box::new(QueryDiagnostics {
diagnostics,
emitted_diagnostics: Cell::new(true),
}))
},
};
Ok(f(&tcx.maps
.$name
.borrow_mut()
.map
.entry(key)
.or_insert(value)
.value))
}
pub fn try_get(tcx: TyCtxt<'a, $tcx, 'lcx>, span: Span, key: $K)
-> Result<$V, DiagnosticBuilder<'a>> {
match Self::try_get_with(tcx, span, key, Clone::clone) {
Ok(e) => Ok(e),
Err(e) => Err(tcx.report_cycle(e)),
}
}
pub fn force(tcx: TyCtxt<'a, $tcx, 'lcx>, span: Span, key: $K) {
// Ignore dependencies, since we not reading the computed value
let _task = tcx.dep_graph.in_ignore();
match Self::try_get_with(tcx, span, key, |_| ()) {
Ok(()) => {}
Err(e) => tcx.report_cycle(e).emit(),
}
}
})*
#[derive(Copy, Clone)]
pub struct TyCtxtAt<'a, 'gcx: 'a+'tcx, 'tcx: 'a> {
pub tcx: TyCtxt<'a, 'gcx, 'tcx>,
pub span: Span,
}
impl<'a, 'gcx, 'tcx> Deref for TyCtxtAt<'a, 'gcx, 'tcx> {
type Target = TyCtxt<'a, 'gcx, 'tcx>;
fn deref(&self) -> &Self::Target {
&self.tcx
}
}
impl<'a, $tcx, 'lcx> TyCtxt<'a, $tcx, 'lcx> {
/// Return a transparent wrapper for `TyCtxt` which uses
/// `span` as the location of queries performed through it.
pub fn at(self, span: Span) -> TyCtxtAt<'a, $tcx, 'lcx> {
TyCtxtAt {
tcx: self,
span
}
}
$($(#[$attr])*
pub fn $name(self, key: $K) -> $V {
self.at(DUMMY_SP).$name(key)
})*
}
impl<'a, $tcx, 'lcx> TyCtxtAt<'a, $tcx, 'lcx> {
$($(#[$attr])*
pub fn $name(self, key: $K) -> $V {
queries::$name::try_get(self.tcx, self.span, key).unwrap_or_else(|mut e| {
e.emit();
Value::from_cycle_error(self.global_tcx())
})
})*
}
define_provider_struct! {
tcx: $tcx,
input: ($(([$($modifiers)*] [$name] [$K] [$V]))*),
output: ()
}
impl<$tcx> Copy for Providers<$tcx> {}
impl<$tcx> Clone for Providers<$tcx> {
fn clone(&self) -> Self { *self }
}
}
}
macro_rules! define_map_struct {
// Initial state
(tcx: $tcx:tt,
input: $input:tt) => {
define_map_struct! {
tcx: $tcx,
input: $input,
output: ()
}
};
// Final output
(tcx: $tcx:tt,
input: (),
output: ($($output:tt)*)) => {
pub struct Maps<$tcx> {
providers: IndexVec<CrateNum, Providers<$tcx>>,
query_stack: RefCell<Vec<(Span, Query<$tcx>)>>,
$($output)*
}
};
// Field recognized and ready to shift into the output
(tcx: $tcx:tt,
ready: ([$($pub:tt)*] [$($attr:tt)*] [$name:ident]),
input: $input:tt,
output: ($($output:tt)*)) => {
define_map_struct! {
tcx: $tcx,
input: $input,
output: ($($output)*
$(#[$attr])* $($pub)* $name: RefCell<QueryMap<queries::$name<$tcx>>>,)
}
};
// No modifiers left? This is a private item.
(tcx: $tcx:tt,
input: (([] $attrs:tt $name:tt) $($input:tt)*),
output: $output:tt) => {
define_map_struct! {
tcx: $tcx,
ready: ([] $attrs $name),
input: ($($input)*),
output: $output
}
};
// Skip other modifiers
(tcx: $tcx:tt,
input: (([$other_modifier:tt $($modifiers:tt)*] $($fields:tt)*) $($input:tt)*),
output: $output:tt) => {
define_map_struct! {
tcx: $tcx,
input: (([$($modifiers)*] $($fields)*) $($input)*),
output: $output
}
};
}
macro_rules! define_provider_struct {
// Initial state:
(tcx: $tcx:tt, input: $input:tt) => {
define_provider_struct! {
tcx: $tcx,
input: $input,
output: ()
}
};
// Final state:
(tcx: $tcx:tt,
input: (),
output: ($(([$name:ident] [$K:ty] [$R:ty]))*)) => {
pub struct Providers<$tcx> {
$(pub $name: for<'a> fn(TyCtxt<'a, $tcx, $tcx>, $K) -> $R,)*
}
impl<$tcx> Default for Providers<$tcx> {
fn default() -> Self {
$(fn $name<'a, $tcx>(_: TyCtxt<'a, $tcx, $tcx>, key: $K) -> $R {
bug!("tcx.maps.{}({:?}) unsupported by its crate",
stringify!($name), key);
})*
Providers { $($name),* }
}
}
};
// Something ready to shift:
(tcx: $tcx:tt,
ready: ($name:tt $K:tt $V:tt),
input: $input:tt,
output: ($($output:tt)*)) => {
define_provider_struct! {
tcx: $tcx,
input: $input,
output: ($($output)* ($name $K $V))
}
};
// Regular queries produce a `V` only.
(tcx: $tcx:tt,
input: (([] $name:tt $K:tt $V:tt) $($input:tt)*),
output: $output:tt) => {
define_provider_struct! {
tcx: $tcx,
ready: ($name $K $V),
input: ($($input)*),
output: $output
}
};
// Skip modifiers.
(tcx: $tcx:tt,
input: (([$other_modifier:tt $($modifiers:tt)*] $($fields:tt)*) $($input:tt)*),
output: $output:tt) => {
define_provider_struct! {
tcx: $tcx,
input: (([$($modifiers)*] $($fields)*) $($input)*),
output: $output
}
};
}

View File

@ -0,0 +1,49 @@
// Copyright 2012-2015 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT.
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
use ty::{self, Ty, TyCtxt};
use syntax::symbol::Symbol;
pub(super) trait Value<'tcx>: Sized {
fn from_cycle_error<'a>(tcx: TyCtxt<'a, 'tcx, 'tcx>) -> Self;
}
impl<'tcx, T> Value<'tcx> for T {
default fn from_cycle_error<'a>(tcx: TyCtxt<'a, 'tcx, 'tcx>) -> T {
tcx.sess.abort_if_errors();
bug!("Value::from_cycle_error called without errors");
}
}
impl<'tcx, T: Default> Value<'tcx> for T {
default fn from_cycle_error<'a>(_: TyCtxt<'a, 'tcx, 'tcx>) -> T {
T::default()
}
}
impl<'tcx> Value<'tcx> for Ty<'tcx> {
fn from_cycle_error<'a>(tcx: TyCtxt<'a, 'tcx, 'tcx>) -> Ty<'tcx> {
tcx.types.err
}
}
impl<'tcx> Value<'tcx> for ty::DtorckConstraint<'tcx> {
fn from_cycle_error<'a>(_: TyCtxt<'a, 'tcx, 'tcx>) -> Self {
Self::empty()
}
}
impl<'tcx> Value<'tcx> for ty::SymbolName {
fn from_cycle_error<'a>(_: TyCtxt<'a, 'tcx, 'tcx>) -> Self {
ty::SymbolName { name: Symbol::intern("<error>").as_str() }
}
}

View File

@ -0,0 +1,6 @@
NB: This crate is part of the Rust compiler. For an overview of the
compiler as a whole, see
[the README.md file found in `librustc`](../librustc/README.md).
`librustc_back` contains some very low-level details that are
specific to different LLVM targets and so forth.

View File

@ -0,0 +1,12 @@
NB: This crate is part of the Rust compiler. For an overview of the
compiler as a whole, see
[the README.md file found in `librustc`](../librustc/README.md).
The `driver` crate is effectively the "main" function for the rust
compiler. It orchstrates the compilation process and "knits together"
the code from the other crates within rustc. This crate itself does
not contain any of the "main logic" of the compiler (though it does
have some code related to pretty printing or other minor compiler
options).

View File

@ -1 +1,7 @@
See [librustc/README.md](../librustc/README.md).
NB: This crate is part of the Rust compiler. For an overview of the
compiler as a whole, see
[the README.md file found in `librustc`](../librustc/README.md).
The `trans` crate contains the code to convert from MIR into LLVM IR,
and then from LLVM IR into machine code. In general it contains code
that runs towards the end of the compilation process.

View File

@ -0,0 +1,48 @@
NB: This crate is part of the Rust compiler. For an overview of the
compiler as a whole, see
[the README.md file found in `librustc`](../librustc/README.md).
The `rustc_typeck` crate contains the source for "type collection" and
"type checking", as well as a few other bits of related functionality.
(It draws heavily on the [type inferencing][infer] and
[trait solving][traits] code found in librustc.)
[infer]: ../librustc/infer/README.md
[traits]: ../librustc/traits/README.md
## Type collection
Type "collection" is the process of convering the types found in the
HIR (`hir::Ty`), which represent the syntactic things that the user
wrote, into the **internal representation** used by the compiler
(`Ty<'tcx>`) -- we also do similar conversions for where-clauses and
other bits of the function signature.
To try and get a sense for the difference, consider this function:
```rust
struct Foo { }
fn foo(x: Foo, y: self::Foo) { .. }
// ^^^ ^^^^^^^^^
```
Those two parameters `x` and `y` each have the same type: but they
will have distinct `hir::Ty` nodes. Those nodes will have different
spans, and of course they encode the path somewhat differently. But
once they are "collected" into `Ty<'tcx>` nodes, they will be
represented by the exact same internal type.
Collection is defined as a bundle of queries (e.g., `type_of`) for
computing information about the various functions, traits, and other
items in the crate being compiled. Note that each of these queries is
concerned with *interprocedural* things -- for example, for a function
definition, collection will figure out the type and signature of the
function, but it will not visit the *body* of the function in any way,
nor examine type annotations on local variables (that's the job of
type *checking*).
For more details, see the `collect` module.
## Type checking
TODO

View File

@ -8,50 +8,21 @@
// option. This file may not be copied, modified, or distributed
// except according to those terms.
/*
# Collect phase
The collect phase of type check has the job of visiting all items,
determining their type, and writing that type into the `tcx.types`
table. Despite its name, this table does not really operate as a
*cache*, at least not for the types of items defined within the
current crate: we assume that after the collect phase, the types of
all local items will be present in the table.
Unlike most of the types that are present in Rust, the types computed
for each item are in fact type schemes. This means that they are
generic types that may have type parameters. TypeSchemes are
represented by a pair of `Generics` and `Ty`. Type
parameters themselves are represented as `ty_param()` instances.
The phasing of type conversion is somewhat complicated. There is no
clear set of phases we can enforce (e.g., converting traits first,
then types, or something like that) because the user can introduce
arbitrary interdependencies. So instead we generally convert things
lazilly and on demand, and include logic that checks for cycles.
Demand is driven by calls to `AstConv::get_item_type_scheme` or
`AstConv::trait_def`.
Currently, we "convert" types and traits in two phases (note that
conversion only affects the types of items / enum variants / methods;
it does not e.g. compute the types of individual expressions):
0. Intrinsics
1. Trait/Type definitions
Conversion itself is done by simply walking each of the items in turn
and invoking an appropriate function (e.g., `trait_def_of_item` or
`convert_item`). However, it is possible that while converting an
item, we may need to compute the *type scheme* or *trait definition*
for other items.
There are some shortcomings in this design:
- Because the item generics include defaults, cycles through type
parameter defaults are illegal even if those defaults are never
employed. This is not necessarily a bug.
*/
//! "Collection" is the process of determining the type and other external
//! details of each item in Rust. Collection is specifically concerned
//! with *interprocedural* things -- for example, for a function
//! definition, collection will figure out the type and signature of the
//! function, but it will not visit the *body* of the function in any way,
//! nor examine type annotations on local variables (that's the job of
//! type *checking*).
//!
//! Collecting is ultimately defined by a bundle of queries that
//! inquire after various facts about the items in the crate (e.g.,
//! `type_of`, `generics_of`, `predicates_of`, etc). See the `provide` function
//! for the full set.
//!
//! At present, however, we do run collection across all items in the
//! crate as a kind of pass. This should eventually be factored away.
use astconv::{AstConv, Bounds};
use lint;

7
src/libsyntax/README.md Normal file
View File

@ -0,0 +1,7 @@
NB: This crate is part of the Rust compiler. For an overview of the
compiler as a whole, see
[the README.md file found in `librustc`](../librustc/README.md).
The `syntax` crate contains those things concerned purely with syntax
that is, the AST ("abstract syntax tree"), parser, pretty-printer,
lexer, macro expander, and utilities for traversing ASTs.