Commit Graph

13 Commits

Author SHA1 Message Date
Aaron Hill
3321d70161
Address review comments 2021-02-13 13:04:54 -05:00
Aaron Hill
0b411f56e1
Require passing an AttrWrapper to collect_tokens_trailing_token
This is a pure refactoring split out from #80689.
It represents the most invasive part of that PR, requiring changes in
every caller of `parse_outer_attributes`

In order to eagerly expand `#[cfg]` attributes while preserving the
original `TokenStream`, we need to know the range of tokens that
corresponds to every attribute target. This is accomplished by making
`parse_outer_attributes` return an opaque `AttrWrapper` struct. An
`AttrWrapper` must be converted to a plain `AttrVec` by passing it to
`collect_tokens_trailing_token`. This makes it difficult to accidentally
construct an AST node with attributes without calling `collect_tokens_trailing_token`,
since AST nodes store an `AttrVec`, not an `AttrWrapper`.

As a result, we now call `collect_tokens_trailing_token` for attribute
targets which only support inert attributes, such as generic arguments
and struct fields. Currently, the constructed `LazyTokenStream` is
simply discarded. Future PRs will record the token range corresponding
to the attribute target, allowing those tokens to be removed from an
enclosing `collect_tokens_trailing_token` call if necessary.
2021-02-13 12:07:15 -05:00
Vadim Petrochenkov
dbdbd30bf2 expand/resolve: Turn #[derive] into a regular macro attribute 2021-02-07 20:08:45 +03:00
Aaron Hill
a961e6785c
Set tokens on AST node in collect_tokens
A new `HasTokens` trait is introduced, which is used to move logic from
the callers of `collect_tokens` into the body of `collect_tokens`.

In addition to reducing duplication, this paves the way for PR #80689,
which needs to perform additional logic during token collection.
2021-01-13 22:10:36 -05:00
Caleb Cartwright
5930a8ab12 rustc_parse: restore pub vis on parse_attribute 2020-11-26 12:54:09 -06:00
Vadim Petrochenkov
dfb690eaa9 resolve/expand: Misc cleanup 2020-11-19 19:25:20 +03:00
Vadim Petrochenkov
12de1e8985 Do not collect tokens for doc comments 2020-11-09 01:47:11 +03:00
Aaron Hill
5c7d8d049c
Only call collect_tokens when we have an attribute to parse 2020-10-22 15:17:40 -04:00
Aaron Hill
920bed1213
Don't create an empty LazyTokenStream 2020-10-22 10:09:08 -04:00
Aaron Hill
b9b2546417
Unconditionally capture tokens for attributes.
This allows us to avoid synthesizing tokens in `prepend_attr`, since we
have the original tokens available.

We still need to synthesize tokens when expanding `cfg_attr`,
but this is an unavoidable consequence of the syntax of `cfg_attr` -
the user does not supply the `#` and `[]` tokens that a `cfg_attr`
expands to.
2020-10-21 18:57:29 -04:00
Aaron Hill
593fdd3d45
Rewrite collect_tokens implementations to use a flattened buffer
Instead of trying to collect tokens at each depth, we 'flatten' the
stream as we go allong, pushing open/close delimiters to our buffer
just like regular tokens. One capturing is complete, we reconstruct a
nested `TokenTree::Delimited` structure, producing a normal
`TokenStream`.

The reconstructed `TokenStream` is not created immediately - instead, it is
produced on-demand by a closure (wrapped in a new `LazyTokenStream` type). This
closure stores a clone of the original `TokenCursor`, plus a record of the
number of calls to `next()/next_desugared()`. This is sufficient to reconstruct
the tokenstream seen by the callback without storing any additional state. If
the tokenstream is never used (e.g. when a captured `macro_rules!` argument is
never passed to a proc macro), we never actually create a `TokenStream`.

This implementation has a number of advantages over the previous one:

* It is significantly simpler, with no edge cases around capturing the
  start/end of a delimited group.

* It can be easily extended to allow replacing tokens an an arbitrary
  'depth' by just using `Vec::splice` at the proper position. This is
  important for PR #76130, which requires us to track information about
  attributes along with tokens.

* The lazy approach to `TokenStream` construction allows us to easily
  parse an AST struct, and then decide after the fact whether we need a
  `TokenStream`. This will be useful when we start collecting tokens for
  `Attribute` - we can discard the `LazyTokenStream` if the parsed
  attribute doesn't need tokens (e.g. is a builtin attribute).

The performance impact seems to be neglibile (see
https://github.com/rust-lang/rust/pull/77250#issuecomment-703960604). There is a
small slowdown on a few benchmarks, but it only rises above 1% for incremental
builds, where it represents a larger fraction of the much smaller instruction
count. There a ~1% speedup on a few other incremental benchmarks - my guess is
that the speedups and slowdowns will usually cancel out in practice.
2020-10-19 13:59:18 -04:00
Aaron Hill
3815e91ccd
Attach tokens to NtMeta (ast::AttrItem)
An `AttrItem` does not have outer attributes, so we only capture tokens
when parsing a `macro_rules!` matcher
2020-09-10 17:33:06 -04:00
mark
9e5f7d5631 mv compiler to compiler/ 2020-08-30 18:45:07 +03:00