Commit Graph

9 Commits

Author SHA1 Message Date
Caleb Cartwright
5930a8ab12 rustc_parse: restore pub vis on parse_attribute 2020-11-26 12:54:09 -06:00
Vadim Petrochenkov
dfb690eaa9 resolve/expand: Misc cleanup 2020-11-19 19:25:20 +03:00
Vadim Petrochenkov
12de1e8985 Do not collect tokens for doc comments 2020-11-09 01:47:11 +03:00
Aaron Hill
5c7d8d049c
Only call collect_tokens when we have an attribute to parse 2020-10-22 15:17:40 -04:00
Aaron Hill
920bed1213
Don't create an empty LazyTokenStream 2020-10-22 10:09:08 -04:00
Aaron Hill
b9b2546417
Unconditionally capture tokens for attributes.
This allows us to avoid synthesizing tokens in `prepend_attr`, since we
have the original tokens available.

We still need to synthesize tokens when expanding `cfg_attr`,
but this is an unavoidable consequence of the syntax of `cfg_attr` -
the user does not supply the `#` and `[]` tokens that a `cfg_attr`
expands to.
2020-10-21 18:57:29 -04:00
Aaron Hill
593fdd3d45
Rewrite collect_tokens implementations to use a flattened buffer
Instead of trying to collect tokens at each depth, we 'flatten' the
stream as we go allong, pushing open/close delimiters to our buffer
just like regular tokens. One capturing is complete, we reconstruct a
nested `TokenTree::Delimited` structure, producing a normal
`TokenStream`.

The reconstructed `TokenStream` is not created immediately - instead, it is
produced on-demand by a closure (wrapped in a new `LazyTokenStream` type). This
closure stores a clone of the original `TokenCursor`, plus a record of the
number of calls to `next()/next_desugared()`. This is sufficient to reconstruct
the tokenstream seen by the callback without storing any additional state. If
the tokenstream is never used (e.g. when a captured `macro_rules!` argument is
never passed to a proc macro), we never actually create a `TokenStream`.

This implementation has a number of advantages over the previous one:

* It is significantly simpler, with no edge cases around capturing the
  start/end of a delimited group.

* It can be easily extended to allow replacing tokens an an arbitrary
  'depth' by just using `Vec::splice` at the proper position. This is
  important for PR #76130, which requires us to track information about
  attributes along with tokens.

* The lazy approach to `TokenStream` construction allows us to easily
  parse an AST struct, and then decide after the fact whether we need a
  `TokenStream`. This will be useful when we start collecting tokens for
  `Attribute` - we can discard the `LazyTokenStream` if the parsed
  attribute doesn't need tokens (e.g. is a builtin attribute).

The performance impact seems to be neglibile (see
https://github.com/rust-lang/rust/pull/77250#issuecomment-703960604). There is a
small slowdown on a few benchmarks, but it only rises above 1% for incremental
builds, where it represents a larger fraction of the much smaller instruction
count. There a ~1% speedup on a few other incremental benchmarks - my guess is
that the speedups and slowdowns will usually cancel out in practice.
2020-10-19 13:59:18 -04:00
Aaron Hill
3815e91ccd
Attach tokens to NtMeta (ast::AttrItem)
An `AttrItem` does not have outer attributes, so we only capture tokens
when parsing a `macro_rules!` matcher
2020-09-10 17:33:06 -04:00
mark
9e5f7d5631 mv compiler to compiler/ 2020-08-30 18:45:07 +03:00