Commit Graph

444 Commits

Author SHA1 Message Date
Zalathar
7fdac5eef0 coverage: Defer the filtering of hole spans 2025-03-21 21:23:50 +11:00
Zalathar
83b56eb059 coverage: Separate span-extraction from unexpansion 2025-03-21 21:23:50 +11:00
Zalathar
cc8336b6c1 coverage: Don't store a body span in FunctionCoverageInfo 2025-03-18 23:18:24 +11:00
bors
7d8c6e781d Auto merge of #135726 - jdonszelmann:attr-parsing, r=oli-obk
New attribute parsing infrastructure

Another step in the plan outlined in https://github.com/rust-lang/rust/issues/131229

introduces infrastructure for structured parsers for attributes, as well as converting a couple of complex attributes to have such structured parsers.

This PR may prove too large to review. I left some of my own comments to guide it a little. Some general notes:

- The first commit is basically standalone. It just preps some mostly unrelated sources for the rest of the PR to work. It might not have enormous merit on its own, but not negative merit either. Could be merged alone, but also doesn't make the review a whole lot easier. (but it's only +274 -209)
- The second commit is the one that introduces new infrastructure. It's the important one to review.
- The 3rd commit uses the new infrastructure showing how some of the more complex attributes can be parsed using it. Theoretically can be split up, though the parsers in this commit are the ones that really test the new infrastructure and show that it all works.
- The 4th commit fixes up rustdoc and clippy. In the previous 2 they didn't compile yet while the compiler does. Separated them out to separate concerns and make the rest more palatable.
- The 5th commit blesses some test outputs. Sometimes that's just because a diagnostic happens slightly earlier than before, which I'd say is acceptable. Sometimes a diagnostic is now only emitted once where it would've been twice before (yay! fixed some bugs). One test I actually moved from crashes to fixed, because it simply doesn't crash anymore. That's why this PR  Closes #132391. I think most choices I made here are generally reasonable, but let me know if you disagree anywhere.
- The 6th commit adds a derive to pretty print attributes
- The 7th removes smir apis for attributes, for the time being. The api will at some point be replaced by one based on `rustc_ast_data_structures::AttributeKind`

In general, a lot of the additions here are comments. I've found it very important to document new things in the 2nd commit well so other people can start using it.

Closes #132391
Closes #136717
2025-02-24 23:07:24 +00:00
Jana Dönszelmann
115b3b03b0
Change span field accesses to method calls 2025-02-24 14:22:31 +01:00
Jacob Pratt
6aa015ae9d
Rollup merge of #136610 - Jarcho:range_idx, r=Noratrieb
Allow `IndexSlice` to be indexed by ranges.

This comes with some annoyances as the index type can no longer inferred from indexing expressions. The biggest offender for this is `IndexVec::from_fn_n(|idx| ..., n)` where the index type won't be inferred from the call site or any index expressions inside the closure.

My main use case for this is mapping a `Place` to `Range<Idx>` for value tracking where the range represents all the values the place contains.
2025-02-24 02:11:32 -05:00
Michael Goulet
12e3911d81 Greatly simplify lifetime captures in edition 2024 2025-02-22 22:24:52 +00:00
Jason Newcomb
162fb713ac Allow SliceIndex to be indexed by ranges. 2025-02-21 16:10:31 -05:00
Zalathar
d38f6880c0 coverage: Make HolesVisitor::visit_hole_span a regular method 2025-02-19 14:02:29 +11:00
Zalathar
51f704f0ff coverage: Get hole spans from nested items without fully visiting them
It turns out that this visitor doesn't actually need `nested_filter::All` to
handle nested items; it just needs to override `visit_nested_item` and look up
the item's span.
2025-02-19 14:01:46 +11:00
Nicholas Nethercote
661f99ba03 Overhaul the intravisit::Map trait.
First of all, note that `Map` has three different relevant meanings.
- The `intravisit::Map` trait.
- The `map::Map` struct.
- The `NestedFilter::Map` associated type.

The `intravisit::Map` trait is impl'd twice.
- For `!`, where the methods are all unreachable.
- For `map::Map`, which gets HIR stuff from the `TyCtxt`.

As part of getting rid of `map::Map`, this commit changes `impl
intravisit::Map for map::Map` to `impl intravisit::Map for TyCtxt`. It's
fairly straightforward except various things are renamed, because the
existing names would no longer have made sense.

- `trait intravisit::Map` becomes `trait intravisit::HirTyCtxt`, so named
  because it gets some HIR stuff from a `TyCtxt`.
- `NestedFilter::Map` assoc type becomes `NestedFilter::MaybeTyCtxt`,
  because it's always `!` or `TyCtxt`.
- `Visitor::nested_visit_map` becomes `Visitor::maybe_tcx`.

I deliberately made the new trait and associated type names different to
avoid the old `type Map: Map` situation, which I found confusing. We now
have `type MaybeTyCtxt: HirTyCtxt`.
2025-02-17 13:21:35 +11:00
Nicholas Nethercote
f86f7ad5f2 Move some Map methods onto TyCtxt.
The end goal is to eliminate `Map` altogether.

I added a `hir_` prefix to all of them, that seemed simplest. The
exceptions are `module_items` which became `hir_module_free_items` because
there was already a `hir_module_items`, and `items` which became
`hir_free_items` for consistency with `hir_module_free_items`.
2025-02-17 13:21:02 +11:00
Zalathar
ab786d3b98 coverage: Eliminate more counters by giving them to unreachable nodes
When preparing a function's coverage counters and metadata during codegen, any
part of the original coverage graph that was removed by MIR optimizations can
be treated as having an execution count of zero.

Somewhat counter-intuitively, if we give those unreachable nodes a _higher_
priority for receiving physical counters (instead of counter expressions), that
ends up reducing the total number of physical counters needed.

This works because if a node is unreachable, we don't actually create a
physical counter for it. Instead that node gets a fixed zero counter, and any
other node that would have relied on that physical counter in its counter
expression can just ignore that term completely.
2025-02-13 13:45:53 +11:00
Jubilee
7f8108afc8
Rollup merge of #136053 - Zalathar:defer-counters, r=saethlin
coverage: Defer part of counter-creation until codegen

Follow-up to #135481 and #135873.

One of the pleasant properties of the new counter-assignment algorithm is that we can stop partway through the process, store the intermediate state in MIR, and then resume the rest of the algorithm during codegen. This lets it take into account which parts of the control-flow graph were eliminated by MIR opts, resulting in fewer physical counters and simpler counter expressions.

Those improvements end up completely obsoleting much larger chunks of code that were previously responsible for cleaning up the coverage metadata after MIR opts, while also doing a more thorough cleanup job.

(That change also unlocks some further simplifications that I've kept out of this PR to limit its scope.)
2025-02-10 00:51:49 -08:00
bjorn3
1fcae03369 Rustfmt 2025-02-08 22:12:13 +00:00
Zalathar
bd855b6c9e coverage: Remove the old code for simplifying counters after MIR opts 2025-02-06 21:44:31 +11:00
Zalathar
bf1f254b13 coverage: Don't create counters for code that was removed by MIR opts 2025-02-06 21:44:31 +11:00
Zalathar
20d051ec87 coverage: Defer part of counter-creation until codegen 2025-02-06 21:44:31 +11:00
Zalathar
ee7dc06cf1 coverage: Store BCB node IDs in mappings, and resolve them in codegen
Even though the coverage graph itself is no longer available during codegen,
its nodes can still be used as opaque IDs.
2025-02-06 21:44:29 +11:00
Nicholas Nethercote
e661514bda Remove hook calling via TyCtxtAt.
All hooks receive a `TyCtxtAt` argument.

Currently hooks can be called through `TyCtxtAt` or `TyCtxt`. In the
latter case, a `TyCtxtAt` is constructed with a dummy span and passed to
the hook.

However, in practice hooks are never called through `TyCtxtAt`, and
always receive a dummy span. (I confirmed this via code inspection, and
double-checked it by temporarily making the `TyCtxtAt` code path panic
and running all the tests.)

This commit removes all the `TyCtxtAt` machinery for hooks. All hooks
now receive `TyCtxt` instead of `TyCtxtAt`. There are two existing hooks
that use `TyCtxtAt::span`: `const_caller_location_provider` and
`try_destructure_mir_constant_for_user_output`. For both hooks the span
is always a dummy span, probably unintentionally. This dummy span use is
now explicit. If a non-dummy span is needed for these two hooks it would
be easy to add it as an extra argument because hooks are less
constrained than queries.
2025-02-03 17:02:33 +11:00
Zalathar
d36e2b88d6 Incorporate iter_nodes into graph::DirectedGraph
This assumes that the set of valid node IDs is exactly `0..num_nodes`.

In practice, we have a lot of graph-algorithm code that already assumes that
nodes are densely numbered, by using `num_nodes` to allocate per-node indexed
data structures.
2025-01-26 14:08:42 +11:00
bors
6365178a6b Auto merge of #128657 - clubby789:optimize-none, r=fee1-dead,WaffleLapkin
Add `#[optimize(none)]`

cc #54882

This extends the `optimize` attribute to add `none`, which corresponds to the LLVM `OptimizeNone` attribute.

Not sure if an MCP is required for this, happy to file one if so.
2025-01-25 05:50:36 +00:00
Zalathar
2bdc67a75e coverage: Treat the "merged node flow graph" as a plain data struct
By removing all methods from this struct and treating it as a collection of
data fields, we make it easier for a future PR to store that data in a query
result, without having to move all of its methods into `rustc_middle`.
2025-01-24 16:13:12 +11:00
Zalathar
4b20a27ae0 coverage: Replace FrozenUnionFind with a plain IndexVec
This dedicated type seemed like a good idea at the time, but if we want to
store this information in a query result then a plainer data type is more
convenient.
2025-01-24 16:13:12 +11:00
Zalathar
52c1bfa7bb coverage: Simplify how counter terms are stored
Using `SmallVec` here was fine when it was a module-private detail, but if we
want to pass these terms across query boundaries then it's not worth the extra
hassle.

Replacing a method call with direct field access is slightly simpler.

Using the name `counter_terms` is more consistent with other code that tries to
maintain a distinction between (physical) "counters" and "expressions".
2025-01-24 16:13:12 +11:00
Zalathar
ff48331588 coverage: Make query coverage_ids_info return an Option
This reflects the fact that we can't compute meaningful info for a function
that wasn't instrumented and therefore doesn't have `function_coverage_info`.
2025-01-24 16:13:11 +11:00
Zalathar
ec6fc95d6d coverage: Remove some dead code from MC/DC branch mapping conversion 2025-01-24 16:06:18 +11:00
clubby789
7a9661d768 Disable non-required MIR opts with optimize(none)
Co-authored-by: Waffle Lapkin <waffle.lapkin@gmail.com>
2025-01-23 17:40:41 +00:00
Yotam Ofek
264fa0fc54 Run clippy --fix for unnecessary_map_or lint 2025-01-19 19:15:00 +00:00
Zalathar
ea0c86c434 coverage: Add a few more comments to counter creation 2025-01-18 22:14:16 +11:00
Zalathar
6000d5c462 coverage: Remove BcbCounter and BcbExpression
Making these separate types from `CovTerm` and `Expression` was historically
very helpful, but now that most of the counter-creation work is handled by
`node_flow` they are no longer needed.
2025-01-18 22:14:16 +11:00
Zalathar
4170b93cdc coverage: Flatten top-level counter creation into plain functions
- Move `make_bcb_counters` out of `CoverageCounters`
- Split out `make_node_counter_priority_list`
- Flatten `Transcriber` into the function `transcribe_counters`
2025-01-18 22:12:47 +11:00
Zalathar
48399442d4 coverage: Move phys_counter_for_node into CoverageCounters 2025-01-18 18:51:22 +11:00
Zalathar
a14c35c507 coverage: Remove the Site enum now that we only instrument nodes 2025-01-18 17:12:18 +11:00
Zalathar
6eabf03526 coverage: Make yank_to_spantree_root iterative instead of recursive
This avoids having to worry about stack space when traversing very long
spantree paths, e.g. when instrumenting a long sequence of if/else statements.
2025-01-16 22:07:18 +11:00
Zalathar
f1300c860e coverage: Completely overhaul counter assignment, using node-flow graphs 2025-01-16 22:07:18 +11:00
Rémy Rakic
a13354bea0 rename BitSet to DenseBitSet
This should make it clearer that this bitset is dense, with the
advantages and disadvantages that it entails.
2025-01-11 11:34:01 +00:00
Matthias Krüger
1c619373f9 remove more redundant into() conversions 2025-01-10 07:08:28 +01:00
Zalathar
544809e48a coverage: Rename basic_coverage_blocks to just graph
During coverage instrumentation, this variable always holds the coverage graph,
which is a simplified view of the MIR control-flow graph. The new name is
clearer in context, and also shorter.
2024-12-20 17:48:59 +11:00
Zalathar
34ed51cb83 coverage: Store coverage source regions as Span until codegen 2024-12-19 18:09:09 +11:00
León Orell Valerian Liehr
bb8a20678c
Rollup merge of #134029 - Zalathar:zero, r=oli-obk
coverage: Use a query to find counters/expressions that must be zero

As of #133446, this query (`coverage_ids_info`) determines which counter/expression IDs are unused. So with only a little extra work, we can take the code that was using that information to determine which coverage counters/expressions must be zero, and move that inside the query as well.

There should be no change in compiler output.
2024-12-10 08:55:59 +01:00
Zalathar
2022ef7f12 coverage: Use a query to find counters/expressions that must be zero
This query (`coverage_ids_info`) already determines which counter/expression
IDs are unused, so it only takes a little extra effort to also determine which
counters/expressions must have a value of zero.
2024-12-08 20:53:39 +11:00
Zalathar
f3f7c20f7b coverage: Move CoverageIdsInfo into mir::coverage 2024-12-08 17:50:42 +11:00
Zalathar
ac815ff6b0 coverage: Prefer to visit nodes whose predecessors have been visited 2024-12-07 12:13:12 +11:00
Zalathar
ba08056d47 coverage: Remove the expression simplifier from CoverageCounters
These simplifications are now handled by the transcribe step.
2024-12-04 17:55:57 +11:00
Zalathar
d7090f335c coverage: Use a separate counter type during counter creation 2024-12-04 17:55:53 +11:00
Zalathar
44e4e4515c coverage: Add an extra "transcribe" step after counter creation 2024-12-04 17:50:52 +11:00
Zalathar
aca6dba6d1 coverage: Use a single make_phys_counter method
This is more convenient for subsequent patches.
2024-12-04 17:00:25 +11:00
Zalathar
7ecc677f5b coverage: Rename CounterIncrementSite to just Site
A "site" is a node or edge in the coverage graph.
2024-12-04 17:00:25 +11:00
Zalathar
2a3b4a0afd coverage: Extract subtracted_sum in counter creation 2024-12-04 17:00:25 +11:00