Auto merge of #132384 - matthiaskrgr:rollup-0ze5wc4, r=matthiaskrgr

Rollup of 4 pull requests

Successful merges:

 - #132347 (Remove `ValueAnalysis` and `ValueAnalysisWrapper`.)
 - #132365 (pass `RUSTC_HOST_FLAGS` at once without the for loop)
 - #132366 (Do not enforce `~const` constness effects in typeck if `rustc_do_not_const_check`)
 - #132376 (Annotate `input` reference tests)

r? `@ghost`
`@rustbot` modify labels: rollup
This commit is contained in:
bors 2024-10-31 06:22:57 +00:00
commit 4d296eabe4
28 changed files with 355 additions and 511 deletions

View File

@ -851,6 +851,11 @@ impl<'a, 'tcx> FnCtxt<'a, 'tcx> {
return; return;
} }
// If we have `rustc_do_not_const_check`, do not check `~const` bounds.
if self.tcx.has_attr(self.body_id, sym::rustc_do_not_const_check) {
return;
}
let host = match self.tcx.hir().body_const_context(self.body_id) { let host = match self.tcx.hir().body_const_context(self.body_id) {
Some(hir::ConstContext::Const { .. } | hir::ConstContext::Static(_)) => { Some(hir::ConstContext::Const { .. } | hir::ConstContext::Static(_)) => {
ty::BoundConstness::Const ty::BoundConstness::Const

View File

@ -1,38 +1,3 @@
//! This module provides a framework on top of the normal MIR dataflow framework to simplify the
//! implementation of analyses that track information about the values stored in certain places.
//! We are using the term "place" here to refer to a `mir::Place` (a place expression) instead of
//! an `interpret::Place` (a memory location).
//!
//! The default methods of [`ValueAnalysis`] (prefixed with `super_` instead of `handle_`)
//! provide some behavior that should be valid for all abstract domains that are based only on the
//! value stored in a certain place. On top of these default rules, an implementation should
//! override some of the `handle_` methods. For an example, see `ConstAnalysis`.
//!
//! An implementation must also provide a [`Map`]. Before the analysis begins, all places that
//! should be tracked during the analysis must be registered. During the analysis, no new places
//! can be registered. The [`State`] can be queried to retrieve the abstract value stored for a
//! certain place by passing the map.
//!
//! This framework is currently experimental. Originally, it supported shared references and enum
//! variants. However, it was discovered that both of these were unsound, and especially references
//! had subtle but serious issues. In the future, they could be added back in, but we should clarify
//! the rules for optimizations that rely on the aliasing model first.
//!
//!
//! # Notes
//!
//! - The bottom state denotes uninitialized memory. Because we are only doing a sound approximation
//! of the actual execution, we can also use this state for places where access would be UB.
//!
//! - The assignment logic in `State::insert_place_idx` assumes that the places are non-overlapping,
//! or identical. Note that this refers to place expressions, not memory locations.
//!
//! - Currently, places that have their reference taken cannot be tracked. Although this would be
//! possible, it has to rely on some aliasing model, which we are not ready to commit to yet.
//! Because of that, we can assume that the only way to change the value behind a tracked place is
//! by direct assignment.
use std::assert_matches::assert_matches;
use std::fmt::{Debug, Formatter}; use std::fmt::{Debug, Formatter};
use std::ops::Range; use std::ops::Range;
@ -42,359 +7,14 @@ use rustc_data_structures::fx::{FxHashMap, FxIndexSet, StdEntry};
use rustc_data_structures::stack::ensure_sufficient_stack; use rustc_data_structures::stack::ensure_sufficient_stack;
use rustc_index::IndexVec; use rustc_index::IndexVec;
use rustc_index::bit_set::BitSet; use rustc_index::bit_set::BitSet;
use rustc_middle::bug;
use rustc_middle::mir::tcx::PlaceTy; use rustc_middle::mir::tcx::PlaceTy;
use rustc_middle::mir::visit::{MutatingUseContext, PlaceContext, Visitor}; use rustc_middle::mir::visit::{MutatingUseContext, PlaceContext, Visitor};
use rustc_middle::mir::*; use rustc_middle::mir::*;
use rustc_middle::ty::{self, Ty, TyCtxt}; use rustc_middle::ty::{self, Ty, TyCtxt};
use tracing::debug; use tracing::debug;
use crate::fmt::DebugWithContext; use crate::JoinSemiLattice;
use crate::lattice::{HasBottom, HasTop}; use crate::lattice::{HasBottom, HasTop};
use crate::{Analysis, JoinSemiLattice, SwitchIntEdgeEffects};
pub trait ValueAnalysis<'tcx> {
/// For each place of interest, the analysis tracks a value of the given type.
type Value: Clone + JoinSemiLattice + HasBottom + HasTop + Debug;
const NAME: &'static str;
fn map(&self) -> &Map<'tcx>;
fn handle_statement(&self, statement: &Statement<'tcx>, state: &mut State<Self::Value>) {
self.super_statement(statement, state)
}
fn super_statement(&self, statement: &Statement<'tcx>, state: &mut State<Self::Value>) {
match &statement.kind {
StatementKind::Assign(box (place, rvalue)) => {
self.handle_assign(*place, rvalue, state);
}
StatementKind::SetDiscriminant { box place, variant_index } => {
self.handle_set_discriminant(*place, *variant_index, state);
}
StatementKind::Intrinsic(box intrinsic) => {
self.handle_intrinsic(intrinsic, state);
}
StatementKind::StorageLive(local) | StatementKind::StorageDead(local) => {
// StorageLive leaves the local in an uninitialized state.
// StorageDead makes it UB to access the local afterwards.
state.flood_with(Place::from(*local).as_ref(), self.map(), Self::Value::BOTTOM);
}
StatementKind::Deinit(box place) => {
// Deinit makes the place uninitialized.
state.flood_with(place.as_ref(), self.map(), Self::Value::BOTTOM);
}
StatementKind::Retag(..) => {
// We don't track references.
}
StatementKind::ConstEvalCounter
| StatementKind::Nop
| StatementKind::FakeRead(..)
| StatementKind::PlaceMention(..)
| StatementKind::Coverage(..)
| StatementKind::AscribeUserType(..) => (),
}
}
fn handle_set_discriminant(
&self,
place: Place<'tcx>,
variant_index: VariantIdx,
state: &mut State<Self::Value>,
) {
self.super_set_discriminant(place, variant_index, state)
}
fn super_set_discriminant(
&self,
place: Place<'tcx>,
_variant_index: VariantIdx,
state: &mut State<Self::Value>,
) {
state.flood_discr(place.as_ref(), self.map());
}
fn handle_intrinsic(
&self,
intrinsic: &NonDivergingIntrinsic<'tcx>,
state: &mut State<Self::Value>,
) {
self.super_intrinsic(intrinsic, state);
}
fn super_intrinsic(
&self,
intrinsic: &NonDivergingIntrinsic<'tcx>,
_state: &mut State<Self::Value>,
) {
match intrinsic {
NonDivergingIntrinsic::Assume(..) => {
// Could use this, but ignoring it is sound.
}
NonDivergingIntrinsic::CopyNonOverlapping(CopyNonOverlapping {
dst: _,
src: _,
count: _,
}) => {
// This statement represents `*dst = *src`, `count` times.
}
}
}
fn handle_assign(
&self,
target: Place<'tcx>,
rvalue: &Rvalue<'tcx>,
state: &mut State<Self::Value>,
) {
self.super_assign(target, rvalue, state)
}
fn super_assign(
&self,
target: Place<'tcx>,
rvalue: &Rvalue<'tcx>,
state: &mut State<Self::Value>,
) {
let result = self.handle_rvalue(rvalue, state);
state.assign(target.as_ref(), result, self.map());
}
fn handle_rvalue(
&self,
rvalue: &Rvalue<'tcx>,
state: &mut State<Self::Value>,
) -> ValueOrPlace<Self::Value> {
self.super_rvalue(rvalue, state)
}
fn super_rvalue(
&self,
rvalue: &Rvalue<'tcx>,
state: &mut State<Self::Value>,
) -> ValueOrPlace<Self::Value> {
match rvalue {
Rvalue::Use(operand) => self.handle_operand(operand, state),
Rvalue::CopyForDeref(place) => self.handle_operand(&Operand::Copy(*place), state),
Rvalue::Ref(..) | Rvalue::RawPtr(..) => {
// We don't track such places.
ValueOrPlace::TOP
}
Rvalue::Repeat(..)
| Rvalue::ThreadLocalRef(..)
| Rvalue::Len(..)
| Rvalue::Cast(..)
| Rvalue::BinaryOp(..)
| Rvalue::NullaryOp(..)
| Rvalue::UnaryOp(..)
| Rvalue::Discriminant(..)
| Rvalue::Aggregate(..)
| Rvalue::ShallowInitBox(..) => {
// No modification is possible through these r-values.
ValueOrPlace::TOP
}
}
}
fn handle_operand(
&self,
operand: &Operand<'tcx>,
state: &mut State<Self::Value>,
) -> ValueOrPlace<Self::Value> {
self.super_operand(operand, state)
}
fn super_operand(
&self,
operand: &Operand<'tcx>,
state: &mut State<Self::Value>,
) -> ValueOrPlace<Self::Value> {
match operand {
Operand::Constant(box constant) => {
ValueOrPlace::Value(self.handle_constant(constant, state))
}
Operand::Copy(place) | Operand::Move(place) => {
// On move, we would ideally flood the place with bottom. But with the current
// framework this is not possible (similar to `InterpCx::eval_operand`).
self.map()
.find(place.as_ref())
.map(ValueOrPlace::Place)
.unwrap_or(ValueOrPlace::TOP)
}
}
}
fn handle_constant(
&self,
constant: &ConstOperand<'tcx>,
state: &mut State<Self::Value>,
) -> Self::Value {
self.super_constant(constant, state)
}
fn super_constant(
&self,
_constant: &ConstOperand<'tcx>,
_state: &mut State<Self::Value>,
) -> Self::Value {
Self::Value::TOP
}
/// The effect of a successful function call return should not be
/// applied here, see [`Analysis::apply_terminator_effect`].
fn handle_terminator<'mir>(
&self,
terminator: &'mir Terminator<'tcx>,
state: &mut State<Self::Value>,
) -> TerminatorEdges<'mir, 'tcx> {
self.super_terminator(terminator, state)
}
fn super_terminator<'mir>(
&self,
terminator: &'mir Terminator<'tcx>,
state: &mut State<Self::Value>,
) -> TerminatorEdges<'mir, 'tcx> {
match &terminator.kind {
TerminatorKind::Call { .. } | TerminatorKind::InlineAsm { .. } => {
// Effect is applied by `handle_call_return`.
}
TerminatorKind::Drop { place, .. } => {
state.flood_with(place.as_ref(), self.map(), Self::Value::BOTTOM);
}
TerminatorKind::Yield { .. } => {
// They would have an effect, but are not allowed in this phase.
bug!("encountered disallowed terminator");
}
TerminatorKind::SwitchInt { discr, targets } => {
return self.handle_switch_int(discr, targets, state);
}
TerminatorKind::TailCall { .. } => {
// FIXME(explicit_tail_calls): determine if we need to do something here (probably not)
}
TerminatorKind::Goto { .. }
| TerminatorKind::UnwindResume
| TerminatorKind::UnwindTerminate(_)
| TerminatorKind::Return
| TerminatorKind::Unreachable
| TerminatorKind::Assert { .. }
| TerminatorKind::CoroutineDrop
| TerminatorKind::FalseEdge { .. }
| TerminatorKind::FalseUnwind { .. } => {
// These terminators have no effect on the analysis.
}
}
terminator.edges()
}
fn handle_call_return(
&self,
return_places: CallReturnPlaces<'_, 'tcx>,
state: &mut State<Self::Value>,
) {
self.super_call_return(return_places, state)
}
fn super_call_return(
&self,
return_places: CallReturnPlaces<'_, 'tcx>,
state: &mut State<Self::Value>,
) {
return_places.for_each(|place| {
state.flood(place.as_ref(), self.map());
})
}
fn handle_switch_int<'mir>(
&self,
discr: &'mir Operand<'tcx>,
targets: &'mir SwitchTargets,
state: &mut State<Self::Value>,
) -> TerminatorEdges<'mir, 'tcx> {
self.super_switch_int(discr, targets, state)
}
fn super_switch_int<'mir>(
&self,
discr: &'mir Operand<'tcx>,
targets: &'mir SwitchTargets,
_state: &mut State<Self::Value>,
) -> TerminatorEdges<'mir, 'tcx> {
TerminatorEdges::SwitchInt { discr, targets }
}
fn wrap(self) -> ValueAnalysisWrapper<Self>
where
Self: Sized,
{
ValueAnalysisWrapper(self)
}
}
pub struct ValueAnalysisWrapper<T>(pub T);
impl<'tcx, T: ValueAnalysis<'tcx>> Analysis<'tcx> for ValueAnalysisWrapper<T> {
type Domain = State<T::Value>;
const NAME: &'static str = T::NAME;
fn bottom_value(&self, _body: &Body<'tcx>) -> Self::Domain {
State::Unreachable
}
fn initialize_start_block(&self, body: &Body<'tcx>, state: &mut Self::Domain) {
// The initial state maps all tracked places of argument projections to and the rest to ⊥.
assert_matches!(state, State::Unreachable);
*state = State::new_reachable();
for arg in body.args_iter() {
state.flood(PlaceRef { local: arg, projection: &[] }, self.0.map());
}
}
fn apply_statement_effect(
&mut self,
state: &mut Self::Domain,
statement: &Statement<'tcx>,
_location: Location,
) {
if state.is_reachable() {
self.0.handle_statement(statement, state);
}
}
fn apply_terminator_effect<'mir>(
&mut self,
state: &mut Self::Domain,
terminator: &'mir Terminator<'tcx>,
_location: Location,
) -> TerminatorEdges<'mir, 'tcx> {
if state.is_reachable() {
self.0.handle_terminator(terminator, state)
} else {
TerminatorEdges::None
}
}
fn apply_call_return_effect(
&mut self,
state: &mut Self::Domain,
_block: BasicBlock,
return_places: CallReturnPlaces<'_, 'tcx>,
) {
if state.is_reachable() {
self.0.handle_call_return(return_places, state)
}
}
fn apply_switch_int_edge_effects(
&mut self,
_block: BasicBlock,
_discr: &Operand<'tcx>,
_apply_edge_effects: &mut impl SwitchIntEdgeEffects<Self::Domain>,
) {
}
}
rustc_index::newtype_index!( rustc_index::newtype_index!(
/// This index uniquely identifies a place. /// This index uniquely identifies a place.
@ -464,7 +84,7 @@ impl<V: JoinSemiLattice + Clone + HasBottom> JoinSemiLattice for StateData<V> {
} }
} }
/// The dataflow state for an instance of [`ValueAnalysis`]. /// Dataflow state.
/// ///
/// Every instance specifies a lattice that represents the possible values of a single tracked /// Every instance specifies a lattice that represents the possible values of a single tracked
/// place. If we call this lattice `V` and set of tracked places `P`, then a [`State`] is an /// place. If we call this lattice `V` and set of tracked places `P`, then a [`State`] is an
@ -514,7 +134,7 @@ impl<V: Clone + HasBottom> State<V> {
} }
} }
fn is_reachable(&self) -> bool { pub fn is_reachable(&self) -> bool {
matches!(self, State::Reachable(_)) matches!(self, State::Reachable(_))
} }
@ -1317,34 +937,6 @@ pub fn excluded_locals(body: &Body<'_>) -> BitSet<Local> {
collector.result collector.result
} }
/// This is used to visualize the dataflow analysis.
impl<'tcx, T> DebugWithContext<ValueAnalysisWrapper<T>> for State<T::Value>
where
T: ValueAnalysis<'tcx>,
T::Value: Debug,
{
fn fmt_with(&self, ctxt: &ValueAnalysisWrapper<T>, f: &mut Formatter<'_>) -> std::fmt::Result {
match self {
State::Reachable(values) => debug_with_context(values, None, ctxt.0.map(), f),
State::Unreachable => write!(f, "unreachable"),
}
}
fn fmt_diff_with(
&self,
old: &Self,
ctxt: &ValueAnalysisWrapper<T>,
f: &mut Formatter<'_>,
) -> std::fmt::Result {
match (self, old) {
(State::Reachable(this), State::Reachable(old)) => {
debug_with_context(this, Some(old), ctxt.0.map(), f)
}
_ => Ok(()), // Consider printing something here.
}
}
}
fn debug_with_context_rec<V: Debug + Eq + HasBottom>( fn debug_with_context_rec<V: Debug + Eq + HasBottom>(
place: PlaceIndex, place: PlaceIndex,
place_str: &str, place_str: &str,
@ -1391,7 +983,7 @@ fn debug_with_context_rec<V: Debug + Eq + HasBottom>(
Ok(()) Ok(())
} }
fn debug_with_context<V: Debug + Eq + HasBottom>( pub fn debug_with_context<V: Debug + Eq + HasBottom>(
new: &StateData<V>, new: &StateData<V>,
old: Option<&StateData<V>>, old: Option<&StateData<V>>,
map: &Map<'_>, map: &Map<'_>,

View File

@ -2,6 +2,9 @@
//! //!
//! Currently, this pass only propagates scalar values. //! Currently, this pass only propagates scalar values.
use std::assert_matches::assert_matches;
use std::fmt::Formatter;
use rustc_abi::{BackendRepr, FIRST_VARIANT, FieldIdx, Size, VariantIdx}; use rustc_abi::{BackendRepr, FIRST_VARIANT, FieldIdx, Size, VariantIdx};
use rustc_const_eval::const_eval::{DummyMachine, throw_machine_stop_str}; use rustc_const_eval::const_eval::{DummyMachine, throw_machine_stop_str};
use rustc_const_eval::interpret::{ use rustc_const_eval::interpret::{
@ -15,9 +18,10 @@ use rustc_middle::mir::visit::{MutVisitor, PlaceContext, Visitor};
use rustc_middle::mir::*; use rustc_middle::mir::*;
use rustc_middle::ty::layout::{HasParamEnv, LayoutOf}; use rustc_middle::ty::layout::{HasParamEnv, LayoutOf};
use rustc_middle::ty::{self, Ty, TyCtxt}; use rustc_middle::ty::{self, Ty, TyCtxt};
use rustc_mir_dataflow::lattice::FlatSet; use rustc_mir_dataflow::fmt::DebugWithContext;
use rustc_mir_dataflow::lattice::{FlatSet, HasBottom};
use rustc_mir_dataflow::value_analysis::{ use rustc_mir_dataflow::value_analysis::{
Map, PlaceIndex, State, TrackElem, ValueAnalysis, ValueAnalysisWrapper, ValueOrPlace, Map, PlaceIndex, State, TrackElem, ValueOrPlace, debug_with_context,
}; };
use rustc_mir_dataflow::{Analysis, Results, ResultsVisitor}; use rustc_mir_dataflow::{Analysis, Results, ResultsVisitor};
use rustc_span::DUMMY_SP; use rustc_span::DUMMY_SP;
@ -58,8 +62,8 @@ impl<'tcx> crate::MirPass<'tcx> for DataflowConstProp {
// Perform the actual dataflow analysis. // Perform the actual dataflow analysis.
let analysis = ConstAnalysis::new(tcx, body, map); let analysis = ConstAnalysis::new(tcx, body, map);
let mut results = debug_span!("analyze") let mut results =
.in_scope(|| analysis.wrap().iterate_to_fixpoint(tcx, body, None)); debug_span!("analyze").in_scope(|| analysis.iterate_to_fixpoint(tcx, body, None));
// Collect results and patch the body afterwards. // Collect results and patch the body afterwards.
let mut visitor = Collector::new(tcx, &body.local_decls); let mut visitor = Collector::new(tcx, &body.local_decls);
@ -69,6 +73,10 @@ impl<'tcx> crate::MirPass<'tcx> for DataflowConstProp {
} }
} }
// Note: Currently, places that have their reference taken cannot be tracked. Although this would
// be possible, it has to rely on some aliasing model, which we are not ready to commit to yet.
// Because of that, we can assume that the only way to change the value behind a tracked place is
// by direct assignment.
struct ConstAnalysis<'a, 'tcx> { struct ConstAnalysis<'a, 'tcx> {
map: Map<'tcx>, map: Map<'tcx>,
tcx: TyCtxt<'tcx>, tcx: TyCtxt<'tcx>,
@ -77,20 +85,198 @@ struct ConstAnalysis<'a, 'tcx> {
param_env: ty::ParamEnv<'tcx>, param_env: ty::ParamEnv<'tcx>,
} }
impl<'tcx> ValueAnalysis<'tcx> for ConstAnalysis<'_, 'tcx> { impl<'tcx> Analysis<'tcx> for ConstAnalysis<'_, 'tcx> {
type Value = FlatSet<Scalar>; type Domain = State<FlatSet<Scalar>>;
const NAME: &'static str = "ConstAnalysis"; const NAME: &'static str = "ConstAnalysis";
fn map(&self) -> &Map<'tcx> { // The bottom state denotes uninitialized memory. Because we are only doing a sound
&self.map // approximation of the actual execution, we can also use this state for places where access
// would be UB.
fn bottom_value(&self, _body: &Body<'tcx>) -> Self::Domain {
State::Unreachable
}
fn initialize_start_block(&self, body: &Body<'tcx>, state: &mut Self::Domain) {
// The initial state maps all tracked places of argument projections to and the rest to ⊥.
assert_matches!(state, State::Unreachable);
*state = State::new_reachable();
for arg in body.args_iter() {
state.flood(PlaceRef { local: arg, projection: &[] }, &self.map);
}
}
fn apply_statement_effect(
&mut self,
state: &mut Self::Domain,
statement: &Statement<'tcx>,
_location: Location,
) {
if state.is_reachable() {
self.handle_statement(statement, state);
}
}
fn apply_terminator_effect<'mir>(
&mut self,
state: &mut Self::Domain,
terminator: &'mir Terminator<'tcx>,
_location: Location,
) -> TerminatorEdges<'mir, 'tcx> {
if state.is_reachable() {
self.handle_terminator(terminator, state)
} else {
TerminatorEdges::None
}
}
fn apply_call_return_effect(
&mut self,
state: &mut Self::Domain,
_block: BasicBlock,
return_places: CallReturnPlaces<'_, 'tcx>,
) {
if state.is_reachable() {
self.handle_call_return(return_places, state)
}
}
}
impl<'a, 'tcx> ConstAnalysis<'a, 'tcx> {
fn new(tcx: TyCtxt<'tcx>, body: &'a Body<'tcx>, map: Map<'tcx>) -> Self {
let param_env = tcx.param_env_reveal_all_normalized(body.source.def_id());
Self {
map,
tcx,
local_decls: &body.local_decls,
ecx: InterpCx::new(tcx, DUMMY_SP, param_env, DummyMachine),
param_env,
}
}
fn handle_statement(&self, statement: &Statement<'tcx>, state: &mut State<FlatSet<Scalar>>) {
match &statement.kind {
StatementKind::Assign(box (place, rvalue)) => {
self.handle_assign(*place, rvalue, state);
}
StatementKind::SetDiscriminant { box place, variant_index } => {
self.handle_set_discriminant(*place, *variant_index, state);
}
StatementKind::Intrinsic(box intrinsic) => {
self.handle_intrinsic(intrinsic);
}
StatementKind::StorageLive(local) | StatementKind::StorageDead(local) => {
// StorageLive leaves the local in an uninitialized state.
// StorageDead makes it UB to access the local afterwards.
state.flood_with(
Place::from(*local).as_ref(),
&self.map,
FlatSet::<Scalar>::BOTTOM,
);
}
StatementKind::Deinit(box place) => {
// Deinit makes the place uninitialized.
state.flood_with(place.as_ref(), &self.map, FlatSet::<Scalar>::BOTTOM);
}
StatementKind::Retag(..) => {
// We don't track references.
}
StatementKind::ConstEvalCounter
| StatementKind::Nop
| StatementKind::FakeRead(..)
| StatementKind::PlaceMention(..)
| StatementKind::Coverage(..)
| StatementKind::AscribeUserType(..) => (),
}
}
fn handle_intrinsic(&self, intrinsic: &NonDivergingIntrinsic<'tcx>) {
match intrinsic {
NonDivergingIntrinsic::Assume(..) => {
// Could use this, but ignoring it is sound.
}
NonDivergingIntrinsic::CopyNonOverlapping(CopyNonOverlapping {
dst: _,
src: _,
count: _,
}) => {
// This statement represents `*dst = *src`, `count` times.
}
}
}
fn handle_operand(
&self,
operand: &Operand<'tcx>,
state: &mut State<FlatSet<Scalar>>,
) -> ValueOrPlace<FlatSet<Scalar>> {
match operand {
Operand::Constant(box constant) => {
ValueOrPlace::Value(self.handle_constant(constant, state))
}
Operand::Copy(place) | Operand::Move(place) => {
// On move, we would ideally flood the place with bottom. But with the current
// framework this is not possible (similar to `InterpCx::eval_operand`).
self.map.find(place.as_ref()).map(ValueOrPlace::Place).unwrap_or(ValueOrPlace::TOP)
}
}
}
/// The effect of a successful function call return should not be
/// applied here, see [`Analysis::apply_terminator_effect`].
fn handle_terminator<'mir>(
&self,
terminator: &'mir Terminator<'tcx>,
state: &mut State<FlatSet<Scalar>>,
) -> TerminatorEdges<'mir, 'tcx> {
match &terminator.kind {
TerminatorKind::Call { .. } | TerminatorKind::InlineAsm { .. } => {
// Effect is applied by `handle_call_return`.
}
TerminatorKind::Drop { place, .. } => {
state.flood_with(place.as_ref(), &self.map, FlatSet::<Scalar>::BOTTOM);
}
TerminatorKind::Yield { .. } => {
// They would have an effect, but are not allowed in this phase.
bug!("encountered disallowed terminator");
}
TerminatorKind::SwitchInt { discr, targets } => {
return self.handle_switch_int(discr, targets, state);
}
TerminatorKind::TailCall { .. } => {
// FIXME(explicit_tail_calls): determine if we need to do something here (probably
// not)
}
TerminatorKind::Goto { .. }
| TerminatorKind::UnwindResume
| TerminatorKind::UnwindTerminate(_)
| TerminatorKind::Return
| TerminatorKind::Unreachable
| TerminatorKind::Assert { .. }
| TerminatorKind::CoroutineDrop
| TerminatorKind::FalseEdge { .. }
| TerminatorKind::FalseUnwind { .. } => {
// These terminators have no effect on the analysis.
}
}
terminator.edges()
}
fn handle_call_return(
&self,
return_places: CallReturnPlaces<'_, 'tcx>,
state: &mut State<FlatSet<Scalar>>,
) {
return_places.for_each(|place| {
state.flood(place.as_ref(), &self.map);
})
} }
fn handle_set_discriminant( fn handle_set_discriminant(
&self, &self,
place: Place<'tcx>, place: Place<'tcx>,
variant_index: VariantIdx, variant_index: VariantIdx,
state: &mut State<Self::Value>, state: &mut State<FlatSet<Scalar>>,
) { ) {
state.flood_discr(place.as_ref(), &self.map); state.flood_discr(place.as_ref(), &self.map);
if self.map.find_discr(place.as_ref()).is_some() { if self.map.find_discr(place.as_ref()).is_some() {
@ -109,17 +295,17 @@ impl<'tcx> ValueAnalysis<'tcx> for ConstAnalysis<'_, 'tcx> {
&self, &self,
target: Place<'tcx>, target: Place<'tcx>,
rvalue: &Rvalue<'tcx>, rvalue: &Rvalue<'tcx>,
state: &mut State<Self::Value>, state: &mut State<FlatSet<Scalar>>,
) { ) {
match rvalue { match rvalue {
Rvalue::Use(operand) => { Rvalue::Use(operand) => {
state.flood(target.as_ref(), self.map()); state.flood(target.as_ref(), &self.map);
if let Some(target) = self.map.find(target.as_ref()) { if let Some(target) = self.map.find(target.as_ref()) {
self.assign_operand(state, target, operand); self.assign_operand(state, target, operand);
} }
} }
Rvalue::CopyForDeref(rhs) => { Rvalue::CopyForDeref(rhs) => {
state.flood(target.as_ref(), self.map()); state.flood(target.as_ref(), &self.map);
if let Some(target) = self.map.find(target.as_ref()) { if let Some(target) = self.map.find(target.as_ref()) {
self.assign_operand(state, target, &Operand::Copy(*rhs)); self.assign_operand(state, target, &Operand::Copy(*rhs));
} }
@ -127,9 +313,9 @@ impl<'tcx> ValueAnalysis<'tcx> for ConstAnalysis<'_, 'tcx> {
Rvalue::Aggregate(kind, operands) => { Rvalue::Aggregate(kind, operands) => {
// If we assign `target = Enum::Variant#0(operand)`, // If we assign `target = Enum::Variant#0(operand)`,
// we must make sure that all `target as Variant#i` are `Top`. // we must make sure that all `target as Variant#i` are `Top`.
state.flood(target.as_ref(), self.map()); state.flood(target.as_ref(), &self.map);
let Some(target_idx) = self.map().find(target.as_ref()) else { return }; let Some(target_idx) = self.map.find(target.as_ref()) else { return };
let (variant_target, variant_index) = match **kind { let (variant_target, variant_index) = match **kind {
AggregateKind::Tuple | AggregateKind::Closure(..) => (Some(target_idx), None), AggregateKind::Tuple | AggregateKind::Closure(..) => (Some(target_idx), None),
@ -148,14 +334,14 @@ impl<'tcx> ValueAnalysis<'tcx> for ConstAnalysis<'_, 'tcx> {
if let Some(variant_target_idx) = variant_target { if let Some(variant_target_idx) = variant_target {
for (field_index, operand) in operands.iter_enumerated() { for (field_index, operand) in operands.iter_enumerated() {
if let Some(field) = if let Some(field) =
self.map().apply(variant_target_idx, TrackElem::Field(field_index)) self.map.apply(variant_target_idx, TrackElem::Field(field_index))
{ {
self.assign_operand(state, field, operand); self.assign_operand(state, field, operand);
} }
} }
} }
if let Some(variant_index) = variant_index if let Some(variant_index) = variant_index
&& let Some(discr_idx) = self.map().apply(target_idx, TrackElem::Discriminant) && let Some(discr_idx) = self.map.apply(target_idx, TrackElem::Discriminant)
{ {
// We are assigning the discriminant as part of an aggregate. // We are assigning the discriminant as part of an aggregate.
// This discriminant can only alias a variant field's value if the operand // This discriminant can only alias a variant field's value if the operand
@ -170,23 +356,23 @@ impl<'tcx> ValueAnalysis<'tcx> for ConstAnalysis<'_, 'tcx> {
} }
Rvalue::BinaryOp(op, box (left, right)) if op.is_overflowing() => { Rvalue::BinaryOp(op, box (left, right)) if op.is_overflowing() => {
// Flood everything now, so we can use `insert_value_idx` directly later. // Flood everything now, so we can use `insert_value_idx` directly later.
state.flood(target.as_ref(), self.map()); state.flood(target.as_ref(), &self.map);
let Some(target) = self.map().find(target.as_ref()) else { return }; let Some(target) = self.map.find(target.as_ref()) else { return };
let value_target = self.map().apply(target, TrackElem::Field(0_u32.into())); let value_target = self.map.apply(target, TrackElem::Field(0_u32.into()));
let overflow_target = self.map().apply(target, TrackElem::Field(1_u32.into())); let overflow_target = self.map.apply(target, TrackElem::Field(1_u32.into()));
if value_target.is_some() || overflow_target.is_some() { if value_target.is_some() || overflow_target.is_some() {
let (val, overflow) = self.binary_op(state, *op, left, right); let (val, overflow) = self.binary_op(state, *op, left, right);
if let Some(value_target) = value_target { if let Some(value_target) = value_target {
// We have flooded `target` earlier. // We have flooded `target` earlier.
state.insert_value_idx(value_target, val, self.map()); state.insert_value_idx(value_target, val, &self.map);
} }
if let Some(overflow_target) = overflow_target { if let Some(overflow_target) = overflow_target {
// We have flooded `target` earlier. // We have flooded `target` earlier.
state.insert_value_idx(overflow_target, overflow, self.map()); state.insert_value_idx(overflow_target, overflow, &self.map);
} }
} }
} }
@ -196,27 +382,30 @@ impl<'tcx> ValueAnalysis<'tcx> for ConstAnalysis<'_, 'tcx> {
_, _,
) => { ) => {
let pointer = self.handle_operand(operand, state); let pointer = self.handle_operand(operand, state);
state.assign(target.as_ref(), pointer, self.map()); state.assign(target.as_ref(), pointer, &self.map);
if let Some(target_len) = self.map().find_len(target.as_ref()) if let Some(target_len) = self.map.find_len(target.as_ref())
&& let operand_ty = operand.ty(self.local_decls, self.tcx) && let operand_ty = operand.ty(self.local_decls, self.tcx)
&& let Some(operand_ty) = operand_ty.builtin_deref(true) && let Some(operand_ty) = operand_ty.builtin_deref(true)
&& let ty::Array(_, len) = operand_ty.kind() && let ty::Array(_, len) = operand_ty.kind()
&& let Some(len) = Const::Ty(self.tcx.types.usize, *len) && let Some(len) = Const::Ty(self.tcx.types.usize, *len)
.try_eval_scalar_int(self.tcx, self.param_env) .try_eval_scalar_int(self.tcx, self.param_env)
{ {
state.insert_value_idx(target_len, FlatSet::Elem(len.into()), self.map()); state.insert_value_idx(target_len, FlatSet::Elem(len.into()), &self.map);
} }
} }
_ => self.super_assign(target, rvalue, state), _ => {
let result = self.handle_rvalue(rvalue, state);
state.assign(target.as_ref(), result, &self.map);
}
} }
} }
fn handle_rvalue( fn handle_rvalue(
&self, &self,
rvalue: &Rvalue<'tcx>, rvalue: &Rvalue<'tcx>,
state: &mut State<Self::Value>, state: &mut State<FlatSet<Scalar>>,
) -> ValueOrPlace<Self::Value> { ) -> ValueOrPlace<FlatSet<Scalar>> {
let val = match rvalue { let val = match rvalue {
Rvalue::Len(place) => { Rvalue::Len(place) => {
let place_ty = place.ty(self.local_decls, self.tcx); let place_ty = place.ty(self.local_decls, self.tcx);
@ -225,7 +414,7 @@ impl<'tcx> ValueAnalysis<'tcx> for ConstAnalysis<'_, 'tcx> {
.try_eval_scalar(self.tcx, self.param_env) .try_eval_scalar(self.tcx, self.param_env)
.map_or(FlatSet::Top, FlatSet::Elem) .map_or(FlatSet::Top, FlatSet::Elem)
} else if let [ProjectionElem::Deref] = place.projection[..] { } else if let [ProjectionElem::Deref] = place.projection[..] {
state.get_len(place.local.into(), self.map()) state.get_len(place.local.into(), &self.map)
} else { } else {
FlatSet::Top FlatSet::Top
} }
@ -296,8 +485,24 @@ impl<'tcx> ValueAnalysis<'tcx> for ConstAnalysis<'_, 'tcx> {
}; };
FlatSet::Elem(Scalar::from_target_usize(val, &self.tcx)) FlatSet::Elem(Scalar::from_target_usize(val, &self.tcx))
} }
Rvalue::Discriminant(place) => state.get_discr(place.as_ref(), self.map()), Rvalue::Discriminant(place) => state.get_discr(place.as_ref(), &self.map),
_ => return self.super_rvalue(rvalue, state), Rvalue::Use(operand) => return self.handle_operand(operand, state),
Rvalue::CopyForDeref(place) => {
return self.handle_operand(&Operand::Copy(*place), state);
}
Rvalue::Ref(..) | Rvalue::RawPtr(..) => {
// We don't track such places.
return ValueOrPlace::TOP;
}
Rvalue::Repeat(..)
| Rvalue::ThreadLocalRef(..)
| Rvalue::Cast(..)
| Rvalue::BinaryOp(..)
| Rvalue::Aggregate(..)
| Rvalue::ShallowInitBox(..) => {
// No modification is possible through these r-values.
return ValueOrPlace::TOP;
}
}; };
ValueOrPlace::Value(val) ValueOrPlace::Value(val)
} }
@ -305,8 +510,8 @@ impl<'tcx> ValueAnalysis<'tcx> for ConstAnalysis<'_, 'tcx> {
fn handle_constant( fn handle_constant(
&self, &self,
constant: &ConstOperand<'tcx>, constant: &ConstOperand<'tcx>,
_state: &mut State<Self::Value>, _state: &mut State<FlatSet<Scalar>>,
) -> Self::Value { ) -> FlatSet<Scalar> {
constant constant
.const_ .const_
.try_eval_scalar(self.tcx, self.param_env) .try_eval_scalar(self.tcx, self.param_env)
@ -317,11 +522,11 @@ impl<'tcx> ValueAnalysis<'tcx> for ConstAnalysis<'_, 'tcx> {
&self, &self,
discr: &'mir Operand<'tcx>, discr: &'mir Operand<'tcx>,
targets: &'mir SwitchTargets, targets: &'mir SwitchTargets,
state: &mut State<Self::Value>, state: &mut State<FlatSet<Scalar>>,
) -> TerminatorEdges<'mir, 'tcx> { ) -> TerminatorEdges<'mir, 'tcx> {
let value = match self.handle_operand(discr, state) { let value = match self.handle_operand(discr, state) {
ValueOrPlace::Value(value) => value, ValueOrPlace::Value(value) => value,
ValueOrPlace::Place(place) => state.get_idx(place, self.map()), ValueOrPlace::Place(place) => state.get_idx(place, &self.map),
}; };
match value { match value {
// We are branching on uninitialized data, this is UB, treat it as unreachable. // We are branching on uninitialized data, this is UB, treat it as unreachable.
@ -334,19 +539,6 @@ impl<'tcx> ValueAnalysis<'tcx> for ConstAnalysis<'_, 'tcx> {
FlatSet::Top => TerminatorEdges::SwitchInt { discr, targets }, FlatSet::Top => TerminatorEdges::SwitchInt { discr, targets },
} }
} }
}
impl<'a, 'tcx> ConstAnalysis<'a, 'tcx> {
fn new(tcx: TyCtxt<'tcx>, body: &'a Body<'tcx>, map: Map<'tcx>) -> Self {
let param_env = tcx.param_env_reveal_all_normalized(body.source.def_id());
Self {
map,
tcx,
local_decls: &body.local_decls,
ecx: InterpCx::new(tcx, DUMMY_SP, param_env, DummyMachine),
param_env,
}
}
/// The caller must have flooded `place`. /// The caller must have flooded `place`.
fn assign_operand( fn assign_operand(
@ -537,16 +729,40 @@ impl<'a, 'tcx> ConstAnalysis<'a, 'tcx> {
} }
} }
pub(crate) struct Patch<'tcx> { /// This is used to visualize the dataflow analysis.
impl<'tcx> DebugWithContext<ConstAnalysis<'_, 'tcx>> for State<FlatSet<Scalar>> {
fn fmt_with(&self, ctxt: &ConstAnalysis<'_, 'tcx>, f: &mut Formatter<'_>) -> std::fmt::Result {
match self {
State::Reachable(values) => debug_with_context(values, None, &ctxt.map, f),
State::Unreachable => write!(f, "unreachable"),
}
}
fn fmt_diff_with(
&self,
old: &Self,
ctxt: &ConstAnalysis<'_, 'tcx>,
f: &mut Formatter<'_>,
) -> std::fmt::Result {
match (self, old) {
(State::Reachable(this), State::Reachable(old)) => {
debug_with_context(this, Some(old), &ctxt.map, f)
}
_ => Ok(()), // Consider printing something here.
}
}
}
struct Patch<'tcx> {
tcx: TyCtxt<'tcx>, tcx: TyCtxt<'tcx>,
/// For a given MIR location, this stores the values of the operands used by that location. In /// For a given MIR location, this stores the values of the operands used by that location. In
/// particular, this is before the effect, such that the operands of `_1 = _1 + _2` are /// particular, this is before the effect, such that the operands of `_1 = _1 + _2` are
/// properly captured. (This may become UB soon, but it is currently emitted even by safe code.) /// properly captured. (This may become UB soon, but it is currently emitted even by safe code.)
pub(crate) before_effect: FxHashMap<(Location, Place<'tcx>), Const<'tcx>>, before_effect: FxHashMap<(Location, Place<'tcx>), Const<'tcx>>,
/// Stores the assigned values for assignments where the Rvalue is constant. /// Stores the assigned values for assignments where the Rvalue is constant.
pub(crate) assignments: FxHashMap<Location, Const<'tcx>>, assignments: FxHashMap<Location, Const<'tcx>>,
} }
impl<'tcx> Patch<'tcx> { impl<'tcx> Patch<'tcx> {
@ -725,8 +941,7 @@ fn try_write_constant<'tcx>(
interp_ok(()) interp_ok(())
} }
impl<'mir, 'tcx> impl<'mir, 'tcx> ResultsVisitor<'mir, 'tcx, Results<'tcx, ConstAnalysis<'_, 'tcx>>>
ResultsVisitor<'mir, 'tcx, Results<'tcx, ValueAnalysisWrapper<ConstAnalysis<'_, 'tcx>>>>
for Collector<'_, 'tcx> for Collector<'_, 'tcx>
{ {
type Domain = State<FlatSet<Scalar>>; type Domain = State<FlatSet<Scalar>>;
@ -734,7 +949,7 @@ impl<'mir, 'tcx>
#[instrument(level = "trace", skip(self, results, statement))] #[instrument(level = "trace", skip(self, results, statement))]
fn visit_statement_before_primary_effect( fn visit_statement_before_primary_effect(
&mut self, &mut self,
results: &mut Results<'tcx, ValueAnalysisWrapper<ConstAnalysis<'_, 'tcx>>>, results: &mut Results<'tcx, ConstAnalysis<'_, 'tcx>>,
state: &Self::Domain, state: &Self::Domain,
statement: &'mir Statement<'tcx>, statement: &'mir Statement<'tcx>,
location: Location, location: Location,
@ -744,8 +959,8 @@ impl<'mir, 'tcx>
OperandCollector { OperandCollector {
state, state,
visitor: self, visitor: self,
ecx: &mut results.analysis.0.ecx, ecx: &mut results.analysis.ecx,
map: &results.analysis.0.map, map: &results.analysis.map,
} }
.visit_rvalue(rvalue, location); .visit_rvalue(rvalue, location);
} }
@ -756,7 +971,7 @@ impl<'mir, 'tcx>
#[instrument(level = "trace", skip(self, results, statement))] #[instrument(level = "trace", skip(self, results, statement))]
fn visit_statement_after_primary_effect( fn visit_statement_after_primary_effect(
&mut self, &mut self,
results: &mut Results<'tcx, ValueAnalysisWrapper<ConstAnalysis<'_, 'tcx>>>, results: &mut Results<'tcx, ConstAnalysis<'_, 'tcx>>,
state: &Self::Domain, state: &Self::Domain,
statement: &'mir Statement<'tcx>, statement: &'mir Statement<'tcx>,
location: Location, location: Location,
@ -767,10 +982,10 @@ impl<'mir, 'tcx>
} }
StatementKind::Assign(box (place, _)) => { StatementKind::Assign(box (place, _)) => {
if let Some(value) = self.try_make_constant( if let Some(value) = self.try_make_constant(
&mut results.analysis.0.ecx, &mut results.analysis.ecx,
place, place,
state, state,
&results.analysis.0.map, &results.analysis.map,
) { ) {
self.patch.assignments.insert(location, value); self.patch.assignments.insert(location, value);
} }
@ -781,7 +996,7 @@ impl<'mir, 'tcx>
fn visit_terminator_before_primary_effect( fn visit_terminator_before_primary_effect(
&mut self, &mut self,
results: &mut Results<'tcx, ValueAnalysisWrapper<ConstAnalysis<'_, 'tcx>>>, results: &mut Results<'tcx, ConstAnalysis<'_, 'tcx>>,
state: &Self::Domain, state: &Self::Domain,
terminator: &'mir Terminator<'tcx>, terminator: &'mir Terminator<'tcx>,
location: Location, location: Location,
@ -789,8 +1004,8 @@ impl<'mir, 'tcx>
OperandCollector { OperandCollector {
state, state,
visitor: self, visitor: self,
ecx: &mut results.analysis.0.ecx, ecx: &mut results.analysis.ecx,
map: &results.analysis.0.map, map: &results.analysis.map,
} }
.visit_terminator(terminator, location); .visit_terminator(terminator, location);
} }

View File

@ -175,9 +175,7 @@ fn main() {
// Find any host flags that were passed by bootstrap. // Find any host flags that were passed by bootstrap.
// The flags are stored in a RUSTC_HOST_FLAGS variable, separated by spaces. // The flags are stored in a RUSTC_HOST_FLAGS variable, separated by spaces.
if let Ok(flags) = std::env::var("RUSTC_HOST_FLAGS") { if let Ok(flags) = std::env::var("RUSTC_HOST_FLAGS") {
for flag in flags.split(' ') { cmd.args(flags.split(' '));
cmd.arg(flag);
}
} }
} }

View File

@ -1,6 +1,8 @@
// (This line has BOM so it's ignored by compiletest for directives) // (This line has BOM so it's ignored by compiletest for directives)
// //
//@ compile-flags: --json=diagnostic-short --error-format=json //@ compile-flags: --json=diagnostic-short --error-format=json
//@ reference: input.byte-order-mark
//@ reference: input.crlf
// ignore-tidy-cr // ignore-tidy-cr
#[path = "json-bom-plus-crlf-multifile-aux.rs"] #[path = "json-bom-plus-crlf-multifile-aux.rs"]

View File

@ -1,6 +1,8 @@
// (This line has BOM so it's ignored by compiletest for directives) // (This line has BOM so it's ignored by compiletest for directives)
// //
//@ compile-flags: --json=diagnostic-short --error-format=json //@ compile-flags: --json=diagnostic-short --error-format=json
//@ reference: input.byte-order-mark
//@ reference: input.crlf
// ignore-tidy-cr // ignore-tidy-cr
// For easier verifying, the byte offsets in this file should match those // For easier verifying, the byte offsets in this file should match those

View File

@ -24,7 +24,7 @@ This error occurs when an expression was used in a place where the compiler
expected an expression of a different type. It can occur in several cases, the expected an expression of a different type. It can occur in several cases, the
most common being when calling a function and passing an argument which has a most common being when calling a function and passing an argument which has a
different type than the matching type in the function declaration. different type than the matching type in the function declaration.
"},"level":"error","spans":[{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":607,"byte_end":608,"line_start":16,"line_end":16,"column_start":22,"column_end":23,"is_primary":true,"text":[{"text":" let s : String = 1; // Error in the middle of line.","highlight_start":22,"highlight_end":23}],"label":"expected `String`, found integer","suggested_replacement":null,"suggestion_applicability":null,"expansion":null},{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":598,"byte_end":604,"line_start":16,"line_end":16,"column_start":13,"column_end":19,"is_primary":false,"text":[{"text":" let s : String = 1; // Error in the middle of line.","highlight_start":13,"highlight_end":19}],"label":"expected due to this","suggested_replacement":null,"suggestion_applicability":null,"expansion":null}],"children":[{"message":"try using a conversion method","code":null,"level":"help","spans":[{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":608,"byte_end":608,"line_start":16,"line_end":16,"column_start":23,"column_end":23,"is_primary":true,"text":[{"text":" let s : String = 1; // Error in the middle of line.","highlight_start":23,"highlight_end":23}],"label":null,"suggested_replacement":".to_string()","suggestion_applicability":"MaybeIncorrect","expansion":null}],"children":[],"rendered":null}],"rendered":"$DIR/json-bom-plus-crlf.rs:16:22: error[E0308]: mismatched types: expected `String`, found integer "},"level":"error","spans":[{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":672,"byte_end":673,"line_start":18,"line_end":18,"column_start":22,"column_end":23,"is_primary":true,"text":[{"text":" let s : String = 1; // Error in the middle of line.","highlight_start":22,"highlight_end":23}],"label":"expected `String`, found integer","suggested_replacement":null,"suggestion_applicability":null,"expansion":null},{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":663,"byte_end":669,"line_start":18,"line_end":18,"column_start":13,"column_end":19,"is_primary":false,"text":[{"text":" let s : String = 1; // Error in the middle of line.","highlight_start":13,"highlight_end":19}],"label":"expected due to this","suggested_replacement":null,"suggestion_applicability":null,"expansion":null}],"children":[{"message":"try using a conversion method","code":null,"level":"help","spans":[{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":673,"byte_end":673,"line_start":18,"line_end":18,"column_start":23,"column_end":23,"is_primary":true,"text":[{"text":" let s : String = 1; // Error in the middle of line.","highlight_start":23,"highlight_end":23}],"label":null,"suggested_replacement":".to_string()","suggestion_applicability":"MaybeIncorrect","expansion":null}],"children":[],"rendered":null}],"rendered":"$DIR/json-bom-plus-crlf.rs:18:22: error[E0308]: mismatched types: expected `String`, found integer
"} "}
{"$message_type":"diagnostic","message":"mismatched types","code":{"code":"E0308","explanation":"Expected type did not match the received type. {"$message_type":"diagnostic","message":"mismatched types","code":{"code":"E0308","explanation":"Expected type did not match the received type.
@ -52,7 +52,7 @@ This error occurs when an expression was used in a place where the compiler
expected an expression of a different type. It can occur in several cases, the expected an expression of a different type. It can occur in several cases, the
most common being when calling a function and passing an argument which has a most common being when calling a function and passing an argument which has a
different type than the matching type in the function declaration. different type than the matching type in the function declaration.
"},"level":"error","spans":[{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":667,"byte_end":668,"line_start":18,"line_end":18,"column_start":22,"column_end":23,"is_primary":true,"text":[{"text":" let s : String = 1","highlight_start":22,"highlight_end":23}],"label":"expected `String`, found integer","suggested_replacement":null,"suggestion_applicability":null,"expansion":null},{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":658,"byte_end":664,"line_start":18,"line_end":18,"column_start":13,"column_end":19,"is_primary":false,"text":[{"text":" let s : String = 1","highlight_start":13,"highlight_end":19}],"label":"expected due to this","suggested_replacement":null,"suggestion_applicability":null,"expansion":null}],"children":[{"message":"try using a conversion method","code":null,"level":"help","spans":[{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":668,"byte_end":668,"line_start":18,"line_end":18,"column_start":23,"column_end":23,"is_primary":true,"text":[{"text":" let s : String = 1","highlight_start":23,"highlight_end":23}],"label":null,"suggested_replacement":".to_string()","suggestion_applicability":"MaybeIncorrect","expansion":null}],"children":[],"rendered":null}],"rendered":"$DIR/json-bom-plus-crlf.rs:18:22: error[E0308]: mismatched types: expected `String`, found integer "},"level":"error","spans":[{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":732,"byte_end":733,"line_start":20,"line_end":20,"column_start":22,"column_end":23,"is_primary":true,"text":[{"text":" let s : String = 1","highlight_start":22,"highlight_end":23}],"label":"expected `String`, found integer","suggested_replacement":null,"suggestion_applicability":null,"expansion":null},{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":723,"byte_end":729,"line_start":20,"line_end":20,"column_start":13,"column_end":19,"is_primary":false,"text":[{"text":" let s : String = 1","highlight_start":13,"highlight_end":19}],"label":"expected due to this","suggested_replacement":null,"suggestion_applicability":null,"expansion":null}],"children":[{"message":"try using a conversion method","code":null,"level":"help","spans":[{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":733,"byte_end":733,"line_start":20,"line_end":20,"column_start":23,"column_end":23,"is_primary":true,"text":[{"text":" let s : String = 1","highlight_start":23,"highlight_end":23}],"label":null,"suggested_replacement":".to_string()","suggestion_applicability":"MaybeIncorrect","expansion":null}],"children":[],"rendered":null}],"rendered":"$DIR/json-bom-plus-crlf.rs:20:22: error[E0308]: mismatched types: expected `String`, found integer
"} "}
{"$message_type":"diagnostic","message":"mismatched types","code":{"code":"E0308","explanation":"Expected type did not match the received type. {"$message_type":"diagnostic","message":"mismatched types","code":{"code":"E0308","explanation":"Expected type did not match the received type.
@ -80,7 +80,7 @@ This error occurs when an expression was used in a place where the compiler
expected an expression of a different type. It can occur in several cases, the expected an expression of a different type. It can occur in several cases, the
most common being when calling a function and passing an argument which has a most common being when calling a function and passing an argument which has a
different type than the matching type in the function declaration. different type than the matching type in the function declaration.
"},"level":"error","spans":[{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":731,"byte_end":732,"line_start":22,"line_end":22,"column_start":1,"column_end":2,"is_primary":true,"text":[{"text":"1; // Error after the newline.","highlight_start":1,"highlight_end":2}],"label":"expected `String`, found integer","suggested_replacement":null,"suggestion_applicability":null,"expansion":null},{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":721,"byte_end":727,"line_start":21,"line_end":21,"column_start":13,"column_end":19,"is_primary":false,"text":[{"text":" let s : String =","highlight_start":13,"highlight_end":19}],"label":"expected due to this","suggested_replacement":null,"suggestion_applicability":null,"expansion":null}],"children":[{"message":"try using a conversion method","code":null,"level":"help","spans":[{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":732,"byte_end":732,"line_start":22,"line_end":22,"column_start":2,"column_end":2,"is_primary":true,"text":[{"text":"1; // Error after the newline.","highlight_start":2,"highlight_end":2}],"label":null,"suggested_replacement":".to_string()","suggestion_applicability":"MaybeIncorrect","expansion":null}],"children":[],"rendered":null}],"rendered":"$DIR/json-bom-plus-crlf.rs:22:1: error[E0308]: mismatched types: expected `String`, found integer "},"level":"error","spans":[{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":796,"byte_end":797,"line_start":24,"line_end":24,"column_start":1,"column_end":2,"is_primary":true,"text":[{"text":"1; // Error after the newline.","highlight_start":1,"highlight_end":2}],"label":"expected `String`, found integer","suggested_replacement":null,"suggestion_applicability":null,"expansion":null},{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":786,"byte_end":792,"line_start":23,"line_end":23,"column_start":13,"column_end":19,"is_primary":false,"text":[{"text":" let s : String =","highlight_start":13,"highlight_end":19}],"label":"expected due to this","suggested_replacement":null,"suggestion_applicability":null,"expansion":null}],"children":[{"message":"try using a conversion method","code":null,"level":"help","spans":[{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":797,"byte_end":797,"line_start":24,"line_end":24,"column_start":2,"column_end":2,"is_primary":true,"text":[{"text":"1; // Error after the newline.","highlight_start":2,"highlight_end":2}],"label":null,"suggested_replacement":".to_string()","suggestion_applicability":"MaybeIncorrect","expansion":null}],"children":[],"rendered":null}],"rendered":"$DIR/json-bom-plus-crlf.rs:24:1: error[E0308]: mismatched types: expected `String`, found integer
"} "}
{"$message_type":"diagnostic","message":"mismatched types","code":{"code":"E0308","explanation":"Expected type did not match the received type. {"$message_type":"diagnostic","message":"mismatched types","code":{"code":"E0308","explanation":"Expected type did not match the received type.
@ -108,7 +108,7 @@ This error occurs when an expression was used in a place where the compiler
expected an expression of a different type. It can occur in several cases, the expected an expression of a different type. It can occur in several cases, the
most common being when calling a function and passing an argument which has a most common being when calling a function and passing an argument which has a
different type than the matching type in the function declaration. different type than the matching type in the function declaration.
"},"level":"error","spans":[{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":787,"byte_end":795,"line_start":24,"line_end":25,"column_start":22,"column_end":6,"is_primary":true,"text":[{"text":" let s : String = (","highlight_start":22,"highlight_end":23},{"text":" ); // Error spanning the newline.","highlight_start":1,"highlight_end":6}],"label":"expected `String`, found `()`","suggested_replacement":null,"suggestion_applicability":null,"expansion":null},{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":778,"byte_end":784,"line_start":24,"line_end":24,"column_start":13,"column_end":19,"is_primary":false,"text":[{"text":" let s : String = (","highlight_start":13,"highlight_end":19}],"label":"expected due to this","suggested_replacement":null,"suggestion_applicability":null,"expansion":null}],"children":[],"rendered":"$DIR/json-bom-plus-crlf.rs:24:22: error[E0308]: mismatched types: expected `String`, found `()` "},"level":"error","spans":[{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":852,"byte_end":860,"line_start":26,"line_end":27,"column_start":22,"column_end":6,"is_primary":true,"text":[{"text":" let s : String = (","highlight_start":22,"highlight_end":23},{"text":" ); // Error spanning the newline.","highlight_start":1,"highlight_end":6}],"label":"expected `String`, found `()`","suggested_replacement":null,"suggestion_applicability":null,"expansion":null},{"file_name":"$DIR/json-bom-plus-crlf.rs","byte_start":843,"byte_end":849,"line_start":26,"line_end":26,"column_start":13,"column_end":19,"is_primary":false,"text":[{"text":" let s : String = (","highlight_start":13,"highlight_end":19}],"label":"expected due to this","suggested_replacement":null,"suggestion_applicability":null,"expansion":null}],"children":[],"rendered":"$DIR/json-bom-plus-crlf.rs:26:22: error[E0308]: mismatched types: expected `String`, found `()`
"} "}
{"$message_type":"diagnostic","message":"aborting due to 4 previous errors","code":null,"level":"error","spans":[],"children":[],"rendered":"error: aborting due to 4 previous errors {"$message_type":"diagnostic","message":"aborting due to 4 previous errors","code":null,"level":"error","spans":[],"children":[],"rendered":"error: aborting due to 4 previous errors
"} "}

View File

@ -1,4 +1,5 @@
//@ run-pass //@ run-pass
//@ reference: input.crlf
// ignore-tidy-cr // ignore-tidy-cr
// ignore-tidy-cr (repeated again because of tidy bug) // ignore-tidy-cr (repeated again because of tidy bug)
// license is ignored because tidy can't handle the CRLF here properly. // license is ignored because tidy can't handle the CRLF here properly.

View File

@ -1,4 +1,6 @@
//@ error-pattern: did not contain valid UTF-8 //@ error-pattern: did not contain valid UTF-8
//@ reference: input.encoding.utf8
//@ reference: input.encoding.invalid
fn foo() { fn foo() {
include!("not-utf8.bin") include!("not-utf8.bin")

View File

@ -1,5 +1,5 @@
error: couldn't read $DIR/not-utf8.bin: stream did not contain valid UTF-8 error: couldn't read $DIR/not-utf8.bin: stream did not contain valid UTF-8
--> $DIR/not-utf8.rs:4:5 --> $DIR/not-utf8.rs:6:5
| |
LL | include!("not-utf8.bin") LL | include!("not-utf8.bin")
| ^^^^^^^^^^^^^^^^^^^^^^^^ | ^^^^^^^^^^^^^^^^^^^^^^^^

View File

@ -1,2 +1,4 @@
#!B //~ expected `[`, found `B` #!B //~ expected `[`, found `B`
//@ reference: input.shebang

View File

@ -1,6 +1,7 @@
#! #!
[allow(unused_variables)] [allow(unused_variables)]
//@ check-pass //@ check-pass
//@ reference: input.shebang.inner-attribute
fn main() { fn main() {
let x = 5; let x = 5;

View File

@ -1,5 +1,6 @@
#![allow(unused_variables)] #![allow(unused_variables)]
//@ check-pass //@ check-pass
//@ reference: input.shebang.inner-attribute
fn main() { fn main() {
let x = 5; let x = 5;
} }

View File

@ -1,6 +1,7 @@
#!/usr/bin/env run-cargo-script #!/usr/bin/env run-cargo-script
//@ check-pass //@ check-pass
//@ reference: input.shebang.inner-attribute
#![allow(unused_variables)] #![allow(unused_variables)]

View File

@ -1,6 +1,7 @@
#!//bin/bash #!//bin/bash
//@ check-pass //@ check-pass
//@ reference: input.shebang
fn main() { fn main() {
println!("a valid shebang (that is also a rust comment)") println!("a valid shebang (that is also a rust comment)")
} }

View File

@ -1,3 +1,5 @@
#!///bin/bash #!///bin/bash
[allow(unused_variables)] [allow(unused_variables)]
//~^ ERROR expected item, found `[` //~^ ERROR expected item, found `[`
//@ reference: input.shebang.inner-attribute

View File

@ -1,4 +1,5 @@
#! #!
//@ check-pass //@ check-pass
//@ reference: input.shebang
fn main() {} fn main() {}

View File

@ -1,6 +1,8 @@
// something on the first line for tidy // something on the first line for tidy
#!/bin/bash //~ expected `[`, found `/` #!/bin/bash //~ expected `[`, found `/`
//@ reference: input.shebang
fn main() { fn main() {
println!("ok!"); println!("ok!");
} }

View File

@ -1,5 +1,6 @@
#! #!
//@ check-pass //@ check-pass
//@ reference: input.shebang
// ignore-tidy-end-whitespace // ignore-tidy-end-whitespace
fn main() {} fn main() {}

View File

@ -11,6 +11,7 @@
[allow(unused_variables)] [allow(unused_variables)]
//@ check-pass //@ check-pass
//@ reference: input.shebang.inner-attribute
fn main() { fn main() {
let x = 5; let x = 5;
} }

View File

@ -1,6 +1,7 @@
#!/usr/bin/env run-cargo-script #!/usr/bin/env run-cargo-script
//@ check-pass //@ check-pass
//@ reference: input.shebang
fn main() { fn main() {
println!("Hello World!"); println!("Hello World!");
} }

View File

@ -1,5 +1,5 @@
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-be-without-bom.rs:4:1 --> $DIR/utf16-be-without-bom.rs:5:1
| |
LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -7,7 +7,7 @@ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-be-without-bom.rs:4:3 --> $DIR/utf16-be-without-bom.rs:5:3
| |
LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -15,7 +15,7 @@ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-be-without-bom.rs:4:5 --> $DIR/utf16-be-without-bom.rs:5:5
| |
LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -23,7 +23,7 @@ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-be-without-bom.rs:4:7 --> $DIR/utf16-be-without-bom.rs:5:7
| |
LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -31,7 +31,7 @@ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-be-without-bom.rs:4:9 --> $DIR/utf16-be-without-bom.rs:5:9
| |
LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -39,7 +39,7 @@ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-be-without-bom.rs:4:11 --> $DIR/utf16-be-without-bom.rs:5:11
| |
LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -47,7 +47,7 @@ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-be-without-bom.rs:4:13 --> $DIR/utf16-be-without-bom.rs:5:13
| |
LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -55,7 +55,7 @@ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-be-without-bom.rs:4:15 --> $DIR/utf16-be-without-bom.rs:5:15
| |
LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -63,7 +63,7 @@ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-be-without-bom.rs:4:17 --> $DIR/utf16-be-without-bom.rs:5:17
| |
LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -71,7 +71,7 @@ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-be-without-bom.rs:4:19 --> $DIR/utf16-be-without-bom.rs:5:19
| |
LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -79,7 +79,7 @@ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-be-without-bom.rs:4:21 --> $DIR/utf16-be-without-bom.rs:5:21
| |
LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -87,7 +87,7 @@ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-be-without-bom.rs:4:23 --> $DIR/utf16-be-without-bom.rs:5:23
| |
LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -95,7 +95,7 @@ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-be-without-bom.rs:4:25 --> $DIR/utf16-be-without-bom.rs:5:25
| |
LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -103,7 +103,7 @@ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: expected one of `!` or `::`, found `n` error: expected one of `!` or `::`, found `n`
--> $DIR/utf16-be-without-bom.rs:4:4 --> $DIR/utf16-be-without-bom.rs:5:4
| |
LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | ␀f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ expected one of `!` or `::` | ^ expected one of `!` or `::`

View File

@ -1,5 +1,5 @@
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-le-without-bom.rs:4:2 --> $DIR/utf16-le-without-bom.rs:5:2
| |
LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -7,7 +7,7 @@ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-le-without-bom.rs:4:4 --> $DIR/utf16-le-without-bom.rs:5:4
| |
LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -15,7 +15,7 @@ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-le-without-bom.rs:4:6 --> $DIR/utf16-le-without-bom.rs:5:6
| |
LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -23,7 +23,7 @@ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-le-without-bom.rs:4:8 --> $DIR/utf16-le-without-bom.rs:5:8
| |
LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -31,7 +31,7 @@ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-le-without-bom.rs:4:10 --> $DIR/utf16-le-without-bom.rs:5:10
| |
LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -39,7 +39,7 @@ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-le-without-bom.rs:4:12 --> $DIR/utf16-le-without-bom.rs:5:12
| |
LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -47,7 +47,7 @@ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-le-without-bom.rs:4:14 --> $DIR/utf16-le-without-bom.rs:5:14
| |
LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -55,7 +55,7 @@ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-le-without-bom.rs:4:16 --> $DIR/utf16-le-without-bom.rs:5:16
| |
LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -63,7 +63,7 @@ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-le-without-bom.rs:4:18 --> $DIR/utf16-le-without-bom.rs:5:18
| |
LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -71,7 +71,7 @@ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-le-without-bom.rs:4:20 --> $DIR/utf16-le-without-bom.rs:5:20
| |
LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -79,7 +79,7 @@ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-le-without-bom.rs:4:22 --> $DIR/utf16-le-without-bom.rs:5:22
| |
LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -87,7 +87,7 @@ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-le-without-bom.rs:4:24 --> $DIR/utf16-le-without-bom.rs:5:24
| |
LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ | ^
@ -95,7 +95,7 @@ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: unknown start of token: \u{0} error: unknown start of token: \u{0}
--> $DIR/utf16-le-without-bom.rs:5:1 --> $DIR/utf16-le-without-bom.rs:6:1
| |
LL | ␀ LL | ␀
| ^ | ^
@ -103,7 +103,7 @@ LL | ␀
= help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used = help: source files must contain UTF-8 encoded text, unexpected null bytes might occur when a different encoding is used
error: expected one of `!` or `::`, found `n` error: expected one of `!` or `::`, found `n`
--> $DIR/utf16-le-without-bom.rs:4:3 --> $DIR/utf16-le-without-bom.rs:5:3
| |
LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀ LL | f␀n␀ ␀m␀a␀i␀n␀(␀)␀ ␀{␀}␀
| ^ expected one of `!` or `::` | ^ expected one of `!` or `::`

View File

@ -1,5 +1,6 @@
//@ check-pass //@ check-pass
#![feature(const_trait_impl, rustc_attrs)] #![feature(const_trait_impl, rustc_attrs, effects)]
//~^ WARN the feature `effects` is incomplete
#[const_trait] #[const_trait]
trait IntoIter { trait IntoIter {

View File

@ -0,0 +1,11 @@
warning: the feature `effects` is incomplete and may not be safe to use and/or cause compiler crashes
--> $DIR/do-not-const-check.rs:2:43
|
LL | #![feature(const_trait_impl, rustc_attrs, effects)]
| ^^^^^^^
|
= note: see issue #102090 <https://github.com/rust-lang/rust/issues/102090> for more information
= note: `#[warn(incomplete_features)]` on by default
warning: 1 warning emitted

View File

@ -1,4 +1,5 @@
// This file has utf-8 BOM, it should be compiled normally without error. // This file has utf-8 BOM, it should be compiled normally without error.
//@ run-pass //@ run-pass
//@ reference: input.byte-order-mark
pub fn main() {} pub fn main() {}