mirror of
https://github.com/rust-lang/rust.git
synced 2024-11-23 23:34:48 +00:00
Auto merge of #46922 - kennytm:rollup, r=kennytm
Rollup of 14 pull requests - Successful merges: #46636, #46780, #46784, #46809, #46814, #46820, #46839, #46847, #46858, #46878, #46884, #46890, #46898, #46918 - Failed merges:
This commit is contained in:
commit
ba2741594b
@ -112,14 +112,17 @@ There are large number of options provided in this config file that will alter t
|
||||
configuration used in the build process. Some options to note:
|
||||
|
||||
#### `[llvm]`:
|
||||
- `assertions = true` = This enables LLVM assertions, which makes LLVM misuse cause an assertion failure instead of weird misbehavior. This also slows down the compiler's runtime by ~20%.
|
||||
- `ccache = true` - Use ccache when building llvm
|
||||
|
||||
#### `[build]`:
|
||||
- `compiler-docs = true` - Build compiler documentation
|
||||
|
||||
#### `[rust]`:
|
||||
- `debuginfo = true` - Build a compiler with debuginfo
|
||||
- `optimize = false` - Disable optimizations to speed up compilation of stage1 rust
|
||||
- `debuginfo = true` - Build a compiler with debuginfo. Makes building rustc slower, but then you can use a debugger to debug `rustc`.
|
||||
- `debuginfo-lines = true` - An alternative to `debuginfo = true` that doesn't let you use a debugger, but doesn't make building rustc slower and still gives you line numbers in backtraces.
|
||||
- `debug-assertions = true` - Makes the log output of `debug!` work.
|
||||
- `optimize = false` - Disable optimizations to speed up compilation of stage1 rust, but makes the stage1 compiler x100 slower.
|
||||
|
||||
For more options, the `config.toml` file contains commented out defaults, with
|
||||
descriptions of what each option will do.
|
||||
@ -273,6 +276,27 @@ build, you'll need to build rustdoc specially, since it's not normally built in
|
||||
stage 1. `python x.py build --stage 1 src/libstd src/tools/rustdoc` will build
|
||||
rustdoc and libstd, which will allow rustdoc to be run with that toolchain.)
|
||||
|
||||
### Out-of-tree builds
|
||||
[out-of-tree-builds]: #out-of-tree-builds
|
||||
|
||||
Rust's `x.py` script fully supports out-of-tree builds - it looks for
|
||||
the Rust source code from the directory `x.py` was found in, but it
|
||||
reads the `config.toml` configuration file from the directory it's
|
||||
run in, and places all build artifacts within a subdirectory named `build`.
|
||||
|
||||
This means that if you want to do an out-of-tree build, you can just do it:
|
||||
```
|
||||
$ cd my/build/dir
|
||||
$ cp ~/my-config.toml config.toml # Or fill in config.toml otherwise
|
||||
$ path/to/rust/x.py build
|
||||
...
|
||||
$ # This will use the Rust source code in `path/to/rust`, but build
|
||||
$ # artifacts will now be in ./build
|
||||
```
|
||||
|
||||
It's absolutely fine to have multiple build directories with different
|
||||
`config.toml` configurations using the same code.
|
||||
|
||||
## Pull Requests
|
||||
[pull-requests]: #pull-requests
|
||||
|
||||
@ -446,14 +470,14 @@ failed to run: ~/rust/build/x86_64-unknown-linux-gnu/stage0/bin/cargo build --ma
|
||||
If you haven't used the `[patch]`
|
||||
section of `Cargo.toml` before, there is [some relevant documentation about it
|
||||
in the cargo docs](http://doc.crates.io/manifest.html#the-patch-section). In
|
||||
addition to that, you should read the
|
||||
addition to that, you should read the
|
||||
[Overriding dependencies](http://doc.crates.io/specifying-dependencies.html#overriding-dependencies)
|
||||
section of the documentation as well.
|
||||
|
||||
Specifically, the following [section in Overriding dependencies](http://doc.crates.io/specifying-dependencies.html#testing-a-bugfix) reveals what the problem is:
|
||||
|
||||
> Next up we need to ensure that our lock file is updated to use this new version of uuid so our project uses the locally checked out copy instead of one from crates.io. The way [patch] works is that it'll load the dependency at ../path/to/uuid and then whenever crates.io is queried for versions of uuid it'll also return the local version.
|
||||
>
|
||||
>
|
||||
> This means that the version number of the local checkout is significant and will affect whether the patch is used. Our manifest declared uuid = "1.0" which means we'll only resolve to >= 1.0.0, < 2.0.0, and Cargo's greedy resolution algorithm also means that we'll resolve to the maximum version within that range. Typically this doesn't matter as the version of the git repository will already be greater or match the maximum version published on crates.io, but it's important to keep this in mind!
|
||||
|
||||
This says that when we updated the submodule, the version number in our
|
||||
|
@ -224,8 +224,10 @@ use Bound::{Excluded, Included, Unbounded};
|
||||
/// types inside a `Vec`, it will not allocate space for them. *Note that in this case
|
||||
/// the `Vec` may not report a [`capacity`] of 0*. `Vec` will allocate if and only
|
||||
/// if [`mem::size_of::<T>`]`() * capacity() > 0`. In general, `Vec`'s allocation
|
||||
/// details are subtle enough that it is strongly recommended that you only
|
||||
/// free memory allocated by a `Vec` by creating a new `Vec` and dropping it.
|
||||
/// details are very subtle — if you intend to allocate memory using a `Vec`
|
||||
/// and use it for something else (either to pass to unsafe code, or to build your
|
||||
/// own memory-backed collection), be sure to deallocate this memory by using
|
||||
/// `from_raw_parts` to recover the `Vec` and then dropping it.
|
||||
///
|
||||
/// If a `Vec` *has* allocated memory, then the memory it points to is on the heap
|
||||
/// (as defined by the allocator Rust is configured to use by default), and its
|
||||
|
@ -1 +1 @@
|
||||
Subproject commit 18feaccbfd0dfbd5ab5d0a2a6eac9c04be667266
|
||||
Subproject commit 000d06a57a622eb4db4a15d0f76db48571f4d8e4
|
@ -114,7 +114,7 @@ macro_rules! define_bignum {
|
||||
/// copying it recklessly may result in the performance hit.
|
||||
/// Thus this is intentionally not `Copy`.
|
||||
///
|
||||
/// All operations available to bignums panic in the case of over/underflows.
|
||||
/// All operations available to bignums panic in the case of overflows.
|
||||
/// The caller is responsible to use large enough bignum types.
|
||||
pub struct $name {
|
||||
/// One plus the offset to the maximum "digit" in use.
|
||||
|
@ -98,10 +98,7 @@ pub mod diy_float;
|
||||
|
||||
// `Int` + `SignedInt` implemented for signed integers
|
||||
macro_rules! int_impl {
|
||||
($SelfT:ty, $ActualT:ident, $UnsignedT:ty, $BITS:expr,
|
||||
$add_with_overflow:path,
|
||||
$sub_with_overflow:path,
|
||||
$mul_with_overflow:path) => {
|
||||
($SelfT:ty, $ActualT:ident, $UnsignedT:ty, $BITS:expr) => {
|
||||
/// Returns the smallest value that can be represented by this integer type.
|
||||
///
|
||||
/// # Examples
|
||||
@ -402,7 +399,7 @@ macro_rules! int_impl {
|
||||
}
|
||||
|
||||
/// Checked integer subtraction. Computes `self - rhs`, returning
|
||||
/// `None` if underflow occurred.
|
||||
/// `None` if overflow occurred.
|
||||
///
|
||||
/// # Examples
|
||||
///
|
||||
@ -420,7 +417,7 @@ macro_rules! int_impl {
|
||||
}
|
||||
|
||||
/// Checked integer multiplication. Computes `self * rhs`, returning
|
||||
/// `None` if underflow or overflow occurred.
|
||||
/// `None` if overflow occurred.
|
||||
///
|
||||
/// # Examples
|
||||
///
|
||||
@ -438,7 +435,7 @@ macro_rules! int_impl {
|
||||
}
|
||||
|
||||
/// Checked integer division. Computes `self / rhs`, returning `None`
|
||||
/// if `rhs == 0` or the operation results in underflow or overflow.
|
||||
/// if `rhs == 0` or the operation results in overflow.
|
||||
///
|
||||
/// # Examples
|
||||
///
|
||||
@ -460,7 +457,7 @@ macro_rules! int_impl {
|
||||
}
|
||||
|
||||
/// Checked integer remainder. Computes `self % rhs`, returning `None`
|
||||
/// if `rhs == 0` or the operation results in underflow or overflow.
|
||||
/// if `rhs == 0` or the operation results in overflow.
|
||||
///
|
||||
/// # Examples
|
||||
///
|
||||
@ -865,11 +862,11 @@ macro_rules! int_impl {
|
||||
#[inline]
|
||||
#[stable(feature = "wrapping", since = "1.7.0")]
|
||||
pub fn overflowing_add(self, rhs: Self) -> (Self, bool) {
|
||||
unsafe {
|
||||
let (a, b) = $add_with_overflow(self as $ActualT,
|
||||
rhs as $ActualT);
|
||||
(a as Self, b)
|
||||
}
|
||||
let (a, b) = unsafe {
|
||||
intrinsics::add_with_overflow(self as $ActualT,
|
||||
rhs as $ActualT)
|
||||
};
|
||||
(a as Self, b)
|
||||
}
|
||||
|
||||
/// Calculates `self` - `rhs`
|
||||
@ -891,11 +888,11 @@ macro_rules! int_impl {
|
||||
#[inline]
|
||||
#[stable(feature = "wrapping", since = "1.7.0")]
|
||||
pub fn overflowing_sub(self, rhs: Self) -> (Self, bool) {
|
||||
unsafe {
|
||||
let (a, b) = $sub_with_overflow(self as $ActualT,
|
||||
rhs as $ActualT);
|
||||
(a as Self, b)
|
||||
}
|
||||
let (a, b) = unsafe {
|
||||
intrinsics::sub_with_overflow(self as $ActualT,
|
||||
rhs as $ActualT)
|
||||
};
|
||||
(a as Self, b)
|
||||
}
|
||||
|
||||
/// Calculates the multiplication of `self` and `rhs`.
|
||||
@ -915,11 +912,11 @@ macro_rules! int_impl {
|
||||
#[inline]
|
||||
#[stable(feature = "wrapping", since = "1.7.0")]
|
||||
pub fn overflowing_mul(self, rhs: Self) -> (Self, bool) {
|
||||
unsafe {
|
||||
let (a, b) = $mul_with_overflow(self as $ActualT,
|
||||
rhs as $ActualT);
|
||||
(a as Self, b)
|
||||
}
|
||||
let (a, b) = unsafe {
|
||||
intrinsics::mul_with_overflow(self as $ActualT,
|
||||
rhs as $ActualT)
|
||||
};
|
||||
(a as Self, b)
|
||||
}
|
||||
|
||||
/// Calculates the divisor when `self` is divided by `rhs`.
|
||||
@ -1207,82 +1204,50 @@ macro_rules! int_impl {
|
||||
|
||||
#[lang = "i8"]
|
||||
impl i8 {
|
||||
int_impl! { i8, i8, u8, 8,
|
||||
intrinsics::add_with_overflow,
|
||||
intrinsics::sub_with_overflow,
|
||||
intrinsics::mul_with_overflow }
|
||||
int_impl! { i8, i8, u8, 8 }
|
||||
}
|
||||
|
||||
#[lang = "i16"]
|
||||
impl i16 {
|
||||
int_impl! { i16, i16, u16, 16,
|
||||
intrinsics::add_with_overflow,
|
||||
intrinsics::sub_with_overflow,
|
||||
intrinsics::mul_with_overflow }
|
||||
int_impl! { i16, i16, u16, 16 }
|
||||
}
|
||||
|
||||
#[lang = "i32"]
|
||||
impl i32 {
|
||||
int_impl! { i32, i32, u32, 32,
|
||||
intrinsics::add_with_overflow,
|
||||
intrinsics::sub_with_overflow,
|
||||
intrinsics::mul_with_overflow }
|
||||
int_impl! { i32, i32, u32, 32 }
|
||||
}
|
||||
|
||||
#[lang = "i64"]
|
||||
impl i64 {
|
||||
int_impl! { i64, i64, u64, 64,
|
||||
intrinsics::add_with_overflow,
|
||||
intrinsics::sub_with_overflow,
|
||||
intrinsics::mul_with_overflow }
|
||||
int_impl! { i64, i64, u64, 64 }
|
||||
}
|
||||
|
||||
#[lang = "i128"]
|
||||
impl i128 {
|
||||
int_impl! { i128, i128, u128, 128,
|
||||
intrinsics::add_with_overflow,
|
||||
intrinsics::sub_with_overflow,
|
||||
intrinsics::mul_with_overflow }
|
||||
int_impl! { i128, i128, u128, 128 }
|
||||
}
|
||||
|
||||
#[cfg(target_pointer_width = "16")]
|
||||
#[lang = "isize"]
|
||||
impl isize {
|
||||
int_impl! { isize, i16, u16, 16,
|
||||
intrinsics::add_with_overflow,
|
||||
intrinsics::sub_with_overflow,
|
||||
intrinsics::mul_with_overflow }
|
||||
int_impl! { isize, i16, u16, 16 }
|
||||
}
|
||||
|
||||
#[cfg(target_pointer_width = "32")]
|
||||
#[lang = "isize"]
|
||||
impl isize {
|
||||
int_impl! { isize, i32, u32, 32,
|
||||
intrinsics::add_with_overflow,
|
||||
intrinsics::sub_with_overflow,
|
||||
intrinsics::mul_with_overflow }
|
||||
int_impl! { isize, i32, u32, 32 }
|
||||
}
|
||||
|
||||
#[cfg(target_pointer_width = "64")]
|
||||
#[lang = "isize"]
|
||||
impl isize {
|
||||
int_impl! { isize, i64, u64, 64,
|
||||
intrinsics::add_with_overflow,
|
||||
intrinsics::sub_with_overflow,
|
||||
intrinsics::mul_with_overflow }
|
||||
int_impl! { isize, i64, u64, 64 }
|
||||
}
|
||||
|
||||
// `Int` + `UnsignedInt` implemented for unsigned integers
|
||||
macro_rules! uint_impl {
|
||||
($SelfT:ty, $ActualT:ty, $BITS:expr,
|
||||
$ctpop:path,
|
||||
$ctlz:path,
|
||||
$ctlz_nonzero:path,
|
||||
$cttz:path,
|
||||
$bswap:path,
|
||||
$add_with_overflow:path,
|
||||
$sub_with_overflow:path,
|
||||
$mul_with_overflow:path) => {
|
||||
($SelfT:ty, $ActualT:ty, $BITS:expr) => {
|
||||
/// Returns the smallest value that can be represented by this integer type.
|
||||
///
|
||||
/// # Examples
|
||||
@ -1346,7 +1311,7 @@ macro_rules! uint_impl {
|
||||
#[stable(feature = "rust1", since = "1.0.0")]
|
||||
#[inline]
|
||||
pub fn count_ones(self) -> u32 {
|
||||
unsafe { $ctpop(self as $ActualT) as u32 }
|
||||
unsafe { intrinsics::ctpop(self as $ActualT) as u32 }
|
||||
}
|
||||
|
||||
/// Returns the number of zeros in the binary representation of `self`.
|
||||
@ -1381,7 +1346,7 @@ macro_rules! uint_impl {
|
||||
#[stable(feature = "rust1", since = "1.0.0")]
|
||||
#[inline]
|
||||
pub fn leading_zeros(self) -> u32 {
|
||||
unsafe { $ctlz(self as $ActualT) as u32 }
|
||||
unsafe { intrinsics::ctlz(self as $ActualT) as u32 }
|
||||
}
|
||||
|
||||
/// Returns the number of trailing zeros in the binary representation
|
||||
@ -1480,7 +1445,7 @@ macro_rules! uint_impl {
|
||||
#[stable(feature = "rust1", since = "1.0.0")]
|
||||
#[inline]
|
||||
pub fn swap_bytes(self) -> Self {
|
||||
unsafe { $bswap(self as $ActualT) as Self }
|
||||
unsafe { intrinsics::bswap(self as $ActualT) as Self }
|
||||
}
|
||||
|
||||
/// Converts an integer from big endian to the target's endianness.
|
||||
@ -1598,7 +1563,7 @@ macro_rules! uint_impl {
|
||||
}
|
||||
|
||||
/// Checked integer subtraction. Computes `self - rhs`, returning
|
||||
/// `None` if underflow occurred.
|
||||
/// `None` if overflow occurred.
|
||||
///
|
||||
/// # Examples
|
||||
///
|
||||
@ -1616,7 +1581,7 @@ macro_rules! uint_impl {
|
||||
}
|
||||
|
||||
/// Checked integer multiplication. Computes `self * rhs`, returning
|
||||
/// `None` if underflow or overflow occurred.
|
||||
/// `None` if overflow occurred.
|
||||
///
|
||||
/// # Examples
|
||||
///
|
||||
@ -1634,7 +1599,7 @@ macro_rules! uint_impl {
|
||||
}
|
||||
|
||||
/// Checked integer division. Computes `self / rhs`, returning `None`
|
||||
/// if `rhs == 0` or the operation results in underflow or overflow.
|
||||
/// if `rhs == 0` or the operation results in overflow.
|
||||
///
|
||||
/// # Examples
|
||||
///
|
||||
@ -1654,7 +1619,7 @@ macro_rules! uint_impl {
|
||||
}
|
||||
|
||||
/// Checked integer remainder. Computes `self % rhs`, returning `None`
|
||||
/// if `rhs == 0` or the operation results in underflow or overflow.
|
||||
/// if `rhs == 0` or the operation results in overflow.
|
||||
///
|
||||
/// # Examples
|
||||
///
|
||||
@ -1984,11 +1949,11 @@ macro_rules! uint_impl {
|
||||
#[inline]
|
||||
#[stable(feature = "wrapping", since = "1.7.0")]
|
||||
pub fn overflowing_add(self, rhs: Self) -> (Self, bool) {
|
||||
unsafe {
|
||||
let (a, b) = $add_with_overflow(self as $ActualT,
|
||||
rhs as $ActualT);
|
||||
(a as Self, b)
|
||||
}
|
||||
let (a, b) = unsafe {
|
||||
intrinsics::add_with_overflow(self as $ActualT,
|
||||
rhs as $ActualT)
|
||||
};
|
||||
(a as Self, b)
|
||||
}
|
||||
|
||||
/// Calculates `self` - `rhs`
|
||||
@ -2010,11 +1975,11 @@ macro_rules! uint_impl {
|
||||
#[inline]
|
||||
#[stable(feature = "wrapping", since = "1.7.0")]
|
||||
pub fn overflowing_sub(self, rhs: Self) -> (Self, bool) {
|
||||
unsafe {
|
||||
let (a, b) = $sub_with_overflow(self as $ActualT,
|
||||
rhs as $ActualT);
|
||||
(a as Self, b)
|
||||
}
|
||||
let (a, b) = unsafe {
|
||||
intrinsics::sub_with_overflow(self as $ActualT,
|
||||
rhs as $ActualT)
|
||||
};
|
||||
(a as Self, b)
|
||||
}
|
||||
|
||||
/// Calculates the multiplication of `self` and `rhs`.
|
||||
@ -2034,11 +1999,11 @@ macro_rules! uint_impl {
|
||||
#[inline]
|
||||
#[stable(feature = "wrapping", since = "1.7.0")]
|
||||
pub fn overflowing_mul(self, rhs: Self) -> (Self, bool) {
|
||||
unsafe {
|
||||
let (a, b) = $mul_with_overflow(self as $ActualT,
|
||||
rhs as $ActualT);
|
||||
(a as Self, b)
|
||||
}
|
||||
let (a, b) = unsafe {
|
||||
intrinsics::mul_with_overflow(self as $ActualT,
|
||||
rhs as $ActualT)
|
||||
};
|
||||
(a as Self, b)
|
||||
}
|
||||
|
||||
/// Calculates the divisor when `self` is divided by `rhs`.
|
||||
@ -2223,7 +2188,7 @@ macro_rules! uint_impl {
|
||||
// (such as intel pre-haswell) have more efficient ctlz
|
||||
// intrinsics when the argument is non-zero.
|
||||
let p = self - 1;
|
||||
let z = unsafe { $ctlz_nonzero(p) };
|
||||
let z = unsafe { intrinsics::ctlz_nonzero(p) };
|
||||
<$SelfT>::max_value() >> z
|
||||
}
|
||||
|
||||
@ -2270,15 +2235,7 @@ macro_rules! uint_impl {
|
||||
|
||||
#[lang = "u8"]
|
||||
impl u8 {
|
||||
uint_impl! { u8, u8, 8,
|
||||
intrinsics::ctpop,
|
||||
intrinsics::ctlz,
|
||||
intrinsics::ctlz_nonzero,
|
||||
intrinsics::cttz,
|
||||
intrinsics::bswap,
|
||||
intrinsics::add_with_overflow,
|
||||
intrinsics::sub_with_overflow,
|
||||
intrinsics::mul_with_overflow }
|
||||
uint_impl! { u8, u8, 8 }
|
||||
|
||||
|
||||
/// Checks if the value is within the ASCII range.
|
||||
@ -2824,95 +2781,39 @@ impl u8 {
|
||||
|
||||
#[lang = "u16"]
|
||||
impl u16 {
|
||||
uint_impl! { u16, u16, 16,
|
||||
intrinsics::ctpop,
|
||||
intrinsics::ctlz,
|
||||
intrinsics::ctlz_nonzero,
|
||||
intrinsics::cttz,
|
||||
intrinsics::bswap,
|
||||
intrinsics::add_with_overflow,
|
||||
intrinsics::sub_with_overflow,
|
||||
intrinsics::mul_with_overflow }
|
||||
uint_impl! { u16, u16, 16 }
|
||||
}
|
||||
|
||||
#[lang = "u32"]
|
||||
impl u32 {
|
||||
uint_impl! { u32, u32, 32,
|
||||
intrinsics::ctpop,
|
||||
intrinsics::ctlz,
|
||||
intrinsics::ctlz_nonzero,
|
||||
intrinsics::cttz,
|
||||
intrinsics::bswap,
|
||||
intrinsics::add_with_overflow,
|
||||
intrinsics::sub_with_overflow,
|
||||
intrinsics::mul_with_overflow }
|
||||
uint_impl! { u32, u32, 32 }
|
||||
}
|
||||
|
||||
#[lang = "u64"]
|
||||
impl u64 {
|
||||
uint_impl! { u64, u64, 64,
|
||||
intrinsics::ctpop,
|
||||
intrinsics::ctlz,
|
||||
intrinsics::ctlz_nonzero,
|
||||
intrinsics::cttz,
|
||||
intrinsics::bswap,
|
||||
intrinsics::add_with_overflow,
|
||||
intrinsics::sub_with_overflow,
|
||||
intrinsics::mul_with_overflow }
|
||||
uint_impl! { u64, u64, 64 }
|
||||
}
|
||||
|
||||
#[lang = "u128"]
|
||||
impl u128 {
|
||||
uint_impl! { u128, u128, 128,
|
||||
intrinsics::ctpop,
|
||||
intrinsics::ctlz,
|
||||
intrinsics::ctlz_nonzero,
|
||||
intrinsics::cttz,
|
||||
intrinsics::bswap,
|
||||
intrinsics::add_with_overflow,
|
||||
intrinsics::sub_with_overflow,
|
||||
intrinsics::mul_with_overflow }
|
||||
uint_impl! { u128, u128, 128 }
|
||||
}
|
||||
|
||||
#[cfg(target_pointer_width = "16")]
|
||||
#[lang = "usize"]
|
||||
impl usize {
|
||||
uint_impl! { usize, u16, 16,
|
||||
intrinsics::ctpop,
|
||||
intrinsics::ctlz,
|
||||
intrinsics::ctlz_nonzero,
|
||||
intrinsics::cttz,
|
||||
intrinsics::bswap,
|
||||
intrinsics::add_with_overflow,
|
||||
intrinsics::sub_with_overflow,
|
||||
intrinsics::mul_with_overflow }
|
||||
uint_impl! { usize, u16, 16 }
|
||||
}
|
||||
#[cfg(target_pointer_width = "32")]
|
||||
#[lang = "usize"]
|
||||
impl usize {
|
||||
uint_impl! { usize, u32, 32,
|
||||
intrinsics::ctpop,
|
||||
intrinsics::ctlz,
|
||||
intrinsics::ctlz_nonzero,
|
||||
intrinsics::cttz,
|
||||
intrinsics::bswap,
|
||||
intrinsics::add_with_overflow,
|
||||
intrinsics::sub_with_overflow,
|
||||
intrinsics::mul_with_overflow }
|
||||
uint_impl! { usize, u32, 32 }
|
||||
}
|
||||
|
||||
#[cfg(target_pointer_width = "64")]
|
||||
#[lang = "usize"]
|
||||
impl usize {
|
||||
uint_impl! { usize, u64, 64,
|
||||
intrinsics::ctpop,
|
||||
intrinsics::ctlz,
|
||||
intrinsics::ctlz_nonzero,
|
||||
intrinsics::cttz,
|
||||
intrinsics::bswap,
|
||||
intrinsics::add_with_overflow,
|
||||
intrinsics::sub_with_overflow,
|
||||
intrinsics::mul_with_overflow }
|
||||
uint_impl! { usize, u64, 64 }
|
||||
}
|
||||
|
||||
/// A classification of floating point numbers.
|
||||
|
@ -581,8 +581,7 @@ impl<T: ?Sized> *const T {
|
||||
/// * Both the starting and resulting pointer must be either in bounds or one
|
||||
/// byte past the end of an allocated object.
|
||||
///
|
||||
/// * The computed offset, **in bytes**, cannot overflow or underflow an
|
||||
/// `isize`.
|
||||
/// * The computed offset, **in bytes**, cannot overflow an `isize`.
|
||||
///
|
||||
/// * The offset being in bounds cannot rely on "wrapping around" the address
|
||||
/// space. That is, the infinite-precision sum, **in bytes** must fit in a usize.
|
||||
@ -714,8 +713,7 @@ impl<T: ?Sized> *const T {
|
||||
/// * Both the starting and resulting pointer must be either in bounds or one
|
||||
/// byte past the end of an allocated object.
|
||||
///
|
||||
/// * The computed offset, **in bytes**, cannot overflow or underflow an
|
||||
/// `isize`.
|
||||
/// * The computed offset, **in bytes**, cannot overflow an `isize`.
|
||||
///
|
||||
/// * The offset being in bounds cannot rely on "wrapping around" the address
|
||||
/// space. That is, the infinite-precision sum must fit in a `usize`.
|
||||
@ -1219,8 +1217,7 @@ impl<T: ?Sized> *mut T {
|
||||
/// * Both the starting and resulting pointer must be either in bounds or one
|
||||
/// byte past the end of an allocated object.
|
||||
///
|
||||
/// * The computed offset, **in bytes**, cannot overflow or underflow an
|
||||
/// `isize`.
|
||||
/// * The computed offset, **in bytes**, cannot overflow an `isize`.
|
||||
///
|
||||
/// * The offset being in bounds cannot rely on "wrapping around" the address
|
||||
/// space. That is, the infinite-precision sum, **in bytes** must fit in a usize.
|
||||
@ -1419,8 +1416,7 @@ impl<T: ?Sized> *mut T {
|
||||
/// * Both the starting and resulting pointer must be either in bounds or one
|
||||
/// byte past the end of an allocated object.
|
||||
///
|
||||
/// * The computed offset, **in bytes**, cannot overflow or underflow an
|
||||
/// `isize`.
|
||||
/// * The computed offset, **in bytes**, cannot overflow an `isize`.
|
||||
///
|
||||
/// * The offset being in bounds cannot rely on "wrapping around" the address
|
||||
/// space. That is, the infinite-precision sum must fit in a `usize`.
|
||||
|
@ -413,27 +413,14 @@ impl<'a> Id<'a> {
|
||||
/// quotes, ...) will return an empty `Err` value.
|
||||
pub fn new<Name: IntoCow<'a, str>>(name: Name) -> Result<Id<'a>, ()> {
|
||||
let name = name.into_cow();
|
||||
{
|
||||
let mut chars = name.chars();
|
||||
match chars.next() {
|
||||
Some(c) if is_letter_or_underscore(c) => {}
|
||||
_ => return Err(()),
|
||||
}
|
||||
if !chars.all(is_constituent) {
|
||||
return Err(());
|
||||
}
|
||||
match name.chars().next() {
|
||||
Some(c) if c.is_ascii_alphabetic() || c == '_' => {}
|
||||
_ => return Err(()),
|
||||
}
|
||||
if !name.chars().all(|c| c.is_ascii_alphanumeric() || c == '_' ) {
|
||||
return Err(());
|
||||
}
|
||||
return Ok(Id { name: name });
|
||||
|
||||
fn is_letter_or_underscore(c: char) -> bool {
|
||||
in_range('a', c, 'z') || in_range('A', c, 'Z') || c == '_'
|
||||
}
|
||||
fn is_constituent(c: char) -> bool {
|
||||
is_letter_or_underscore(c) || in_range('0', c, '9')
|
||||
}
|
||||
fn in_range(low: char, c: char, high: char) -> bool {
|
||||
low as usize <= c as usize && c as usize <= high as usize
|
||||
}
|
||||
}
|
||||
|
||||
pub fn as_slice(&'a self) -> &'a str {
|
||||
@ -484,8 +471,7 @@ pub trait Labeller<'a> {
|
||||
/// Maps `e` to a label that will be used in the rendered output.
|
||||
/// The label need not be unique, and may be the empty string; the
|
||||
/// default is in fact the empty string.
|
||||
fn edge_label(&'a self, e: &Self::Edge) -> LabelText<'a> {
|
||||
let _ignored = e;
|
||||
fn edge_label(&'a self, _e: &Self::Edge) -> LabelText<'a> {
|
||||
LabelStr("".into_cow())
|
||||
}
|
||||
|
||||
@ -655,79 +641,58 @@ pub fn render_opts<'a, N, E, G, W>(g: &'a G,
|
||||
G: Labeller<'a, Node=N, Edge=E> + GraphWalk<'a, Node=N, Edge=E>,
|
||||
W: Write
|
||||
{
|
||||
fn writeln<W: Write>(w: &mut W, arg: &[&str]) -> io::Result<()> {
|
||||
for &s in arg {
|
||||
w.write_all(s.as_bytes())?;
|
||||
}
|
||||
write!(w, "\n")
|
||||
}
|
||||
|
||||
fn indent<W: Write>(w: &mut W) -> io::Result<()> {
|
||||
w.write_all(b" ")
|
||||
}
|
||||
|
||||
writeln(w, &["digraph ", g.graph_id().as_slice(), " {"])?;
|
||||
writeln!(w, "digraph {} {{", g.graph_id().as_slice())?;
|
||||
for n in g.nodes().iter() {
|
||||
indent(w)?;
|
||||
write!(w, " ")?;
|
||||
let id = g.node_id(n);
|
||||
|
||||
let escaped = &g.node_label(n).to_dot_string();
|
||||
let shape;
|
||||
|
||||
let mut text = vec![id.as_slice()];
|
||||
let mut text = Vec::new();
|
||||
write!(text, "{}", id.as_slice()).unwrap();
|
||||
|
||||
if !options.contains(&RenderOption::NoNodeLabels) {
|
||||
text.push("[label=");
|
||||
text.push(escaped);
|
||||
text.push("]");
|
||||
write!(text, "[label={}]", escaped).unwrap();
|
||||
}
|
||||
|
||||
let style = g.node_style(n);
|
||||
if !options.contains(&RenderOption::NoNodeStyles) && style != Style::None {
|
||||
text.push("[style=\"");
|
||||
text.push(style.as_slice());
|
||||
text.push("\"]");
|
||||
write!(text, "[style=\"{}\"]", style.as_slice()).unwrap();
|
||||
}
|
||||
|
||||
if let Some(s) = g.node_shape(n) {
|
||||
shape = s.to_dot_string();
|
||||
text.push("[shape=");
|
||||
text.push(&shape);
|
||||
text.push("]");
|
||||
write!(text, "[shape={}]", &s.to_dot_string()).unwrap();
|
||||
}
|
||||
|
||||
text.push(";");
|
||||
writeln(w, &text)?;
|
||||
writeln!(text, ";").unwrap();
|
||||
w.write_all(&text[..])?;
|
||||
}
|
||||
|
||||
for e in g.edges().iter() {
|
||||
let escaped_label = &g.edge_label(e).to_dot_string();
|
||||
indent(w)?;
|
||||
write!(w, " ")?;
|
||||
let source = g.source(e);
|
||||
let target = g.target(e);
|
||||
let source_id = g.node_id(&source);
|
||||
let target_id = g.node_id(&target);
|
||||
|
||||
let mut text = vec![source_id.as_slice(), " -> ", target_id.as_slice()];
|
||||
let mut text = Vec::new();
|
||||
write!(text, "{} -> {}", source_id.as_slice(), target_id.as_slice()).unwrap();
|
||||
|
||||
if !options.contains(&RenderOption::NoEdgeLabels) {
|
||||
text.push("[label=");
|
||||
text.push(escaped_label);
|
||||
text.push("]");
|
||||
write!(text, "[label={}]", escaped_label).unwrap();
|
||||
}
|
||||
|
||||
let style = g.edge_style(e);
|
||||
if !options.contains(&RenderOption::NoEdgeStyles) && style != Style::None {
|
||||
text.push("[style=\"");
|
||||
text.push(style.as_slice());
|
||||
text.push("\"]");
|
||||
write!(text, "[style=\"{}\"]", style.as_slice()).unwrap();
|
||||
}
|
||||
|
||||
text.push(";");
|
||||
writeln(w, &text)?;
|
||||
writeln!(text, ";").unwrap();
|
||||
w.write_all(&text[..])?;
|
||||
}
|
||||
|
||||
writeln(w, &["}"])
|
||||
writeln!(w, "}}")
|
||||
}
|
||||
|
||||
pub trait IntoCow<'a, B: ?Sized> where B: ToOwned {
|
||||
|
@ -341,7 +341,7 @@ impl<'gcx> HashStable<StableHashingContext<'gcx>> for Span {
|
||||
std_hash::Hash::hash(&TAG_VALID_SPAN, hasher);
|
||||
// We truncate the stable_id hash and line and col numbers. The chances
|
||||
// of causing a collision this way should be minimal.
|
||||
std_hash::Hash::hash(&file_lo.name, hasher);
|
||||
std_hash::Hash::hash(&(file_lo.name_hash as u64), hasher);
|
||||
|
||||
let col = (col_lo.0 as u64) & 0xFF;
|
||||
let line = ((line_lo as u64) & 0xFF_FF_FF) << 8;
|
||||
|
@ -387,7 +387,8 @@ impl<'gcx> HashStable<StableHashingContext<'gcx>> for FileMap {
|
||||
hcx: &mut StableHashingContext<'gcx>,
|
||||
hasher: &mut StableHasher<W>) {
|
||||
let FileMap {
|
||||
ref name,
|
||||
name: _, // We hash the smaller name_hash instead of this
|
||||
name_hash,
|
||||
name_was_remapped,
|
||||
unmapped_path: _,
|
||||
crate_of_origin,
|
||||
@ -402,7 +403,7 @@ impl<'gcx> HashStable<StableHashingContext<'gcx>> for FileMap {
|
||||
ref non_narrow_chars,
|
||||
} = *self;
|
||||
|
||||
name.hash_stable(hcx, hasher);
|
||||
(name_hash as u64).hash_stable(hcx, hasher);
|
||||
name_was_remapped.hash_stable(hcx, hasher);
|
||||
|
||||
DefId {
|
||||
|
@ -558,24 +558,29 @@ impl<'a, 'gcx, 'tcx> ExprUseVisitor<'a, 'gcx, 'tcx> {
|
||||
}
|
||||
ty::TyError => { }
|
||||
_ => {
|
||||
let def_id = self.mc.tables.type_dependent_defs()[call.hir_id].def_id();
|
||||
let call_scope = region::Scope::Node(call.hir_id.local_id);
|
||||
match OverloadedCallType::from_method_id(self.tcx(), def_id) {
|
||||
FnMutOverloadedCall => {
|
||||
let call_scope_r = self.tcx().mk_region(ty::ReScope(call_scope));
|
||||
self.borrow_expr(callee,
|
||||
call_scope_r,
|
||||
ty::MutBorrow,
|
||||
ClosureInvocation);
|
||||
if let Some(def) = self.mc.tables.type_dependent_defs().get(call.hir_id) {
|
||||
let def_id = def.def_id();
|
||||
let call_scope = region::Scope::Node(call.hir_id.local_id);
|
||||
match OverloadedCallType::from_method_id(self.tcx(), def_id) {
|
||||
FnMutOverloadedCall => {
|
||||
let call_scope_r = self.tcx().mk_region(ty::ReScope(call_scope));
|
||||
self.borrow_expr(callee,
|
||||
call_scope_r,
|
||||
ty::MutBorrow,
|
||||
ClosureInvocation);
|
||||
}
|
||||
FnOverloadedCall => {
|
||||
let call_scope_r = self.tcx().mk_region(ty::ReScope(call_scope));
|
||||
self.borrow_expr(callee,
|
||||
call_scope_r,
|
||||
ty::ImmBorrow,
|
||||
ClosureInvocation);
|
||||
}
|
||||
FnOnceOverloadedCall => self.consume_expr(callee),
|
||||
}
|
||||
FnOverloadedCall => {
|
||||
let call_scope_r = self.tcx().mk_region(ty::ReScope(call_scope));
|
||||
self.borrow_expr(callee,
|
||||
call_scope_r,
|
||||
ty::ImmBorrow,
|
||||
ClosureInvocation);
|
||||
}
|
||||
FnOnceOverloadedCall => self.consume_expr(callee),
|
||||
} else {
|
||||
self.tcx().sess.delay_span_bug(call.span,
|
||||
"no type-dependent def for overloaded call");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -528,6 +528,25 @@ impl OutputFilenames {
|
||||
pub fn filestem(&self) -> String {
|
||||
format!("{}{}", self.out_filestem, self.extra)
|
||||
}
|
||||
|
||||
pub fn contains_path(&self, input_path: &PathBuf) -> bool {
|
||||
let input_path = input_path.canonicalize().ok();
|
||||
if input_path.is_none() {
|
||||
return false
|
||||
}
|
||||
match self.single_output_file {
|
||||
Some(ref output_path) => output_path.canonicalize().ok() == input_path,
|
||||
None => {
|
||||
for k in self.outputs.keys() {
|
||||
let output_path = self.path(k.to_owned());
|
||||
if output_path.canonicalize().ok() == input_path {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
false
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn host_triple() -> &'static str {
|
||||
@ -596,6 +615,12 @@ impl Options {
|
||||
).map(|(src, dst)| (src.clone(), dst.clone())).collect()
|
||||
)
|
||||
}
|
||||
|
||||
/// True if there will be an output file generated
|
||||
pub fn will_create_output_file(&self) -> bool {
|
||||
!self.debugging_opts.parse_only && // The file is just being parsed
|
||||
!self.debugging_opts.ls // The file is just being queried
|
||||
}
|
||||
}
|
||||
|
||||
// The type of entry function, so
|
||||
|
@ -1484,27 +1484,25 @@ impl<'a, 'tcx> LayoutDetails {
|
||||
Some(niche) => niche,
|
||||
None => continue
|
||||
};
|
||||
let mut align = dl.aggregate_align;
|
||||
let st = variants.iter().enumerate().map(|(j, v)| {
|
||||
let mut st = univariant_uninterned(v,
|
||||
&def.repr, StructKind::AlwaysSized)?;
|
||||
st.variants = Variants::Single { index: j };
|
||||
|
||||
align = align.max(st.align);
|
||||
|
||||
Ok(st)
|
||||
}).collect::<Result<Vec<_>, _>>()?;
|
||||
|
||||
let offset = st[i].fields.offset(field_index) + offset;
|
||||
let LayoutDetails { mut size, mut align, .. } = st[i];
|
||||
let size = st[i].size;
|
||||
|
||||
let mut niche_align = niche.value.align(dl);
|
||||
let abi = if offset.bytes() == 0 && niche.value.size(dl) == size {
|
||||
Abi::Scalar(niche.clone())
|
||||
} else {
|
||||
if offset.abi_align(niche_align) != offset {
|
||||
niche_align = dl.i8_align;
|
||||
}
|
||||
Abi::Aggregate { sized: true }
|
||||
};
|
||||
align = align.max(niche_align);
|
||||
size = size.abi_align(align);
|
||||
|
||||
return Ok(tcx.intern_layout(LayoutDetails {
|
||||
variants: Variants::NicheFilling {
|
||||
|
@ -71,6 +71,7 @@ use profile;
|
||||
|
||||
pub fn compile_input(sess: &Session,
|
||||
cstore: &CStore,
|
||||
input_path: &Option<PathBuf>,
|
||||
input: &Input,
|
||||
outdir: &Option<PathBuf>,
|
||||
output: &Option<PathBuf>,
|
||||
@ -142,6 +143,20 @@ pub fn compile_input(sess: &Session,
|
||||
};
|
||||
|
||||
let outputs = build_output_filenames(input, outdir, output, &krate.attrs, sess);
|
||||
|
||||
// Ensure the source file isn't accidentally overwritten during compilation.
|
||||
match *input_path {
|
||||
Some(ref input_path) => {
|
||||
if outputs.contains_path(input_path) && sess.opts.will_create_output_file() {
|
||||
sess.err(&format!(
|
||||
"the input file \"{}\" would be overwritten by the generated executable",
|
||||
input_path.display()));
|
||||
return Err(CompileIncomplete::Stopped);
|
||||
}
|
||||
},
|
||||
None => {}
|
||||
}
|
||||
|
||||
let crate_name =
|
||||
::rustc_trans_utils::link::find_crate_name(Some(sess), &krate.attrs, input);
|
||||
let ExpansionResult { expanded_crate, defs, analysis, resolutions, mut hir_forest } = {
|
||||
|
@ -232,7 +232,7 @@ pub fn run_compiler<'a>(args: &[String],
|
||||
let loader = file_loader.unwrap_or(box RealFileLoader);
|
||||
let codemap = Rc::new(CodeMap::with_file_loader(loader, sopts.file_path_mapping()));
|
||||
let mut sess = session::build_session_with_codemap(
|
||||
sopts, input_file_path, descriptions, codemap, emitter_dest,
|
||||
sopts, input_file_path.clone(), descriptions, codemap, emitter_dest,
|
||||
);
|
||||
rustc_trans::init(&sess);
|
||||
rustc_lint::register_builtins(&mut sess.lint_store.borrow_mut(), Some(&sess));
|
||||
@ -252,6 +252,7 @@ pub fn run_compiler<'a>(args: &[String],
|
||||
let control = callbacks.build_controller(&sess, &matches);
|
||||
(driver::compile_input(&sess,
|
||||
&cstore,
|
||||
&input_file_path,
|
||||
&input,
|
||||
&odir,
|
||||
&ofile,
|
||||
|
@ -349,7 +349,27 @@ impl MissingDoc {
|
||||
}
|
||||
}
|
||||
|
||||
let has_doc = attrs.iter().any(|a| a.is_value_str() && a.check_name("doc"));
|
||||
fn has_doc(attr: &ast::Attribute) -> bool {
|
||||
if !attr.check_name("doc") {
|
||||
return false;
|
||||
}
|
||||
|
||||
if attr.is_value_str() {
|
||||
return true;
|
||||
}
|
||||
|
||||
if let Some(list) = attr.meta_item_list() {
|
||||
for meta in list {
|
||||
if meta.check_name("include") {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
false
|
||||
}
|
||||
|
||||
let has_doc = attrs.iter().any(|a| has_doc(a));
|
||||
if !has_doc {
|
||||
cx.span_lint(MISSING_DOCS,
|
||||
cx.tcx.sess.codemap().def_span(sp),
|
||||
|
@ -1129,6 +1129,7 @@ impl<'a, 'tcx> CrateMetadata {
|
||||
lines,
|
||||
multibyte_chars,
|
||||
non_narrow_chars,
|
||||
name_hash,
|
||||
.. } = filemap_to_import;
|
||||
|
||||
let source_length = (end_pos - start_pos).to_usize();
|
||||
@ -1155,6 +1156,7 @@ impl<'a, 'tcx> CrateMetadata {
|
||||
name_was_remapped,
|
||||
self.cnum.as_u32(),
|
||||
src_hash,
|
||||
name_hash,
|
||||
source_length,
|
||||
lines,
|
||||
multibyte_chars,
|
||||
|
@ -28,8 +28,10 @@ use rustc::ty::codec::{self as ty_codec, TyEncoder};
|
||||
use rustc::session::config::{self, CrateTypeProcMacro};
|
||||
use rustc::util::nodemap::{FxHashMap, NodeSet};
|
||||
|
||||
use rustc_data_structures::stable_hasher::StableHasher;
|
||||
use rustc_serialize::{Encodable, Encoder, SpecializedEncoder, opaque};
|
||||
|
||||
use std::hash::Hash;
|
||||
use std::io::prelude::*;
|
||||
use std::io::Cursor;
|
||||
use std::path::Path;
|
||||
@ -290,6 +292,11 @@ impl<'a, 'tcx> EncodeContext<'a, 'tcx> {
|
||||
} else {
|
||||
let mut adapted = (**filemap).clone();
|
||||
adapted.name = Path::new(&working_dir).join(name).into();
|
||||
adapted.name_hash = {
|
||||
let mut hasher: StableHasher<u128> = StableHasher::new();
|
||||
adapted.name.hash(&mut hasher);
|
||||
hasher.finish()
|
||||
};
|
||||
Rc::new(adapted)
|
||||
}
|
||||
},
|
||||
|
@ -377,6 +377,17 @@ fn merge_codegen_units<'tcx>(initial_partitioning: &mut PreInliningPartitioning<
|
||||
assert!(target_cgu_count >= 1);
|
||||
let codegen_units = &mut initial_partitioning.codegen_units;
|
||||
|
||||
// Note that at this point in time the `codegen_units` here may not be in a
|
||||
// deterministic order (but we know they're deterministically the same set).
|
||||
// We want this merging to produce a deterministic ordering of codegen units
|
||||
// from the input.
|
||||
//
|
||||
// Due to basically how we've implemented the merging below (merge the two
|
||||
// smallest into each other) we're sure to start off with a deterministic
|
||||
// order (sorted by name). This'll mean that if two cgus have the same size
|
||||
// the stable sort below will keep everything nice and deterministic.
|
||||
codegen_units.sort_by_key(|cgu| cgu.name().clone());
|
||||
|
||||
// Merge the two smallest codegen units until the target size is reached.
|
||||
// Note that "size" is estimated here rather inaccurately as the number of
|
||||
// translation items in a given unit. This could be improved on.
|
||||
|
@ -3720,9 +3720,10 @@ fn sidebar_assoc_items(it: &clean::Item) -> String {
|
||||
})).and_then(|did| c.impls.get(&did));
|
||||
if let Some(impls) = inner_impl {
|
||||
out.push_str("<a class=\"sidebar-title\" href=\"#deref-methods\">");
|
||||
out.push_str(&format!("Methods from {:#}<Target={:#}>",
|
||||
impl_.inner_impl().trait_.as_ref().unwrap(),
|
||||
target));
|
||||
out.push_str(&format!("Methods from {}<Target={}>",
|
||||
Escape(&format!("{:#}",
|
||||
impl_.inner_impl().trait_.as_ref().unwrap())),
|
||||
Escape(&format!("{:#}", target))));
|
||||
out.push_str("</a>");
|
||||
let ret = impls.iter()
|
||||
.filter(|i| i.inner_impl().trait_.is_none())
|
||||
|
@ -263,7 +263,7 @@ fn run_test(test: &str, cratename: &str, filename: &FileName, cfgs: Vec<String>,
|
||||
}
|
||||
|
||||
let res = panic::catch_unwind(AssertUnwindSafe(|| {
|
||||
driver::compile_input(&sess, &cstore, &input, &out, &None, None, &control)
|
||||
driver::compile_input(&sess, &cstore, &None, &input, &out, &None, None, &control)
|
||||
}));
|
||||
|
||||
let compile_result = match res {
|
||||
@ -533,7 +533,7 @@ impl Collector {
|
||||
should_panic: testing::ShouldPanic::No,
|
||||
allow_fail,
|
||||
},
|
||||
testfn: testing::DynTestFn(box move |()| {
|
||||
testfn: testing::DynTestFn(box move || {
|
||||
let panic = io::set_panic(None);
|
||||
let print = io::set_print(None);
|
||||
match {
|
||||
|
@ -263,7 +263,7 @@ impl<R: Seek> Seek for BufReader<R> {
|
||||
/// See `std::io::Seek` for more details.
|
||||
///
|
||||
/// Note: In the edge case where you're seeking with `SeekFrom::Current(n)`
|
||||
/// where `n` minus the internal buffer length underflows an `i64`, two
|
||||
/// where `n` minus the internal buffer length overflows an `i64`, two
|
||||
/// seeks will be performed instead of one. If the second seek returns
|
||||
/// `Err`, the underlying reader will be left at the same position it would
|
||||
/// have if you seeked to `SeekFrom::Current(0)`.
|
||||
|
@ -290,7 +290,7 @@ impl Duration {
|
||||
}
|
||||
|
||||
/// Checked `Duration` subtraction. Computes `self - other`, returning [`None`]
|
||||
/// if the result would be negative or if underflow occurred.
|
||||
/// if the result would be negative or if overflow occurred.
|
||||
///
|
||||
/// [`None`]: ../../std/option/enum.Option.html#variant.None
|
||||
///
|
||||
|
@ -246,6 +246,7 @@ impl CodeMap {
|
||||
name_was_remapped: bool,
|
||||
crate_of_origin: u32,
|
||||
src_hash: u128,
|
||||
name_hash: u128,
|
||||
source_len: usize,
|
||||
mut file_local_lines: Vec<BytePos>,
|
||||
mut file_local_multibyte_chars: Vec<MultiByteChar>,
|
||||
@ -282,6 +283,7 @@ impl CodeMap {
|
||||
lines: RefCell::new(file_local_lines),
|
||||
multibyte_chars: RefCell::new(file_local_multibyte_chars),
|
||||
non_narrow_chars: RefCell::new(file_local_non_narrow_chars),
|
||||
name_hash,
|
||||
});
|
||||
|
||||
files.push(filemap.clone());
|
||||
|
@ -1115,15 +1115,19 @@ impl<'a, 'b> Folder for InvocationCollector<'a, 'b> {
|
||||
match File::open(&filename).and_then(|mut f| f.read_to_end(&mut buf)) {
|
||||
Ok(..) => {}
|
||||
Err(e) => {
|
||||
self.cx.span_warn(at.span,
|
||||
&format!("couldn't read {}: {}",
|
||||
filename.display(),
|
||||
e));
|
||||
self.cx.span_err(at.span,
|
||||
&format!("couldn't read {}: {}",
|
||||
filename.display(),
|
||||
e));
|
||||
}
|
||||
}
|
||||
|
||||
match String::from_utf8(buf) {
|
||||
Ok(src) => {
|
||||
// Add this input file to the code map to make it available as
|
||||
// dependency information
|
||||
self.cx.codemap().new_filemap_and_lines(&filename, &src);
|
||||
|
||||
let include_info = vec![
|
||||
dummy_spanned(ast::NestedMetaItemKind::MetaItem(
|
||||
attr::mk_name_value_item_str("file".into(),
|
||||
@ -1137,9 +1141,9 @@ impl<'a, 'b> Folder for InvocationCollector<'a, 'b> {
|
||||
attr::mk_list_item("include".into(), include_info))));
|
||||
}
|
||||
Err(_) => {
|
||||
self.cx.span_warn(at.span,
|
||||
&format!("{} wasn't a utf-8 file",
|
||||
filename.display()));
|
||||
self.cx.span_err(at.span,
|
||||
&format!("{} wasn't a utf-8 file",
|
||||
filename.display()));
|
||||
}
|
||||
}
|
||||
} else {
|
||||
|
@ -30,7 +30,7 @@ use std::borrow::Cow;
|
||||
use std::cell::{Cell, RefCell};
|
||||
use std::cmp::{self, Ordering};
|
||||
use std::fmt;
|
||||
use std::hash::Hasher;
|
||||
use std::hash::{Hasher, Hash};
|
||||
use std::ops::{Add, Sub};
|
||||
use std::path::PathBuf;
|
||||
use std::rc::Rc;
|
||||
@ -691,6 +691,8 @@ pub struct FileMap {
|
||||
pub multibyte_chars: RefCell<Vec<MultiByteChar>>,
|
||||
/// Width of characters that are not narrow in the source code
|
||||
pub non_narrow_chars: RefCell<Vec<NonNarrowChar>>,
|
||||
/// A hash of the filename, used for speeding up the incr. comp. hashing.
|
||||
pub name_hash: u128,
|
||||
}
|
||||
|
||||
impl Encodable for FileMap {
|
||||
@ -752,6 +754,9 @@ impl Encodable for FileMap {
|
||||
})?;
|
||||
s.emit_struct_field("non_narrow_chars", 8, |s| {
|
||||
(*self.non_narrow_chars.borrow()).encode(s)
|
||||
})?;
|
||||
s.emit_struct_field("name_hash", 9, |s| {
|
||||
self.name_hash.encode(s)
|
||||
})
|
||||
})
|
||||
}
|
||||
@ -801,6 +806,8 @@ impl Decodable for FileMap {
|
||||
d.read_struct_field("multibyte_chars", 7, |d| Decodable::decode(d))?;
|
||||
let non_narrow_chars: Vec<NonNarrowChar> =
|
||||
d.read_struct_field("non_narrow_chars", 8, |d| Decodable::decode(d))?;
|
||||
let name_hash: u128 =
|
||||
d.read_struct_field("name_hash", 9, |d| Decodable::decode(d))?;
|
||||
Ok(FileMap {
|
||||
name,
|
||||
name_was_remapped,
|
||||
@ -816,7 +823,8 @@ impl Decodable for FileMap {
|
||||
external_src: RefCell::new(ExternalSource::AbsentOk),
|
||||
lines: RefCell::new(lines),
|
||||
multibyte_chars: RefCell::new(multibyte_chars),
|
||||
non_narrow_chars: RefCell::new(non_narrow_chars)
|
||||
non_narrow_chars: RefCell::new(non_narrow_chars),
|
||||
name_hash,
|
||||
})
|
||||
})
|
||||
}
|
||||
@ -836,9 +844,16 @@ impl FileMap {
|
||||
start_pos: BytePos) -> FileMap {
|
||||
remove_bom(&mut src);
|
||||
|
||||
let mut hasher: StableHasher<u128> = StableHasher::new();
|
||||
hasher.write(src.as_bytes());
|
||||
let src_hash = hasher.finish();
|
||||
let src_hash = {
|
||||
let mut hasher: StableHasher<u128> = StableHasher::new();
|
||||
hasher.write(src.as_bytes());
|
||||
hasher.finish()
|
||||
};
|
||||
let name_hash = {
|
||||
let mut hasher: StableHasher<u128> = StableHasher::new();
|
||||
name.hash(&mut hasher);
|
||||
hasher.finish()
|
||||
};
|
||||
let end_pos = start_pos.to_usize() + src.len();
|
||||
|
||||
FileMap {
|
||||
@ -854,6 +869,7 @@ impl FileMap {
|
||||
lines: RefCell::new(Vec::new()),
|
||||
multibyte_chars: RefCell::new(Vec::new()),
|
||||
non_narrow_chars: RefCell::new(Vec::new()),
|
||||
name_hash,
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -35,6 +35,7 @@
|
||||
#![deny(warnings)]
|
||||
|
||||
#![feature(asm)]
|
||||
#![feature(fnbox)]
|
||||
#![cfg_attr(unix, feature(libc))]
|
||||
#![feature(set_stdio)]
|
||||
#![feature(panic_unwind)]
|
||||
@ -56,6 +57,7 @@ use self::OutputLocation::*;
|
||||
|
||||
use std::panic::{catch_unwind, AssertUnwindSafe};
|
||||
use std::any::Any;
|
||||
use std::boxed::FnBox;
|
||||
use std::cmp;
|
||||
use std::collections::BTreeMap;
|
||||
use std::env;
|
||||
@ -133,16 +135,6 @@ pub trait TDynBenchFn: Send {
|
||||
fn run(&self, harness: &mut Bencher);
|
||||
}
|
||||
|
||||
pub trait FnBox<T>: Send + 'static {
|
||||
fn call_box(self: Box<Self>, t: T);
|
||||
}
|
||||
|
||||
impl<T, F: FnOnce(T) + Send + 'static> FnBox<T> for F {
|
||||
fn call_box(self: Box<F>, t: T) {
|
||||
(*self)(t)
|
||||
}
|
||||
}
|
||||
|
||||
// A function that runs a test. If the function returns successfully,
|
||||
// the test succeeds; if the function panics then the test fails. We
|
||||
// may need to come up with a more clever definition of test in order
|
||||
@ -150,7 +142,7 @@ impl<T, F: FnOnce(T) + Send + 'static> FnBox<T> for F {
|
||||
pub enum TestFn {
|
||||
StaticTestFn(fn()),
|
||||
StaticBenchFn(fn(&mut Bencher)),
|
||||
DynTestFn(Box<FnBox<()>>),
|
||||
DynTestFn(Box<FnBox() + Send>),
|
||||
DynBenchFn(Box<TDynBenchFn + 'static>),
|
||||
}
|
||||
|
||||
@ -1337,14 +1329,14 @@ pub fn convert_benchmarks_to_tests(tests: Vec<TestDescAndFn>) -> Vec<TestDescAnd
|
||||
tests.into_iter().map(|x| {
|
||||
let testfn = match x.testfn {
|
||||
DynBenchFn(bench) => {
|
||||
DynTestFn(Box::new(move |()| {
|
||||
DynTestFn(Box::new(move || {
|
||||
bench::run_once(|b| {
|
||||
__rust_begin_short_backtrace(|| bench.run(b))
|
||||
})
|
||||
}))
|
||||
}
|
||||
StaticBenchFn(benchfn) => {
|
||||
DynTestFn(Box::new(move |()| {
|
||||
DynTestFn(Box::new(move || {
|
||||
bench::run_once(|b| {
|
||||
__rust_begin_short_backtrace(|| benchfn(b))
|
||||
})
|
||||
@ -1379,7 +1371,7 @@ pub fn run_test(opts: &TestOpts,
|
||||
fn run_test_inner(desc: TestDesc,
|
||||
monitor_ch: Sender<MonitorMsg>,
|
||||
nocapture: bool,
|
||||
testfn: Box<FnBox<()>>) {
|
||||
testfn: Box<FnBox() + Send>) {
|
||||
struct Sink(Arc<Mutex<Vec<u8>>>);
|
||||
impl Write for Sink {
|
||||
fn write(&mut self, data: &[u8]) -> io::Result<usize> {
|
||||
@ -1405,9 +1397,7 @@ pub fn run_test(opts: &TestOpts,
|
||||
None
|
||||
};
|
||||
|
||||
let result = catch_unwind(AssertUnwindSafe(|| {
|
||||
testfn.call_box(())
|
||||
}));
|
||||
let result = catch_unwind(AssertUnwindSafe(testfn));
|
||||
|
||||
if let Some((printio, panicio)) = oldio {
|
||||
io::set_print(printio);
|
||||
@ -1449,14 +1439,14 @@ pub fn run_test(opts: &TestOpts,
|
||||
return;
|
||||
}
|
||||
DynTestFn(f) => {
|
||||
let cb = move |()| {
|
||||
__rust_begin_short_backtrace(|| f.call_box(()))
|
||||
let cb = move || {
|
||||
__rust_begin_short_backtrace(f)
|
||||
};
|
||||
run_test_inner(desc, monitor_ch, opts.nocapture, Box::new(cb))
|
||||
}
|
||||
StaticTestFn(f) =>
|
||||
run_test_inner(desc, monitor_ch, opts.nocapture,
|
||||
Box::new(move |()| __rust_begin_short_backtrace(f))),
|
||||
Box::new(move || __rust_begin_short_backtrace(f))),
|
||||
}
|
||||
}
|
||||
|
||||
@ -1720,7 +1710,7 @@ mod tests {
|
||||
should_panic: ShouldPanic::No,
|
||||
allow_fail: false,
|
||||
},
|
||||
testfn: DynTestFn(Box::new(move |()| f())),
|
||||
testfn: DynTestFn(Box::new(f)),
|
||||
};
|
||||
let (tx, rx) = channel();
|
||||
run_test(&TestOpts::new(), false, desc, tx);
|
||||
@ -1738,7 +1728,7 @@ mod tests {
|
||||
should_panic: ShouldPanic::No,
|
||||
allow_fail: false,
|
||||
},
|
||||
testfn: DynTestFn(Box::new(move |()| f())),
|
||||
testfn: DynTestFn(Box::new(f)),
|
||||
};
|
||||
let (tx, rx) = channel();
|
||||
run_test(&TestOpts::new(), false, desc, tx);
|
||||
@ -1758,7 +1748,7 @@ mod tests {
|
||||
should_panic: ShouldPanic::Yes,
|
||||
allow_fail: false,
|
||||
},
|
||||
testfn: DynTestFn(Box::new(move |()| f())),
|
||||
testfn: DynTestFn(Box::new(f)),
|
||||
};
|
||||
let (tx, rx) = channel();
|
||||
run_test(&TestOpts::new(), false, desc, tx);
|
||||
@ -1778,7 +1768,7 @@ mod tests {
|
||||
should_panic: ShouldPanic::YesWithMessage("error message"),
|
||||
allow_fail: false,
|
||||
},
|
||||
testfn: DynTestFn(Box::new(move |()| f())),
|
||||
testfn: DynTestFn(Box::new(f)),
|
||||
};
|
||||
let (tx, rx) = channel();
|
||||
run_test(&TestOpts::new(), false, desc, tx);
|
||||
@ -1800,7 +1790,7 @@ mod tests {
|
||||
should_panic: ShouldPanic::YesWithMessage(expected),
|
||||
allow_fail: false,
|
||||
},
|
||||
testfn: DynTestFn(Box::new(move |()| f())),
|
||||
testfn: DynTestFn(Box::new(f)),
|
||||
};
|
||||
let (tx, rx) = channel();
|
||||
run_test(&TestOpts::new(), false, desc, tx);
|
||||
@ -1818,7 +1808,7 @@ mod tests {
|
||||
should_panic: ShouldPanic::Yes,
|
||||
allow_fail: false,
|
||||
},
|
||||
testfn: DynTestFn(Box::new(move |()| f())),
|
||||
testfn: DynTestFn(Box::new(f)),
|
||||
};
|
||||
let (tx, rx) = channel();
|
||||
run_test(&TestOpts::new(), false, desc, tx);
|
||||
@ -1852,7 +1842,7 @@ mod tests {
|
||||
should_panic: ShouldPanic::No,
|
||||
allow_fail: false,
|
||||
},
|
||||
testfn: DynTestFn(Box::new(move |()| {})),
|
||||
testfn: DynTestFn(Box::new(move || {})),
|
||||
},
|
||||
TestDescAndFn {
|
||||
desc: TestDesc {
|
||||
@ -1861,7 +1851,7 @@ mod tests {
|
||||
should_panic: ShouldPanic::No,
|
||||
allow_fail: false,
|
||||
},
|
||||
testfn: DynTestFn(Box::new(move |()| {})),
|
||||
testfn: DynTestFn(Box::new(move || {})),
|
||||
}];
|
||||
let filtered = filter_tests(&opts, tests);
|
||||
|
||||
@ -1885,7 +1875,7 @@ mod tests {
|
||||
should_panic: ShouldPanic::No,
|
||||
allow_fail: false,
|
||||
},
|
||||
testfn: DynTestFn(Box::new(move |()| {}))
|
||||
testfn: DynTestFn(Box::new(move || {}))
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
@ -1967,7 +1957,7 @@ mod tests {
|
||||
should_panic: ShouldPanic::No,
|
||||
allow_fail: false,
|
||||
},
|
||||
testfn: DynTestFn(Box::new(move |()| testfn())),
|
||||
testfn: DynTestFn(Box::new(testfn)),
|
||||
};
|
||||
tests.push(test);
|
||||
}
|
||||
|
@ -35,20 +35,24 @@ The error levels that you can have are:
|
||||
## Summary of Header Commands
|
||||
|
||||
Header commands specify something about the entire test file as a
|
||||
whole, instead of just a few lines inside the test.
|
||||
whole. They are normally put right after the copyright comment, e.g.:
|
||||
|
||||
```Rust
|
||||
// Copyright blah blah blah
|
||||
// except according to those terms.
|
||||
|
||||
// ignore-test This doesn't actually work
|
||||
```
|
||||
|
||||
### Ignoring tests
|
||||
|
||||
These are used to ignore the test in some situations, which means the test won't
|
||||
be compiled or run.
|
||||
|
||||
* `ignore-X` where `X` is a target detail or stage will ignore the test accordingly (see below)
|
||||
* `ignore-pretty` will not compile the pretty-printed test (this is done to test the pretty-printer, but might not always work)
|
||||
* `ignore-test` always ignores the test
|
||||
* `ignore-lldb` and `ignore-gdb` will skip the debuginfo tests
|
||||
* `min-{gdb,lldb}-version`
|
||||
* `should-fail` indicates that the test should fail; used for "meta testing",
|
||||
where we test the compiletest program itself to check that it will generate
|
||||
errors in appropriate scenarios. This header is ignored for pretty-printer tests.
|
||||
* `gate-test-X` where `X` is a feature marks the test as "gate test" for feature X.
|
||||
Such tests are supposed to ensure that the compiler errors when usage of a gated
|
||||
feature is attempted without the proper `#![feature(X)]` tag.
|
||||
Each unstable lang feature is required to have a gate test.
|
||||
* `ignore-lldb` and `ignore-gdb` will skip a debuginfo test on that debugger.
|
||||
|
||||
Some examples of `X` in `ignore-X`:
|
||||
|
||||
@ -58,6 +62,22 @@ Some examples of `X` in `ignore-X`:
|
||||
* Pointer width: `32bit`, `64bit`.
|
||||
* Stage: `stage0`, `stage1`, `stage2`.
|
||||
|
||||
### Other Header Commands
|
||||
|
||||
* `min-{gdb,lldb}-version`
|
||||
* `min-llvm-version`
|
||||
* `must-compile-successfully` for UI tests, indicates that the test is supposed
|
||||
to compile, as opposed to the default where the test is supposed to error out.
|
||||
* `compile-flags` passes extra command-line args to the compiler,
|
||||
e.g. `compile-flags -g` which forces debuginfo to be enabled.
|
||||
* `should-fail` indicates that the test should fail; used for "meta testing",
|
||||
where we test the compiletest program itself to check that it will generate
|
||||
errors in appropriate scenarios. This header is ignored for pretty-printer tests.
|
||||
* `gate-test-X` where `X` is a feature marks the test as "gate test" for feature X.
|
||||
Such tests are supposed to ensure that the compiler errors when usage of a gated
|
||||
feature is attempted without the proper `#![feature(X)]` tag.
|
||||
Each unstable lang feature is required to have a gate test.
|
||||
|
||||
## Revisions
|
||||
|
||||
Certain classes of tests support "revisions" (as of the time of this
|
||||
@ -109,6 +129,12 @@ fails, we will print out the current output, but it is also saved in
|
||||
printed as part of the test failure message), so you can run `diff` and
|
||||
so forth.
|
||||
|
||||
Normally, the test-runner checks that UI tests fail compilation. If you want
|
||||
to do a UI test for code that *compiles* (e.g. to test warnings, or if you
|
||||
have a collection of tests, only some of which error out), you can use the
|
||||
`// must-compile-successfully` header command to have the test runner instead
|
||||
check that the test compiles successfully.
|
||||
|
||||
### Editing and updating the reference files
|
||||
|
||||
If you have changed the compiler's output intentionally, or you are
|
||||
|
16
src/test/compile-fail/external-doc-error.rs
Normal file
16
src/test/compile-fail/external-doc-error.rs
Normal file
@ -0,0 +1,16 @@
|
||||
// Copyright 2012-2015 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
#![feature(external_doc)]
|
||||
|
||||
#[doc(include = "not-a-file.md")] //~ ERROR: couldn't read
|
||||
pub struct SomeStruct;
|
||||
|
||||
fn main() {}
|
14
src/test/compile-fail/issue-46771.rs
Normal file
14
src/test/compile-fail/issue-46771.rs
Normal file
@ -0,0 +1,14 @@
|
||||
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
fn main() {
|
||||
struct Foo;
|
||||
(1 .. 2).find(|_| Foo(0) == 0); //~ ERROR expected function, found `main::Foo`
|
||||
}
|
@ -8,7 +8,7 @@ ifneq ($(shell uname),FreeBSD)
|
||||
ifndef IS_WINDOWS
|
||||
all:
|
||||
$(RUSTC) --emit dep-info main.rs
|
||||
$(CGREP) "input.txt" "input.bin" < $(TMPDIR)/main.d
|
||||
$(CGREP) "input.txt" "input.bin" "input.md" < $(TMPDIR)/main.d
|
||||
else
|
||||
all:
|
||||
|
||||
|
1
src/test/run-make/include_bytes_deps/input.md
Normal file
1
src/test/run-make/include_bytes_deps/input.md
Normal file
@ -0,0 +1 @@
|
||||
# Hello, world!
|
@ -8,6 +8,11 @@
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
#![feature(external_doc)]
|
||||
|
||||
#[doc(include="input.md")]
|
||||
pub struct SomeStruct;
|
||||
|
||||
pub fn main() {
|
||||
const INPUT_TXT: &'static str = include_str!("input.txt");
|
||||
const INPUT_BIN: &'static [u8] = include_bytes!("input.bin");
|
||||
|
@ -71,5 +71,5 @@ fn compile(code: String, output: PathBuf, sysroot: PathBuf) {
|
||||
let (sess, cstore) = basic_sess(sysroot);
|
||||
let control = CompileController::basic();
|
||||
let input = Input::Str { name: FileName::Anon, input: code };
|
||||
let _ = compile_input(&sess, &cstore, &input, &None, &Some(output), None, &control);
|
||||
let _ = compile_input(&sess, &cstore, &None, &input, &None, &Some(output), None, &control);
|
||||
}
|
||||
|
10
src/test/run-make/output-filename-overwrites-input/Makefile
Normal file
10
src/test/run-make/output-filename-overwrites-input/Makefile
Normal file
@ -0,0 +1,10 @@
|
||||
-include ../tools.mk
|
||||
|
||||
all:
|
||||
cp foo.rs $(TMPDIR)/foo
|
||||
$(RUSTC) $(TMPDIR)/foo 2>&1 \
|
||||
| $(CGREP) -e "the input file \".*foo\" would be overwritten by the generated executable"
|
||||
$(RUSTC) foo.rs 2>&1 && $(RUSTC) -Z ls $(TMPDIR)/foo 2>&1
|
||||
cp foo.rs $(TMPDIR)/foo.rs
|
||||
$(RUSTC) $(TMPDIR)/foo.rs -o $(TMPDIR)/foo.rs 2>&1 \
|
||||
| $(CGREP) -e "the input file \".*foo.rs\" would be overwritten by the generated executable"
|
11
src/test/run-make/output-filename-overwrites-input/foo.rs
Normal file
11
src/test/run-make/output-filename-overwrites-input/foo.rs
Normal file
@ -0,0 +1,11 @@
|
||||
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
fn main() {}
|
@ -7,8 +7,8 @@ all:
|
||||
cp foo.rs $(TMPDIR)/.foo.bar
|
||||
$(RUSTC) $(TMPDIR)/.foo.bar 2>&1 \
|
||||
| $(CGREP) -e "invalid character.*in crate name:"
|
||||
cp foo.rs $(TMPDIR)/+foo+bar
|
||||
$(RUSTC) $(TMPDIR)/+foo+bar 2>&1 \
|
||||
cp foo.rs $(TMPDIR)/+foo+bar.rs
|
||||
$(RUSTC) $(TMPDIR)/+foo+bar.rs 2>&1 \
|
||||
| $(CGREP) -e "invalid character.*in crate name:"
|
||||
cp foo.rs $(TMPDIR)/-foo.rs
|
||||
$(RUSTC) $(TMPDIR)/-foo.rs 2>&1 \
|
||||
|
@ -19,7 +19,9 @@ impl<T: Copy> Clone for Packed<T> {
|
||||
fn sanity_check_size<T: Copy>(one: T) {
|
||||
let two = [one, one];
|
||||
let stride = (&two[1] as *const _ as usize) - (&two[0] as *const _ as usize);
|
||||
assert_eq!(stride, std::mem::size_of_val(&one));
|
||||
let (size, align) = (std::mem::size_of::<T>(), std::mem::align_of::<T>());
|
||||
assert_eq!(stride, size);
|
||||
assert_eq!(size % align, 0);
|
||||
}
|
||||
|
||||
fn main() {
|
||||
@ -32,5 +34,12 @@ fn main() {
|
||||
// In #46769, `Option<(Packed<&()>, bool)>` was found to have
|
||||
// pointer alignment, without actually being aligned in size.
|
||||
// E.g. on 64-bit platforms, it had alignment `8` but size `9`.
|
||||
sanity_check_size(Some((Packed(&()), true)));
|
||||
type PackedRefAndBool<'a> = (Packed<&'a ()>, bool);
|
||||
sanity_check_size::<Option<PackedRefAndBool>>(Some((Packed(&()), true)));
|
||||
|
||||
// Make sure we don't pay for the enum optimization in size,
|
||||
// e.g. we shouldn't need extra padding after the packed data.
|
||||
assert_eq!(std::mem::align_of::<Option<PackedRefAndBool>>(), 1);
|
||||
assert_eq!(std::mem::size_of::<Option<PackedRefAndBool>>(),
|
||||
std::mem::size_of::<PackedRefAndBool>());
|
||||
}
|
||||
|
@ -9,6 +9,7 @@
|
||||
// except according to those terms.
|
||||
|
||||
#![feature(external_doc)]
|
||||
#![deny(missing_doc)]
|
||||
|
||||
#[doc(include="external-cross-doc.md")]
|
||||
pub struct NeedMoreDocs;
|
||||
|
45
src/test/rustdoc/escape-deref-methods.rs
Normal file
45
src/test/rustdoc/escape-deref-methods.rs
Normal file
@ -0,0 +1,45 @@
|
||||
// Copyright 2017 The Rust Project Developers. See the COPYRIGHT
|
||||
// file at the top-level directory of this distribution and at
|
||||
// http://rust-lang.org/COPYRIGHT.
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
|
||||
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
|
||||
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
|
||||
// option. This file may not be copied, modified, or distributed
|
||||
// except according to those terms.
|
||||
|
||||
#![crate_name = "foo"]
|
||||
|
||||
use std::ops::{Deref, DerefMut};
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Title {
|
||||
name: String,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct TitleList {
|
||||
pub members: Vec<Title>,
|
||||
}
|
||||
|
||||
impl TitleList {
|
||||
pub fn new() -> Self {
|
||||
TitleList { members: Vec::new() }
|
||||
}
|
||||
}
|
||||
|
||||
impl Deref for TitleList {
|
||||
type Target = Vec<Title>;
|
||||
|
||||
fn deref(&self) -> &Self::Target {
|
||||
&self.members
|
||||
}
|
||||
}
|
||||
|
||||
// @has foo/struct.TitleList.html
|
||||
// @has - '//*[@class="sidebar-title"]' 'Methods from Deref<Target=Vec<Title>>'
|
||||
impl DerefMut for TitleList {
|
||||
fn deref_mut(&mut self) -> &mut Self::Target {
|
||||
&mut self.members
|
||||
}
|
||||
}
|
@ -711,7 +711,7 @@ pub fn make_test_name(config: &Config, testpaths: &TestPaths) -> test::TestName
|
||||
pub fn make_test_closure(config: &Config, testpaths: &TestPaths) -> test::TestFn {
|
||||
let config = config.clone();
|
||||
let testpaths = testpaths.clone();
|
||||
test::DynTestFn(Box::new(move |()| runtest::run(config, &testpaths)))
|
||||
test::DynTestFn(Box::new(move || runtest::run(config, &testpaths)))
|
||||
}
|
||||
|
||||
/// Returns (Path to GDB, GDB Version, GDB has Rust Support)
|
||||
|
Loading…
Reference in New Issue
Block a user