Auto merge of #58669 - Centril:rollup, r=Centril

Rollup of 16 pull requests

Successful merges:

 - #58100 (Transition librustdoc to Rust 2018)
 - #58122 (RangeInclusive internal iteration performance improvement.)
 - #58199 (Add better error message for partial move)
 - #58227 (Updated RELEASES.md for 1.33.0)
 - #58353 (Check the Self-type of inherent associated constants)
 - #58453 (SGX target: fix panic = abort)
 - #58476 (Remove `LazyTokenStream`.)
 - #58526 (Special suggestion for illegal unicode curly quote pairs)
 - #58595 (Turn duration consts into associated consts)
 - #58609 (Allow Self::Module to be mutated.)
 - #58628 (Optimise vec![false; N] to zero-alloc)
 - #58643 (Don't generate minification variables if minification disabled)
 - #58648 (Update tests to account for cross-platform testing and miri.)
 - #58654 (Do not underflow after resetting unmatched braces count)
 - #58658 (Add expected/provided byte alignments to validation error message)
 - #58667 (Reduce Miri-related Code Repetition `like (n << amt) >> amt`)

Failed merges:

r? @ghost
This commit is contained in:
bors 2019-02-23 14:14:04 +00:00
commit 7f19f161f2
94 changed files with 1125 additions and 765 deletions

View file

@ -1,3 +1,151 @@
Version 1.33.0 (2019-02-28)
==========================
Language
--------
- [You can now use the `cfg(target_vendor)` attribute.][57465] E.g.
`#[cfg(target_vendor="linux")] fn main() { println!("Hello Linux!"); }`
- [Integer patterns such as in a match expression can now be exhaustive.][56362]
E.g. You can have match statement on a `u8` that covers `0..=255` and
you would no longer be required to have a `_ => unreachable!()` case.
- [You can now have multiple patterns in `if let` and `while let`
expressions.][57532] You can do this with the same syntax as a `match`
expression. E.g.
```rust
enum Creature {
Crab(String),
Lobster(String),
Person(String),
}
fn main() {
let state = Creature::Crab("Ferris");
if let Creature::Crab(name) | Creature::Person(name) = state {
println!("This creature's name is: {}", name);
}
}
```
- [You can now have irrefutable `if let` and `while let` patterns.][57535] Using
this feature will by default produce a warning as this behaviour can be
unintuitive. E.g. `if let _ = 5 {}`
- [You can now use `let` bindings, assignments, expression statements,
and irrefutable pattern destructuring in const functions.][57175]
- [You can now call unsafe const functions.][57067] E.g.
```rust
const unsafe fn foo() -> i32 { 5 }
const fn bar() -> i32 {
unsafe { foo() }
}
```
- [You can now specify multiple attributes in a `cfg_attr` attribute.][57332]
E.g. `#[cfg_attr(all(), must_use, optimize)]`
- [You can now specify a specific alignment with the `#[repr(packed)]`
attribute.][57049] E.g. `#[repr(packed(2))] struct Foo(i16, i32);` is a struct
with an alignment of 2 bytes and a size of 6 bytes.
- [You can now import an item from a module as an `_`.][56303] This allows you to
import a trait's impls, and not have the name in the namespace. E.g.
```rust
use std::io::Read as _;
// Allowed as there is only one `Read` in the module.
pub trait Read {}
```
- [`extern` functions will now abort by default when panicking.][55982]
This was previously undefined behaviour.
Compiler
--------
- [You can now set a linker flavor for `rustc` with the `-Clinker-flavor`
command line argument.][56351]
- [The mininum required LLVM version has been bumped to 6.0.][56642]
- [Added support for the PowerPC64 architecture on FreeBSD.][57615]
- [The `x86_64-fortanix-unknown-sgx` target support has been upgraded to
tier 2 support.][57130] Visit the [platform support][platform-support] page for
information on Rust's platform support.
- [Added support for the `thumbv7neon-linux-androideabi` and
`thumbv7neon-unknown-linux-gnueabihf` targets.][56947]
- [Added support for the `x86_64-unknown-uefi` target.][56769]
Libraries
---------
- [The methods `overflowing_{add, sub, mul, shl, shr}` are now `const`
functions for all numeric types.][57566]
- [The methods `rotate_left`, `rotate_right`, and `wrapping_{add, sub, mul, shl, shr}`
are now `const` functions for all numeric types.][57105]
- [The methods `is_positive` and `is_negative` are now `const` functions for
all signed numeric types.][57105]
- [The `get` method for all `NonZero` types is now `const`.][57167]
- [The methods `count_ones`, `count_zeros`, `leading_zeros`, `trailing_zeros`,
`swap_bytes`, `from_be`, `from_le`, `to_be`, `to_le` are now `const` for all
numeric types.][57234]
- [`Ipv4Addr::new` is now a `const` function][57234]
Stabilized APIs
---------------
- [`unix::FileExt::read_exact_at`]
- [`unix::FileExt::write_all_at`]
- [`Option::transpose`]
- [`Result::transpose`]
- [`convert::identity`]
- [`pin::Pin`]
- [`marker::Unpin`]
- [`marker::PhantomPinned`]
- [`Vec::resize_with`]
- [`VecDeque::resize_with`]
- [`Duration::as_millis`]
- [`Duration::as_micros`]
- [`Duration::as_nanos`]
Cargo
-----
- [Cargo should now rebuild a crate if a file was modified during the initial
build.][cargo/6484]
Compatibility Notes
-------------------
- The methods `str::{trim_left, trim_right, trim_left_matches, trim_right_matches}`
are now deprecated in the standard library, and their usage will now produce a warning.
Please use the `str::{trim_start, trim_end, trim_start_matches, trim_end_matches}`
methods instead.
[57615]: https://github.com/rust-lang/rust/pull/57615/
[57465]: https://github.com/rust-lang/rust/pull/57465/
[57532]: https://github.com/rust-lang/rust/pull/57532/
[57535]: https://github.com/rust-lang/rust/pull/57535/
[57566]: https://github.com/rust-lang/rust/pull/57566/
[57130]: https://github.com/rust-lang/rust/pull/57130/
[57167]: https://github.com/rust-lang/rust/pull/57167/
[57175]: https://github.com/rust-lang/rust/pull/57175/
[57234]: https://github.com/rust-lang/rust/pull/57234/
[57332]: https://github.com/rust-lang/rust/pull/57332/
[56947]: https://github.com/rust-lang/rust/pull/56947/
[57049]: https://github.com/rust-lang/rust/pull/57049/
[57067]: https://github.com/rust-lang/rust/pull/57067/
[56769]: https://github.com/rust-lang/rust/pull/56769/
[56642]: https://github.com/rust-lang/rust/pull/56642/
[56303]: https://github.com/rust-lang/rust/pull/56303/
[56351]: https://github.com/rust-lang/rust/pull/56351/
[55982]: https://github.com/rust-lang/rust/pull/55982/
[56362]: https://github.com/rust-lang/rust/pull/56362
[57105]: https://github.com/rust-lang/rust/pull/57105
[cargo/6484]: https://github.com/rust-lang/cargo/pull/6484/
[`unix::FileExt::read_exact_at`]: https://doc.rust-lang.org/std/os/unix/fs/trait.FileExt.html#method.read_exact_at
[`unix::FileExt::write_all_at`]: https://doc.rust-lang.org/std/os/unix/fs/trait.FileExt.html#method.write_all_at
[`Option::transpose`]: https://doc.rust-lang.org/std/option/enum.Option.html#method.transpose
[`Result::transpose`]: https://doc.rust-lang.org/std/result/enum.Result.html#method.transpose
[`convert::identity`]: https://doc.rust-lang.org/std/convert/fn.identity.html
[`pin::Pin`]: https://doc.rust-lang.org/std/pin/struct.Pin.html
[`marker::Unpin`]: https://doc.rust-lang.org/stable/std/marker/trait.Unpin.html
[`marker::PhantomPinned`]: https://doc.rust-lang.org/nightly/std/marker/struct.PhantomPinned.html
[`Vec::resize_with`]: https://doc.rust-lang.org/std/vec/struct.Vec.html#method.resize_with
[`VecDeque::resize_with`]: https://doc.rust-lang.org/std/collections/struct.VecDeque.html#method.resize_with
[`Duration::as_millis`]: https://doc.rust-lang.org/std/time/struct.Duration.html#method.as_millis
[`Duration::as_micros`]: https://doc.rust-lang.org/std/time/struct.Duration.html#method.as_micros
[`Duration::as_nanos`]: https://doc.rust-lang.org/std/time/struct.Duration.html#method.as_nanos
[platform-support]: https://forge.rust-lang.org/platform-support.html
Version 1.32.0 (2019-01-17)
==========================

View file

@ -1610,6 +1610,7 @@ impl_is_zero!(u64, |x| x == 0);
impl_is_zero!(u128, |x| x == 0);
impl_is_zero!(usize, |x| x == 0);
impl_is_zero!(bool, |x| x == false);
impl_is_zero!(char, |x| x == '\0');
impl_is_zero!(f32, |x: f32| x.to_bits() == 0);

View file

@ -1,6 +1,6 @@
use convert::TryFrom;
use mem;
use ops::{self, Add, Sub};
use ops::{self, Add, Sub, Try};
use usize;
use super::{FusedIterator, TrustedLen};
@ -368,11 +368,11 @@ impl<A: Step> Iterator for ops::RangeInclusive<A> {
Some(Less) => {
self.is_empty = Some(false);
self.start = plus_n.add_one();
return Some(plus_n)
return Some(plus_n);
}
Some(Equal) => {
self.is_empty = Some(true);
return Some(plus_n)
return Some(plus_n);
}
_ => {}
}
@ -382,6 +382,34 @@ impl<A: Step> Iterator for ops::RangeInclusive<A> {
None
}
#[inline]
fn try_fold<B, F, R>(&mut self, init: B, mut f: F) -> R
where
Self: Sized, F: FnMut(B, Self::Item) -> R, R: Try<Ok=B>
{
self.compute_is_empty();
if self.is_empty() {
return Try::from_ok(init);
}
let mut accum = init;
while self.start < self.end {
let n = self.start.add_one();
let n = mem::replace(&mut self.start, n);
accum = f(accum, n)?;
}
self.is_empty = Some(true);
if self.start == self.end {
accum = f(accum, self.start.clone())?;
}
Try::from_ok(accum)
}
#[inline]
fn last(mut self) -> Option<A> {
self.next_back()
@ -415,6 +443,33 @@ impl<A: Step> DoubleEndedIterator for ops::RangeInclusive<A> {
self.end.clone()
})
}
#[inline]
fn try_rfold<B, F, R>(&mut self, init: B, mut f: F) -> R where
Self: Sized, F: FnMut(B, Self::Item) -> R, R: Try<Ok=B>
{
self.compute_is_empty();
if self.is_empty() {
return Try::from_ok(init);
}
let mut accum = init;
while self.start < self.end {
let n = self.end.sub_one();
let n = mem::replace(&mut self.end, n);
accum = f(accum, n)?;
}
self.is_empty = Some(true);
if self.start == self.end {
accum = f(accum, self.start.clone())?;
}
Try::from_ok(accum)
}
}
#[stable(feature = "fused", since = "1.26.0")]

View file

@ -334,12 +334,14 @@ pub struct RangeInclusive<Idx> {
trait RangeInclusiveEquality: Sized {
fn canonicalized_is_empty(range: &RangeInclusive<Self>) -> bool;
}
impl<T> RangeInclusiveEquality for T {
#[inline]
default fn canonicalized_is_empty(range: &RangeInclusive<Self>) -> bool {
range.is_empty.unwrap_or_default()
}
}
impl<T: PartialOrd> RangeInclusiveEquality for T {
#[inline]
fn canonicalized_is_empty(range: &RangeInclusive<Self>) -> bool {

View file

@ -1741,19 +1741,37 @@ fn test_range_inclusive_folds() {
assert_eq!((1..=10).sum::<i32>(), 55);
assert_eq!((1..=10).rev().sum::<i32>(), 55);
let mut it = 40..=50;
let mut it = 44..=50;
assert_eq!(it.try_fold(0, i8::checked_add), None);
assert_eq!(it, 44..=50);
assert_eq!(it, 47..=50);
assert_eq!(it.try_fold(0, i8::checked_add), None);
assert_eq!(it, 50..=50);
assert_eq!(it.try_fold(0, i8::checked_add), Some(50));
assert!(it.is_empty());
assert_eq!(it.try_fold(0, i8::checked_add), Some(0));
assert!(it.is_empty());
let mut it = 40..=47;
assert_eq!(it.try_rfold(0, i8::checked_add), None);
assert_eq!(it, 44..=47);
assert_eq!(it, 40..=44);
assert_eq!(it.try_rfold(0, i8::checked_add), None);
assert_eq!(it, 40..=41);
assert_eq!(it.try_rfold(0, i8::checked_add), Some(81));
assert!(it.is_empty());
assert_eq!(it.try_rfold(0, i8::checked_add), Some(0));
assert!(it.is_empty());
let mut it = 10..=20;
assert_eq!(it.try_fold(0, |a,b| Some(a+b)), Some(165));
assert!(it.is_empty());
assert_eq!(it.try_fold(0, |a,b| Some(a+b)), Some(0));
assert!(it.is_empty());
let mut it = 10..=20;
assert_eq!(it.try_rfold(0, |a,b| Some(a+b)), Some(165));
assert!(it.is_empty());
assert_eq!(it.try_rfold(0, |a,b| Some(a+b)), Some(0));
assert!(it.is_empty());
}
#[test]

View file

@ -23,22 +23,6 @@ const MILLIS_PER_SEC: u64 = 1_000;
const MICROS_PER_SEC: u64 = 1_000_000;
const MAX_NANOS_F64: f64 = ((u64::MAX as u128 + 1)*(NANOS_PER_SEC as u128)) as f64;
/// The duration of one second.
#[unstable(feature = "duration_constants", issue = "57391")]
pub const SECOND: Duration = Duration::from_secs(1);
/// The duration of one millisecond.
#[unstable(feature = "duration_constants", issue = "57391")]
pub const MILLISECOND: Duration = Duration::from_millis(1);
/// The duration of one microsecond.
#[unstable(feature = "duration_constants", issue = "57391")]
pub const MICROSECOND: Duration = Duration::from_micros(1);
/// The duration of one nanosecond.
#[unstable(feature = "duration_constants", issue = "57391")]
pub const NANOSECOND: Duration = Duration::from_nanos(1);
/// A `Duration` type to represent a span of time, typically used for system
/// timeouts.
///
@ -75,6 +59,58 @@ pub struct Duration {
}
impl Duration {
/// The duration of one second.
///
/// # Examples
///
/// ```
/// #![feature(duration_constants)]
/// use std::time::Duration;
///
/// assert_eq!(Duration::SECOND, Duration::from_secs(1));
/// ```
#[unstable(feature = "duration_constants", issue = "57391")]
pub const SECOND: Duration = Duration::from_secs(1);
/// The duration of one millisecond.
///
/// # Examples
///
/// ```
/// #![feature(duration_constants)]
/// use std::time::Duration;
///
/// assert_eq!(Duration::MILLISECOND, Duration::from_millis(1));
/// ```
#[unstable(feature = "duration_constants", issue = "57391")]
pub const MILLISECOND: Duration = Duration::from_millis(1);
/// The duration of one microsecond.
///
/// # Examples
///
/// ```
/// #![feature(duration_constants)]
/// use std::time::Duration;
///
/// assert_eq!(Duration::MICROSECOND, Duration::from_micros(1));
/// ```
#[unstable(feature = "duration_constants", issue = "57391")]
pub const MICROSECOND: Duration = Duration::from_micros(1);
/// The duration of one nanosecond.
///
/// # Examples
///
/// ```
/// #![feature(duration_constants)]
/// use std::time::Duration;
///
/// assert_eq!(Duration::NANOSECOND, Duration::from_nanos(1));
/// ```
#[unstable(feature = "duration_constants", issue = "57391")]
pub const NANOSECOND: Duration = Duration::from_nanos(1);
/// Creates a new `Duration` from the specified number of whole seconds and
/// additional nanoseconds.
///

View file

@ -58,8 +58,9 @@ pub unsafe extern fn __rust_start_panic(_payload: usize) -> u32 {
#[cfg(all(target_vendor="fortanix", target_env="sgx"))]
unsafe fn abort() -> ! {
extern "C" { pub fn panic_exit() -> !; }
panic_exit();
// call std::sys::abort_internal
extern "C" { pub fn __rust_abort() -> !; }
__rust_abort();
}
}

View file

@ -1124,19 +1124,19 @@ impl<'a> LoweringContext<'a> {
TokenTree::Delimited(span, delim, tts) => TokenTree::Delimited(
span,
delim,
self.lower_token_stream(tts.into()).into(),
self.lower_token_stream(tts),
).into(),
}
}
fn lower_token(&mut self, token: Token, span: Span) -> TokenStream {
match token {
Token::Interpolated(_) => {}
other => return TokenTree::Token(span, other).into(),
Token::Interpolated(nt) => {
let tts = nt.to_tokenstream(&self.sess.parse_sess, span);
self.lower_token_stream(tts)
}
other => TokenTree::Token(span, other).into(),
}
let tts = token.interpolated_to_tokenstream(&self.sess.parse_sess, span);
self.lower_token_stream(tts)
}
fn lower_arm(&mut self, arm: &Arm) -> hir::Arm {

View file

@ -339,7 +339,7 @@ impl<'a> visit::Visitor<'a> for DefCollector<'a> {
fn visit_token(&mut self, t: Token) {
if let Token::Interpolated(nt) = t {
if let token::NtExpr(ref expr) = nt.0 {
if let token::NtExpr(ref expr) = *nt {
if let ExprKind::Mac(..) = expr.node {
self.visit_macro_invoc(expr.id);
}

View file

@ -9,7 +9,7 @@ use rustc_allocator::{ALLOCATOR_METHODS, AllocatorTy};
use crate::ModuleLlvm;
use crate::llvm::{self, False, True};
pub(crate) unsafe fn codegen(tcx: TyCtxt, mods: &ModuleLlvm, kind: AllocatorKind) {
pub(crate) unsafe fn codegen(tcx: TyCtxt, mods: &mut ModuleLlvm, kind: AllocatorKind) {
let llcx = &*mods.llcx;
let llmod = mods.llmod();
let usize = match &tcx.sess.target.target.target_pointer_width[..] {

View file

@ -46,7 +46,7 @@ use crate::value::Value;
pub fn write_metadata<'a, 'gcx>(
tcx: TyCtxt<'a, 'gcx, 'gcx>,
llvm_module: &ModuleLlvm
llvm_module: &mut ModuleLlvm
) -> EncodedMetadata {
use std::io::Write;
use flate2::Compression;

View file

@ -120,11 +120,11 @@ impl ExtraBackendMethods for LlvmCodegenBackend {
fn write_metadata<'b, 'gcx>(
&self,
tcx: TyCtxt<'b, 'gcx, 'gcx>,
metadata: &ModuleLlvm
metadata: &mut ModuleLlvm
) -> EncodedMetadata {
base::write_metadata(tcx, metadata)
}
fn codegen_allocator(&self, tcx: TyCtxt, mods: &ModuleLlvm, kind: AllocatorKind) {
fn codegen_allocator(&self, tcx: TyCtxt, mods: &mut ModuleLlvm, kind: AllocatorKind) {
unsafe { allocator::codegen(tcx, mods, kind) }
}
fn compile_codegen_unit<'a, 'tcx: 'a>(

View file

@ -551,9 +551,9 @@ pub fn codegen_crate<B: ExtraBackendMethods>(
&["crate"],
Some("metadata")).as_str()
.to_string();
let metadata_llvm_module = backend.new_metadata(tcx, &metadata_cgu_name);
let mut metadata_llvm_module = backend.new_metadata(tcx, &metadata_cgu_name);
let metadata = time(tcx.sess, "write metadata", || {
backend.write_metadata(tcx, &metadata_llvm_module)
backend.write_metadata(tcx, &mut metadata_llvm_module)
});
tcx.sess.profiler(|p| p.end_activity(ProfileCategory::Codegen));
@ -636,9 +636,9 @@ pub fn codegen_crate<B: ExtraBackendMethods>(
&["crate"],
Some("allocator")).as_str()
.to_string();
let modules = backend.new_metadata(tcx, &llmod_id);
let mut modules = backend.new_metadata(tcx, &llmod_id);
time(tcx.sess, "write allocator module", || {
backend.codegen_allocator(tcx, &modules, kind)
backend.codegen_allocator(tcx, &mut modules, kind)
});
Some(ModuleCodegen {

View file

@ -36,9 +36,9 @@ pub trait ExtraBackendMethods: CodegenBackend + WriteBackendMethods + Sized + Se
fn write_metadata<'b, 'gcx>(
&self,
tcx: TyCtxt<'b, 'gcx, 'gcx>,
metadata: &Self::Module,
metadata: &mut Self::Module,
) -> EncodedMetadata;
fn codegen_allocator(&self, tcx: TyCtxt, mods: &Self::Module, kind: AllocatorKind);
fn codegen_allocator(&self, tcx: TyCtxt, mods: &mut Self::Module, kind: AllocatorKind);
fn compile_codegen_unit<'a, 'tcx: 'a>(
&self,
tcx: TyCtxt<'a, 'tcx, 'tcx>,

View file

@ -130,6 +130,11 @@ impl<'cx, 'gcx, 'tcx> MirBorrowckCtxt<'cx, 'gcx, 'tcx> {
);
let mut is_loop_move = false;
let is_partial_move = move_site_vec.iter().any(|move_site| {
let move_out = self.move_data.moves[(*move_site).moi];
let moved_place = &self.move_data.move_paths[move_out.path].place;
used_place != moved_place && used_place.is_prefix_of(moved_place)
});
for move_site in &move_site_vec {
let move_out = self.move_data.moves[(*move_site).moi];
let moved_place = &self.move_data.move_paths[move_out.path].place;
@ -175,8 +180,9 @@ impl<'cx, 'gcx, 'tcx> MirBorrowckCtxt<'cx, 'gcx, 'tcx> {
err.span_label(
span,
format!(
"value {} here after move",
desired_action.as_verb_in_past_tense()
"value {} here {}",
desired_action.as_verb_in_past_tense(),
if is_partial_move { "after partial move" } else { "after move" },
),
);
}

View file

@ -7,6 +7,7 @@ use std::hash::Hash;
use rustc::hir;
use rustc::mir;
use rustc::mir::interpret::truncate;
use rustc::ty::{self, Ty};
use rustc::ty::layout::{self, Size, Align, LayoutOf, TyLayout, HasDataLayout, VariantIdx};
use rustc::ty::TypeFoldable;
@ -965,8 +966,7 @@ where
// their computation, but the in-memory tag is the smallest possible
// representation
let size = tag.value.size(self);
let shift = 128 - size.bits();
let discr_val = (discr_val << shift) >> shift;
let discr_val = truncate(discr_val, size);
let discr_dest = self.place_field(dest, 0)?;
self.write_scalar(Scalar::from_uint(discr_val, size), discr_dest)?;

View file

@ -357,8 +357,10 @@ impl<'rt, 'a, 'mir, 'tcx, M: Machine<'a, 'mir, 'tcx>>
match err.kind {
EvalErrorKind::InvalidNullPointerUsage =>
return validation_failure!("NULL reference", self.path),
EvalErrorKind::AlignmentCheckFailed { .. } =>
return validation_failure!("unaligned reference", self.path),
EvalErrorKind::AlignmentCheckFailed { required, has } =>
return validation_failure!(format!("unaligned reference \
(required {} byte alignment but found {})",
required.bytes(), has.bytes()), self.path),
_ =>
return validation_failure!(
"dangling (out-of-bounds) reference (might be NULL at \

View file

@ -1025,7 +1025,7 @@ impl<'a, 'b> Visitor<'a> for BuildReducedGraphVisitor<'a, 'b> {
fn visit_token(&mut self, t: Token) {
if let Token::Interpolated(nt) = t {
if let token::NtExpr(ref expr) = nt.0 {
if let token::NtExpr(ref expr) = *nt {
if let ast::ExprKind::Mac(..) = expr.node {
self.visit_invoc(expr.id);
}

View file

@ -2234,7 +2234,7 @@ impl<'a, 'gcx, 'tcx> FnCtxt<'a, 'gcx, 'tcx> {
hir_id, def_id, substs, user_self_ty, self.tag(),
);
if !substs.is_noop() {
if Self::can_contain_user_lifetime_bounds((substs, user_self_ty)) {
let canonicalized = self.infcx.canonicalize_user_type_annotation(
&UserType::TypeOf(def_id, UserSubsts {
substs,
@ -2429,15 +2429,7 @@ impl<'a, 'gcx, 'tcx> FnCtxt<'a, 'gcx, 'tcx> {
let ty = self.to_ty(ast_ty);
debug!("to_ty_saving_user_provided_ty: ty={:?}", ty);
// If the type given by the user has free regions, save it for
// later, since NLL would like to enforce those. Also pass in
// types that involve projections, since those can resolve to
// `'static` bounds (modulo #54940, which hopefully will be
// fixed by the time you see this comment, dear reader,
// although I have my doubts). Also pass in types with inference
// types, because they may be repeated. Other sorts of things
// are already sufficiently enforced with erased regions. =)
if ty.has_free_regions() || ty.has_projections() || ty.has_infer_types() {
if Self::can_contain_user_lifetime_bounds(ty) {
let c_ty = self.infcx.canonicalize_response(&UserType::Ty(ty));
debug!("to_ty_saving_user_provided_ty: c_ty={:?}", c_ty);
self.tables.borrow_mut().user_provided_types_mut().insert(ast_ty.hir_id, c_ty);
@ -2446,6 +2438,20 @@ impl<'a, 'gcx, 'tcx> FnCtxt<'a, 'gcx, 'tcx> {
ty
}
// If the type given by the user has free regions, save it for later, since
// NLL would like to enforce those. Also pass in types that involve
// projections, since those can resolve to `'static` bounds (modulo #54940,
// which hopefully will be fixed by the time you see this comment, dear
// reader, although I have my doubts). Also pass in types with inference
// types, because they may be repeated. Other sorts of things are already
// sufficiently enforced with erased regions. =)
fn can_contain_user_lifetime_bounds<T>(t: T) -> bool
where
T: TypeFoldable<'tcx>
{
t.has_free_regions() || t.has_projections() || t.has_infer_types()
}
pub fn node_ty(&self, id: hir::HirId) -> Ty<'tcx> {
match self.tables.borrow().node_types().get(id) {
Some(&t) => t,

View file

@ -2,6 +2,7 @@
authors = ["The Rust Project Developers"]
name = "rustdoc"
version = "0.0.0"
edition = "2018"
[lib]
name = "rustdoc"

View file

@ -220,7 +220,10 @@ impl<'a, 'tcx, 'rcx> AutoTraitFinder<'a, 'tcx, 'rcx> {
}
}
fn get_lifetime(&self, region: Region, names_map: &FxHashMap<String, Lifetime>) -> Lifetime {
fn get_lifetime(
&self, region: Region<'_>,
names_map: &FxHashMap<String, Lifetime>
) -> Lifetime {
self.region_name(region)
.map(|name| {
names_map.get(&name).unwrap_or_else(|| {
@ -231,7 +234,7 @@ impl<'a, 'tcx, 'rcx> AutoTraitFinder<'a, 'tcx, 'rcx> {
.clone()
}
fn region_name(&self, region: Region) -> Option<String> {
fn region_name(&self, region: Region<'_>) -> Option<String> {
match region {
&ty::ReEarlyBound(r) => Some(r.name.to_string()),
_ => None,
@ -259,7 +262,7 @@ impl<'a, 'tcx, 'rcx> AutoTraitFinder<'a, 'tcx, 'rcx> {
// we need to create the Generics.
let mut finished: FxHashMap<_, Vec<_>> = Default::default();
let mut vid_map: FxHashMap<RegionTarget, RegionDeps> = Default::default();
let mut vid_map: FxHashMap<RegionTarget<'_>, RegionDeps<'_>> = Default::default();
// Flattening is done in two parts. First, we insert all of the constraints
// into a map. Each RegionTarget (either a RegionVid or a Region) maps
@ -842,7 +845,7 @@ impl<'a, 'tcx, 'rcx> AutoTraitFinder<'a, 'tcx, 'rcx> {
vec.sort_by_cached_key(|x| format!("{:?}", x))
}
fn is_fn_ty(&self, tcx: &TyCtxt, ty: &Type) -> bool {
fn is_fn_ty(&self, tcx: &TyCtxt<'_, '_, '_>, ty: &Type) -> bool {
match &ty {
&&Type::ResolvedPath { ref did, .. } => {
*did == tcx.require_lang_item(lang_items::FnTraitLangItem)

View file

@ -5,7 +5,7 @@ use rustc::ty::subst::Subst;
use rustc::infer::InferOk;
use syntax_pos::DUMMY_SP;
use core::DocAccessLevels;
use crate::core::DocAccessLevels;
use super::*;

View file

@ -14,7 +14,7 @@ use syntax::feature_gate::Features;
use syntax_pos::Span;
use html::escape::Escape;
use crate::html::escape::Escape;
#[derive(Clone, RustcEncodable, RustcDecodable, Debug, PartialEq, Eq, Hash)]
pub enum Cfg {
@ -261,7 +261,7 @@ impl ops::BitOr for Cfg {
struct Html<'a>(&'a Cfg, bool);
fn write_with_opt_paren<T: fmt::Display>(
fmt: &mut fmt::Formatter,
fmt: &mut fmt::Formatter<'_>,
has_paren: bool,
obj: T,
) -> fmt::Result {
@ -277,7 +277,7 @@ fn write_with_opt_paren<T: fmt::Display>(
impl<'a> fmt::Display for Html<'a> {
fn fmt(&self, fmt: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, fmt: &mut fmt::Formatter<'_>) -> fmt::Result {
match *self.0 {
Cfg::Not(ref child) => match **child {
Cfg::Any(ref sub_cfgs) => {

View file

@ -1,8 +1,8 @@
use core::DocContext;
use crate::core::DocContext;
use super::*;
pub fn get_def_from_def_id<F>(cx: &DocContext,
pub fn get_def_from_def_id<F>(cx: &DocContext<'_, '_, '_>,
def_id: DefId,
callback: &F,
) -> Vec<Item>
@ -38,7 +38,7 @@ where F: Fn(& dyn Fn(DefId) -> Def) -> Vec<Item> {
}
}
pub fn get_def_from_node_id<F>(cx: &DocContext,
pub fn get_def_from_node_id<F>(cx: &DocContext<'_, '_, '_>,
id: ast::NodeId,
name: String,
callback: &F,

View file

@ -13,9 +13,9 @@ use rustc_metadata::cstore::LoadedMacro;
use rustc::ty;
use rustc::util::nodemap::FxHashSet;
use core::{DocContext, DocAccessLevels};
use doctree;
use clean::{
use crate::core::{DocContext, DocAccessLevels};
use crate::doctree;
use crate::clean::{
self,
GetDefId,
ToSource,
@ -35,7 +35,12 @@ use super::Clean;
///
/// The returned value is `None` if the definition could not be inlined,
/// and `Some` of a vector of items if it was successfully expanded.
pub fn try_inline(cx: &DocContext, def: Def, name: ast::Name, visited: &mut FxHashSet<DefId>)
pub fn try_inline(
cx: &DocContext<'_, '_, '_>,
def: Def,
name: ast::Name,
visited: &mut FxHashSet<DefId>
)
-> Option<Vec<clean::Item>> {
let did = if let Some(did) = def.opt_def_id() {
did
@ -124,7 +129,7 @@ pub fn try_inline(cx: &DocContext, def: Def, name: ast::Name, visited: &mut FxHa
Some(ret)
}
pub fn try_inline_glob(cx: &DocContext, def: Def, visited: &mut FxHashSet<DefId>)
pub fn try_inline_glob(cx: &DocContext<'_, '_, '_>, def: Def, visited: &mut FxHashSet<DefId>)
-> Option<Vec<clean::Item>>
{
if def == Def::Err { return None }
@ -141,7 +146,7 @@ pub fn try_inline_glob(cx: &DocContext, def: Def, visited: &mut FxHashSet<DefId>
}
}
pub fn load_attrs(cx: &DocContext, did: DefId) -> clean::Attributes {
pub fn load_attrs(cx: &DocContext<'_, '_, '_>, did: DefId) -> clean::Attributes {
cx.tcx.get_attrs(did).clean(cx)
}
@ -149,7 +154,7 @@ pub fn load_attrs(cx: &DocContext, did: DefId) -> clean::Attributes {
///
/// These names are used later on by HTML rendering to generate things like
/// source links back to the original item.
pub fn record_extern_fqn(cx: &DocContext, did: DefId, kind: clean::TypeKind) {
pub fn record_extern_fqn(cx: &DocContext<'_, '_, '_>, did: DefId, kind: clean::TypeKind) {
let mut crate_name = cx.tcx.crate_name(did.krate).to_string();
if did.is_local() {
crate_name = cx.crate_name.clone().unwrap_or(crate_name);
@ -177,7 +182,7 @@ pub fn record_extern_fqn(cx: &DocContext, did: DefId, kind: clean::TypeKind) {
}
}
pub fn build_external_trait(cx: &DocContext, did: DefId) -> clean::Trait {
pub fn build_external_trait(cx: &DocContext<'_, '_, '_>, did: DefId) -> clean::Trait {
let auto_trait = cx.tcx.trait_def(did).has_auto_impl;
let trait_items = cx.tcx.associated_items(did).map(|item| item.clean(cx)).collect();
let predicates = cx.tcx.predicates_of(did);
@ -197,7 +202,7 @@ pub fn build_external_trait(cx: &DocContext, did: DefId) -> clean::Trait {
}
}
fn build_external_function(cx: &DocContext, did: DefId) -> clean::Function {
fn build_external_function(cx: &DocContext<'_, '_, '_>, did: DefId) -> clean::Function {
let sig = cx.tcx.fn_sig(did);
let constness = if cx.tcx.is_min_const_fn(did) {
@ -219,7 +224,7 @@ fn build_external_function(cx: &DocContext, did: DefId) -> clean::Function {
}
}
fn build_enum(cx: &DocContext, did: DefId) -> clean::Enum {
fn build_enum(cx: &DocContext<'_, '_, '_>, did: DefId) -> clean::Enum {
let predicates = cx.tcx.predicates_of(did);
clean::Enum {
@ -229,7 +234,7 @@ fn build_enum(cx: &DocContext, did: DefId) -> clean::Enum {
}
}
fn build_struct(cx: &DocContext, did: DefId) -> clean::Struct {
fn build_struct(cx: &DocContext<'_, '_, '_>, did: DefId) -> clean::Struct {
let predicates = cx.tcx.predicates_of(did);
let variant = cx.tcx.adt_def(did).non_enum_variant();
@ -245,7 +250,7 @@ fn build_struct(cx: &DocContext, did: DefId) -> clean::Struct {
}
}
fn build_union(cx: &DocContext, did: DefId) -> clean::Union {
fn build_union(cx: &DocContext<'_, '_, '_>, did: DefId) -> clean::Union {
let predicates = cx.tcx.predicates_of(did);
let variant = cx.tcx.adt_def(did).non_enum_variant();
@ -257,7 +262,7 @@ fn build_union(cx: &DocContext, did: DefId) -> clean::Union {
}
}
fn build_type_alias(cx: &DocContext, did: DefId) -> clean::Typedef {
fn build_type_alias(cx: &DocContext<'_, '_, '_>, did: DefId) -> clean::Typedef {
let predicates = cx.tcx.predicates_of(did);
clean::Typedef {
@ -266,7 +271,7 @@ fn build_type_alias(cx: &DocContext, did: DefId) -> clean::Typedef {
}
}
pub fn build_impls(cx: &DocContext, did: DefId) -> Vec<clean::Item> {
pub fn build_impls(cx: &DocContext<'_, '_, '_>, did: DefId) -> Vec<clean::Item> {
let tcx = cx.tcx;
let mut impls = Vec::new();
@ -277,7 +282,7 @@ pub fn build_impls(cx: &DocContext, did: DefId) -> Vec<clean::Item> {
impls
}
pub fn build_impl(cx: &DocContext, did: DefId, ret: &mut Vec<clean::Item>) {
pub fn build_impl(cx: &DocContext<'_, '_, '_>, did: DefId, ret: &mut Vec<clean::Item>) {
if !cx.renderinfo.borrow_mut().inlined.insert(did) {
return
}
@ -387,7 +392,11 @@ pub fn build_impl(cx: &DocContext, did: DefId, ret: &mut Vec<clean::Item>) {
});
}
fn build_module(cx: &DocContext, did: DefId, visited: &mut FxHashSet<DefId>) -> clean::Module {
fn build_module(
cx: &DocContext<'_, '_, '_>,
did: DefId,
visited: &mut FxHashSet<DefId>
) -> clean::Module {
let mut items = Vec::new();
fill_in(cx, did, &mut items, visited);
return clean::Module {
@ -395,7 +404,7 @@ fn build_module(cx: &DocContext, did: DefId, visited: &mut FxHashSet<DefId>) ->
is_crate: false,
};
fn fill_in(cx: &DocContext, did: DefId, items: &mut Vec<clean::Item>,
fn fill_in(cx: &DocContext<'_, '_, '_>, did: DefId, items: &mut Vec<clean::Item>,
visited: &mut FxHashSet<DefId>) {
// If we're re-exporting a re-export it may actually re-export something in
// two namespaces, so the target may be listed twice. Make sure we only
@ -412,7 +421,7 @@ fn build_module(cx: &DocContext, did: DefId, visited: &mut FxHashSet<DefId>) ->
}
}
pub fn print_inlined_const(cx: &DocContext, did: DefId) -> String {
pub fn print_inlined_const(cx: &DocContext<'_, '_, '_>, did: DefId) -> String {
if let Some(node_id) = cx.tcx.hir().as_local_node_id(did) {
cx.tcx.hir().node_to_pretty_string(node_id)
} else {
@ -420,14 +429,14 @@ pub fn print_inlined_const(cx: &DocContext, did: DefId) -> String {
}
}
fn build_const(cx: &DocContext, did: DefId) -> clean::Constant {
fn build_const(cx: &DocContext<'_, '_, '_>, did: DefId) -> clean::Constant {
clean::Constant {
type_: cx.tcx.type_of(did).clean(cx),
expr: print_inlined_const(cx, did)
}
}
fn build_static(cx: &DocContext, did: DefId, mutable: bool) -> clean::Static {
fn build_static(cx: &DocContext<'_, '_, '_>, did: DefId, mutable: bool) -> clean::Static {
clean::Static {
type_: cx.tcx.type_of(did).clean(cx),
mutability: if mutable {clean::Mutable} else {clean::Immutable},
@ -435,7 +444,7 @@ fn build_static(cx: &DocContext, did: DefId, mutable: bool) -> clean::Static {
}
}
fn build_macro(cx: &DocContext, did: DefId, name: ast::Name) -> clean::ItemEnum {
fn build_macro(cx: &DocContext<'_, '_, '_>, did: DefId, name: ast::Name) -> clean::ItemEnum {
let imported_from = cx.tcx.original_crate_name(did.krate);
match cx.cstore.load_macro_untracked(did, cx.sess()) {
LoadedMacro::MacroDef(def) => {
@ -537,7 +546,7 @@ fn separate_supertrait_bounds(mut g: clean::Generics)
(g, ty_bounds)
}
pub fn record_extern_trait(cx: &DocContext, did: DefId) {
pub fn record_extern_trait(cx: &DocContext<'_, '_, '_>, did: DefId) {
if did.is_local() {
return;
}

View file

@ -48,11 +48,12 @@ use std::u32;
use parking_lot::ReentrantMutex;
use core::{self, DocContext};
use doctree;
use visit_ast;
use html::render::{cache, ExternalLocation};
use html::item_type::ItemType;
use crate::core::{self, DocContext};
use crate::doctree;
use crate::visit_ast;
use crate::html::render::{cache, ExternalLocation};
use crate::html::item_type::ItemType;
use self::cfg::Cfg;
use self::auto_trait::AutoTraitFinder;
@ -70,56 +71,56 @@ thread_local!(pub static MAX_DEF_ID: RefCell<FxHashMap<CrateNum, DefId>> = Defau
const FN_OUTPUT_NAME: &'static str = "Output";
// extract the stability index for a node from tcx, if possible
fn get_stability(cx: &DocContext, def_id: DefId) -> Option<Stability> {
fn get_stability(cx: &DocContext<'_, '_, '_>, def_id: DefId) -> Option<Stability> {
cx.tcx.lookup_stability(def_id).clean(cx)
}
fn get_deprecation(cx: &DocContext, def_id: DefId) -> Option<Deprecation> {
fn get_deprecation(cx: &DocContext<'_, '_, '_>, def_id: DefId) -> Option<Deprecation> {
cx.tcx.lookup_deprecation(def_id).clean(cx)
}
pub trait Clean<T> {
fn clean(&self, cx: &DocContext) -> T;
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> T;
}
impl<T: Clean<U>, U> Clean<Vec<U>> for [T] {
fn clean(&self, cx: &DocContext) -> Vec<U> {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Vec<U> {
self.iter().map(|x| x.clean(cx)).collect()
}
}
impl<T: Clean<U>, U, V: Idx> Clean<IndexVec<V, U>> for IndexVec<V, T> {
fn clean(&self, cx: &DocContext) -> IndexVec<V, U> {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> IndexVec<V, U> {
self.iter().map(|x| x.clean(cx)).collect()
}
}
impl<T: Clean<U>, U> Clean<U> for P<T> {
fn clean(&self, cx: &DocContext) -> U {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> U {
(**self).clean(cx)
}
}
impl<T: Clean<U>, U> Clean<U> for Rc<T> {
fn clean(&self, cx: &DocContext) -> U {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> U {
(**self).clean(cx)
}
}
impl<T: Clean<U>, U> Clean<Option<U>> for Option<T> {
fn clean(&self, cx: &DocContext) -> Option<U> {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Option<U> {
self.as_ref().map(|v| v.clean(cx))
}
}
impl<T, U> Clean<U> for ty::Binder<T> where T: Clean<U> {
fn clean(&self, cx: &DocContext) -> U {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> U {
self.skip_binder().clean(cx)
}
}
impl<T: Clean<U>, U> Clean<Vec<U>> for P<[T]> {
fn clean(&self, cx: &DocContext) -> Vec<U> {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Vec<U> {
self.iter().map(|x| x.clean(cx)).collect()
}
}
@ -139,8 +140,8 @@ pub struct Crate {
}
impl<'a, 'tcx, 'rcx> Clean<Crate> for visit_ast::RustdocVisitor<'a, 'tcx, 'rcx> {
fn clean(&self, cx: &DocContext) -> Crate {
use ::visit_lib::LibEmbargoVisitor;
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Crate {
use crate::visit_lib::LibEmbargoVisitor;
{
let mut r = cx.renderinfo.borrow_mut();
@ -233,7 +234,7 @@ pub struct ExternalCrate {
}
impl Clean<ExternalCrate> for CrateNum {
fn clean(&self, cx: &DocContext) -> ExternalCrate {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> ExternalCrate {
let root = DefId { krate: *self, index: CRATE_DEF_INDEX };
let krate_span = cx.tcx.def_span(root);
let krate_src = cx.sess().source_map().span_to_filename(krate_span);
@ -365,7 +366,7 @@ pub struct Item {
}
impl fmt::Debug for Item {
fn fmt(&self, fmt: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, fmt: &mut fmt::Formatter<'_>) -> fmt::Result {
let fake = MAX_DEF_ID.with(|m| m.borrow().get(&self.def_id.krate)
.map(|id| self.def_id >= *id).unwrap_or(false));
@ -581,7 +582,7 @@ pub struct Module {
}
impl Clean<Item> for doctree::Module {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
let name = if self.name.is_some() {
self.name.expect("No name provided").clean(cx)
} else {
@ -949,7 +950,8 @@ impl Attributes {
///
/// Cache must be populated before call
pub fn links(&self, krate: &CrateNum) -> Vec<(String, String)> {
use html::format::href;
use crate::html::format::href;
self.links.iter().filter_map(|&(ref s, did, ref fragment)| {
match did {
Some(did) => {
@ -1019,7 +1021,7 @@ impl AttributesExt for Attributes {
}
impl Clean<Attributes> for [ast::Attribute] {
fn clean(&self, cx: &DocContext) -> Attributes {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Attributes {
Attributes::from_ast(cx.sess().diagnostic(), self)
}
}
@ -1031,7 +1033,7 @@ pub enum GenericBound {
}
impl GenericBound {
fn maybe_sized(cx: &DocContext) -> GenericBound {
fn maybe_sized(cx: &DocContext<'_, '_, '_>) -> GenericBound {
let did = cx.tcx.require_lang_item(lang_items::SizedTraitLangItem);
let empty = cx.tcx.intern_substs(&[]);
let path = external_path(cx, &cx.tcx.item_name(did).as_str(),
@ -1048,7 +1050,7 @@ impl GenericBound {
}, hir::TraitBoundModifier::Maybe)
}
fn is_sized_bound(&self, cx: &DocContext) -> bool {
fn is_sized_bound(&self, cx: &DocContext<'_, '_, '_>) -> bool {
use rustc::hir::TraitBoundModifier as TBM;
if let GenericBound::TraitBound(PolyTrait { ref trait_, .. }, TBM::None) = *self {
if trait_.def_id() == cx.tcx.lang_items().sized_trait() {
@ -1074,7 +1076,7 @@ impl GenericBound {
}
impl Clean<GenericBound> for hir::GenericBound {
fn clean(&self, cx: &DocContext) -> GenericBound {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> GenericBound {
match *self {
hir::GenericBound::Outlives(lt) => GenericBound::Outlives(lt.clean(cx)),
hir::GenericBound::Trait(ref t, modifier) => {
@ -1084,8 +1086,8 @@ impl Clean<GenericBound> for hir::GenericBound {
}
}
fn external_generic_args(cx: &DocContext, trait_did: Option<DefId>, has_self: bool,
bindings: Vec<TypeBinding>, substs: &Substs) -> GenericArgs {
fn external_generic_args(cx: &DocContext<'_, '_, '_>, trait_did: Option<DefId>, has_self: bool,
bindings: Vec<TypeBinding>, substs: &Substs<'_>) -> GenericArgs {
let lifetimes = substs.regions().filter_map(|v| v.clean(cx)).collect();
let types = substs.types().skip(has_self as usize).collect::<Vec<_>>();
@ -1126,8 +1128,8 @@ fn external_generic_args(cx: &DocContext, trait_did: Option<DefId>, has_self: bo
// trait_did should be set to a trait's DefId if called on a TraitRef, in order to sugar
// from Fn<(A, B,), C> to Fn(A, B) -> C
fn external_path(cx: &DocContext, name: &str, trait_did: Option<DefId>, has_self: bool,
bindings: Vec<TypeBinding>, substs: &Substs) -> Path {
fn external_path(cx: &DocContext<'_, '_, '_>, name: &str, trait_did: Option<DefId>, has_self: bool,
bindings: Vec<TypeBinding>, substs: &Substs<'_>) -> Path {
Path {
global: false,
def: Def::Err,
@ -1139,7 +1141,7 @@ fn external_path(cx: &DocContext, name: &str, trait_did: Option<DefId>, has_self
}
impl<'a, 'tcx> Clean<GenericBound> for (&'a ty::TraitRef<'tcx>, Vec<TypeBinding>) {
fn clean(&self, cx: &DocContext) -> GenericBound {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> GenericBound {
let (trait_ref, ref bounds) = *self;
inline::record_extern_fqn(cx, trait_ref.def_id, TypeKind::Trait);
let path = external_path(cx, &cx.tcx.item_name(trait_ref.def_id).as_str(),
@ -1183,13 +1185,13 @@ impl<'a, 'tcx> Clean<GenericBound> for (&'a ty::TraitRef<'tcx>, Vec<TypeBinding>
}
impl<'tcx> Clean<GenericBound> for ty::TraitRef<'tcx> {
fn clean(&self, cx: &DocContext) -> GenericBound {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> GenericBound {
(self, vec![]).clean(cx)
}
}
impl<'tcx> Clean<Option<Vec<GenericBound>>> for Substs<'tcx> {
fn clean(&self, cx: &DocContext) -> Option<Vec<GenericBound>> {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Option<Vec<GenericBound>> {
let mut v = Vec::new();
v.extend(self.regions().filter_map(|r| r.clean(cx)).map(GenericBound::Outlives));
v.extend(self.types().map(|t| GenericBound::TraitBound(PolyTrait {
@ -1216,7 +1218,7 @@ impl Lifetime {
}
impl Clean<Lifetime> for hir::Lifetime {
fn clean(&self, cx: &DocContext) -> Lifetime {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Lifetime {
if self.id != ast::DUMMY_NODE_ID {
let def = cx.tcx.named_region(self.hir_id);
match def {
@ -1235,7 +1237,7 @@ impl Clean<Lifetime> for hir::Lifetime {
}
impl Clean<Lifetime> for hir::GenericParam {
fn clean(&self, _: &DocContext) -> Lifetime {
fn clean(&self, _: &DocContext<'_, '_, '_>) -> Lifetime {
match self.kind {
hir::GenericParamKind::Lifetime { .. } => {
if self.bounds.len() > 0 {
@ -1259,7 +1261,7 @@ impl Clean<Lifetime> for hir::GenericParam {
}
impl Clean<Constant> for hir::ConstArg {
fn clean(&self, cx: &DocContext) -> Constant {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Constant {
Constant {
type_: cx.tcx.type_of(cx.tcx.hir().body_owner_def_id(self.value.body)).clean(cx),
expr: print_const_expr(cx, self.value.body),
@ -1268,13 +1270,13 @@ impl Clean<Constant> for hir::ConstArg {
}
impl<'tcx> Clean<Lifetime> for ty::GenericParamDef {
fn clean(&self, _cx: &DocContext) -> Lifetime {
fn clean(&self, _cx: &DocContext<'_, '_, '_>) -> Lifetime {
Lifetime(self.name.to_string())
}
}
impl Clean<Option<Lifetime>> for ty::RegionKind {
fn clean(&self, cx: &DocContext) -> Option<Lifetime> {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Option<Lifetime> {
match *self {
ty::ReStatic => Some(Lifetime::statik()),
ty::ReLateBound(_, ty::BrNamed(_, name)) => Some(Lifetime(name.to_string())),
@ -1303,7 +1305,7 @@ pub enum WherePredicate {
}
impl Clean<WherePredicate> for hir::WherePredicate {
fn clean(&self, cx: &DocContext) -> WherePredicate {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> WherePredicate {
match *self {
hir::WherePredicate::BoundPredicate(ref wbp) => {
WherePredicate::BoundPredicate {
@ -1330,7 +1332,7 @@ impl Clean<WherePredicate> for hir::WherePredicate {
}
impl<'a> Clean<Option<WherePredicate>> for ty::Predicate<'a> {
fn clean(&self, cx: &DocContext) -> Option<WherePredicate> {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Option<WherePredicate> {
use rustc::ty::Predicate;
match *self {
@ -1349,7 +1351,7 @@ impl<'a> Clean<Option<WherePredicate>> for ty::Predicate<'a> {
}
impl<'a> Clean<WherePredicate> for ty::TraitPredicate<'a> {
fn clean(&self, cx: &DocContext) -> WherePredicate {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> WherePredicate {
WherePredicate::BoundPredicate {
ty: self.trait_ref.self_ty().clean(cx),
bounds: vec![self.trait_ref.clean(cx)]
@ -1358,7 +1360,7 @@ impl<'a> Clean<WherePredicate> for ty::TraitPredicate<'a> {
}
impl<'tcx> Clean<WherePredicate> for ty::SubtypePredicate<'tcx> {
fn clean(&self, _cx: &DocContext) -> WherePredicate {
fn clean(&self, _cx: &DocContext<'_, '_, '_>) -> WherePredicate {
panic!("subtype predicates are an internal rustc artifact \
and should not be seen by rustdoc")
}
@ -1367,7 +1369,7 @@ impl<'tcx> Clean<WherePredicate> for ty::SubtypePredicate<'tcx> {
impl<'tcx> Clean<Option<WherePredicate>> for
ty::OutlivesPredicate<ty::Region<'tcx>,ty::Region<'tcx>> {
fn clean(&self, cx: &DocContext) -> Option<WherePredicate> {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Option<WherePredicate> {
let ty::OutlivesPredicate(ref a, ref b) = *self;
match (a, b) {
@ -1385,7 +1387,7 @@ impl<'tcx> Clean<Option<WherePredicate>> for
}
impl<'tcx> Clean<Option<WherePredicate>> for ty::OutlivesPredicate<Ty<'tcx>, ty::Region<'tcx>> {
fn clean(&self, cx: &DocContext) -> Option<WherePredicate> {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Option<WherePredicate> {
let ty::OutlivesPredicate(ref ty, ref lt) = *self;
match lt {
@ -1401,7 +1403,7 @@ impl<'tcx> Clean<Option<WherePredicate>> for ty::OutlivesPredicate<Ty<'tcx>, ty:
}
impl<'tcx> Clean<WherePredicate> for ty::ProjectionPredicate<'tcx> {
fn clean(&self, cx: &DocContext) -> WherePredicate {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> WherePredicate {
WherePredicate::EqPredicate {
lhs: self.projection_ty.clean(cx),
rhs: self.ty.clean(cx)
@ -1410,7 +1412,7 @@ impl<'tcx> Clean<WherePredicate> for ty::ProjectionPredicate<'tcx> {
}
impl<'tcx> Clean<Type> for ty::ProjectionTy<'tcx> {
fn clean(&self, cx: &DocContext) -> Type {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Type {
let trait_ = match self.trait_ref(cx.tcx).clean(cx) {
GenericBound::TraitBound(t, _) => t.trait_,
GenericBound::Outlives(_) => panic!("cleaning a trait got a lifetime"),
@ -1458,7 +1460,7 @@ impl GenericParamDef {
}
impl<'tcx> Clean<GenericParamDef> for ty::GenericParamDef {
fn clean(&self, cx: &DocContext) -> GenericParamDef {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> GenericParamDef {
let (name, kind) = match self.kind {
ty::GenericParamDefKind::Lifetime => {
(self.name.to_string(), GenericParamDefKind::Lifetime)
@ -1488,7 +1490,7 @@ impl<'tcx> Clean<GenericParamDef> for ty::GenericParamDef {
}
impl Clean<GenericParamDef> for hir::GenericParam {
fn clean(&self, cx: &DocContext) -> GenericParamDef {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> GenericParamDef {
let (name, kind) = match self.kind {
hir::GenericParamKind::Lifetime { .. } => {
let name = if self.bounds.len() > 0 {
@ -1538,7 +1540,7 @@ pub struct Generics {
}
impl Clean<Generics> for hir::Generics {
fn clean(&self, cx: &DocContext) -> Generics {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Generics {
// Synthetic type-parameters are inserted after normal ones.
// In order for normal parameters to be able to refer to synthetic ones,
// scans them first.
@ -1608,7 +1610,7 @@ impl Clean<Generics> for hir::Generics {
impl<'a, 'tcx> Clean<Generics> for (&'a ty::Generics,
&'a Lrc<ty::GenericPredicates<'tcx>>) {
fn clean(&self, cx: &DocContext) -> Generics {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Generics {
use self::WherePredicate as WP;
let (gens, preds) = *self;
@ -1689,7 +1691,7 @@ pub struct Method {
}
impl<'a> Clean<Method> for (&'a hir::MethodSig, &'a hir::Generics, hir::BodyId) {
fn clean(&self, cx: &DocContext) -> Method {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Method {
let (generics, decl) = enter_impl_trait(cx, || {
(self.1.clean(cx), (&*self.0.decl, self.2).clean(cx))
});
@ -1716,7 +1718,7 @@ pub struct Function {
}
impl Clean<Item> for doctree::Function {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
let (generics, decl) = enter_impl_trait(cx, || {
(self.generics.clean(cx), (&self.decl, self.body).clean(cx))
});
@ -1788,7 +1790,7 @@ pub struct Arguments {
}
impl<'a> Clean<Arguments> for (&'a [hir::Ty], &'a [ast::Ident]) {
fn clean(&self, cx: &DocContext) -> Arguments {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Arguments {
Arguments {
values: self.0.iter().enumerate().map(|(i, ty)| {
let mut name = self.1.get(i).map(|ident| ident.to_string())
@ -1806,7 +1808,7 @@ impl<'a> Clean<Arguments> for (&'a [hir::Ty], &'a [ast::Ident]) {
}
impl<'a> Clean<Arguments> for (&'a [hir::Ty], hir::BodyId) {
fn clean(&self, cx: &DocContext) -> Arguments {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Arguments {
let body = cx.tcx.hir().body(self.1);
Arguments {
@ -1823,7 +1825,7 @@ impl<'a> Clean<Arguments> for (&'a [hir::Ty], hir::BodyId) {
impl<'a, A: Copy> Clean<FnDecl> for (&'a hir::FnDecl, A)
where (&'a [hir::Ty], A): Clean<Arguments>
{
fn clean(&self, cx: &DocContext) -> FnDecl {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> FnDecl {
FnDecl {
inputs: (&self.0.inputs[..], self.1).clean(cx),
output: self.0.output.clean(cx),
@ -1834,7 +1836,7 @@ impl<'a, A: Copy> Clean<FnDecl> for (&'a hir::FnDecl, A)
}
impl<'a, 'tcx> Clean<FnDecl> for (DefId, ty::PolyFnSig<'tcx>) {
fn clean(&self, cx: &DocContext) -> FnDecl {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> FnDecl {
let (did, sig) = *self;
let mut names = if cx.tcx.hir().as_local_node_id(did).is_some() {
vec![].into_iter()
@ -1895,7 +1897,7 @@ pub enum FunctionRetTy {
}
impl Clean<FunctionRetTy> for hir::FunctionRetTy {
fn clean(&self, cx: &DocContext) -> FunctionRetTy {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> FunctionRetTy {
match *self {
hir::Return(ref typ) => Return(typ.clean(cx)),
hir::DefaultReturn(..) => DefaultReturn,
@ -1924,7 +1926,7 @@ pub struct Trait {
}
impl Clean<Item> for doctree::Trait {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
let attrs = self.attrs.clean(cx);
let is_spotlight = attrs.has_doc_flag("spotlight");
Item {
@ -1955,7 +1957,7 @@ pub struct TraitAlias {
}
impl Clean<Item> for doctree::TraitAlias {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
let attrs = self.attrs.clean(cx);
Item {
name: Some(self.name.clean(cx)),
@ -1974,7 +1976,7 @@ impl Clean<Item> for doctree::TraitAlias {
}
impl Clean<bool> for hir::IsAuto {
fn clean(&self, _: &DocContext) -> bool {
fn clean(&self, _: &DocContext<'_, '_, '_>) -> bool {
match *self {
hir::IsAuto::Yes => true,
hir::IsAuto::No => false,
@ -1983,13 +1985,13 @@ impl Clean<bool> for hir::IsAuto {
}
impl Clean<Type> for hir::TraitRef {
fn clean(&self, cx: &DocContext) -> Type {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Type {
resolve_type(cx, self.path.clean(cx), self.ref_id)
}
}
impl Clean<PolyTrait> for hir::PolyTraitRef {
fn clean(&self, cx: &DocContext) -> PolyTrait {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> PolyTrait {
PolyTrait {
trait_: self.trait_ref.clean(cx),
generic_params: self.bound_generic_params.clean(cx)
@ -1998,7 +2000,7 @@ impl Clean<PolyTrait> for hir::PolyTraitRef {
}
impl Clean<Item> for hir::TraitItem {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
let inner = match self.node {
hir::TraitItemKind::Const(ref ty, default) => {
AssociatedConstItem(ty.clean(cx),
@ -2035,7 +2037,7 @@ impl Clean<Item> for hir::TraitItem {
}
impl Clean<Item> for hir::ImplItem {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
let inner = match self.node {
hir::ImplItemKind::Const(ref ty, expr) => {
AssociatedConstItem(ty.clean(cx),
@ -2067,7 +2069,7 @@ impl Clean<Item> for hir::ImplItem {
}
impl<'tcx> Clean<Item> for ty::AssociatedItem {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
let inner = match self.kind {
ty::AssociatedKind::Const => {
let ty = cx.tcx.type_of(self.def_id);
@ -2385,7 +2387,7 @@ impl GetDefId for Type {
fn def_id(&self) -> Option<DefId> {
match *self {
ResolvedPath { did, .. } => Some(did),
Primitive(p) => ::html::render::cache().primitive_locations.get(&p).cloned(),
Primitive(p) => crate::html::render::cache().primitive_locations.get(&p).cloned(),
BorrowedRef { type_: box Generic(..), .. } =>
Primitive(PrimitiveType::Reference).def_id(),
BorrowedRef { ref type_, .. } => type_.def_id(),
@ -2509,7 +2511,7 @@ impl From<ast::FloatTy> for PrimitiveType {
}
impl Clean<Type> for hir::Ty {
fn clean(&self, cx: &DocContext) -> Type {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Type {
use rustc::hir::*;
match self.node {
@ -2708,7 +2710,7 @@ impl Clean<Type> for hir::Ty {
}
impl<'tcx> Clean<Type> for Ty<'tcx> {
fn clean(&self, cx: &DocContext) -> Type {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Type {
match self.sty {
ty::Never => Never,
ty::Bool => Primitive(PrimitiveType::Bool),
@ -2903,7 +2905,7 @@ impl<'tcx> Clean<Type> for Ty<'tcx> {
}
impl Clean<Item> for hir::StructField {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
Item {
name: Some(self.ident.name).clean(cx),
attrs: self.attrs.clean(cx),
@ -2918,7 +2920,7 @@ impl Clean<Item> for hir::StructField {
}
impl<'tcx> Clean<Item> for ty::FieldDef {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
Item {
name: Some(self.ident.name).clean(cx),
attrs: cx.tcx.get_attrs(self.did).clean(cx),
@ -2941,7 +2943,7 @@ pub enum Visibility {
}
impl Clean<Option<Visibility>> for hir::Visibility {
fn clean(&self, cx: &DocContext) -> Option<Visibility> {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Option<Visibility> {
Some(match self.node {
hir::VisibilityKind::Public => Visibility::Public,
hir::VisibilityKind::Inherited => Visibility::Inherited,
@ -2956,7 +2958,7 @@ impl Clean<Option<Visibility>> for hir::Visibility {
}
impl Clean<Option<Visibility>> for ty::Visibility {
fn clean(&self, _: &DocContext) -> Option<Visibility> {
fn clean(&self, _: &DocContext<'_, '_, '_>) -> Option<Visibility> {
Some(if *self == ty::Visibility::Public { Public } else { Inherited })
}
}
@ -2978,7 +2980,7 @@ pub struct Union {
}
impl Clean<Item> for doctree::Struct {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
Item {
name: Some(self.name.clean(cx)),
attrs: self.attrs.clean(cx),
@ -2998,7 +3000,7 @@ impl Clean<Item> for doctree::Struct {
}
impl Clean<Item> for doctree::Union {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
Item {
name: Some(self.name.clean(cx)),
attrs: self.attrs.clean(cx),
@ -3028,7 +3030,7 @@ pub struct VariantStruct {
}
impl Clean<VariantStruct> for ::rustc::hir::VariantData {
fn clean(&self, cx: &DocContext) -> VariantStruct {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> VariantStruct {
VariantStruct {
struct_type: doctree::struct_type_from_def(self),
fields: self.fields().iter().map(|x| x.clean(cx)).collect(),
@ -3045,7 +3047,7 @@ pub struct Enum {
}
impl Clean<Item> for doctree::Enum {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
Item {
name: Some(self.name.clean(cx)),
attrs: self.attrs.clean(cx),
@ -3069,7 +3071,7 @@ pub struct Variant {
}
impl Clean<Item> for doctree::Variant {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
Item {
name: Some(self.name.clean(cx)),
attrs: self.attrs.clean(cx),
@ -3086,7 +3088,7 @@ impl Clean<Item> for doctree::Variant {
}
impl<'tcx> Clean<Item> for ty::VariantDef {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
let kind = match self.ctor_kind {
CtorKind::Const => VariantKind::CLike,
CtorKind::Fn => {
@ -3134,7 +3136,7 @@ pub enum VariantKind {
}
impl Clean<VariantKind> for hir::VariantData {
fn clean(&self, cx: &DocContext) -> VariantKind {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> VariantKind {
if self.is_struct() {
VariantKind::Struct(self.clean(cx))
} else if self.is_unit() {
@ -3165,7 +3167,7 @@ impl Span {
}
impl Clean<Span> for syntax_pos::Span {
fn clean(&self, cx: &DocContext) -> Span {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Span {
if self.is_dummy() {
return Span::empty();
}
@ -3198,7 +3200,7 @@ impl Path {
}
impl Clean<Path> for hir::Path {
fn clean(&self, cx: &DocContext) -> Path {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Path {
Path {
global: self.is_global(),
def: self.def,
@ -3221,7 +3223,7 @@ pub enum GenericArgs {
}
impl Clean<GenericArgs> for hir::GenericArgs {
fn clean(&self, cx: &DocContext) -> GenericArgs {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> GenericArgs {
if self.parenthesized {
let output = self.bindings[0].ty.clean(cx);
GenericArgs::Parenthesized {
@ -3263,7 +3265,7 @@ pub struct PathSegment {
}
impl Clean<PathSegment> for hir::PathSegment {
fn clean(&self, cx: &DocContext) -> PathSegment {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> PathSegment {
PathSegment {
name: self.ident.name.clean(cx),
args: self.with_generic_args(|generic_args| generic_args.clean(cx))
@ -3335,21 +3337,21 @@ fn qpath_to_string(p: &hir::QPath) -> String {
impl Clean<String> for Ident {
#[inline]
fn clean(&self, cx: &DocContext) -> String {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> String {
self.name.clean(cx)
}
}
impl Clean<String> for ast::Name {
#[inline]
fn clean(&self, _: &DocContext) -> String {
fn clean(&self, _: &DocContext<'_, '_, '_>) -> String {
self.to_string()
}
}
impl Clean<String> for InternedString {
#[inline]
fn clean(&self, _: &DocContext) -> String {
fn clean(&self, _: &DocContext<'_, '_, '_>) -> String {
self.to_string()
}
}
@ -3361,7 +3363,7 @@ pub struct Typedef {
}
impl Clean<Item> for doctree::Typedef {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
Item {
name: Some(self.name.clean(cx)),
attrs: self.attrs.clean(cx),
@ -3385,7 +3387,7 @@ pub struct Existential {
}
impl Clean<Item> for doctree::Existential {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
Item {
name: Some(self.name.clean(cx)),
attrs: self.attrs.clean(cx),
@ -3411,7 +3413,7 @@ pub struct BareFunctionDecl {
}
impl Clean<BareFunctionDecl> for hir::BareFnTy {
fn clean(&self, cx: &DocContext) -> BareFunctionDecl {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> BareFunctionDecl {
let (generic_params, decl) = enter_impl_trait(cx, || {
(self.generic_params.clean(cx), (&*self.decl, &self.arg_names[..]).clean(cx))
});
@ -3435,7 +3437,7 @@ pub struct Static {
}
impl Clean<Item> for doctree::Static {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
debug!("cleaning static {}: {:?}", self.name.clean(cx), self);
Item {
name: Some(self.name.clean(cx)),
@ -3461,7 +3463,7 @@ pub struct Constant {
}
impl Clean<Item> for doctree::Constant {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
Item {
name: Some(self.name.clean(cx)),
attrs: self.attrs.clean(cx),
@ -3485,7 +3487,7 @@ pub enum Mutability {
}
impl Clean<Mutability> for hir::Mutability {
fn clean(&self, _: &DocContext) -> Mutability {
fn clean(&self, _: &DocContext<'_, '_, '_>) -> Mutability {
match self {
&hir::MutMutable => Mutable,
&hir::MutImmutable => Immutable,
@ -3500,7 +3502,7 @@ pub enum ImplPolarity {
}
impl Clean<ImplPolarity> for hir::ImplPolarity {
fn clean(&self, _: &DocContext) -> ImplPolarity {
fn clean(&self, _: &DocContext<'_, '_, '_>) -> ImplPolarity {
match self {
&hir::ImplPolarity::Positive => ImplPolarity::Positive,
&hir::ImplPolarity::Negative => ImplPolarity::Negative,
@ -3521,30 +3523,44 @@ pub struct Impl {
pub blanket_impl: Option<Type>,
}
pub fn get_auto_traits_with_node_id(cx: &DocContext, id: ast::NodeId, name: String) -> Vec<Item> {
pub fn get_auto_traits_with_node_id(
cx: &DocContext<'_, '_, '_>,
id: ast::NodeId,
name: String
) -> Vec<Item> {
let finder = AutoTraitFinder::new(cx);
finder.get_with_node_id(id, name)
}
pub fn get_auto_traits_with_def_id(cx: &DocContext, id: DefId) -> Vec<Item> {
pub fn get_auto_traits_with_def_id(
cx: &DocContext<'_, '_, '_>,
id: DefId
) -> Vec<Item> {
let finder = AutoTraitFinder::new(cx);
finder.get_with_def_id(id)
}
pub fn get_blanket_impls_with_node_id(cx: &DocContext, id: ast::NodeId, name: String) -> Vec<Item> {
pub fn get_blanket_impls_with_node_id(
cx: &DocContext<'_, '_, '_>,
id: ast::NodeId,
name: String
) -> Vec<Item> {
let finder = BlanketImplFinder::new(cx);
finder.get_with_node_id(id, name)
}
pub fn get_blanket_impls_with_def_id(cx: &DocContext, id: DefId) -> Vec<Item> {
pub fn get_blanket_impls_with_def_id(
cx: &DocContext<'_, '_, '_>,
id: DefId
) -> Vec<Item> {
let finder = BlanketImplFinder::new(cx);
finder.get_with_def_id(id)
}
impl Clean<Vec<Item>> for doctree::Impl {
fn clean(&self, cx: &DocContext) -> Vec<Item> {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Vec<Item> {
let mut ret = Vec::new();
let trait_ = self.trait_.clean(cx);
let items = self.items.clean(cx);
@ -3586,7 +3602,7 @@ impl Clean<Vec<Item>> for doctree::Impl {
}
}
fn build_deref_target_impls(cx: &DocContext,
fn build_deref_target_impls(cx: &DocContext<'_, '_, '_>,
items: &[Item],
ret: &mut Vec<Item>) {
use self::PrimitiveType::*;
@ -3644,7 +3660,7 @@ fn build_deref_target_impls(cx: &DocContext,
}
impl Clean<Vec<Item>> for doctree::ExternCrate {
fn clean(&self, cx: &DocContext) -> Vec<Item> {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Vec<Item> {
let please_inline = self.vis.node.is_pub() && self.attrs.iter().any(|a| {
a.name() == "doc" && match a.meta_item_list() {
@ -3680,7 +3696,7 @@ impl Clean<Vec<Item>> for doctree::ExternCrate {
}
impl Clean<Vec<Item>> for doctree::Import {
fn clean(&self, cx: &DocContext) -> Vec<Item> {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Vec<Item> {
// We consider inlining the documentation of `pub use` statements, but we
// forcefully don't inline if this is not public or if the
// #[doc(no_inline)] attribute is present.
@ -3754,7 +3770,7 @@ pub struct ImportSource {
}
impl Clean<Vec<Item>> for hir::ForeignMod {
fn clean(&self, cx: &DocContext) -> Vec<Item> {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Vec<Item> {
let mut items = self.items.clean(cx);
for item in &mut items {
if let ForeignFunctionItem(ref mut f) = item.inner {
@ -3766,7 +3782,7 @@ impl Clean<Vec<Item>> for hir::ForeignMod {
}
impl Clean<Item> for hir::ForeignItem {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
let inner = match self.node {
hir::ForeignItemKind::Fn(ref decl, ref names, ref generics) => {
let (generics, decl) = enter_impl_trait(cx, || {
@ -3811,11 +3827,11 @@ impl Clean<Item> for hir::ForeignItem {
// Utilities
pub trait ToSource {
fn to_src(&self, cx: &DocContext) -> String;
fn to_src(&self, cx: &DocContext<'_, '_, '_>) -> String;
}
impl ToSource for syntax_pos::Span {
fn to_src(&self, cx: &DocContext) -> String {
fn to_src(&self, cx: &DocContext<'_, '_, '_>) -> String {
debug!("converting span {:?} to snippet", self.clean(cx));
let sn = match cx.sess().source_map().span_to_snippet(*self) {
Ok(x) => x,
@ -3862,7 +3878,7 @@ fn name_from_pat(p: &hir::Pat) -> String {
}
}
fn print_const(cx: &DocContext, n: ty::LazyConst) -> String {
fn print_const(cx: &DocContext<'_, '_, '_>, n: ty::LazyConst<'_>) -> String {
match n {
ty::LazyConst::Unevaluated(def_id, _) => {
if let Some(node_id) = cx.tcx.hir().as_local_node_id(def_id) {
@ -3884,12 +3900,12 @@ fn print_const(cx: &DocContext, n: ty::LazyConst) -> String {
}
}
fn print_const_expr(cx: &DocContext, body: hir::BodyId) -> String {
fn print_const_expr(cx: &DocContext<'_, '_, '_>, body: hir::BodyId) -> String {
cx.tcx.hir().hir_to_pretty_string(body.hir_id)
}
/// Given a type Path, resolve it to a Type using the TyCtxt
fn resolve_type(cx: &DocContext,
fn resolve_type(cx: &DocContext<'_, '_, '_>,
path: Path,
id: ast::NodeId) -> Type {
if id == ast::DUMMY_NODE_ID {
@ -3920,7 +3936,7 @@ fn resolve_type(cx: &DocContext,
ResolvedPath { path: path, typarams: None, did: did, is_generic: is_generic }
}
pub fn register_def(cx: &DocContext, def: Def) -> DefId {
pub fn register_def(cx: &DocContext<'_, '_, '_>, def: Def) -> DefId {
debug!("register_def({:?})", def);
let (did, kind) = match def {
@ -3955,7 +3971,7 @@ pub fn register_def(cx: &DocContext, def: Def) -> DefId {
did
}
fn resolve_use_source(cx: &DocContext, path: Path) -> ImportSource {
fn resolve_use_source(cx: &DocContext<'_, '_, '_>, path: Path) -> ImportSource {
ImportSource {
did: if path.def.opt_def_id().is_none() {
None
@ -3973,7 +3989,7 @@ pub struct Macro {
}
impl Clean<Item> for doctree::Macro {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
let name = self.name.clean(cx);
Item {
name: Some(name.clone()),
@ -4002,7 +4018,7 @@ pub struct ProcMacro {
}
impl Clean<Item> for doctree::ProcMacro {
fn clean(&self, cx: &DocContext) -> Item {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> Item {
Item {
name: Some(self.name.clean(cx)),
attrs: self.attrs.clean(cx),
@ -4036,7 +4052,7 @@ pub struct Deprecation {
}
impl Clean<Stability> for attr::Stability {
fn clean(&self, _: &DocContext) -> Stability {
fn clean(&self, _: &DocContext<'_, '_, '_>) -> Stability {
Stability {
level: stability::StabilityLevel::from_attr_level(&self.level),
feature: Some(self.feature.to_string()).filter(|f| !f.is_empty()),
@ -4063,13 +4079,13 @@ impl Clean<Stability> for attr::Stability {
}
impl<'a> Clean<Stability> for &'a attr::Stability {
fn clean(&self, dc: &DocContext) -> Stability {
fn clean(&self, dc: &DocContext<'_, '_, '_>) -> Stability {
(**self).clean(dc)
}
}
impl Clean<Deprecation> for attr::Deprecation {
fn clean(&self, _: &DocContext) -> Deprecation {
fn clean(&self, _: &DocContext<'_, '_, '_>) -> Deprecation {
Deprecation {
since: self.since.map(|s| s.to_string()).filter(|s| !s.is_empty()),
note: self.note.map(|n| n.to_string()).filter(|n| !n.is_empty()),
@ -4085,7 +4101,7 @@ pub struct TypeBinding {
}
impl Clean<TypeBinding> for hir::TypeBinding {
fn clean(&self, cx: &DocContext) -> TypeBinding {
fn clean(&self, cx: &DocContext<'_, '_, '_>) -> TypeBinding {
TypeBinding {
name: self.ident.name.clean(cx),
ty: self.ty.clean(cx)
@ -4093,7 +4109,11 @@ impl Clean<TypeBinding> for hir::TypeBinding {
}
}
pub fn def_id_to_path(cx: &DocContext, did: DefId, name: Option<String>) -> Vec<String> {
pub fn def_id_to_path(
cx: &DocContext<'_, '_, '_>,
did: DefId,
name: Option<String>
) -> Vec<String> {
let crate_name = name.unwrap_or_else(|| cx.tcx.crate_name(did.krate).to_string());
let relative = cx.tcx.def_path(did).data.into_iter().filter_map(|elem| {
// extern blocks have an empty name
@ -4107,7 +4127,7 @@ pub fn def_id_to_path(cx: &DocContext, did: DefId, name: Option<String>) -> Vec<
once(crate_name).chain(relative).collect()
}
pub fn enter_impl_trait<F, R>(cx: &DocContext, f: F) -> R
pub fn enter_impl_trait<F, R>(cx: &DocContext<'_, '_, '_>, f: F) -> R
where
F: FnOnce() -> R,
{
@ -4120,7 +4140,7 @@ where
// Start of code copied from rust-clippy
pub fn path_to_def_local(tcx: &TyCtxt, path: &[&str]) -> Option<DefId> {
pub fn path_to_def_local(tcx: &TyCtxt<'_, '_, '_>, path: &[&str]) -> Option<DefId> {
let krate = tcx.hir().krate();
let mut items = krate.module.item_ids.clone();
let mut path_it = path.iter().peekable();
@ -4145,7 +4165,7 @@ pub fn path_to_def_local(tcx: &TyCtxt, path: &[&str]) -> Option<DefId> {
}
}
pub fn path_to_def(tcx: &TyCtxt, path: &[&str]) -> Option<DefId> {
pub fn path_to_def(tcx: &TyCtxt<'_, '_, '_>, path: &[&str]) -> Option<DefId> {
let crates = tcx.crates();
let krate = crates
@ -4182,7 +4202,7 @@ pub fn path_to_def(tcx: &TyCtxt, path: &[&str]) -> Option<DefId> {
}
}
pub fn get_path_for_type<F>(tcx: TyCtxt, def_id: DefId, def_ctor: F) -> hir::Path
pub fn get_path_for_type<F>(tcx: TyCtxt<'_, '_, '_>, def_id: DefId, def_ctor: F) -> hir::Path
where F: Fn(DefId) -> Def {
#[derive(Debug)]
struct AbsolutePathBuffer {

View file

@ -17,12 +17,12 @@ use std::collections::BTreeMap;
use rustc::hir::def_id::DefId;
use rustc::ty;
use clean::GenericArgs as PP;
use clean::WherePredicate as WP;
use clean;
use core::DocContext;
use crate::clean::GenericArgs as PP;
use crate::clean::WherePredicate as WP;
use crate::clean;
use crate::core::DocContext;
pub fn where_clauses(cx: &DocContext, clauses: Vec<WP>) -> Vec<WP> {
pub fn where_clauses(cx: &DocContext<'_, '_, '_>, clauses: Vec<WP>) -> Vec<WP> {
// First, partition the where clause into its separate components
let mut params: BTreeMap<_, Vec<_>> = BTreeMap::new();
let mut lifetimes = Vec::new();
@ -141,7 +141,7 @@ fn ty_bounds(bounds: Vec<clean::GenericBound>) -> Vec<clean::GenericBound> {
bounds
}
fn trait_is_same_or_supertrait(cx: &DocContext, child: DefId,
fn trait_is_same_or_supertrait(cx: &DocContext<'_, '_, '_>, child: DefId,
trait_: DefId) -> bool {
if child == trait_ {
return true

View file

@ -15,14 +15,14 @@ use rustc_driver;
use rustc_target::spec::TargetTriple;
use syntax::edition::Edition;
use core::new_handler;
use externalfiles::ExternalHtml;
use html;
use html::markdown::IdMap;
use html::static_files;
use opts;
use passes::{self, DefaultPassOption};
use theme;
use crate::core::new_handler;
use crate::externalfiles::ExternalHtml;
use crate::html;
use crate::html::{static_files};
use crate::html::markdown::{IdMap};
use crate::opts;
use crate::passes::{self, DefaultPassOption};
use crate::theme;
/// Configuration options for rustdoc.
#[derive(Clone)]
@ -95,11 +95,11 @@ pub struct Options {
}
impl fmt::Debug for Options {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
struct FmtExterns<'a>(&'a Externs);
impl<'a> fmt::Debug for FmtExterns<'a> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
f.debug_map()
.entries(self.0.iter())
.finish()
@ -204,7 +204,7 @@ impl Options {
nightly_options::check_nightly_options(&matches, &opts());
if matches.opt_present("h") || matches.opt_present("help") {
::usage("rustdoc");
crate::usage("rustdoc");
return Err(0);
} else if matches.opt_present("version") {
rustc_driver::version("rustdoc", &matches);

View file

@ -33,12 +33,13 @@ use rustc_data_structures::sync::{self, Lrc};
use std::rc::Rc;
use std::sync::Arc;
use visit_ast::RustdocVisitor;
use config::{Options as RustdocOptions, RenderOptions};
use clean;
use clean::{get_path_for_type, Clean, MAX_DEF_ID, AttributesExt};
use html::render::RenderInfo;
use passes;
use crate::visit_ast::RustdocVisitor;
use crate::config::{Options as RustdocOptions, RenderOptions};
use crate::clean;
use crate::clean::{get_path_for_type, Clean, MAX_DEF_ID, AttributesExt};
use crate::html::render::RenderInfo;
use crate::passes;
pub use rustc::session::config::{Input, Options, CodegenOptions};
pub use rustc::session::search_paths::SearchPath;

View file

@ -2,8 +2,9 @@ use std::fs;
use std::path::Path;
use std::str;
use errors;
use syntax::feature_gate::UnstableFeatures;
use html::markdown::{IdMap, ErrorCodes, Markdown};
use crate::syntax::feature_gate::UnstableFeatures;
use crate::html::markdown::{IdMap, ErrorCodes, Markdown};
use std::cell::RefCell;
#[derive(Clone, Debug)]

View file

@ -1,4 +1,4 @@
use clean::*;
use crate::clean::*;
pub struct StripItem(pub Item);

View file

@ -10,7 +10,7 @@ use std::fmt;
pub struct Escape<'a>(pub &'a str);
impl<'a> fmt::Display for Escape<'a> {
fn fmt(&self, fmt: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, fmt: &mut fmt::Formatter<'_>) -> fmt::Result {
// Because the internet is always right, turns out there's not that many
// characters to escape: http://stackoverflow.com/questions/7381974
let Escape(s) = *self;

View file

@ -12,10 +12,11 @@ use rustc::hir::def_id::DefId;
use rustc_target::spec::abi::Abi;
use rustc::hir;
use clean::{self, PrimitiveType};
use core::DocAccessLevels;
use html::item_type::ItemType;
use html::render::{self, cache, CURRENT_LOCATION_KEY};
use crate::clean::{self, PrimitiveType};
use crate::core::DocAccessLevels;
use crate::html::item_type::ItemType;
use crate::html::render::{self, cache, CURRENT_LOCATION_KEY};
/// Helper to render an optional visibility with a space after it (if the
/// visibility is preset)
@ -42,7 +43,7 @@ pub struct RawMutableSpace(pub clean::Mutability);
/// Wrapper struct for emitting type parameter bounds.
pub struct GenericBounds<'a>(pub &'a [clean::GenericBound]);
/// Wrapper struct for emitting a comma-separated list of items
pub struct CommaSep<'a, T: 'a>(pub &'a [T]);
pub struct CommaSep<'a, T>(pub &'a [T]);
pub struct AbiSpace(pub Abi);
/// Wrapper struct for properly emitting a function or method declaration.
@ -94,7 +95,7 @@ impl ConstnessSpace {
}
impl<'a, T: fmt::Display> fmt::Display for CommaSep<'a, T> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
for (i, item) in self.0.iter().enumerate() {
if i != 0 { write!(f, ", ")?; }
fmt::Display::fmt(item, f)?;
@ -104,7 +105,7 @@ impl<'a, T: fmt::Display> fmt::Display for CommaSep<'a, T> {
}
impl<'a> fmt::Display for GenericBounds<'a> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
let &GenericBounds(bounds) = self;
for (i, bound) in bounds.iter().enumerate() {
if i > 0 {
@ -117,7 +118,7 @@ impl<'a> fmt::Display for GenericBounds<'a> {
}
impl fmt::Display for clean::GenericParamDef {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self.kind {
clean::GenericParamDefKind::Lifetime => write!(f, "{}", self.name),
clean::GenericParamDefKind::Type { ref bounds, ref default, .. } => {
@ -156,7 +157,7 @@ impl fmt::Display for clean::GenericParamDef {
}
impl fmt::Display for clean::Generics {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
let real_params = self.params
.iter()
.filter(|p| !p.is_synthetic_type_param())
@ -173,7 +174,7 @@ impl fmt::Display for clean::Generics {
}
impl<'a> fmt::Display for WhereClause<'a> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
let &WhereClause { gens, indent, end_newline } = self;
if gens.where_predicates.is_empty() {
return Ok(());
@ -252,14 +253,14 @@ impl<'a> fmt::Display for WhereClause<'a> {
}
impl fmt::Display for clean::Lifetime {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
f.write_str(self.get_ref())?;
Ok(())
}
}
impl fmt::Display for clean::PolyTrait {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
if !self.generic_params.is_empty() {
if f.alternate() {
write!(f, "for<{:#}> ", CommaSep(&self.generic_params))?;
@ -276,7 +277,7 @@ impl fmt::Display for clean::PolyTrait {
}
impl fmt::Display for clean::GenericBound {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match *self {
clean::GenericBound::Outlives(ref lt) => {
write!(f, "{}", *lt)
@ -297,7 +298,7 @@ impl fmt::Display for clean::GenericBound {
}
impl fmt::Display for clean::GenericArgs {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match *self {
clean::GenericArgs::AngleBracketed {
ref lifetimes, ref types, ref bindings
@ -374,7 +375,7 @@ impl fmt::Display for clean::GenericArgs {
}
impl fmt::Display for clean::PathSegment {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
f.write_str(&self.name)?;
if f.alternate() {
write!(f, "{:#}", self.args)
@ -385,7 +386,7 @@ impl fmt::Display for clean::PathSegment {
}
impl fmt::Display for clean::Path {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
if self.global {
f.write_str("::")?
}
@ -445,7 +446,7 @@ pub fn href(did: DefId) -> Option<(String, ItemType, Vec<String>)> {
/// Used when rendering a `ResolvedPath` structure. This invokes the `path`
/// rendering function with the necessary arguments for linking to a local path.
fn resolved_path(w: &mut fmt::Formatter, did: DefId, path: &clean::Path,
fn resolved_path(w: &mut fmt::Formatter<'_>, did: DefId, path: &clean::Path,
print_all: bool, use_absolute: bool) -> fmt::Result {
let last = path.segments.last().unwrap();
@ -474,7 +475,7 @@ fn resolved_path(w: &mut fmt::Formatter, did: DefId, path: &clean::Path,
Ok(())
}
fn primitive_link(f: &mut fmt::Formatter,
fn primitive_link(f: &mut fmt::Formatter<'_>,
prim: clean::PrimitiveType,
name: &str) -> fmt::Result {
let m = cache();
@ -519,7 +520,7 @@ fn primitive_link(f: &mut fmt::Formatter,
}
/// Helper to render type parameters
fn tybounds(w: &mut fmt::Formatter,
fn tybounds(w: &mut fmt::Formatter<'_>,
typarams: &Option<Vec<clean::GenericBound>>) -> fmt::Result {
match *typarams {
Some(ref params) => {
@ -540,7 +541,7 @@ impl<'a> HRef<'a> {
}
impl<'a> fmt::Display for HRef<'a> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match href(self.did) {
Some((url, shortty, fqp)) => if !f.alternate() {
write!(f, "<a class=\"{}\" href=\"{}\" title=\"{} {}\">{}</a>",
@ -553,7 +554,7 @@ impl<'a> fmt::Display for HRef<'a> {
}
}
fn fmt_type(t: &clean::Type, f: &mut fmt::Formatter, use_absolute: bool) -> fmt::Result {
fn fmt_type(t: &clean::Type, f: &mut fmt::Formatter<'_>, use_absolute: bool) -> fmt::Result {
match *t {
clean::Generic(ref name) => {
f.write_str(name)
@ -745,13 +746,13 @@ fn fmt_type(t: &clean::Type, f: &mut fmt::Formatter, use_absolute: bool) -> fmt:
}
impl fmt::Display for clean::Type {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
fmt_type(self, f, false)
}
}
fn fmt_impl(i: &clean::Impl,
f: &mut fmt::Formatter,
f: &mut fmt::Formatter<'_>,
link_trait: bool,
use_absolute: bool) -> fmt::Result {
if f.alternate() {
@ -791,20 +792,20 @@ fn fmt_impl(i: &clean::Impl,
}
impl fmt::Display for clean::Impl {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
fmt_impl(self, f, true, false)
}
}
// The difference from above is that trait is not hyperlinked.
pub fn fmt_impl_for_trait_page(i: &clean::Impl,
f: &mut fmt::Formatter,
f: &mut fmt::Formatter<'_>,
use_absolute: bool) -> fmt::Result {
fmt_impl(i, f, false, use_absolute)
}
impl fmt::Display for clean::Arguments {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
for (i, input) in self.values.iter().enumerate() {
if !input.name.is_empty() {
write!(f, "{}: ", input.name)?;
@ -821,7 +822,7 @@ impl fmt::Display for clean::Arguments {
}
impl fmt::Display for clean::FunctionRetTy {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match *self {
clean::Return(clean::Tuple(ref tys)) if tys.is_empty() => Ok(()),
clean::Return(ref ty) if f.alternate() => write!(f, " -> {:#}", ty),
@ -832,7 +833,7 @@ impl fmt::Display for clean::FunctionRetTy {
}
impl fmt::Display for clean::FnDecl {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
if self.variadic {
if f.alternate() {
write!(f, "({args:#}, ...){arrow:#}", args = self.inputs, arrow = self.output)
@ -850,7 +851,7 @@ impl fmt::Display for clean::FnDecl {
}
impl<'a> fmt::Display for Function<'a> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
let &Function { decl, header_len, indent, asyncness } = self;
let amp = if f.alternate() { "&" } else { "&amp;" };
let mut args = String::new();
@ -947,7 +948,7 @@ impl<'a> fmt::Display for Function<'a> {
}
impl<'a> fmt::Display for VisSpace<'a> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match *self.get() {
Some(clean::Public) => f.write_str("pub "),
Some(clean::Inherited) | None => Ok(()),
@ -967,7 +968,7 @@ impl<'a> fmt::Display for VisSpace<'a> {
}
impl fmt::Display for UnsafetySpace {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self.get() {
hir::Unsafety::Unsafe => write!(f, "unsafe "),
hir::Unsafety::Normal => Ok(())
@ -976,7 +977,7 @@ impl fmt::Display for UnsafetySpace {
}
impl fmt::Display for ConstnessSpace {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self.get() {
hir::Constness::Const => write!(f, "const "),
hir::Constness::NotConst => Ok(())
@ -985,7 +986,7 @@ impl fmt::Display for ConstnessSpace {
}
impl fmt::Display for AsyncSpace {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self.0 {
hir::IsAsync::Async => write!(f, "async "),
hir::IsAsync::NotAsync => Ok(()),
@ -994,7 +995,7 @@ impl fmt::Display for AsyncSpace {
}
impl fmt::Display for clean::Import {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match *self {
clean::Import::Simple(ref name, ref src) => {
if *name == src.path.last_name() {
@ -1015,7 +1016,7 @@ impl fmt::Display for clean::Import {
}
impl fmt::Display for clean::ImportSource {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self.did {
Some(did) => resolved_path(f, did, &self.path, true, false),
_ => {
@ -1032,7 +1033,7 @@ impl fmt::Display for clean::ImportSource {
}
impl fmt::Display for clean::TypeBinding {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
if f.alternate() {
write!(f, "{} = {:#}", self.name, self.ty)
} else {
@ -1042,7 +1043,7 @@ impl fmt::Display for clean::TypeBinding {
}
impl fmt::Display for MutableSpace {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match *self {
MutableSpace(clean::Immutable) => Ok(()),
MutableSpace(clean::Mutable) => write!(f, "mut "),
@ -1051,7 +1052,7 @@ impl fmt::Display for MutableSpace {
}
impl fmt::Display for RawMutableSpace {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match *self {
RawMutableSpace(clean::Immutable) => write!(f, "const "),
RawMutableSpace(clean::Mutable) => write!(f, "mut "),
@ -1060,7 +1061,7 @@ impl fmt::Display for RawMutableSpace {
}
impl fmt::Display for AbiSpace {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
let quot = if f.alternate() { "\"" } else { "&quot;" };
match self.0 {
Abi::Rust => Ok(()),

View file

@ -5,7 +5,7 @@
//!
//! Use the `render_with_highlighting` to highlight some rust code.
use html::escape::Escape;
use crate::html::escape::Escape;
use std::fmt::Display;
use std::io;

View file

@ -2,7 +2,7 @@
use std::fmt;
use syntax::ext::base::MacroKind;
use clean;
use crate::clean;
/// Item type. Corresponds to `clean::ItemEnum` variants.
///
@ -189,7 +189,7 @@ impl ItemType {
}
impl fmt::Display for ItemType {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
self.css_class().fmt(f)
}
}
@ -211,7 +211,7 @@ impl NameSpace {
}
impl fmt::Display for NameSpace {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
self.to_static_str().fmt(f)
}
}

View file

@ -2,9 +2,8 @@ use std::fmt;
use std::io;
use std::path::PathBuf;
use externalfiles::ExternalHtml;
use html::render::SlashChecker;
use crate::externalfiles::ExternalHtml;
use crate::html::render::SlashChecker;
#[derive(Clone)]
pub struct Layout {
@ -29,7 +28,7 @@ pub struct Page<'a> {
pub fn render<T: fmt::Display, S: fmt::Display>(
dst: &mut dyn io::Write,
layout: &Layout,
page: &Page,
page: &Page<'_>,
sidebar: &S,
t: &T,
css_file_extension: bool,

View file

@ -29,9 +29,9 @@ use std::ops::Range;
use std::str;
use syntax::edition::Edition;
use html::toc::TocBuilder;
use html::highlight;
use test;
use crate::html::toc::TocBuilder;
use crate::html::highlight;
use crate::test;
use pulldown_cmark::{html, Event, Tag, Parser};
use pulldown_cmark::{Options, OPTION_ENABLE_FOOTNOTES, OPTION_ENABLE_TABLES};
@ -101,7 +101,7 @@ impl<'a> Line<'a> {
// is done in the single # case. This inconsistency seems okay, if non-ideal. In
// order to fix it we'd have to iterate to find the first non-# character, and
// then reallocate to remove it; which would make us return a String.
fn map_line(s: &str) -> Line {
fn map_line(s: &str) -> Line<'_> {
let trimmed = s.trim();
if trimmed.starts_with("##") {
Line::Shown(Cow::Owned(s.replacen("##", "#", 1)))
@ -185,7 +185,7 @@ impl<'a, I: Iterator<Item = Event<'a>>> Iterator for CodeBlocks<'a, I> {
}
}
let lines = origtext.lines().filter_map(|l| map_line(l).for_html());
let text = lines.collect::<Vec<Cow<str>>>().join("\n");
let text = lines.collect::<Vec<Cow<'_, str>>>().join("\n");
PLAYGROUND.with(|play| {
// insert newline to clearly separate it from the
// previous block so we can shorten the html output
@ -196,7 +196,7 @@ impl<'a, I: Iterator<Item = Event<'a>>> Iterator for CodeBlocks<'a, I> {
}
let test = origtext.lines()
.map(|l| map_line(l).for_code())
.collect::<Vec<Cow<str>>>().join("\n");
.collect::<Vec<Cow<'_, str>>>().join("\n");
let krate = krate.as_ref().map(|s| &**s);
let (test, _) = test::make_test(&test, krate, false,
&Default::default());
@ -386,7 +386,7 @@ impl<'a, I: Iterator<Item = Event<'a>>> SummaryLine<'a, I> {
}
}
fn check_if_allowed_tag(t: &Tag) -> bool {
fn check_if_allowed_tag(t: &Tag<'_>) -> bool {
match *t {
Tag::Paragraph
| Tag::Item
@ -523,7 +523,7 @@ impl<'a, I: Iterator<Item = Event<'a>>> Iterator for Footnotes<'a, I> {
pub struct TestableCodeError(());
impl fmt::Display for TestableCodeError {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "invalid start of a new code block")
}
}
@ -569,7 +569,7 @@ pub fn find_testable_code<T: test::Tester>(
}
if let Some(offset) = offset {
let lines = test_s.lines().map(|l| map_line(l).for_code());
let text = lines.collect::<Vec<Cow<str>>>().join("\n");
let text = lines.collect::<Vec<Cow<'_, str>>>().join("\n");
nb_lines += doc[prev_offset..offset].lines().count();
let line = tests.get_line() + (nb_lines - 1);
tests.add_test(text, block_info, line);
@ -681,7 +681,7 @@ impl LangString {
}
impl<'a> fmt::Display for Markdown<'a> {
fn fmt(&self, fmt: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, fmt: &mut fmt::Formatter<'_>) -> fmt::Result {
let Markdown(md, links, ref ids, codes) = *self;
let mut ids = ids.borrow_mut();
@ -714,7 +714,7 @@ impl<'a> fmt::Display for Markdown<'a> {
}
impl<'a> fmt::Display for MarkdownWithToc<'a> {
fn fmt(&self, fmt: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, fmt: &mut fmt::Formatter<'_>) -> fmt::Result {
let MarkdownWithToc(md, ref ids, codes) = *self;
let mut ids = ids.borrow_mut();
@ -742,7 +742,7 @@ impl<'a> fmt::Display for MarkdownWithToc<'a> {
}
impl<'a> fmt::Display for MarkdownHtml<'a> {
fn fmt(&self, fmt: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, fmt: &mut fmt::Formatter<'_>) -> fmt::Result {
let MarkdownHtml(md, ref ids, codes) = *self;
let mut ids = ids.borrow_mut();
@ -772,7 +772,7 @@ impl<'a> fmt::Display for MarkdownHtml<'a> {
}
impl<'a> fmt::Display for MarkdownSummaryLine<'a> {
fn fmt(&self, fmt: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, fmt: &mut fmt::Formatter<'_>) -> fmt::Result {
let MarkdownSummaryLine(md, links) = *self;
// This is actually common enough to special-case
if md.is_empty() { return Ok(()) }

View file

@ -55,18 +55,18 @@ use rustc::hir;
use rustc::util::nodemap::{FxHashMap, FxHashSet};
use rustc_data_structures::flock;
use clean::{self, AttributesExt, Deprecation, GetDefId, SelfTy, Mutability};
use config::RenderOptions;
use doctree;
use fold::DocFolder;
use html::escape::Escape;
use html::format::{AsyncSpace, ConstnessSpace};
use html::format::{GenericBounds, WhereClause, href, AbiSpace};
use html::format::{VisSpace, Function, UnsafetySpace, MutableSpace};
use html::format::fmt_impl_for_trait_page;
use html::item_type::ItemType;
use html::markdown::{self, Markdown, MarkdownHtml, MarkdownSummaryLine, ErrorCodes, IdMap};
use html::{highlight, layout, static_files};
use crate::clean::{self, AttributesExt, Deprecation, GetDefId, SelfTy, Mutability};
use crate::config::RenderOptions;
use crate::doctree;
use crate::fold::DocFolder;
use crate::html::escape::Escape;
use crate::html::format::{AsyncSpace, ConstnessSpace};
use crate::html::format::{GenericBounds, WhereClause, href, AbiSpace};
use crate::html::format::{VisSpace, Function, UnsafetySpace, MutableSpace};
use crate::html::format::fmt_impl_for_trait_page;
use crate::html::item_type::ItemType;
use crate::html::markdown::{self, Markdown, MarkdownHtml, MarkdownSummaryLine, ErrorCodes, IdMap};
use crate::html::{highlight, layout, static_files};
use minifier;
@ -76,7 +76,7 @@ pub type NameDoc = (String, Option<String>);
pub struct SlashChecker<'a>(pub &'a str);
impl<'a> Display for SlashChecker<'a> {
fn fmt(&self, f: &mut Formatter) -> fmt::Result {
fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result {
if !self.0.ends_with("/") && !self.0.is_empty() {
write!(f, "{}/", self.0)
} else {
@ -223,7 +223,7 @@ impl error::Error for Error {
}
impl Display for Error {
fn fmt(&self, f: &mut Formatter) -> fmt::Result {
fn fmt(&self, f: &mut Formatter<'_>) -> fmt::Result {
write!(f, "\"{}\": {}", self.file.display(), self.error)
}
}
@ -367,7 +367,7 @@ pub struct Cache {
#[derive(Default)]
pub struct RenderInfo {
pub inlined: FxHashSet<DefId>,
pub external_paths: ::core::ExternalPaths,
pub external_paths: crate::core::ExternalPaths,
pub external_typarams: FxHashMap<DefId, String>,
pub exact_paths: FxHashMap<DefId, Vec<String>>,
pub access_levels: AccessLevels<DefId>,
@ -1117,7 +1117,11 @@ themePicker.onblur = handleThemeButtonsBlur;
// with rustdoc running in parallel.
all_indexes.sort();
let mut w = try_err!(File::create(&dst), &dst);
try_err!(writeln!(&mut w, "var N=null,E=\"\",T=\"t\",U=\"u\",searchIndex={{}};"), &dst);
if options.enable_minification {
try_err!(writeln!(&mut w, "var N=null,E=\"\",T=\"t\",U=\"u\",searchIndex={{}};"), &dst);
} else {
try_err!(writeln!(&mut w, "var searchIndex={{}};"), &dst);
}
try_err!(write_minify_replacer(&mut w,
&format!("{}\n{}", variables.join(""), all_indexes.join("\n")),
options.enable_minification),
@ -1130,7 +1134,7 @@ themePicker.onblur = handleThemeButtonsBlur;
md_opts.output = cx.dst.clone();
md_opts.external_html = (*cx.shared).layout.external_html.clone();
::markdown::render(index_page, md_opts, diag);
crate::markdown::render(index_page, md_opts, diag);
} else {
let dst = cx.dst.join("index.html");
let mut w = BufWriter::new(try_err!(File::create(&dst), &dst));
@ -1808,7 +1812,7 @@ impl ItemEntry {
}
impl fmt::Display for ItemEntry {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f, "<a href='{}'>{}</a>", self.url, Escape(&self.name))
}
}
@ -1893,7 +1897,7 @@ impl AllTypes {
}
}
fn print_entries(f: &mut fmt::Formatter, e: &FxHashSet<ItemEntry>, title: &str,
fn print_entries(f: &mut fmt::Formatter<'_>, e: &FxHashSet<ItemEntry>, title: &str,
class: &str) -> fmt::Result {
if !e.is_empty() {
let mut e: Vec<&ItemEntry> = e.iter().collect();
@ -1908,7 +1912,7 @@ fn print_entries(f: &mut fmt::Formatter, e: &FxHashSet<ItemEntry>, title: &str,
}
impl fmt::Display for AllTypes {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f,
"<h1 class='fqn'>\
<span class='out-of-band'>\
@ -1965,7 +1969,7 @@ impl<'a> Settings<'a> {
}
impl<'a> fmt::Display for Settings<'a> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(f,
"<h1 class='fqn'>\
<span class='in-band'>Rustdoc settings</span>\
@ -2364,16 +2368,16 @@ impl<'a> Item<'a> {
}
}
fn wrap_into_docblock<F>(w: &mut fmt::Formatter,
fn wrap_into_docblock<F>(w: &mut fmt::Formatter<'_>,
f: F) -> fmt::Result
where F: Fn(&mut fmt::Formatter) -> fmt::Result {
where F: Fn(&mut fmt::Formatter<'_>) -> fmt::Result {
write!(w, "<div class=\"docblock type-decl hidden-by-usual-hider\">")?;
f(w)?;
write!(w, "</div>")
}
impl<'a> fmt::Display for Item<'a> {
fn fmt(&self, fmt: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, fmt: &mut fmt::Formatter<'_>) -> fmt::Result {
debug_assert!(!self.item.is_stripped());
// Write the breadcrumb trail header for the top
write!(fmt, "<h1 class='fqn'><span class='out-of-band'>")?;
@ -2516,7 +2520,7 @@ fn plain_summary_line_short(s: Option<&str>) -> String {
markdown::plain_summary_line_full(&line[..], true)
}
fn document(w: &mut fmt::Formatter, cx: &Context, item: &clean::Item) -> fmt::Result {
fn document(w: &mut fmt::Formatter<'_>, cx: &Context, item: &clean::Item) -> fmt::Result {
if let Some(ref name) = item.name {
info!("Documenting {}", name);
}
@ -2526,7 +2530,7 @@ fn document(w: &mut fmt::Formatter, cx: &Context, item: &clean::Item) -> fmt::Re
}
/// Render md_text as markdown.
fn render_markdown(w: &mut fmt::Formatter,
fn render_markdown(w: &mut fmt::Formatter<'_>,
cx: &Context,
md_text: &str,
links: Vec<(String, String)>,
@ -2541,8 +2545,13 @@ fn render_markdown(w: &mut fmt::Formatter,
cx.codes))
}
fn document_short(w: &mut fmt::Formatter, cx: &Context, item: &clean::Item, link: AssocItemLink,
prefix: &str, is_hidden: bool) -> fmt::Result {
fn document_short(
w: &mut fmt::Formatter<'_>,
cx: &Context,
item: &clean::Item,
link: AssocItemLink<'_>,
prefix: &str, is_hidden: bool
) -> fmt::Result {
if let Some(s) = item.doc_value() {
let markdown = if s.contains('\n') {
format!("{} [Read more]({})",
@ -2559,7 +2568,7 @@ fn document_short(w: &mut fmt::Formatter, cx: &Context, item: &clean::Item, link
Ok(())
}
fn document_full(w: &mut fmt::Formatter, item: &clean::Item,
fn document_full(w: &mut fmt::Formatter<'_>, item: &clean::Item,
cx: &Context, prefix: &str, is_hidden: bool) -> fmt::Result {
if let Some(s) = cx.shared.maybe_collapsed_doc_value(item) {
debug!("Doc block: =====\n{}\n=====", s);
@ -2572,7 +2581,7 @@ fn document_full(w: &mut fmt::Formatter, item: &clean::Item,
Ok(())
}
fn document_stability(w: &mut fmt::Formatter, cx: &Context, item: &clean::Item,
fn document_stability(w: &mut fmt::Formatter<'_>, cx: &Context, item: &clean::Item,
is_hidden: bool) -> fmt::Result {
let stabilities = short_stability(item, cx);
if !stabilities.is_empty() {
@ -2589,7 +2598,7 @@ fn document_non_exhaustive_header(item: &clean::Item) -> &str {
if item.is_non_exhaustive() { " (Non-exhaustive)" } else { "" }
}
fn document_non_exhaustive(w: &mut fmt::Formatter, item: &clean::Item) -> fmt::Result {
fn document_non_exhaustive(w: &mut fmt::Formatter<'_>, item: &clean::Item) -> fmt::Result {
if item.is_non_exhaustive() {
write!(w, "<div class='docblock non-exhaustive non-exhaustive-{}'>", {
if item.is_struct() { "struct" } else if item.is_enum() { "enum" } else { "type" }
@ -2637,7 +2646,7 @@ fn name_key(name: &str) -> (&str, u64, usize) {
}
}
fn item_module(w: &mut fmt::Formatter, cx: &Context,
fn item_module(w: &mut fmt::Formatter<'_>, cx: &Context,
item: &clean::Item, items: &[clean::Item]) -> fmt::Result {
document(w, cx, item)?;
@ -2741,7 +2750,7 @@ fn item_module(w: &mut fmt::Formatter, cx: &Context,
match myitem.inner {
clean::ExternCrateItem(ref name, ref src) => {
use html::format::HRef;
use crate::html::format::HRef;
match *src {
Some(ref src) => {
@ -2957,7 +2966,7 @@ fn short_stability(item: &clean::Item, cx: &Context) -> Vec<String> {
stability
}
fn item_constant(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
fn item_constant(w: &mut fmt::Formatter<'_>, cx: &Context, it: &clean::Item,
c: &clean::Constant) -> fmt::Result {
write!(w, "<pre class='rust const'>")?;
render_attributes(w, it)?;
@ -2969,7 +2978,7 @@ fn item_constant(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
document(w, cx, it)
}
fn item_static(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
fn item_static(w: &mut fmt::Formatter<'_>, cx: &Context, it: &clean::Item,
s: &clean::Static) -> fmt::Result {
write!(w, "<pre class='rust static'>")?;
render_attributes(w, it)?;
@ -2982,7 +2991,7 @@ fn item_static(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
document(w, cx, it)
}
fn item_function(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
fn item_function(w: &mut fmt::Formatter<'_>, cx: &Context, it: &clean::Item,
f: &clean::Function) -> fmt::Result {
let header_len = format!(
"{}{}{}{}{:#}fn {}{:#}",
@ -3016,7 +3025,7 @@ fn item_function(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
document(w, cx, it)
}
fn render_implementor(cx: &Context, implementor: &Impl, w: &mut fmt::Formatter,
fn render_implementor(cx: &Context, implementor: &Impl, w: &mut fmt::Formatter<'_>,
implementor_dups: &FxHashMap<&str, (DefId, bool)>) -> fmt::Result {
// If there's already another implementor that has the same abbridged name, use the
// full path, for example in `std::iter::ExactSizeIterator`
@ -3033,7 +3042,7 @@ fn render_implementor(cx: &Context, implementor: &Impl, w: &mut fmt::Formatter,
Ok(())
}
fn render_impls(cx: &Context, w: &mut fmt::Formatter,
fn render_impls(cx: &Context, w: &mut fmt::Formatter<'_>,
traits: &[&&Impl],
containing_item: &clean::Item) -> fmt::Result {
for i in traits {
@ -3070,7 +3079,7 @@ fn compare_impl<'a, 'b>(lhs: &'a &&Impl, rhs: &'b &&Impl) -> Ordering {
}
fn item_trait(
w: &mut fmt::Formatter,
w: &mut fmt::Formatter<'_>,
cx: &Context,
it: &clean::Item,
t: &clean::Trait,
@ -3156,7 +3165,7 @@ fn item_trait(
document(w, cx, it)?;
fn write_small_section_header(
w: &mut fmt::Formatter,
w: &mut fmt::Formatter<'_>,
id: &str,
title: &str,
extra_content: &str,
@ -3167,11 +3176,11 @@ fn item_trait(
</h2>{2}", id, title, extra_content)
}
fn write_loading_content(w: &mut fmt::Formatter, extra_content: &str) -> fmt::Result {
fn write_loading_content(w: &mut fmt::Formatter<'_>, extra_content: &str) -> fmt::Result {
write!(w, "{}<span class='loading-content'>Loading content...</span>", extra_content)
}
fn trait_item(w: &mut fmt::Formatter, cx: &Context, m: &clean::Item, t: &clean::Item)
fn trait_item(w: &mut fmt::Formatter<'_>, cx: &Context, m: &clean::Item, t: &clean::Item)
-> fmt::Result {
let name = m.name.as_ref().unwrap();
let item_type = m.type_();
@ -3328,8 +3337,8 @@ fn item_trait(
Ok(())
}
fn naive_assoc_href(it: &clean::Item, link: AssocItemLink) -> String {
use html::item_type::ItemType::*;
fn naive_assoc_href(it: &clean::Item, link: AssocItemLink<'_>) -> String {
use crate::html::item_type::ItemType::*;
let name = it.name.as_ref().unwrap();
let ty = match it.type_() {
@ -3347,11 +3356,11 @@ fn naive_assoc_href(it: &clean::Item, link: AssocItemLink) -> String {
}
}
fn assoc_const(w: &mut fmt::Formatter,
fn assoc_const(w: &mut fmt::Formatter<'_>,
it: &clean::Item,
ty: &clean::Type,
_default: Option<&String>,
link: AssocItemLink) -> fmt::Result {
link: AssocItemLink<'_>) -> fmt::Result {
write!(w, "{}const <a href='{}' class=\"constant\"><b>{}</b></a>: {}",
VisSpace(&it.visibility),
naive_assoc_href(it, link),
@ -3363,7 +3372,7 @@ fn assoc_const(w: &mut fmt::Formatter,
fn assoc_type<W: fmt::Write>(w: &mut W, it: &clean::Item,
bounds: &[clean::GenericBound],
default: Option<&clean::Type>,
link: AssocItemLink) -> fmt::Result {
link: AssocItemLink<'_>) -> fmt::Result {
write!(w, "type <a href='{}' class=\"type\">{}</a>",
naive_assoc_href(it, link),
it.name.as_ref().unwrap())?;
@ -3389,22 +3398,22 @@ fn render_stability_since_raw<'a, T: fmt::Write>(
Ok(())
}
fn render_stability_since(w: &mut fmt::Formatter,
fn render_stability_since(w: &mut fmt::Formatter<'_>,
item: &clean::Item,
containing_item: &clean::Item) -> fmt::Result {
render_stability_since_raw(w, item.stable_since(), containing_item.stable_since())
}
fn render_assoc_item(w: &mut fmt::Formatter,
fn render_assoc_item(w: &mut fmt::Formatter<'_>,
item: &clean::Item,
link: AssocItemLink,
link: AssocItemLink<'_>,
parent: ItemType) -> fmt::Result {
fn method(w: &mut fmt::Formatter,
fn method(w: &mut fmt::Formatter<'_>,
meth: &clean::Item,
header: hir::FnHeader,
g: &clean::Generics,
d: &clean::FnDecl,
link: AssocItemLink,
link: AssocItemLink<'_>,
parent: ItemType)
-> fmt::Result {
let name = meth.name.as_ref().unwrap();
@ -3481,7 +3490,7 @@ fn render_assoc_item(w: &mut fmt::Formatter,
}
}
fn item_struct(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
fn item_struct(w: &mut fmt::Formatter<'_>, cx: &Context, it: &clean::Item,
s: &clean::Struct) -> fmt::Result {
wrap_into_docblock(w, |w| {
write!(w, "<pre class='rust struct'>")?;
@ -3532,7 +3541,7 @@ fn item_struct(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
render_assoc_items(w, cx, it, it.def_id, AssocItemRender::All)
}
fn item_union(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
fn item_union(w: &mut fmt::Formatter<'_>, cx: &Context, it: &clean::Item,
s: &clean::Union) -> fmt::Result {
wrap_into_docblock(w, |w| {
write!(w, "<pre class='rust union'>")?;
@ -3577,7 +3586,7 @@ fn item_union(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
render_assoc_items(w, cx, it, it.def_id, AssocItemRender::All)
}
fn item_enum(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
fn item_enum(w: &mut fmt::Formatter<'_>, cx: &Context, it: &clean::Item,
e: &clean::Enum) -> fmt::Result {
wrap_into_docblock(w, |w| {
write!(w, "<pre class='rust enum'>")?;
@ -3666,7 +3675,7 @@ fn item_enum(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
write!(w, "</code></span>")?;
document(w, cx, variant)?;
use clean::{Variant, VariantKind};
use crate::clean::{Variant, VariantKind};
if let clean::VariantItem(Variant {
kind: VariantKind::Struct(ref s)
}) = variant.inner {
@ -3678,7 +3687,7 @@ fn item_enum(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
write!(w, "<h3>Fields of <b>{name}</b></h3><div>",
name = variant.name.as_ref().unwrap())?;
for field in &s.fields {
use clean::StructFieldItem;
use crate::clean::StructFieldItem;
if let StructFieldItem(ref ty) = field.inner {
let id = cx.derive_id(format!("variant.{}.field.{}",
variant.name.as_ref().unwrap(),
@ -3741,7 +3750,7 @@ const ATTRIBUTE_WHITELIST: &'static [&'static str] = &[
"non_exhaustive"
];
fn render_attributes(w: &mut fmt::Formatter, it: &clean::Item) -> fmt::Result {
fn render_attributes(w: &mut fmt::Formatter<'_>, it: &clean::Item) -> fmt::Result {
let mut attrs = String::new();
for attr in &it.attrs.other_attrs {
@ -3759,7 +3768,7 @@ fn render_attributes(w: &mut fmt::Formatter, it: &clean::Item) -> fmt::Result {
Ok(())
}
fn render_struct(w: &mut fmt::Formatter, it: &clean::Item,
fn render_struct(w: &mut fmt::Formatter<'_>, it: &clean::Item,
g: Option<&clean::Generics>,
ty: doctree::StructType,
fields: &[clean::Item],
@ -3835,7 +3844,7 @@ fn render_struct(w: &mut fmt::Formatter, it: &clean::Item,
Ok(())
}
fn render_union(w: &mut fmt::Formatter, it: &clean::Item,
fn render_union(w: &mut fmt::Formatter<'_>, it: &clean::Item,
g: Option<&clean::Generics>,
fields: &[clean::Item],
tab: &str,
@ -3893,11 +3902,11 @@ enum RenderMode {
ForDeref { mut_: bool },
}
fn render_assoc_items(w: &mut fmt::Formatter,
fn render_assoc_items(w: &mut fmt::Formatter<'_>,
cx: &Context,
containing_item: &clean::Item,
it: DefId,
what: AssocItemRender) -> fmt::Result {
what: AssocItemRender<'_>) -> fmt::Result {
let c = cache();
let v = match c.impls.get(&it) {
Some(v) => v,
@ -3955,7 +3964,7 @@ fn render_assoc_items(w: &mut fmt::Formatter,
struct RendererStruct<'a, 'b, 'c>(&'a Context, Vec<&'b &'b Impl>, &'c clean::Item);
impl<'a, 'b, 'c> fmt::Display for RendererStruct<'a, 'b, 'c> {
fn fmt(&self, fmt: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, fmt: &mut fmt::Formatter<'_>) -> fmt::Result {
render_impls(self.0, fmt, &self.1, self.2)
}
}
@ -3996,7 +4005,7 @@ fn render_assoc_items(w: &mut fmt::Formatter,
Ok(())
}
fn render_deref_methods(w: &mut fmt::Formatter, cx: &Context, impl_: &Impl,
fn render_deref_methods(w: &mut fmt::Formatter<'_>, cx: &Context, impl_: &Impl,
container_item: &clean::Item, deref_mut: bool) -> fmt::Result {
let deref_type = impl_.inner_impl().trait_.as_ref().unwrap();
let target = impl_.inner_impl().items.iter().filter_map(|item| {
@ -4107,7 +4116,7 @@ fn spotlight_decl(decl: &clean::FnDecl) -> Result<String, fmt::Error> {
Ok(out)
}
fn render_impl(w: &mut fmt::Formatter, cx: &Context, i: &Impl, link: AssocItemLink,
fn render_impl(w: &mut fmt::Formatter<'_>, cx: &Context, i: &Impl, link: AssocItemLink<'_>,
render_mode: RenderMode, outer_version: Option<&str>,
show_def_docs: bool, use_absolute: Option<bool>) -> fmt::Result {
if render_mode == RenderMode::Normal {
@ -4149,8 +4158,8 @@ fn render_impl(w: &mut fmt::Formatter, cx: &Context, i: &Impl, link: AssocItemLi
}
}
fn doc_impl_item(w: &mut fmt::Formatter, cx: &Context, item: &clean::Item,
link: AssocItemLink, render_mode: RenderMode,
fn doc_impl_item(w: &mut fmt::Formatter<'_>, cx: &Context, item: &clean::Item,
link: AssocItemLink<'_>, render_mode: RenderMode,
is_default_item: bool, outer_version: Option<&str>,
trait_: Option<&clean::Trait>, show_def_docs: bool) -> fmt::Result {
let item_type = item.type_();
@ -4264,7 +4273,7 @@ fn render_impl(w: &mut fmt::Formatter, cx: &Context, i: &Impl, link: AssocItemLi
false, outer_version, trait_, show_def_docs)?;
}
fn render_default_items(w: &mut fmt::Formatter,
fn render_default_items(w: &mut fmt::Formatter<'_>,
cx: &Context,
t: &clean::Trait,
i: &clean::Impl,
@ -4297,7 +4306,7 @@ fn render_impl(w: &mut fmt::Formatter, cx: &Context, i: &Impl, link: AssocItemLi
}
fn item_existential(
w: &mut fmt::Formatter,
w: &mut fmt::Formatter<'_>,
cx: &Context,
it: &clean::Item,
t: &clean::Existential,
@ -4319,7 +4328,7 @@ fn item_existential(
render_assoc_items(w, cx, it, it.def_id, AssocItemRender::All)
}
fn item_trait_alias(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
fn item_trait_alias(w: &mut fmt::Formatter<'_>, cx: &Context, it: &clean::Item,
t: &clean::TraitAlias) -> fmt::Result {
write!(w, "<pre class='rust trait-alias'>")?;
render_attributes(w, it)?;
@ -4338,7 +4347,7 @@ fn item_trait_alias(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
render_assoc_items(w, cx, it, it.def_id, AssocItemRender::All)
}
fn item_typedef(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
fn item_typedef(w: &mut fmt::Formatter<'_>, cx: &Context, it: &clean::Item,
t: &clean::Typedef) -> fmt::Result {
write!(w, "<pre class='rust typedef'>")?;
render_attributes(w, it)?;
@ -4357,7 +4366,7 @@ fn item_typedef(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
render_assoc_items(w, cx, it, it.def_id, AssocItemRender::All)
}
fn item_foreign_type(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item) -> fmt::Result {
fn item_foreign_type(w: &mut fmt::Formatter<'_>, cx: &Context, it: &clean::Item) -> fmt::Result {
writeln!(w, "<pre class='rust foreigntype'>extern {{")?;
render_attributes(w, it)?;
write!(
@ -4373,7 +4382,7 @@ fn item_foreign_type(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item) ->
}
impl<'a> fmt::Display for Sidebar<'a> {
fn fmt(&self, fmt: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, fmt: &mut fmt::Formatter<'_>) -> fmt::Result {
let cx = self.cx;
let it = self.item;
let parentlen = cx.current.len() - if it.is_mod() {1} else {0};
@ -4638,7 +4647,7 @@ fn sidebar_assoc_items(it: &clean::Item) -> String {
out
}
fn sidebar_struct(fmt: &mut fmt::Formatter, it: &clean::Item,
fn sidebar_struct(fmt: &mut fmt::Formatter<'_>, it: &clean::Item,
s: &clean::Struct) -> fmt::Result {
let mut sidebar = String::new();
let fields = get_struct_fields_name(&s.fields);
@ -4675,7 +4684,7 @@ fn is_negative_impl(i: &clean::Impl) -> bool {
i.polarity == Some(clean::ImplPolarity::Negative)
}
fn sidebar_trait(fmt: &mut fmt::Formatter, it: &clean::Item,
fn sidebar_trait(fmt: &mut fmt::Formatter<'_>, it: &clean::Item,
t: &clean::Trait) -> fmt::Result {
let mut sidebar = String::new();
@ -4787,7 +4796,7 @@ fn sidebar_trait(fmt: &mut fmt::Formatter, it: &clean::Item,
write!(fmt, "<div class=\"block items\">{}</div>", sidebar)
}
fn sidebar_primitive(fmt: &mut fmt::Formatter, it: &clean::Item,
fn sidebar_primitive(fmt: &mut fmt::Formatter<'_>, it: &clean::Item,
_p: &clean::PrimitiveType) -> fmt::Result {
let sidebar = sidebar_assoc_items(it);
@ -4797,7 +4806,7 @@ fn sidebar_primitive(fmt: &mut fmt::Formatter, it: &clean::Item,
Ok(())
}
fn sidebar_typedef(fmt: &mut fmt::Formatter, it: &clean::Item,
fn sidebar_typedef(fmt: &mut fmt::Formatter<'_>, it: &clean::Item,
_t: &clean::Typedef) -> fmt::Result {
let sidebar = sidebar_assoc_items(it);
@ -4822,7 +4831,7 @@ fn get_struct_fields_name(fields: &[clean::Item]) -> String {
.collect()
}
fn sidebar_union(fmt: &mut fmt::Formatter, it: &clean::Item,
fn sidebar_union(fmt: &mut fmt::Formatter<'_>, it: &clean::Item,
u: &clean::Union) -> fmt::Result {
let mut sidebar = String::new();
let fields = get_struct_fields_name(&u.fields);
@ -4840,7 +4849,7 @@ fn sidebar_union(fmt: &mut fmt::Formatter, it: &clean::Item,
Ok(())
}
fn sidebar_enum(fmt: &mut fmt::Formatter, it: &clean::Item,
fn sidebar_enum(fmt: &mut fmt::Formatter<'_>, it: &clean::Item,
e: &clean::Enum) -> fmt::Result {
let mut sidebar = String::new();
@ -4895,7 +4904,7 @@ fn item_ty_to_strs(ty: &ItemType) -> (&'static str, &'static str) {
}
}
fn sidebar_module(fmt: &mut fmt::Formatter, _it: &clean::Item,
fn sidebar_module(fmt: &mut fmt::Formatter<'_>, _it: &clean::Item,
items: &[clean::Item]) -> fmt::Result {
let mut sidebar = String::new();
@ -4927,7 +4936,7 @@ fn sidebar_module(fmt: &mut fmt::Formatter, _it: &clean::Item,
Ok(())
}
fn sidebar_foreign_type(fmt: &mut fmt::Formatter, it: &clean::Item) -> fmt::Result {
fn sidebar_foreign_type(fmt: &mut fmt::Formatter<'_>, it: &clean::Item) -> fmt::Result {
let sidebar = sidebar_assoc_items(it);
if !sidebar.is_empty() {
write!(fmt, "<div class=\"block items\">{}</div>", sidebar)?;
@ -4936,7 +4945,7 @@ fn sidebar_foreign_type(fmt: &mut fmt::Formatter, it: &clean::Item) -> fmt::Resu
}
impl<'a> fmt::Display for Source<'a> {
fn fmt(&self, fmt: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, fmt: &mut fmt::Formatter<'_>) -> fmt::Result {
let Source(s) = *self;
let lines = s.lines().count();
let mut cols = 0;
@ -4956,7 +4965,7 @@ impl<'a> fmt::Display for Source<'a> {
}
}
fn item_macro(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
fn item_macro(w: &mut fmt::Formatter<'_>, cx: &Context, it: &clean::Item,
t: &clean::Macro) -> fmt::Result {
wrap_into_docblock(w, |w| {
w.write_str(&highlight::render_with_highlighting(&t.source,
@ -4967,7 +4976,7 @@ fn item_macro(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item,
document(w, cx, it)
}
fn item_proc_macro(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item, m: &clean::ProcMacro)
fn item_proc_macro(w: &mut fmt::Formatter<'_>, cx: &Context, it: &clean::Item, m: &clean::ProcMacro)
-> fmt::Result
{
let name = it.name.as_ref().expect("proc-macros always have names");
@ -5000,14 +5009,14 @@ fn item_proc_macro(w: &mut fmt::Formatter, cx: &Context, it: &clean::Item, m: &c
document(w, cx, it)
}
fn item_primitive(w: &mut fmt::Formatter, cx: &Context,
fn item_primitive(w: &mut fmt::Formatter<'_>, cx: &Context,
it: &clean::Item,
_p: &clean::PrimitiveType) -> fmt::Result {
document(w, cx, it)?;
render_assoc_items(w, cx, it, it.def_id, AssocItemRender::All)
}
fn item_keyword(w: &mut fmt::Formatter, cx: &Context,
fn item_keyword(w: &mut fmt::Formatter<'_>, cx: &Context,
it: &clean::Item,
_p: &str) -> fmt::Result {
document(w, cx, it)

View file

@ -166,13 +166,13 @@ impl TocBuilder {
}
impl fmt::Debug for Toc {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
fmt::Display::fmt(self, f)
}
}
impl fmt::Display for Toc {
fn fmt(&self, fmt: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, fmt: &mut fmt::Formatter<'_>) -> fmt::Result {
write!(fmt, "<ul>")?;
for entry in &self.entries {
// recursively format this table of contents (the

View file

@ -1,3 +1,5 @@
#![deny(rust_2018_idioms)]
#![doc(html_root_url = "https://doc.rust-lang.org/nightly/",
html_playground_url = "https://play.rust-lang.org/")]
@ -17,7 +19,6 @@
#![recursion_limit="256"]
extern crate arena;
extern crate getopts;
extern crate env_logger;
extern crate rustc;
@ -35,10 +36,6 @@ extern crate syntax_pos;
extern crate test as testing;
#[macro_use] extern crate log;
extern crate rustc_errors as errors;
extern crate pulldown_cmark;
extern crate tempfile;
extern crate minifier;
extern crate parking_lot;
extern crate serialize as rustc_serialize; // used by deriving

View file

@ -1,4 +1,3 @@
use std::default::Default;
use std::fs::File;
use std::io::prelude::*;
use std::path::PathBuf;
@ -9,13 +8,12 @@ use testing;
use syntax::source_map::DUMMY_SP;
use syntax::feature_gate::UnstableFeatures;
use externalfiles::{LoadStringError, load_string};
use config::{Options, RenderOptions};
use html::escape::Escape;
use html::markdown;
use html::markdown::{ErrorCodes, IdMap, Markdown, MarkdownWithToc, find_testable_code};
use test::{TestOptions, Collector};
use crate::externalfiles::{LoadStringError, load_string};
use crate::config::{Options, RenderOptions};
use crate::html::escape::Escape;
use crate::html::markdown;
use crate::html::markdown::{ErrorCodes, IdMap, Markdown, MarkdownWithToc, find_testable_code};
use crate::test::{TestOptions, Collector};
/// Separate any lines at the start of the file that begin with `# ` or `%`.
fn extract_leading_metadata<'a>(s: &'a str) -> (Vec<&'a str>, &'a str) {

View file

@ -4,17 +4,17 @@ use syntax::parse::{ParseSess, token};
use syntax::source_map::FilePathMapping;
use syntax_pos::FileName;
use clean;
use core::DocContext;
use fold::DocFolder;
use html::markdown::{self, RustCodeBlock};
use passes::Pass;
use crate::clean;
use crate::core::DocContext;
use crate::fold::DocFolder;
use crate::html::markdown::{self, RustCodeBlock};
use crate::passes::Pass;
pub const CHECK_CODE_BLOCK_SYNTAX: Pass =
Pass::early("check-code-block-syntax", check_code_block_syntax,
"validates syntax inside Rust code blocks");
pub fn check_code_block_syntax(krate: clean::Crate, cx: &DocContext) -> clean::Crate {
pub fn check_code_block_syntax(krate: clean::Crate, cx: &DocContext<'_, '_, '_>) -> clean::Crate {
SyntaxChecker { cx }.fold_crate(krate)
}

View file

@ -1,7 +1,8 @@
use clean::{self, DocFragment, Item};
use fold;
use fold::DocFolder;
use passes::Pass;
use crate::clean::{self, DocFragment, Item};
use crate::fold;
use crate::fold::{DocFolder};
use crate::passes::Pass;
use std::mem::replace;
pub const COLLAPSE_DOCS: Pass =

View file

@ -10,19 +10,19 @@ use syntax_pos::DUMMY_SP;
use std::ops::Range;
use core::DocContext;
use fold::DocFolder;
use html::markdown::markdown_links;
use crate::core::DocContext;
use crate::fold::DocFolder;
use crate::html::markdown::markdown_links;
use crate::clean::*;
use crate::passes::{look_for_tests, Pass};
use clean::*;
use passes::{look_for_tests, Pass};
use super::span_of_attrs;
pub const COLLECT_INTRA_DOC_LINKS: Pass =
Pass::early("collect-intra-doc-links", collect_intra_doc_links,
"reads a crate's documentation to resolve intra-doc-links");
pub fn collect_intra_doc_links(krate: Crate, cx: &DocContext) -> Crate {
pub fn collect_intra_doc_links(krate: Crate, cx: &DocContext<'_, '_, '_>) -> Crate {
if !UnstableFeatures::from_environment().is_nightly_build() {
krate
} else {
@ -423,7 +423,7 @@ impl<'a, 'tcx, 'rcx> DocFolder for LinkCollector<'a, 'tcx, 'rcx> {
}
/// Resolves a string as a macro.
fn macro_resolve(cx: &DocContext, path_str: &str) -> Option<Def> {
fn macro_resolve(cx: &DocContext<'_, '_, '_>, path_str: &str) -> Option<Def> {
use syntax::ext::base::{MacroKind, SyntaxExtension};
let segment = ast::PathSegment::from_ident(Ident::from_str(path_str));
let path = ast::Path { segments: vec![segment], span: DUMMY_SP };
@ -451,7 +451,7 @@ fn macro_resolve(cx: &DocContext, path_str: &str) -> Option<Def> {
/// documentation attributes themselves. This is a little heavy-handed, so we display the markdown
/// line containing the failure as a note as well.
fn resolution_failure(
cx: &DocContext,
cx: &DocContext<'_, '_, '_>,
attrs: &Attributes,
path_str: &str,
dox: &str,
@ -493,7 +493,7 @@ fn resolution_failure(
diag.emit();
}
fn ambiguity_error(cx: &DocContext, attrs: &Attributes,
fn ambiguity_error(cx: &DocContext<'_, '_, '_>, attrs: &Attributes,
path_str: &str,
article1: &str, kind1: &str, disambig1: &str,
article2: &str, kind2: &str, disambig2: &str) {
@ -549,7 +549,7 @@ fn type_ns_kind(def: Def, path_str: &str) -> (&'static str, &'static str, String
}
/// Given an enum variant's def, return the def of its enum and the associated fragment.
fn handle_variant(cx: &DocContext, def: Def) -> Result<(Def, Option<String>), ()> {
fn handle_variant(cx: &DocContext<'_, '_, '_>, def: Def) -> Result<(Def, Option<String>), ()> {
use rustc::ty::DefIdTree;
let parent = if let Some(parent) = cx.tcx.parent(def.def_id()) {

View file

@ -1,17 +1,16 @@
use clean::*;
use crate::clean::*;
use crate::core::DocContext;
use crate::fold::DocFolder;
use super::Pass;
use rustc::util::nodemap::FxHashSet;
use rustc::hir::def_id::DefId;
use super::Pass;
use core::DocContext;
use fold::DocFolder;
pub const COLLECT_TRAIT_IMPLS: Pass =
Pass::early("collect-trait-impls", collect_trait_impls,
"retrieves trait impls for items in the crate");
pub fn collect_trait_impls(krate: Crate, cx: &DocContext) -> Crate {
pub fn collect_trait_impls(krate: Crate, cx: &DocContext<'_, '_, '_>) -> Crate {
let mut synth = SyntheticImplCollector::new(cx);
let mut krate = synth.fold_crate(krate);

View file

@ -11,12 +11,10 @@ use syntax::ast::NodeId;
use syntax_pos::{DUMMY_SP, Span};
use std::ops::Range;
use clean::{self, GetDefId, Item};
use core::{DocContext, DocAccessLevels};
use fold;
use fold::StripItem;
use html::markdown::{find_testable_code, ErrorCodes, LangString};
use crate::clean::{self, GetDefId, Item};
use crate::core::{DocContext, DocAccessLevels};
use crate::fold::{DocFolder, StripItem};
use crate::html::markdown::{find_testable_code, ErrorCodes, LangString};
mod collapse_docs;
pub use self::collapse_docs::COLLAPSE_DOCS;
@ -55,7 +53,7 @@ pub enum Pass {
/// traits and the like.
EarlyPass {
name: &'static str,
pass: fn(clean::Crate, &DocContext) -> clean::Crate,
pass: fn(clean::Crate, &DocContext<'_, '_, '_>) -> clean::Crate,
description: &'static str,
},
/// A "late pass" is run between crate cleaning and page generation.
@ -67,7 +65,7 @@ pub enum Pass {
}
impl fmt::Debug for Pass {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
let mut dbg = match *self {
Pass::EarlyPass { .. } => f.debug_struct("EarlyPass"),
Pass::LatePass { .. } => f.debug_struct("LatePass"),
@ -83,7 +81,7 @@ impl fmt::Debug for Pass {
impl Pass {
/// Constructs a new early pass.
pub const fn early(name: &'static str,
pass: fn(clean::Crate, &DocContext) -> clean::Crate,
pass: fn(clean::Crate, &DocContext<'_, '_, '_>) -> clean::Crate,
description: &'static str) -> Pass {
Pass::EarlyPass { name, pass, description }
}
@ -112,7 +110,7 @@ impl Pass {
}
/// If this pass is an early pass, returns the pointer to its function.
pub fn early_fn(self) -> Option<fn(clean::Crate, &DocContext) -> clean::Crate> {
pub fn early_fn(self) -> Option<fn(clean::Crate, &DocContext<'_, '_, '_>) -> clean::Crate> {
match self {
Pass::EarlyPass { pass, .. } => Some(pass),
_ => None,
@ -196,7 +194,7 @@ struct Stripper<'a> {
update_retained: bool,
}
impl<'a> fold::DocFolder for Stripper<'a> {
impl<'a> DocFolder for Stripper<'a> {
fn fold_item(&mut self, i: Item) -> Option<Item> {
match i.inner {
clean::StrippedItem(..) => {
@ -308,7 +306,7 @@ struct ImplStripper<'a> {
retained: &'a DefIdSet,
}
impl<'a> fold::DocFolder for ImplStripper<'a> {
impl<'a> DocFolder for ImplStripper<'a> {
fn fold_item(&mut self, i: Item) -> Option<Item> {
if let clean::ImplItem(ref imp) = i.inner {
// emptied none trait impls can be stripped
@ -345,7 +343,7 @@ impl<'a> fold::DocFolder for ImplStripper<'a> {
// This stripper discards all private import statements (`use`, `extern crate`)
struct ImportStripper;
impl fold::DocFolder for ImportStripper {
impl DocFolder for ImportStripper {
fn fold_item(&mut self, i: Item) -> Option<Item> {
match i.inner {
clean::ExternCrateItem(..) | clean::ImportItem(..)
@ -373,7 +371,7 @@ pub fn look_for_tests<'a, 'tcx: 'a, 'rcx: 'a>(
found_tests: usize,
}
impl ::test::Tester for Tests {
impl crate::test::Tester for Tests {
fn add_test(&mut self, _: String, _: LangString, _: usize) {
self.found_tests += 1;
}
@ -420,7 +418,7 @@ crate fn span_of_attrs(attrs: &clean::Attributes) -> Span {
/// attributes are not all sugared doc comments. It's difficult to calculate the correct span in
/// that case due to escaping and other source features.
crate fn source_span_for_markdown_range(
cx: &DocContext,
cx: &DocContext<'_, '_, '_>,
markdown: &str,
md_range: &Range<usize>,
attrs: &clean::Attributes,

View file

@ -1,9 +1,8 @@
use clean::*;
use crate::clean::*;
use crate::core::DocContext;
use crate::fold::DocFolder;
use crate::passes::{look_for_tests, Pass};
use core::DocContext;
use fold::DocFolder;
use passes::{look_for_tests, Pass};
pub const CHECK_PRIVATE_ITEMS_DOC_TESTS: Pass =
Pass::early("check-private-items-doc-tests", check_private_items_doc_tests,
@ -21,7 +20,7 @@ impl<'a, 'tcx, 'rcx> PrivateItemDocTestLinter<'a, 'tcx, 'rcx> {
}
}
pub fn check_private_items_doc_tests(krate: Crate, cx: &DocContext) -> Crate {
pub fn check_private_items_doc_tests(krate: Crate, cx: &DocContext<'_, '_, '_>) -> Crate {
let mut coll = PrivateItemDocTestLinter::new(cx);
coll.fold_crate(krate)

View file

@ -1,9 +1,9 @@
use std::sync::Arc;
use clean::{Crate, Item};
use clean::cfg::Cfg;
use fold::DocFolder;
use passes::Pass;
use crate::clean::{Crate, Item};
use crate::clean::cfg::Cfg;
use crate::fold::DocFolder;
use crate::passes::Pass;
pub const PROPAGATE_DOC_CFG: Pass =
Pass::late("propagate-doc-cfg", propagate_doc_cfg,

View file

@ -1,20 +1,18 @@
use rustc::util::nodemap::DefIdSet;
use std::mem;
use clean::{self, AttributesExt, NestedAttributesExt};
use clean::Item;
use core::DocContext;
use fold;
use fold::DocFolder;
use fold::StripItem;
use passes::{ImplStripper, Pass};
use crate::clean::{self, AttributesExt, NestedAttributesExt};
use crate::clean::Item;
use crate::core::DocContext;
use crate::fold::{DocFolder, StripItem};
use crate::passes::{ImplStripper, Pass};
pub const STRIP_HIDDEN: Pass =
Pass::early("strip-hidden", strip_hidden,
"strips all doc(hidden) items from the output");
/// Strip items marked `#[doc(hidden)]`
pub fn strip_hidden(krate: clean::Crate, _: &DocContext) -> clean::Crate {
pub fn strip_hidden(krate: clean::Crate, _: &DocContext<'_, '_, '_>) -> clean::Crate {
let mut retained = DefIdSet::default();
// strip all #[doc(hidden)] items
@ -35,7 +33,7 @@ struct Stripper<'a> {
update_retained: bool,
}
impl<'a> fold::DocFolder for Stripper<'a> {
impl<'a> DocFolder for Stripper<'a> {
fn fold_item(&mut self, i: Item) -> Option<Item> {
if i.attrs.lists("doc").has_word("hidden") {
debug!("strip_hidden: stripping {} {:?}", i.type_(), i.name);

View file

@ -1,11 +1,11 @@
use clean;
use core::DocContext;
use fold::DocFolder;
use passes::{ImportStripper, Pass};
use crate::clean;
use crate::fold::{DocFolder};
use crate::core::DocContext;
use crate::passes::{ImportStripper, Pass};
pub const STRIP_PRIV_IMPORTS: Pass = Pass::early("strip-priv-imports", strip_priv_imports,
"strips all private import statements (`use`, `extern crate`) from a crate");
pub fn strip_priv_imports(krate: clean::Crate, _: &DocContext) -> clean::Crate {
pub fn strip_priv_imports(krate: clean::Crate, _: &DocContext<'_, '_, '_>) -> clean::Crate {
ImportStripper.fold_crate(krate)
}

View file

@ -1,9 +1,9 @@
use rustc::util::nodemap::DefIdSet;
use clean;
use core::DocContext;
use fold::DocFolder;
use passes::{ImplStripper, ImportStripper, Stripper, Pass};
use crate::clean;
use crate::fold::{DocFolder};
use crate::core::DocContext;
use crate::passes::{ImplStripper, ImportStripper, Stripper, Pass};
pub const STRIP_PRIVATE: Pass =
Pass::early("strip-private", strip_private,
@ -12,7 +12,7 @@ pub const STRIP_PRIVATE: Pass =
/// Strip private items from the point of view of a crate or externally from a
/// crate, specified by the `xcrate` flag.
pub fn strip_private(mut krate: clean::Crate, cx: &DocContext) -> clean::Crate {
pub fn strip_private(mut krate: clean::Crate, cx: &DocContext<'_, '_, '_>) -> clean::Crate {
// This stripper collects all *retained* nodes.
let mut retained = DefIdSet::default();
let access_levels = cx.renderinfo.borrow().access_levels.clone();

View file

@ -2,9 +2,9 @@ use std::cmp;
use std::string::String;
use std::usize;
use clean::{self, DocFragment, Item};
use fold::{self, DocFolder};
use passes::Pass;
use crate::clean::{self, DocFragment, Item};
use crate::fold::{self, DocFolder};
use crate::passes::Pass;
pub const UNINDENT_COMMENTS: Pass =
Pass::late("unindent-comments", unindent_comments,

View file

@ -28,9 +28,9 @@ use std::process::Command;
use std::str;
use std::sync::{Arc, Mutex};
use clean::Attributes;
use config::Options;
use html::markdown::{self, ErrorCodes, LangString};
use crate::clean::Attributes;
use crate::config::Options;
use crate::html::markdown::{self, ErrorCodes, LangString};
#[derive(Clone, Default)]
pub struct TestOptions {

View file

@ -14,9 +14,10 @@ use syntax_pos::{self, Span};
use std::mem;
use core;
use clean::{self, AttributesExt, NestedAttributesExt, def_id_to_path};
use doctree::*;
use crate::core;
use crate::clean::{self, AttributesExt, NestedAttributesExt, def_id_to_path};
use crate::doctree::*;
// Looks to me like the first two of these are actually
// output parameters, maybe only mutated once; perhaps
@ -268,7 +269,7 @@ impl<'a, 'tcx, 'rcx> RustdocVisitor<'a, 'tcx, 'rcx> {
om: &mut Module,
please_inline: bool) -> bool {
fn inherits_doc_hidden(cx: &core::DocContext, mut node: ast::NodeId) -> bool {
fn inherits_doc_hidden(cx: &core::DocContext<'_, '_, '_>, mut node: ast::NodeId) -> bool {
while let Some(id) = cx.tcx.hir().get_enclosing_scope(node) {
node = id;
if cx.tcx.hir().attrs(node).lists("doc").has_word("hidden") {
@ -315,7 +316,7 @@ impl<'a, 'tcx, 'rcx> RustdocVisitor<'a, 'tcx, 'rcx> {
.insert(did, AccessLevel::Public);
},
Def::Mod(did) => if !self_is_hidden {
::visit_lib::LibEmbargoVisitor::new(self.cx).visit_mod(did);
crate::visit_lib::LibEmbargoVisitor::new(self.cx).visit_mod(did);
},
_ => {},
}

View file

@ -6,14 +6,14 @@ use rustc::util::nodemap::FxHashSet;
use std::cell::RefMut;
use clean::{AttributesExt, NestedAttributesExt};
use crate::clean::{AttributesExt, NestedAttributesExt};
// FIXME: this may not be exhaustive, but is sufficient for rustdocs current uses
/// Similar to `librustc_privacy::EmbargoVisitor`, but also takes
/// specific rustdoc annotations into account (i.e., `doc(hidden)`)
pub struct LibEmbargoVisitor<'a, 'tcx: 'a, 'rcx: 'a> {
cx: &'a ::core::DocContext<'a, 'tcx, 'rcx>,
cx: &'a crate::core::DocContext<'a, 'tcx, 'rcx>,
// Accessibility levels for reachable nodes
access_levels: RefMut<'a, AccessLevels<DefId>>,
// Previous accessibility level, None means unreachable
@ -24,7 +24,7 @@ pub struct LibEmbargoVisitor<'a, 'tcx: 'a, 'rcx: 'a> {
impl<'a, 'tcx, 'rcx> LibEmbargoVisitor<'a, 'tcx, 'rcx> {
pub fn new(
cx: &'a ::core::DocContext<'a, 'tcx, 'rcx>
cx: &'a crate::core::DocContext<'a, 'tcx, 'rcx>
) -> LibEmbargoVisitor<'a, 'tcx, 'rcx> {
LibEmbargoVisitor {
cx,

View file

@ -204,6 +204,7 @@ pub unsafe extern "C" fn __rust_print_err(m: *mut u8, s: i32) {
}
#[no_mangle]
// NB. used by both libunwind and libpanic_abort
pub unsafe extern "C" fn __rust_abort() {
::sys::abort_internal();
}

View file

@ -23,9 +23,6 @@ use sys_common::mutex::Mutex;
#[stable(feature = "time", since = "1.3.0")]
pub use core::time::Duration;
#[unstable(feature = "duration_constants", issue = "57391")]
pub use core::time::{SECOND, MILLISECOND, MICROSECOND, NANOSECOND};
/// A measurement of a monotonically nondecreasing clock.
/// Opaque and useful only with `Duration`.
///

View file

@ -517,7 +517,7 @@ impl MetaItem {
let span = span.with_hi(segments.last().unwrap().ident.span.hi());
Path { span, segments }
}
Some(TokenTree::Token(_, Token::Interpolated(ref nt))) => match nt.0 {
Some(TokenTree::Token(_, Token::Interpolated(nt))) => match *nt {
token::Nonterminal::NtIdent(ident, _) => Path::from_ident(ident),
token::Nonterminal::NtMeta(ref meta) => return Some(meta.clone()),
token::Nonterminal::NtPath(ref path) => path.clone(),
@ -682,7 +682,7 @@ impl LitKind {
match token {
Token::Ident(ident, false) if ident.name == "true" => Some(LitKind::Bool(true)),
Token::Ident(ident, false) if ident.name == "false" => Some(LitKind::Bool(false)),
Token::Interpolated(ref nt) => match nt.0 {
Token::Interpolated(nt) => match *nt {
token::NtExpr(ref v) | token::NtLiteral(ref v) => match v.node {
ExprKind::Lit(ref lit) => Some(lit.node.clone()),
_ => None,

View file

@ -266,7 +266,7 @@ impl<F> TTMacroExpander for F
impl MutVisitor for AvoidInterpolatedIdents {
fn visit_tt(&mut self, tt: &mut tokenstream::TokenTree) {
if let tokenstream::TokenTree::Token(_, token::Interpolated(nt)) = tt {
if let token::NtIdent(ident, is_raw) = nt.0 {
if let token::NtIdent(ident, is_raw) = **nt {
*tt = tokenstream::TokenTree::Token(ident.span,
token::Ident(ident, is_raw));
}

View file

@ -25,6 +25,7 @@ use syntax_pos::{Span, DUMMY_SP, FileName};
use syntax_pos::hygiene::ExpnFormat;
use rustc_data_structures::fx::FxHashMap;
use rustc_data_structures::sync::Lrc;
use std::fs;
use std::io::ErrorKind;
use std::{iter, mem};
@ -584,14 +585,14 @@ impl<'a, 'b> MacroExpander<'a, 'b> {
}
AttrProcMacro(ref mac, ..) => {
self.gate_proc_macro_attr_item(attr.span, &item);
let item_tok = TokenTree::Token(DUMMY_SP, Token::interpolated(match item {
let item_tok = TokenTree::Token(DUMMY_SP, Token::Interpolated(Lrc::new(match item {
Annotatable::Item(item) => token::NtItem(item),
Annotatable::TraitItem(item) => token::NtTraitItem(item.into_inner()),
Annotatable::ImplItem(item) => token::NtImplItem(item.into_inner()),
Annotatable::ForeignItem(item) => token::NtForeignItem(item.into_inner()),
Annotatable::Stmt(stmt) => token::NtStmt(stmt.into_inner()),
Annotatable::Expr(expr) => token::NtExpr(expr),
})).into();
}))).into();
let input = self.extract_proc_macro_attr_input(attr.tokens, attr.span);
let tok_result = mac.expand(self.cx, attr.span, input, item_tok);
let res = self.parse_ast_fragment(tok_result, invoc.fragment_kind,

View file

@ -88,6 +88,7 @@ use smallvec::{smallvec, SmallVec};
use syntax_pos::Span;
use rustc_data_structures::fx::FxHashMap;
use rustc_data_structures::sync::Lrc;
use std::collections::hash_map::Entry::{Occupied, Vacant};
use std::mem;
use std::ops::{Deref, DerefMut};
@ -179,7 +180,7 @@ struct MatcherPos<'root, 'tt: 'root> {
/// all bound matches from the submatcher into the shared top-level `matches` vector. If `sep`
/// and `up` are `Some`, then `matches` is _not_ the shared top-level list. Instead, if one
/// wants the shared `matches`, one should use `up.matches`.
matches: Box<[Rc<NamedMatchVec>]>,
matches: Box<[Lrc<NamedMatchVec>]>,
/// The position in `matches` corresponding to the first metavar in this matcher's sequence of
/// token trees. In other words, the first metavar in the first token of `top_elts` corresponds
/// to `matches[match_lo]`.
@ -218,7 +219,7 @@ struct MatcherPos<'root, 'tt: 'root> {
impl<'root, 'tt> MatcherPos<'root, 'tt> {
/// Adds `m` as a named match for the `idx`-th metavar.
fn push_match(&mut self, idx: usize, m: NamedMatch) {
let matches = Rc::make_mut(&mut self.matches[idx]);
let matches = Lrc::make_mut(&mut self.matches[idx]);
matches.push(m);
}
}
@ -295,11 +296,11 @@ pub fn count_names(ms: &[TokenTree]) -> usize {
}
/// `len` `Vec`s (initially shared and empty) that will store matches of metavars.
fn create_matches(len: usize) -> Box<[Rc<NamedMatchVec>]> {
fn create_matches(len: usize) -> Box<[Lrc<NamedMatchVec>]> {
if len == 0 {
vec![]
} else {
let empty_matches = Rc::new(SmallVec::new());
let empty_matches = Lrc::new(SmallVec::new());
vec![empty_matches; len]
}.into_boxed_slice()
}
@ -353,8 +354,8 @@ fn initial_matcher_pos<'root, 'tt>(ms: &'tt [TokenTree], open: Span) -> MatcherP
/// token tree it was derived from.
#[derive(Debug, Clone)]
pub enum NamedMatch {
MatchedSeq(Rc<NamedMatchVec>, DelimSpan),
MatchedNonterminal(Rc<Nonterminal>),
MatchedSeq(Lrc<NamedMatchVec>, DelimSpan),
MatchedNonterminal(Lrc<Nonterminal>),
}
/// Takes a sequence of token trees `ms` representing a matcher which successfully matched input
@ -561,7 +562,7 @@ fn inner_parse_loop<'root, 'tt>(
new_item.match_cur += seq.num_captures;
new_item.idx += 1;
for idx in item.match_cur..item.match_cur + seq.num_captures {
new_item.push_match(idx, MatchedSeq(Rc::new(smallvec![]), sp));
new_item.push_match(idx, MatchedSeq(Lrc::new(smallvec![]), sp));
}
cur_items.push(new_item);
}
@ -707,7 +708,7 @@ pub fn parse(
let matches = eof_items[0]
.matches
.iter_mut()
.map(|dv| Rc::make_mut(dv).pop().unwrap());
.map(|dv| Lrc::make_mut(dv).pop().unwrap());
return nameize(sess, ms, matches);
} else if eof_items.len() > 1 {
return Error(
@ -780,7 +781,7 @@ pub fn parse(
let match_cur = item.match_cur;
item.push_match(
match_cur,
MatchedNonterminal(Rc::new(parse_nt(&mut parser, span, &ident.as_str()))),
MatchedNonterminal(Lrc::new(parse_nt(&mut parser, span, &ident.as_str()))),
);
item.idx += 1;
item.match_cur += 1;
@ -829,7 +830,7 @@ fn may_begin_with(name: &str, token: &Token) -> bool {
},
"block" => match *token {
Token::OpenDelim(token::Brace) => true,
Token::Interpolated(ref nt) => match nt.0 {
Token::Interpolated(ref nt) => match **nt {
token::NtItem(_)
| token::NtPat(_)
| token::NtTy(_)
@ -843,9 +844,9 @@ fn may_begin_with(name: &str, token: &Token) -> bool {
},
"path" | "meta" => match *token {
Token::ModSep | Token::Ident(..) => true,
Token::Interpolated(ref nt) => match nt.0 {
Token::Interpolated(ref nt) => match **nt {
token::NtPath(_) | token::NtMeta(_) => true,
_ => may_be_ident(&nt.0),
_ => may_be_ident(&nt),
},
_ => false,
},
@ -862,12 +863,12 @@ fn may_begin_with(name: &str, token: &Token) -> bool {
Token::ModSep | // path
Token::Lt | // path (UFCS constant)
Token::BinOp(token::Shl) => true, // path (double UFCS)
Token::Interpolated(ref nt) => may_be_ident(&nt.0),
Token::Interpolated(ref nt) => may_be_ident(nt),
_ => false,
},
"lifetime" => match *token {
Token::Lifetime(_) => true,
Token::Interpolated(ref nt) => match nt.0 {
Token::Interpolated(ref nt) => match **nt {
token::NtLifetime(_) | token::NtTT(_) => true,
_ => false,
},

View file

@ -149,7 +149,7 @@ pub fn transcribe(cx: &ExtCtxt<'_>,
result.push(tt.clone().into());
} else {
sp = sp.apply_mark(cx.current_expansion.mark);
let token = TokenTree::Token(sp, Token::interpolated((**nt).clone()));
let token = TokenTree::Token(sp, Token::Interpolated(nt.clone()));
result.push(token.into());
}
} else {

View file

@ -581,9 +581,8 @@ pub fn noop_visit_token<T: MutVisitor>(t: &mut Token, vis: &mut T) {
token::Ident(id, _is_raw) => vis.visit_ident(id),
token::Lifetime(id) => vis.visit_ident(id),
token::Interpolated(nt) => {
let nt = Lrc::make_mut(nt);
vis.visit_interpolated(&mut nt.0);
nt.1 = token::LazyTokenStream::new();
let mut nt = Lrc::make_mut(nt);
vis.visit_interpolated(&mut nt);
}
_ => {}
}

View file

@ -141,7 +141,7 @@ impl<'a> Parser<'a> {
/// The delimiters or `=` are still put into the resulting token stream.
crate fn parse_meta_item_unrestricted(&mut self) -> PResult<'a, (ast::Path, TokenStream)> {
let meta = match self.token {
token::Interpolated(ref nt) => match nt.0 {
token::Interpolated(ref nt) => match **nt {
Nonterminal::NtMeta(ref meta) => Some(meta.clone()),
_ => None,
},
@ -227,7 +227,7 @@ impl<'a> Parser<'a> {
/// meta_item_inner : (meta_item | UNSUFFIXED_LIT) (',' meta_item_inner)? ;
pub fn parse_meta_item(&mut self) -> PResult<'a, ast::MetaItem> {
let nt_meta = match self.token {
token::Interpolated(ref nt) => match nt.0 {
token::Interpolated(ref nt) => match **nt {
token::NtMeta(ref e) => Some(e.clone()),
_ => None,
},

View file

@ -125,6 +125,28 @@ impl<'a> StringReader<'a> {
Ok(ret_val)
}
/// Immutably extract string if found at current position with given delimiters
pub fn peek_delimited(&self, from_ch: char, to_ch: char) -> Option<String> {
let mut pos = self.pos;
let mut idx = self.src_index(pos);
let mut ch = char_at(&self.src, idx);
if ch != from_ch {
return None;
}
pos = pos + Pos::from_usize(ch.len_utf8());
let start_pos = pos;
idx = self.src_index(pos);
while idx < self.end_src_index {
ch = char_at(&self.src, idx);
if ch == to_ch {
return Some(self.src[self.src_index(start_pos)..self.src_index(pos)].to_string());
}
pos = pos + Pos::from_usize(ch.len_utf8());
idx = self.src_index(pos);
}
return None;
}
fn try_real_token(&mut self) -> Result<TokenAndSpan, ()> {
let mut t = self.try_next_token()?;
loop {

View file

@ -1,7 +1,7 @@
// Characters and their corresponding confusables were collected from
// http://www.unicode.org/Public/security/10.0.0/confusables.txt
use syntax_pos::{Span, NO_EXPANSION};
use syntax_pos::{Span, Pos, NO_EXPANSION};
use errors::{Applicability, DiagnosticBuilder};
use super::StringReader;
@ -333,14 +333,27 @@ crate fn check_for_substitution<'a>(reader: &StringReader<'a>,
let span = Span::new(reader.pos, reader.next_pos, NO_EXPANSION);
match ASCII_ARRAY.iter().find(|&&(c, _)| c == ascii_char) {
Some(&(ascii_char, ascii_name)) => {
let msg =
format!("Unicode character '{}' ({}) looks like '{}' ({}), but it is not",
ch, u_name, ascii_char, ascii_name);
err.span_suggestion(
span,
&msg,
ascii_char.to_string(),
Applicability::MaybeIncorrect);
// special help suggestion for "directed" double quotes
if let Some(s) = reader.peek_delimited('“', '”') {
let msg = format!("Unicode characters '“' (Left Double Quotation Mark) and \
'”' (Right Double Quotation Mark) look like '{}' ({}), but are not",
ascii_char, ascii_name);
err.span_suggestion(
Span::new(reader.pos, reader.next_pos + Pos::from_usize(s.len()) +
Pos::from_usize('”'.len_utf8()), NO_EXPANSION),
&msg,
format!("\"{}\"", s),
Applicability::MaybeIncorrect);
} else {
let msg =
format!("Unicode character '{}' ({}) looks like '{}' ({}), but it is not",
ch, u_name, ascii_char, ascii_name);
err.span_suggestion(
span,
&msg,
ascii_char.to_string(),
Applicability::MaybeIncorrect);
}
true
},
None => {

View file

@ -119,7 +119,7 @@ enum BlockMode {
macro_rules! maybe_whole_expr {
($p:expr) => {
if let token::Interpolated(nt) = $p.token.clone() {
match nt.0 {
match *nt {
token::NtExpr(ref e) | token::NtLiteral(ref e) => {
$p.bump();
return Ok((*e).clone());
@ -146,7 +146,7 @@ macro_rules! maybe_whole_expr {
macro_rules! maybe_whole {
($p:expr, $constructor:ident, |$x:ident| $e:expr) => {
if let token::Interpolated(nt) = $p.token.clone() {
if let token::$constructor($x) = nt.0.clone() {
if let token::$constructor($x) = (*nt).clone() {
$p.bump();
return Ok($e);
}
@ -1184,8 +1184,10 @@ impl<'a> Parser<'a> {
match ate {
Some(_) => {
// See doc comment for `unmatched_angle_bracket_count`.
self.unmatched_angle_bracket_count -= 1;
debug!("expect_gt: (decrement) count={:?}", self.unmatched_angle_bracket_count);
if self.unmatched_angle_bracket_count > 0 {
self.unmatched_angle_bracket_count -= 1;
debug!("expect_gt: (decrement) count={:?}", self.unmatched_angle_bracket_count);
}
Ok(())
},
@ -1570,7 +1572,7 @@ impl<'a> Parser<'a> {
Some(body)
}
token::Interpolated(ref nt) => {
match &nt.0 {
match **nt {
token::NtBlock(..) => {
*at_end = true;
let (inner_attrs, body) = self.parse_inner_attrs_and_block()?;
@ -1913,7 +1915,7 @@ impl<'a> Parser<'a> {
fn is_named_argument(&mut self) -> bool {
let offset = match self.token {
token::Interpolated(ref nt) => match nt.0 {
token::Interpolated(ref nt) => match **nt {
token::NtPat(..) => return self.look_ahead(1, |t| t == &token::Colon),
_ => 0,
}
@ -2099,7 +2101,7 @@ impl<'a> Parser<'a> {
/// Matches `token_lit = LIT_INTEGER | ...`.
fn parse_lit_token(&mut self) -> PResult<'a, LitKind> {
let out = match self.token {
token::Interpolated(ref nt) => match nt.0 {
token::Interpolated(ref nt) => match **nt {
token::NtExpr(ref v) | token::NtLiteral(ref v) => match v.node {
ExprKind::Lit(ref lit) => { lit.node.clone() }
_ => { return self.unexpected_last(&self.token); }
@ -2248,8 +2250,10 @@ impl<'a> Parser<'a> {
// See doc comment for `unmatched_angle_bracket_count`.
self.expect(&token::Gt)?;
self.unmatched_angle_bracket_count -= 1;
debug!("parse_qpath: (decrement) count={:?}", self.unmatched_angle_bracket_count);
if self.unmatched_angle_bracket_count > 0 {
self.unmatched_angle_bracket_count -= 1;
debug!("parse_qpath: (decrement) count={:?}", self.unmatched_angle_bracket_count);
}
self.expect(&token::ModSep)?;
@ -2299,7 +2303,7 @@ impl<'a> Parser<'a> {
/// attributes.
pub fn parse_path_allowing_meta(&mut self, style: PathStyle) -> PResult<'a, ast::Path> {
let meta_ident = match self.token {
token::Interpolated(ref nt) => match nt.0 {
token::Interpolated(ref nt) => match **nt {
token::NtMeta(ref meta) => match meta.node {
ast::MetaItemKind::Word => Some(meta.ident.clone()),
_ => None,
@ -3271,7 +3275,7 @@ impl<'a> Parser<'a> {
self.meta_var_span = Some(self.span);
// Interpolated identifier and lifetime tokens are replaced with usual identifier
// and lifetime tokens, so the former are never encountered during normal parsing.
match nt.0 {
match **nt {
token::NtIdent(ident, is_raw) => (token::Ident(ident, is_raw), ident.span),
token::NtLifetime(ident) => (token::Lifetime(ident), ident.span),
_ => return,
@ -3403,7 +3407,7 @@ impl<'a> Parser<'a> {
// can't continue an expression after an ident
token::Ident(ident, is_raw) => token::ident_can_begin_expr(ident, is_raw),
token::Literal(..) | token::Pound => true,
token::Interpolated(ref nt) => match nt.0 {
token::Interpolated(ref nt) => match **nt {
token::NtIdent(..) | token::NtExpr(..) |
token::NtBlock(..) | token::NtPath(..) => true,
_ => false,

View file

@ -13,16 +13,15 @@ use crate::syntax::parse::parse_stream_from_source_str;
use crate::syntax::parse::parser::emit_unclosed_delims;
use crate::tokenstream::{self, DelimSpan, TokenStream, TokenTree};
use serialize::{Decodable, Decoder, Encodable, Encoder};
use syntax_pos::symbol::{self, Symbol};
use syntax_pos::{self, Span, FileName};
use log::info;
use std::{cmp, fmt};
use std::fmt;
use std::mem;
#[cfg(target_arch = "x86_64")]
use rustc_data_structures::static_assert;
use rustc_data_structures::sync::{Lrc, Lock};
use rustc_data_structures::sync::Lrc;
#[derive(Clone, PartialEq, RustcEncodable, RustcDecodable, Hash, Debug, Copy)]
pub enum BinOpToken {
@ -87,7 +86,7 @@ impl Lit {
}
}
// See comments in `interpolated_to_tokenstream` for why we care about
// See comments in `Nonterminal::to_tokenstream` for why we care about
// *probably* equal here rather than actual equality
fn probably_equal_for_proc_macro(&self, other: &Lit) -> bool {
mem::discriminant(self) == mem::discriminant(other)
@ -184,9 +183,8 @@ pub enum Token {
Ident(ast::Ident, /* is_raw */ bool),
Lifetime(ast::Ident),
// The `LazyTokenStream` is a pure function of the `Nonterminal`,
// and so the `LazyTokenStream` can be ignored by Eq, Hash, etc.
Interpolated(Lrc<(Nonterminal, LazyTokenStream)>),
Interpolated(Lrc<Nonterminal>),
// Can be expanded into several tokens.
/// A doc comment.
DocComment(ast::Name),
@ -209,10 +207,6 @@ pub enum Token {
static_assert!(MEM_SIZE_OF_STATEMENT: mem::size_of::<Token>() == 16);
impl Token {
pub fn interpolated(nt: Nonterminal) -> Token {
Token::Interpolated(Lrc::new((nt, LazyTokenStream::new())))
}
/// Recovers a `Token` from an `ast::Ident`. This creates a raw identifier if necessary.
pub fn from_ast_ident(ident: ast::Ident) -> Token {
Ident(ident, ident.is_raw_guess())
@ -244,7 +238,7 @@ impl Token {
ModSep | // global path
Lifetime(..) | // labeled loop
Pound => true, // expression attributes
Interpolated(ref nt) => match nt.0 {
Interpolated(ref nt) => match **nt {
NtLiteral(..) |
NtIdent(..) |
NtExpr(..) |
@ -272,7 +266,7 @@ impl Token {
Lifetime(..) | // lifetime bound in trait object
Lt | BinOp(Shl) | // associated path
ModSep => true, // global path
Interpolated(ref nt) => match nt.0 {
Interpolated(ref nt) => match **nt {
NtIdent(..) | NtTy(..) | NtPath(..) | NtLifetime(..) => true,
_ => false,
},
@ -284,7 +278,7 @@ impl Token {
pub fn can_begin_const_arg(&self) -> bool {
match self {
OpenDelim(Brace) => true,
Interpolated(ref nt) => match nt.0 {
Interpolated(ref nt) => match **nt {
NtExpr(..) => true,
NtBlock(..) => true,
NtLiteral(..) => true,
@ -316,7 +310,7 @@ impl Token {
BinOp(Minus) => true,
Ident(ident, false) if ident.name == keywords::True.name() => true,
Ident(ident, false) if ident.name == keywords::False.name() => true,
Interpolated(ref nt) => match nt.0 {
Interpolated(ref nt) => match **nt {
NtLiteral(..) => true,
_ => false,
},
@ -328,7 +322,7 @@ impl Token {
pub fn ident(&self) -> Option<(ast::Ident, /* is_raw */ bool)> {
match *self {
Ident(ident, is_raw) => Some((ident, is_raw)),
Interpolated(ref nt) => match nt.0 {
Interpolated(ref nt) => match **nt {
NtIdent(ident, is_raw) => Some((ident, is_raw)),
_ => None,
},
@ -339,7 +333,7 @@ impl Token {
pub fn lifetime(&self) -> Option<ast::Ident> {
match *self {
Lifetime(ident) => Some(ident),
Interpolated(ref nt) => match nt.0 {
Interpolated(ref nt) => match **nt {
NtLifetime(ident) => Some(ident),
_ => None,
},
@ -367,7 +361,7 @@ impl Token {
/// Returns `true` if the token is an interpolated path.
fn is_path(&self) -> bool {
if let Interpolated(ref nt) = *self {
if let NtPath(..) = nt.0 {
if let NtPath(..) = **nt {
return true;
}
}
@ -508,98 +502,7 @@ impl Token {
}
}
pub fn interpolated_to_tokenstream(&self, sess: &ParseSess, span: Span)
-> TokenStream
{
let nt = match *self {
Token::Interpolated(ref nt) => nt,
_ => panic!("only works on interpolated tokens"),
};
// An `Interpolated` token means that we have a `Nonterminal`
// which is often a parsed AST item. At this point we now need
// to convert the parsed AST to an actual token stream, e.g.
// un-parse it basically.
//
// Unfortunately there's not really a great way to do that in a
// guaranteed lossless fashion right now. The fallback here is
// to just stringify the AST node and reparse it, but this loses
// all span information.
//
// As a result, some AST nodes are annotated with the token
// stream they came from. Here we attempt to extract these
// lossless token streams before we fall back to the
// stringification.
let mut tokens = None;
match nt.0 {
Nonterminal::NtItem(ref item) => {
tokens = prepend_attrs(sess, &item.attrs, item.tokens.as_ref(), span);
}
Nonterminal::NtTraitItem(ref item) => {
tokens = prepend_attrs(sess, &item.attrs, item.tokens.as_ref(), span);
}
Nonterminal::NtImplItem(ref item) => {
tokens = prepend_attrs(sess, &item.attrs, item.tokens.as_ref(), span);
}
Nonterminal::NtIdent(ident, is_raw) => {
let token = Token::Ident(ident, is_raw);
tokens = Some(TokenTree::Token(ident.span, token).into());
}
Nonterminal::NtLifetime(ident) => {
let token = Token::Lifetime(ident);
tokens = Some(TokenTree::Token(ident.span, token).into());
}
Nonterminal::NtTT(ref tt) => {
tokens = Some(tt.clone().into());
}
_ => {}
}
let tokens_for_real = nt.1.force(|| {
// FIXME(#43081): Avoid this pretty-print + reparse hack
let source = pprust::token_to_string(self);
let filename = FileName::macro_expansion_source_code(&source);
let (tokens, errors) = parse_stream_from_source_str(
filename, source, sess, Some(span));
emit_unclosed_delims(&errors, &sess.span_diagnostic);
tokens
});
// During early phases of the compiler the AST could get modified
// directly (e.g., attributes added or removed) and the internal cache
// of tokens my not be invalidated or updated. Consequently if the
// "lossless" token stream disagrees with our actual stringification
// (which has historically been much more battle-tested) then we go
// with the lossy stream anyway (losing span information).
//
// Note that the comparison isn't `==` here to avoid comparing spans,
// but it *also* is a "probable" equality which is a pretty weird
// definition. We mostly want to catch actual changes to the AST
// like a `#[cfg]` being processed or some weird `macro_rules!`
// expansion.
//
// What we *don't* want to catch is the fact that a user-defined
// literal like `0xf` is stringified as `15`, causing the cached token
// stream to not be literal `==` token-wise (ignoring spans) to the
// token stream we got from stringification.
//
// Instead the "probably equal" check here is "does each token
// recursively have the same discriminant?" We basically don't look at
// the token values here and assume that such fine grained token stream
// modifications, including adding/removing typically non-semantic
// tokens such as extra braces and commas, don't happen.
if let Some(tokens) = tokens {
if tokens.probably_equal_for_proc_macro(&tokens_for_real) {
return tokens
}
info!("cached tokens found, but they're not \"probably equal\", \
going with stringified version");
}
return tokens_for_real
}
// See comments in `interpolated_to_tokenstream` for why we care about
// See comments in `Nonterminal::to_tokenstream` for why we care about
// *probably* equal here rather than actual equality
crate fn probably_equal_for_proc_macro(&self, other: &Token) -> bool {
if mem::discriminant(self) != mem::discriminant(other) {
@ -731,6 +634,85 @@ impl fmt::Debug for Nonterminal {
}
}
impl Nonterminal {
pub fn to_tokenstream(&self, sess: &ParseSess, span: Span) -> TokenStream {
// A `Nonterminal` is often a parsed AST item. At this point we now
// need to convert the parsed AST to an actual token stream, e.g.
// un-parse it basically.
//
// Unfortunately there's not really a great way to do that in a
// guaranteed lossless fashion right now. The fallback here is to just
// stringify the AST node and reparse it, but this loses all span
// information.
//
// As a result, some AST nodes are annotated with the token stream they
// came from. Here we attempt to extract these lossless token streams
// before we fall back to the stringification.
let tokens = match *self {
Nonterminal::NtItem(ref item) => {
prepend_attrs(sess, &item.attrs, item.tokens.as_ref(), span)
}
Nonterminal::NtTraitItem(ref item) => {
prepend_attrs(sess, &item.attrs, item.tokens.as_ref(), span)
}
Nonterminal::NtImplItem(ref item) => {
prepend_attrs(sess, &item.attrs, item.tokens.as_ref(), span)
}
Nonterminal::NtIdent(ident, is_raw) => {
let token = Token::Ident(ident, is_raw);
Some(TokenTree::Token(ident.span, token).into())
}
Nonterminal::NtLifetime(ident) => {
let token = Token::Lifetime(ident);
Some(TokenTree::Token(ident.span, token).into())
}
Nonterminal::NtTT(ref tt) => {
Some(tt.clone().into())
}
_ => None,
};
// FIXME(#43081): Avoid this pretty-print + reparse hack
let source = pprust::nonterminal_to_string(self);
let filename = FileName::macro_expansion_source_code(&source);
let (tokens_for_real, errors) =
parse_stream_from_source_str(filename, source, sess, Some(span));
emit_unclosed_delims(&errors, &sess.span_diagnostic);
// During early phases of the compiler the AST could get modified
// directly (e.g., attributes added or removed) and the internal cache
// of tokens my not be invalidated or updated. Consequently if the
// "lossless" token stream disagrees with our actual stringification
// (which has historically been much more battle-tested) then we go
// with the lossy stream anyway (losing span information).
//
// Note that the comparison isn't `==` here to avoid comparing spans,
// but it *also* is a "probable" equality which is a pretty weird
// definition. We mostly want to catch actual changes to the AST
// like a `#[cfg]` being processed or some weird `macro_rules!`
// expansion.
//
// What we *don't* want to catch is the fact that a user-defined
// literal like `0xf` is stringified as `15`, causing the cached token
// stream to not be literal `==` token-wise (ignoring spans) to the
// token stream we got from stringification.
//
// Instead the "probably equal" check here is "does each token
// recursively have the same discriminant?" We basically don't look at
// the token values here and assume that such fine grained token stream
// modifications, including adding/removing typically non-semantic
// tokens such as extra braces and commas, don't happen.
if let Some(tokens) = tokens {
if tokens.probably_equal_for_proc_macro(&tokens_for_real) {
return tokens
}
info!("cached tokens found, but they're not \"probably equal\", \
going with stringified version");
}
return tokens_for_real
}
}
crate fn is_op(tok: &Token) -> bool {
match *tok {
OpenDelim(..) | CloseDelim(..) | Literal(..) | DocComment(..) |
@ -740,52 +722,6 @@ crate fn is_op(tok: &Token) -> bool {
}
}
#[derive(Clone)]
pub struct LazyTokenStream(Lock<Option<TokenStream>>);
impl cmp::Eq for LazyTokenStream {}
impl PartialEq for LazyTokenStream {
fn eq(&self, _other: &LazyTokenStream) -> bool {
true
}
}
impl fmt::Debug for LazyTokenStream {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
fmt::Debug::fmt(&self.clone().0.into_inner(), f)
}
}
impl LazyTokenStream {
pub fn new() -> Self {
LazyTokenStream(Lock::new(None))
}
fn force<F: FnOnce() -> TokenStream>(&self, f: F) -> TokenStream {
let mut opt_stream = self.0.lock();
if opt_stream.is_none() {
*opt_stream = Some(f());
}
opt_stream.clone().unwrap()
}
}
impl Encodable for LazyTokenStream {
fn encode<S: Encoder>(&self, _: &mut S) -> Result<(), S::Error> {
Ok(())
}
}
impl Decodable for LazyTokenStream {
fn decode<D: Decoder>(_: &mut D) -> Result<LazyTokenStream, D::Error> {
Ok(LazyTokenStream::new())
}
}
impl ::std::hash::Hash for LazyTokenStream {
fn hash<H: ::std::hash::Hasher>(&self, _hasher: &mut H) {}
}
fn prepend_attrs(sess: &ParseSess,
attrs: &[ast::Attribute],
tokens: Option<&tokenstream::TokenStream>,

View file

@ -4,7 +4,7 @@ use crate::ast::{Attribute, MacDelimiter, GenericArg};
use crate::util::parser::{self, AssocOp, Fixity};
use crate::attr;
use crate::source_map::{self, SourceMap, Spanned};
use crate::parse::token::{self, BinOpToken, Token};
use crate::parse::token::{self, BinOpToken, Nonterminal, Token};
use crate::parse::lexer::comments;
use crate::parse::{self, ParseSess};
use crate::print::pp::{self, Breaks};
@ -257,29 +257,33 @@ pub fn token_to_string(tok: &Token) -> String {
token::Comment => "/* */".to_string(),
token::Shebang(s) => format!("/* shebang: {}*/", s),
token::Interpolated(ref nt) => match nt.0 {
token::NtExpr(ref e) => expr_to_string(e),
token::NtMeta(ref e) => meta_item_to_string(e),
token::NtTy(ref e) => ty_to_string(e),
token::NtPath(ref e) => path_to_string(e),
token::NtItem(ref e) => item_to_string(e),
token::NtBlock(ref e) => block_to_string(e),
token::NtStmt(ref e) => stmt_to_string(e),
token::NtPat(ref e) => pat_to_string(e),
token::NtIdent(e, false) => ident_to_string(e),
token::NtIdent(e, true) => format!("r#{}", ident_to_string(e)),
token::NtLifetime(e) => ident_to_string(e),
token::NtLiteral(ref e) => expr_to_string(e),
token::NtTT(ref tree) => tt_to_string(tree.clone()),
token::NtArm(ref e) => arm_to_string(e),
token::NtImplItem(ref e) => impl_item_to_string(e),
token::NtTraitItem(ref e) => trait_item_to_string(e),
token::NtGenerics(ref e) => generic_params_to_string(&e.params),
token::NtWhereClause(ref e) => where_clause_to_string(e),
token::NtArg(ref e) => arg_to_string(e),
token::NtVis(ref e) => vis_to_string(e),
token::NtForeignItem(ref e) => foreign_item_to_string(e),
}
token::Interpolated(ref nt) => nonterminal_to_string(nt),
}
}
pub fn nonterminal_to_string(nt: &Nonterminal) -> String {
match *nt {
token::NtExpr(ref e) => expr_to_string(e),
token::NtMeta(ref e) => meta_item_to_string(e),
token::NtTy(ref e) => ty_to_string(e),
token::NtPath(ref e) => path_to_string(e),
token::NtItem(ref e) => item_to_string(e),
token::NtBlock(ref e) => block_to_string(e),
token::NtStmt(ref e) => stmt_to_string(e),
token::NtPat(ref e) => pat_to_string(e),
token::NtIdent(e, false) => ident_to_string(e),
token::NtIdent(e, true) => format!("r#{}", ident_to_string(e)),
token::NtLifetime(e) => ident_to_string(e),
token::NtLiteral(ref e) => expr_to_string(e),
token::NtTT(ref tree) => tt_to_string(tree.clone()),
token::NtArm(ref e) => arm_to_string(e),
token::NtImplItem(ref e) => impl_item_to_string(e),
token::NtTraitItem(ref e) => trait_item_to_string(e),
token::NtGenerics(ref e) => generic_params_to_string(&e.params),
token::NtWhereClause(ref e) => where_clause_to_string(e),
token::NtArg(ref e) => arg_to_string(e),
token::NtVis(ref e) => vis_to_string(e),
token::NtForeignItem(ref e) => foreign_item_to_string(e),
}
}

View file

@ -72,7 +72,7 @@ impl TokenTree {
}
}
// See comments in `interpolated_to_tokenstream` for why we care about
// See comments in `Nonterminal::to_tokenstream` for why we care about
// *probably* equal here rather than actual equality
//
// This is otherwise the same as `eq_unspanned`, only recursing with a
@ -310,7 +310,7 @@ impl TokenStream {
t1.next().is_none() && t2.next().is_none()
}
// See comments in `interpolated_to_tokenstream` for why we care about
// See comments in `Nonterminal::to_tokenstream` for why we care about
// *probably* equal here rather than actual equality
//
// This is otherwise the same as `eq_unspanned`, only recursing with a

View file

@ -2,6 +2,7 @@ use crate::proc_macro_impl::EXEC_STRATEGY;
use crate::proc_macro_server;
use errors::FatalError;
use rustc_data_structures::sync::Lrc;
use syntax::ast::{self, ItemKind, Attribute, Mac};
use syntax::attr::{mark_used, mark_known};
use syntax::source_map::Span;
@ -65,7 +66,7 @@ impl MultiItemModifier for ProcMacroDerive {
// Mark attributes as known, and used.
MarkAttrs(&self.attrs).visit_item(&item);
let token = Token::interpolated(token::NtItem(item));
let token = Token::Interpolated(Lrc::new(token::NtItem(item)));
let input = tokenstream::TokenTree::Token(DUMMY_SP, token).into();
let server = proc_macro_server::Rustc::new(ecx);

View file

@ -178,8 +178,8 @@ impl FromInternal<(TreeAndJoint, &'_ ParseSess, &'_ mut Vec<Self>)>
tt!(Punct::new('#', false))
}
Interpolated(_) => {
let stream = token.interpolated_to_tokenstream(sess, span);
Interpolated(nt) => {
let stream = nt.to_tokenstream(sess, span);
TokenTree::Group(Group {
delimiter: Delimiter::None,
stream,

View file

@ -20,7 +20,7 @@ error[E0382]: use of moved value: `line2`
LL | let _moved = (line2.origin, line2.middle);
| ------------ value moved here
LL | line2.consume(); //[ast]~ ERROR use of partially moved value: `line2` [E0382]
| ^^^^^ value used here after move
| ^^^^^ value used here after partial move
|
= note: move occurs because `line2.middle` has type `Point`, which does not implement the `Copy` trait

View file

@ -20,7 +20,7 @@ error[E0382]: use of moved value: `line2`
LL | let _moved = (line2.origin, line2.middle);
| ------------ value moved here
LL | line2.consume(); //[ast]~ ERROR use of partially moved value: `line2` [E0382]
| ^^^^^ value used here after move
| ^^^^^ value used here after partial move
|
= note: move occurs because `line2.middle` has type `Point`, which does not implement the `Copy` trait

View file

@ -3,11 +3,11 @@
//
// This test is checking the count in an array expression.
// FIXME (#23926): the error output is not consistent between a
// self-hosted and a cross-compiled setup; therefore resorting to
// error-pattern for now.
// error-pattern: attempt to add with overflow
#![allow(unused_imports)]
@ -18,6 +18,7 @@ use std::{u8, u16, u32, u64, usize};
const A_I8_I
: [u32; (i8::MAX as usize) + 1]
= [0; (i8::MAX + 1) as usize];
//~^ ERROR evaluation of constant value failed
fn main() {
foo(&A_I8_I[..]);

View file

@ -7,11 +7,11 @@
// types for the left- and right-hand sides of the addition do not
// match (as well as overflow).
// FIXME (#23926): the error output is not consistent between a
// self-hosted and a cross-compiled setup; therefore resorting to
// error-pattern for now.
// error-pattern: mismatched types
#![allow(unused_imports)]
@ -22,6 +22,8 @@ use std::{u8, u16, u32, u64, usize};
const A_I8_I
: [u32; (i8::MAX as usize) + 1]
= [0; (i8::MAX + 1u8) as usize];
//~^ ERROR mismatched types
//~| ERROR cannot add `u8` to `i8`
fn main() {
foo(&A_I8_I[..]);

View file

@ -1,3 +1,4 @@
// ignore-tidy-linelength
#![feature(const_transmute)]
#![allow(const_err)] // make sure we cannot allow away the errors tested here
@ -5,6 +6,7 @@ use std::mem;
const UNALIGNED: &u16 = unsafe { mem::transmute(&[0u8; 4]) };
//~^ ERROR it is undefined behavior to use this value
//~^^ type validation failed: encountered unaligned reference (required 2 byte alignment but found 1)
const NULL: &u16 = unsafe { mem::transmute(0usize) };
//~^ ERROR it is undefined behavior to use this value

View file

@ -1,13 +1,13 @@
error[E0080]: it is undefined behavior to use this value
--> $DIR/ub-ref.rs:6:1
--> $DIR/ub-ref.rs:7:1
|
LL | const UNALIGNED: &u16 = unsafe { mem::transmute(&[0u8; 4]) };
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ type validation failed: encountered unaligned reference
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ type validation failed: encountered unaligned reference (required 2 byte alignment but found 1)
|
= note: The rules on what exactly is undefined behavior aren't clear, so this check might be overzealous. Please open an issue on the rust compiler repository if you believe it should not be considered undefined behavior
error[E0080]: it is undefined behavior to use this value
--> $DIR/ub-ref.rs:9:1
--> $DIR/ub-ref.rs:11:1
|
LL | const NULL: &u16 = unsafe { mem::transmute(0usize) };
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ type validation failed: encountered 0, but expected something greater or equal to 1
@ -15,7 +15,7 @@ LL | const NULL: &u16 = unsafe { mem::transmute(0usize) };
= note: The rules on what exactly is undefined behavior aren't clear, so this check might be overzealous. Please open an issue on the rust compiler repository if you believe it should not be considered undefined behavior
error[E0080]: it is undefined behavior to use this value
--> $DIR/ub-ref.rs:12:1
--> $DIR/ub-ref.rs:14:1
|
LL | const REF_AS_USIZE: usize = unsafe { mem::transmute(&0) };
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ type validation failed: encountered a pointer, but expected initialized plain (non-pointer) bytes
@ -23,7 +23,7 @@ LL | const REF_AS_USIZE: usize = unsafe { mem::transmute(&0) };
= note: The rules on what exactly is undefined behavior aren't clear, so this check might be overzealous. Please open an issue on the rust compiler repository if you believe it should not be considered undefined behavior
error[E0080]: it is undefined behavior to use this value
--> $DIR/ub-ref.rs:15:1
--> $DIR/ub-ref.rs:17:1
|
LL | const REF_AS_USIZE_SLICE: &[usize] = &[unsafe { mem::transmute(&0) }];
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ type validation failed: encountered a pointer at .<deref>, but expected plain (non-pointer) bytes
@ -31,7 +31,7 @@ LL | const REF_AS_USIZE_SLICE: &[usize] = &[unsafe { mem::transmute(&0) }];
= note: The rules on what exactly is undefined behavior aren't clear, so this check might be overzealous. Please open an issue on the rust compiler repository if you believe it should not be considered undefined behavior
error[E0080]: it is undefined behavior to use this value
--> $DIR/ub-ref.rs:18:1
--> $DIR/ub-ref.rs:20:1
|
LL | const USIZE_AS_REF: &'static u8 = unsafe { mem::transmute(1337usize) };
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ type validation failed: encountered integer pointer in non-ZST reference

View file

@ -1,11 +1,16 @@
// FIXME (#23926): the error output is not consistent between a
// self-hosted and a cross-compiled setup. Skipping for now.
// ignore-test FIXME(#23926)
// error-pattern: too big for the current architecture
// normalize-stderr-test "; \d+]" -> "; N]"
#![allow(exceeding_bitshifts)]
#[cfg(target_pointer_width = "64")]
fn main() {
let _fat : [u8; (1<<61)+(1<<31)] =
[0; (1u64<<61) as usize +(1u64<<31) as usize];
}
#[cfg(target_pointer_width = "32")]
fn main() {
let _fat : [u8; (1<<31)+(1<<15)] =
[0; (1u32<<31) as usize +(1u32<<15) as usize];
}

View file

@ -0,0 +1,4 @@
error: the type `[u8; N]` is too big for the current architecture
error: aborting due to previous error

View file

@ -5,7 +5,7 @@ LL | Some(right) => consume(right),
| ----- value moved here
...
LL | consume(node) + r //~ ERROR use of partially moved value: `node`
| ^^^^ value used here after move
| ^^^^ value used here after partial move
|
= note: move occurs because value has type `std::boxed::Box<List>`, which does not implement the `Copy` trait

View file

@ -5,7 +5,7 @@ LL | Foo {f} => {}
| - value moved here
...
LL | touch(&x); //~ ERROR use of partially moved value: `x`
| ^^ value borrowed here after move
| ^^ value borrowed here after partial move
|
= note: move occurs because `x.f` has type `std::string::String`, which does not implement the `Copy` trait

View file

@ -4,7 +4,7 @@ error[E0382]: use of moved value: `x`
LL | drop(x.0);
| --- value moved here
LL | drop(x); //~ ERROR use of moved value
| ^ value used here after move
| ^ value used here after partial move
|
= note: move occurs because `x.0` has type `std::vec::Vec<i32>`, which does not implement the `Copy` trait

View file

@ -15,5 +15,7 @@ fn main() {
SomeStruct::<_> { t: 22 }; // Nothing interesting given, no annotation.
SomeStruct::<u32> { t: 22 }; //~ ERROR [u32]
SomeStruct::<u32> { t: 22 }; // No lifetime bounds given.
SomeStruct::<&'static u32> { t: &22 }; //~ ERROR [&ReStatic u32]
}

View file

@ -1,8 +1,8 @@
error: user substs: UserSubsts { substs: [u32], user_self_ty: None }
--> $DIR/dump-adt-brace-struct.rs:18:5
error: user substs: UserSubsts { substs: [&ReStatic u32], user_self_ty: None }
--> $DIR/dump-adt-brace-struct.rs:20:5
|
LL | SomeStruct::<u32> { t: 22 }; //~ ERROR [u32]
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^
LL | SomeStruct::<&'static u32> { t: &22 }; //~ ERROR [&ReStatic u32]
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
error: aborting due to previous error

View file

@ -11,7 +11,7 @@ trait Bazoom<T> {
fn method<U>(&self, arg: T, arg2: U) { }
}
impl<T, U> Bazoom<U> for T {
impl<S, T> Bazoom<T> for S {
}
fn foo<'a, T>(_: T) { }
@ -22,20 +22,29 @@ fn main() {
let x = foo;
x(22);
// Here: `u32` is given.
let x = foo::<u32>; //~ ERROR [u32]
// Here: `u32` is given, which doesn't contain any lifetimes, so we don't
// have any annotation.
let x = foo::<u32>;
x(22);
let x = foo::<&'static u32>; //~ ERROR [&ReStatic u32]
x(&22);
// Here: we only want the `T` to be given, the rest should be variables.
//
// (`T` refers to the declaration of `Bazoom`)
let x = <_ as Bazoom<u32>>::method::<_>; //~ ERROR [^0, u32, ^1]
x(&22, 44, 66);
// Here: all are given
let x = <u8 as Bazoom<u16>>::method::<u32>; //~ ERROR [u8, u16, u32]
// Here: all are given and definitely contain no lifetimes, so we
// don't have any annotation.
let x = <u8 as Bazoom<u16>>::method::<u32>;
x(&22, 44, 66);
// Here: all are given and we have a lifetime.
let x = <u8 as Bazoom<&'static u16>>::method::<u32>; //~ ERROR [u8, &ReStatic u16, u32]
x(&22, &44, 66);
// Here: we want in particular that *only* the method `U`
// annotation is given, the rest are variables.
//

View file

@ -1,23 +1,23 @@
error: user substs: UserSubsts { substs: [u32], user_self_ty: None }
--> $DIR/dump-fn-method.rs:26:13
error: user substs: UserSubsts { substs: [&ReStatic u32], user_self_ty: None }
--> $DIR/dump-fn-method.rs:30:13
|
LL | let x = foo::<u32>; //~ ERROR [u32]
| ^^^^^^^^^^
LL | let x = foo::<&'static u32>; //~ ERROR [&ReStatic u32]
| ^^^^^^^^^^^^^^^^^^^
error: user substs: UserSubsts { substs: [^0, u32, ^1], user_self_ty: None }
--> $DIR/dump-fn-method.rs:32:13
--> $DIR/dump-fn-method.rs:36:13
|
LL | let x = <_ as Bazoom<u32>>::method::<_>; //~ ERROR [^0, u32, ^1]
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
error: user substs: UserSubsts { substs: [u8, u16, u32], user_self_ty: None }
--> $DIR/dump-fn-method.rs:36:13
error: user substs: UserSubsts { substs: [u8, &ReStatic u16, u32], user_self_ty: None }
--> $DIR/dump-fn-method.rs:45:13
|
LL | let x = <u8 as Bazoom<u16>>::method::<u32>; //~ ERROR [u8, u16, u32]
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
LL | let x = <u8 as Bazoom<&'static u16>>::method::<u32>; //~ ERROR [u8, &ReStatic u16, u32]
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
error: user substs: UserSubsts { substs: [^0, ^1, u32], user_self_ty: None }
--> $DIR/dump-fn-method.rs:44:5
--> $DIR/dump-fn-method.rs:53:5
|
LL | y.method::<u32>(44, 66); //~ ERROR [^0, ^1, u32]
| ^^^^^^^^^^^^^^^^^^^^^^^

View file

@ -0,0 +1,17 @@
#![feature(nll)]
struct A<'a>(&'a ());
impl A<'static> {
const IC: i32 = 10;
}
fn non_wf_associated_const<'a>(x: i32) {
A::<'a>::IC; //~ ERROR lifetime may not live long enough
}
fn wf_associated_const<'a>(x: i32) {
A::<'static>::IC;
}
fn main() {}

View file

@ -0,0 +1,10 @@
error: lifetime may not live long enough
--> $DIR/inherent-associated-constants.rs:10:5
|
LL | fn non_wf_associated_const<'a>(x: i32) {
| -- lifetime `'a` defined here
LL | A::<'a>::IC; //~ ERROR lifetime may not live long enough
| ^^^^^^^^^^^ requires that `'a` must outlive `'static`
error: aborting due to previous error

View file

@ -0,0 +1,7 @@
// ignore-tidy-linelength
fn main() {
println!(hello world);
//~^ ERROR unknown start of token: \u{201c}
//~^^ HELP Unicode characters '“' (Left Double Quotation Mark) and '”' (Right Double Quotation Mark) look like '"' (Quotation Mark), but are not
}

View file

@ -0,0 +1,12 @@
error: unknown start of token: /u{201c}
--> $DIR/unicode-quote-chars.rs:4:14
|
LL | println!(“hello world”);
| ^
help: Unicode characters '“' (Left Double Quotation Mark) and '”' (Right Double Quotation Mark) look like '"' (Quotation Mark), but are not
|
LL | println!("hello world");
| ^^^^^^^^^^^^^
error: aborting due to previous error

View file

@ -25,7 +25,7 @@ LL | (Some(y), ()) => {},
| - value moved here
...
LL | x; //~ ERROR use of partially moved value
| ^ value used here after move
| ^ value used here after partial move
|
= note: move occurs because value has type `std::vec::Vec<i32>`, which does not implement the `Copy` trait

View file

@ -5,7 +5,7 @@ LL | let y = *x;
| -- value moved here
LL | drop_unsized(y);
LL | println!("{}", &x);
| ^^ value borrowed here after move
| ^^ value borrowed here after partial move
|
= note: move occurs because `*x` has type `str`, which does not implement the `Copy` trait
@ -27,7 +27,7 @@ LL | let y = *x;
| -- value moved here
LL | y.foo();
LL | println!("{}", &x);
| ^^ value borrowed here after move
| ^^ value borrowed here after partial move
|
= note: move occurs because `*x` has type `str`, which does not implement the `Copy` trait
@ -48,7 +48,7 @@ error[E0382]: borrow of moved value: `x`
LL | x.foo();
| - value moved here
LL | println!("{}", &x);
| ^^ value borrowed here after move
| ^^ value borrowed here after partial move
|
= note: move occurs because `*x` has type `str`, which does not implement the `Copy` trait

View file

@ -14,7 +14,7 @@ error[E0382]: use of moved value: `x`
LL | let _y = *x;
| -- value moved here
LL | drop_unsized(x); //~ERROR use of moved value
| ^ value used here after move
| ^ value used here after partial move
|
= note: move occurs because `*x` has type `str`, which does not implement the `Copy` trait